Real-Time 6D Pose Estimation and Multi-Target Tracking for Low-Cost Multi-Robot System

面向低成本多机器人系统的实时6D姿态估计和多目标跟踪

阅读:1

Abstract

In the research field of multi-robot cooperation, reliable and low-cost motion capture is crucial for system development and validation. To address the high costs of traditional motion capture systems, this study proposes a real-time 6D pose estimation and tracking method for multi-robot systems based on YolPnP-FT. Using only an Intel RealSense D435i depth camera, the system achieves simultaneous robot classification, 6D pose estimation, and multi-target tracking in real-world environments. The YolPnP-FT pipeline introduces a keypoint confidence filtering strategy (PnP-FT) at the output of the YOLOv8 detection head and employs Gaussian-penalized Soft-NMS to enhance robustness under partial occlusion. Based on these detection results, a linearly weighted combination of Mahalanobis distance and cosine distance enables stable ID assignment in visually similar multi-robot scenarios. Experimental results show that, at a camera height below 2.5 m, the system achieves an average position error of less than 0.009 m and an average angular error of less than 4.2°, with a stable tracking frame rate of 19.8 FPS at 1920 × 1080 resolution. Furthermore, the perception outputs are validated in a CoppeliaSim-based simulation environment, confirming their utility for downstream coordination tasks. These results demonstrate that the proposed method provides a low-cost, real-time, and deployable perception solution for multi-robot systems.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。