Binocular stereo vision-based relative positioning algorithm for drone swarm

基于双目立体视觉的无人机群相对定位算法

阅读:1

Abstract

To address the challenges of high computational complexity and poor real-time performance in binocular vision-based Unmanned Aerial Vehicle (UAV) formation flight, this paper introduces a UAV localization algorithm based on a lightweight object detection model. Firstly, we optimized the YOLOv5s model using lightweight design principles, resulting in Yolo-SGN. This model achieves a 65.5% reduction in parameter count, a 62.7% reduction in FLOPs, and a 1.8% increase in accuracy compared to the original detection model. Subsequently, Yolo-SGN is utilized to extract target regions from binocular images, and feature point matching is exclusively conducted within these regions to minimize unnecessary computations in non-target areas. Experimental results demonstrate that the combination of Yolo-SGN and the Oriented FAST and Rotated BRIEF (ORB) algorithm reduces feature point matching computations to only a quarter of those in the original ORB algorithm, significantly enhancing real-time performance. Finally, the extracted feature points from UAVs are input into a binocular vision localization model to compute their three-dimensional coordinates. The average of the three-dimensional coordinates of all feature points is used to determine the three-dimensional position of the target UAV. Experimental results confirm that the UAV binocular vision localization algorithm, based on a lightweight object detection model, exhibits exceptional performance in terms of precision and real-time capabilities.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。