Surface Vessels Detection and Tracking Method and Datasets with Multi-Source Data Fusion in Real-World Complex Scenarios

基于多源数据融合的真实世界复杂场景下水面舰艇检测与跟踪方法及数据集

阅读:1

Abstract

Environment sensing plays an important role for the safe autonomous navigation of intelligent ships. However, the inherent limitations of sensors, such as the low frequency of the automatic identification system (AIS), blind zone of the marine radar, and lack of depth information in visible images, make it difficult to achieve accurate sensing with a single modality of sensor data. To overcome this limitation, we propose a new multi-source data fusion framework and technologies that integrate AIS, radar, and visible data. This framework leverages the complementary strengths of these different types of sensors to enhance sensing performance, especially in real complex scenarios where single-modality data are significantly affected by blind zone and adverse weather conditions. We first design a multi-stage detection and tracking method (named MSTrack). By feeding the historical fusion results back to earlier tracking stages, the proposed method identifies and refines potential missing detections from the layered detection and tracking processes of radar and visible images. Then, a cascade association matching method is proposed to realize the association between multi-source trajectories. It first performs pairwise association in a high-accuracy aligned coordinate system, followed by association in a low-accuracy coordinate system and integrated matching between multi-source data. Through these association operations, the method can effectively reduce the association errors caused by measurement noise and projection system errors. Furthermore, we develop the first multi-source fusion dataset for intelligent vessel (WHUT-MSFVessel), and validate our methods. The experimental results show that our multi-source data fusion methods significantly improve the sensing accuracy and identity consistency of tracking, achieving average MOTA scores of 0.872 and 0.938 on the radar and visible images, respectively, and IDF1 scores of 0.811 and 0.929. Additionally, the fusion accuracy reaches up to 0.9, which can provide vessels with a comprehensive perception of the navigation environment for safer navigation.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。