Research on Visual Perception of Speed Bumps for Intelligent Connected Vehicles Based on Lightweight FPNet.

阅读:4
作者:Wang Ruochen, Luo Xiaoguo, Ye Qing, Jiang Yu, Liu Wei
In the field of intelligent connected vehicles, the precise and real-time identification of speed bumps is critically important for the safety of autonomous driving. To address the issue that existing visual perception algorithms struggle to simultaneously maintain identification accuracy and real-time performance amidst image distortion and complex environmental conditions, this study proposes an enhanced lightweight neural network framework, YOLOv5-FPNet. This framework strengthens perception capabilities in two key phases: feature extraction and loss constraint. Firstly, FPNet, based on FasterNet and Dynamic Snake Convolution, is developed to adaptively extract structural features of distorted speed bumps with accuracy. Subsequently, the C3-SFC module is proposed to augment the adaptability of the neck and head components to distorted features. Furthermore, the SimAM attention mechanism is embedded within the backbone to enhance the ability of key feature extraction. Finally, an adaptive loss function, Inner-WiseIoU, based on a dynamic non-monotonic focusing mechanism, is designed to improve the generalization and fitting ability of bounding boxes. Experimental evaluations on a custom speed bumps dataset demonstrate the superior performance of FPNet, with significant improvements in key metrics such as the mAP, mAP50_95, and FPS by 38.76%, 143.15%, and 51.23%, respectively, compared to conventional lightweight neural networks. Ablation studies confirm the effectiveness of the proposed improvements. This research provides a fast and accurate speed bump detection solution for autonomous vehicles, offering theoretical insights for obstacle recognition in intelligent vehicle systems.

特别声明

1、本文转载旨在传播信息,不代表本网站观点,亦不对其内容的真实性承担责任。

2、其他媒体、网站或个人若从本网站转载使用,必须保留本网站注明的“来源”,并自行承担包括版权在内的相关法律责任。

3、如作者不希望本文被转载,或需洽谈转载稿费等事宜,请及时与本网站联系。

4、此外,如需投稿,也可通过邮箱info@biocloudy.com与我们取得联系。