Research on Visual Perception of Speed Bumps for Intelligent Connected Vehicles Based on Lightweight FPNet.

阅读:8
作者:Wang Ruochen, Luo Xiaoguo, Ye Qing, Jiang Yu, Liu Wei
In the field of intelligent connected vehicles, the precise and real-time identification of speed bumps is critically important for the safety of autonomous driving. To address the issue that existing visual perception algorithms struggle to simultaneously maintain identification accuracy and real-time performance amidst image distortion and complex environmental conditions, this study proposes an enhanced lightweight neural network framework, YOLOv5-FPNet. This framework strengthens perception capabilities in two key phases: feature extraction and loss constraint. Firstly, FPNet, based on FasterNet and Dynamic Snake Convolution, is developed to adaptively extract structural features of distorted speed bumps with accuracy. Subsequently, the C3-SFC module is proposed to augment the adaptability of the neck and head components to distorted features. Furthermore, the SimAM attention mechanism is embedded within the backbone to enhance the ability of key feature extraction. Finally, an adaptive loss function, Inner-WiseIoU, based on a dynamic non-monotonic focusing mechanism, is designed to improve the generalization and fitting ability of bounding boxes. Experimental evaluations on a custom speed bumps dataset demonstrate the superior performance of FPNet, with significant improvements in key metrics such as the mAP, mAP50_95, and FPS by 38.76%, 143.15%, and 51.23%, respectively, compared to conventional lightweight neural networks. Ablation studies confirm the effectiveness of the proposed improvements. This research provides a fast and accurate speed bump detection solution for autonomous vehicles, offering theoretical insights for obstacle recognition in intelligent vehicle systems.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。