Research on Orchard Navigation Line Recognition Method Based on U-Net

基于U-Net的果园导航线识别方法研究

阅读:1

Abstract

Aiming at the problems of complex image background and numerous interference factors faced by visual navigation systems in orchard environments, this paper proposes an orchard navigation line recognition method based on U-Net. Firstly, the drivable areas in the collected images are labeled using Labelme (a graphical tool for image annotation) to create an orchard dataset. Then, the Spatial Attention (SA) mechanism is inserted into the downsampling stage of the traditional U-Net semantic segmentation method, and the Coordinate Attention (CA) mechanism is added to the skip connection stage to obtain complete context information and optimize the feature restoration process of the drivable area in the field, thereby improving the overall segmentation accuracy of the model. Subsequently, the improved U-Net network is trained using the enhanced dataset to obtain the drivable area segmentation model. Based on the detected drivable area segmentation mask, the navigation line information is extracted, and the geometric center points are calculated row by row. After performing sliding window processing and bidirectional interpolation filling on the center points, the navigation line is generated through spline interpolation. Finally, the proposed method is compared and verified with U-Net, SegViT, SE-Net, and DeepLabv3+ networks. The results show that the improved drivable area segmentation model has a Recall of 90.23%, a Precision of 91.71%, a mean pixel accuracy (mPA) of 87.75%, and a mean intersection over union (mIoU) of 84.84%. Moreover, when comparing the recognized navigation line with the actual center line, the average distance error of the extracted navigation line is 56 mm, which can provide an effective reference for visual autonomous navigation in orchard environments.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。