AHG-YOLO: multi-category detection for occluded pear fruits in complex orchard scenes

AHG-YOLO:复杂果园场景中被遮挡梨果实的多类别检测

阅读:1

Abstract

INTRODUCTION: To achieve fast detection of pear fruits in natural pear orchards and optimize path planning for harvesting robots, this study proposes the AHG-YOLO model for multi-category detection of pear fruit occlusion in complex orchard environments. METHODS: Using the Red Delicious pear as the research object, the pears are classified into three categories based on different occlusion statuses: non-occluded fruits (NO), fruits occluded by leaves/branches (OBL), and fruits in close contact with other fruits but not obstructed by leaves/branches (FCC). The YOLOv11n model is used as the base model for a lightweight design. First, the sampling method in the backbone and neck networks is replaced with ADown downsampling to capture higher-level image features, reducing floating-point operations and computational complexity. Next, shared weight parameters are introduced in the head network, and group convolution is applied to achieve a lightweight detection head. Finally, the boundary box loss function is changed to Generalized Intersection over Union (GIoU), improving the model's convergence speed and further enhancing detection performance. RESULTS: Experimental results show that the AHG-YOLO model achieves 93.5% (FCC), 95.3% (NO), and 93.4% (OBL) in AP, with an mAP@0.5 of 94.1% across all categories. Compared to the base YOLOv11n network, precision, recall, mAP@0.5, and mAP@0.5:0.95 are improved by 2.5%, 3.6%, 2.3%, and 2.6%, respectively. The model size is only 5.1MB, with a 16.9% reduction in the number of parameters. DISCUSSION: The improved model demonstrates enhanced suitability for deployment on pear-harvesting embedded devices, providing technical support for the path planning of fruit-picking robotic arms.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。