An Object Feature-Based Recognition and Localization Method for Wolfberry

一种基于物体特征的狼莓识别与定位方法

阅读:1

Abstract

To improve the object recognition and localization capabilities of wolfberry harvesting robots, this study introduces an object feature-based image segmentation algorithm designed for the segmentation and localization of wolfberry fruits and branches in unstructured lighting environments. Firstly, based on the a-channel of the Lab color space and the I-channel of the YIQ color space, a feature fusion algorithm combined with wavelet transformation is proposed to achieve pixel-level fusion of the two feature images, significantly enhancing the image segmentation effect. Experimental results show that this method achieved a 78% segmentation accuracy for wolfberry fruits in 500 test image samples under complex lighting and occlusion conditions, demonstrating good robustness. Secondly, addressing the issue of branch colors being similar to the background, a K-means clustering segmentation algorithm based on the Lab color space is proposed, combined with morphological processing and length filtering strategies, effectively achieving precise segmentation of branches and localization of gripping point coordinates. Experiments validated the high accuracy of the improved algorithm in branch localization. The results indicate that the algorithm proposed in this paper can effectively address illumination changes and occlusion issues in complex harvesting environments. Compared with traditional segmentation methods, it significantly improves the segmentation accuracy of wolfberry fruits and the localization accuracy of branches, providing technical support for the vision system of field-based wolfberry harvesting robots and offering theoretical basis and a practical reference for research on agricultural automated harvesting operations.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。