Abstract
INTRODUCTION: To address the insufficient accuracy of autonomous steering in soybean headland areas, this study proposes a dynamic navigation line visualization method based on deep learning and feature detection fusion, enhancing path planning capability for autopilot systems during the soybean V3-V8 stage. METHODS: First, the improved lightweight YOLO-PFL model was used for efficient headland detection (precision, 94.100%; recall, 92.700%; mAP@0.5, 95.600%), with 1.974 M parameters and 4.816 GFLOPs, meeting embedded deployment requirements for agricultural machines. A 3D positioning model was built using binocular stereo vision; distance error was controlled within 2.000%, 4.000%, and 6.000% for ranges of 0.000-3.000 m, 3.000-7.000 m, and 7.000-10.000 m, respectively. Second, interference-resistant crop row centerlines (average orientation angle error, -0.473°, indicating a small systematic leftward bias; mean absolute error, 3.309°) were obtained by enhancing contours through HSV color space conversion and morphological operations, followed by fitting feature points extracted from ROIs and the crop row intersection area using the least squares method. This approach solved centerline offset issues caused by straws, weeds, changes in illumination, and the presence of holes or sticking areas. Finally, 3D positioning and orientation parameters were fused to generate circular arc paths in the world coordinate system, which were dynamically projected across the coordinate system to visualize navigation lines on the image plane. RESULTS AND DISCUSSION: Experiments demonstrated that the method generates real-time steering paths with acceptable errors, providing a navigation reference for automatic wheeled machines in soybean fields and technical support for the advancement of intelligent precision agriculture equipment.