Abstract
With the rapid development of precision agriculture, seed germination detection is crucial for crop monitoring and variety selection. Existing fully supervised detection methods rely on large-scale annotated datasets, which are costly and time-intensive to obtain in agricultural scenarios. To tackle this issue, we introduce a knowledge distillation-enhanced semi-supervised germination detection framework (KD-SSGD) that requires no pre-trained teacher and supports end-to-end training. Built on a teacher-student architecture, KD-SSGD introduces a lightweight distilled student branch and three key modules: Weighted Boxes Fusion (WBF) to optimize pseudo-label localization, Feature Distillation Loss (FDL) for deep semantic knowledge transfer, and Branch-Adaptive Weighting (BAW) to stabilize multi-branch training. On the Maize-Germ (MG) open-access dataset, KD-SSGD achieves 47.0% mAP with only 1% labeled data, outperforming Faster R-CNN (35.6%), Mean Teacher (41.9%), Soft Teacher (45.1%), and Dense Teacher (45.0%), and reaches 59.3%, 62.8%, and 65.1% mAP at 2%, 5%, and 10% labeled ratios. On the Three Grain Crop (TGC) open-access dataset, which achieves 73.3%, 75.3%, 75.6%, and 76.1% mAP at 1%, 2%, 5%, and 10% labels, surpassing mainstream semi-supervised methods and demonstrating robust cross-crop generalization. The results indicate that KD-SSGD could generate high-quality pseudo-labels, effectively transfer deep knowledge, and achieve stable high-precision detection under limited supervision, providing an efficient and scalable solution for intelligent agricultural perception.