KD-SSGD: knowledge distillation-enhanced semi-supervised germination detection

KD-SSGD:知识蒸馏增强的半监督萌发检测

阅读:2

Abstract

With the rapid development of precision agriculture, seed germination detection is crucial for crop monitoring and variety selection. Existing fully supervised detection methods rely on large-scale annotated datasets, which are costly and time-intensive to obtain in agricultural scenarios. To tackle this issue, we introduce a knowledge distillation-enhanced semi-supervised germination detection framework (KD-SSGD) that requires no pre-trained teacher and supports end-to-end training. Built on a teacher-student architecture, KD-SSGD introduces a lightweight distilled student branch and three key modules: Weighted Boxes Fusion (WBF) to optimize pseudo-label localization, Feature Distillation Loss (FDL) for deep semantic knowledge transfer, and Branch-Adaptive Weighting (BAW) to stabilize multi-branch training. On the Maize-Germ (MG) open-access dataset, KD-SSGD achieves 47.0% mAP with only 1% labeled data, outperforming Faster R-CNN (35.6%), Mean Teacher (41.9%), Soft Teacher (45.1%), and Dense Teacher (45.0%), and reaches 59.3%, 62.8%, and 65.1% mAP at 2%, 5%, and 10% labeled ratios. On the Three Grain Crop (TGC) open-access dataset, which achieves 73.3%, 75.3%, 75.6%, and 76.1% mAP at 1%, 2%, 5%, and 10% labels, surpassing mainstream semi-supervised methods and demonstrating robust cross-crop generalization. The results indicate that KD-SSGD could generate high-quality pseudo-labels, effectively transfer deep knowledge, and achieve stable high-precision detection under limited supervision, providing an efficient and scalable solution for intelligent agricultural perception.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。