Machine learning approach for wheat variety identification using single-seed imaging

利用单粒种子成像技术进行小麦品种识别的机器学习方法

阅读:1

Abstract

Accurate identification of wheat varieties is essential for seed certification and precision agriculture, yet traditional visual inspection is subjective, labor-intensive, and often unreliable due to the morphological similarity among cultivars. This study presents a comprehensive comparative framework for automated wheat varietal classification using both handcrafted and deep-learning-based feature extraction methods. A controlled imaging system was used to capture seed images from six Iranian wheat cultivars. Handcrafted morphological, color, and texture descriptors were extracted and reduced using principal component analysis (PCA) prior to classification using a multi-layer perceptron (MLP). In parallel, convolutional neural networks (CNNs) were trained to learn deep features directly from raw images, and two classifier-head strategies-global average pooling (GAP) and fully connected layers (FCL)-were systematically compared. Hyperparameters were optimized through structured experimentation, and model stability was assessed using repeated training runs, one-way ANOVA, and 95% confidence intervals. Results show that the CNN-GAP model achieved the highest accuracy (92.19%) and demonstrated superior generalization stability compared with EfficientNet-B4 and Inception-ResNet-v2 models. PCA-based dimensionality reduction enhanced MLP performance, yielding 86.0% accuracy. Cross-domain testing on chickpea seeds highlighting sensitivity to domain shifts and emphasizing the need for species-specific training data. Practical considerations revealed that the lightweight CNN-GAP architecture, with an average inference time of 13.6 ms per image, is suitable for real-time deployment on low-cost agricultural hardware.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。