A machine vision dataset for automated quality inspection and grading of sweetpotatoes

用于红薯自动质量检测和分级的机器视觉数据集

阅读:1

Abstract

Sweetpotatoes grading, particularly for defects, is a labor-intensive operation and often constrained by inconsistencies in manual inspection. To facilitate the development of automated quality grading systems, this article presents a unique, unified machine vision dataset specifically for multi-view quality inspection and grading of sweetpotatoes. The dataset consolidates data derived from two independent experimental campaigns to enhance diversity in sample characteristics and imaging conditions. It comprises the data of 390 samples divided into two subsets; Subset A consists of 123 fresh-market sweetpotatoes sourced from grocery stores and imaged under ambient indoor lighting at a resolution of 1920 × 1080 pixels, and Subset B includes 267 sweetpotatoes of two varieties harvested from a research station and imaged in an enclosed chamber under controlled illumination at a resolution of 1280 × 720 pixels. In both subsets, sweetpotato samples were rotated on a custom-designed roller conveyor for multi-view imaging, and RGB (red-green-blue) frames were extracted from recorded video streams and labeled for individual sweetpotato instances. The curated dataset contains a total of 1400 images (Subset A: 232, Subset B: 1168) with 3700 annotated instances, along with 39 raw video recordings as well as physical measurements. Each instance is labeled with a polygon segmentation mask and assigned a quality grade (Grade 1, Grade 2, or Grade 3) based on surface defect severity. This dataset represents the first publicly available machine vision dataset dedicated to automated sweetpotato grading. It provides a diverse and valuable resource for training and evaluating computer vision algorithms and models for instance segmentation and surface-quality assessment of sweetpotatoes and beyond.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。