Study on starch content detection and visualization of potato based on hyperspectral imaging

基于高光谱成像的马铃薯淀粉含量检测与可视化研究

阅读:1

Abstract

Starch is an important quality index in potato, which contributes greatly to the taste and nutritional quality of potato. At present, the determination of starch depends on chemical analysis, which is time consuming and laborious. Thus, rapid and accurate detection of the starch content of potatoes is important. This study combined hyperspectral imaging with chemometrics to predict potato starch content. Two varieties of Kexin No.1 and Holland No.15 potatoes were used as experimental samples. Hyperspectral data were collected from three sampling sites (the top, umbilicus, and middle regions). Standard normal variate (SNV) was used for spectral preprocessing, and three different methods of competitive adaptive reweighted sampling (CARS), iterative variable subset optimization (IVSO), and the variable iterative space shrinkage approach (VISSA) were used for characteristic wavelength selection. Linear partial least-squares regression (PLSR) and nonlinear support vector regression (SVR) models were then established. The results indicated that the sampling site has a considerable impact on the accuracy of the prediction model, and the umbilicus region with CARS-SVR model gave best performance with correlation coefficients in calibration (Rc) of 0.9415, in prediction (Rp) of 0.9346, root mean square errors in calibration (RMSEC) of 15.9 g/kg, in prediction (RMSEP) of 17.4 g/kg, and residual predictive deviation (RPD) of 2.69. The starch content in potatoes was visualized using the best model in combination with pseudo-color technology. Our research provides a method for the rapid and nondestructive determination of starch content in potatoes, providing a good foundation for potato quality monitoring and grading.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。