A novel CNN gap layer for growth prediction of palm tree plantlings

一种用于预测棕榈树苗生长的新型 CNN 间隙层

阅读:1

Abstract

Monitoring palm tree seedlings and plantlings presents a formidable challenge because of the microscopic size of these organisms and the absence of distinguishing morphological characteristics. There is a demand for technical approaches that can provide restoration specialists with palm tree seedling monitoring systems that are high-resolution, quick, and environmentally friendly. It is possible that counting plantlings and identifying them down to the genus level will be an extremely time-consuming and challenging task. It has been demonstrated that convolutional neural networks, or CNNs, are effective in many aspects of image recognition; however, the performance of CNNs differs depending on the application. The performance of the existing CNN-based models for monitoring and predicting plantlings growth could be further improved. To achieve this, a novel Gap Layer modified CNN architecture (GL-CNN) has been proposed with an IoT effective monitoring system and UAV technology. The UAV is employed for capturing plantlings images and the IoT model is utilized for obtaining the ground truth information of the plantlings health. The proposed model is trained to predict the successful and poor seedling growth for a given set of palm tree plantling images. The proposed GL-CNN architecture is novel in terms of defined convolution layers and the gap layer designed for output classification. There are two 64×3 conv layers, two 128×3 conv layers, two 256×3 conv layers and one 512×3 conv layer for processing of input image. The output obtained from the gap layer is modulated using the ReLU classifier for determining the seedling classification. To evaluate the proposed system, a new dataset of palm tree plantlings was collected in real time using UAV technology. This dataset consists of images of palm tree plantlings. The evaluation results showed that the proposed GL-CNN model performed better than the existing CNN architectures with an average accuracy of 95.96%.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。