A deep hybrid inception network model with entropy based attention for automated iron ore image characterization

一种基于熵注意力机制的深度混合 Inception 网络模型用于自动铁矿石图像表征

阅读:1

Abstract

Iron ores are an important mineral resource for the industrial development of an economy. Grading of ores is an important task involved at different stages of ore processing. The present study focuses on the grading of iron ores and uses reflected light microscopic iron ore image dataset. The ores were included from different mines of Singhbhum Craton of Eastern India. The aim of the study is to develop a robust generalized model for automating the task of iron ore characterization of ores belonging to four different grades. For this purpose a deep learning model has been developed, which implements a directed acyclic graph network architecture via hybrid inception topology. The network has been designed to combine together the feature extraction efficiencies of different pre-trained models. It implements MobilenetV2, InceptionV3, Xception models, as base classifiers, for feature extraction, followed by an attention channel for enhancement of the extracted features, and an encoder channel for dimensionality reduction of the enhanced feature set. This encoder channel helps in the making of a more generalized model. The performance of the proposed model has been compared to the existing state of art deep learning models- MobilenetV2, Inception V3 and Xception, and shows a very good performance, with a final classification accuracy of 97% in comparison to 91%, the best accuracy among the individual participating base classifiers. The individual base classifiers exhibited varying performance across different classes, with certain classes experiencing notably high misclassification rates, which posed a major concern. In contrast, the proposed model significantly reduces class-wise misclassification rates compared to the base classifiers.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。