Comparison of CNN-Based Image Classification Approaches for Implementation of Low-Cost Multispectral Arcing Detection

基于卷积神经网络的图像分类方法在低成本多光谱弧光检测中的应用比较

阅读:2

Abstract

Camera-based sensing has benefited in recent years from developments in machine learning data processing methods, as well as improved data collection options such as Unmanned Aerial Vehicles (UAV) mounted sensors. However, cost considerations, both for the initial purchase of sensors as well as updates, maintenance, or potential replacement if damaged, can limit adoption of more expensive sensing options for some applications. To evaluate more affordable options with less expensive, more available, and more easily replaceable hardware, we examine the use of machine learning-based image classification with custom datasets, utilizing deep learning based-image classification and the use of ensemble models for sensor fusion. Utilizing the same models for each camera to reduce technical overhead, we showed that for a very representative training dataset, camera-based detection can be successful for detection of electrical arcing. We also use multiple validation datasets, based on conditions expected to be of varying difficulty, to evaluate custom data. These results show that ensemble models of different data sources can mitigate risks from gaps in training data, though the system will be less redundant for those cases unless other precautions are taken. We found that with good quality custom datasets, data fusion models can be utilized without specialization in design to the specific cameras utilized, allowing for less specialized, more accessible equipment to be utilized as multispectral camera components. This approach can provide an alternative to expensive sensing equipment for applications in which lower-cost or more easily replaceable sensing equipment is desirable.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。