Research and optimization of a multilevel fire detection framework based on deep learning and classical pattern recognition techniques

基于深度学习和经典模式识别技术的多级火灾探测框架的研究与优化

阅读:1

Abstract

Fire detection technology is essential for safeguarding public safety and minimizing property damage. Despite advancements in both traditional methodologies and modern deep learning models, challenges such as suboptimal accuracy and elevated false alarm rates persist, particularly in complex environmental scenarios. This paper introduces the Fire Focused Detection Network (FFDNet), a state-of-the-art flame detection framework that seamlessly integrates classical approaches with deep learning strategies. By leveraging an enhanced Real-Time DEtection TRansformer (RT-DETR) model alongside the Vector Quantized Generative Adversarial Network (VQGAN), our methodology not only enhances flame detection sensitivity and precision but also significantly reduces false alarm frequencies. Specifically, we have integrated a novel loss function, the Innovative Minimum Perimeter Distance IoU (InnMPD-IoU), into the RT-DETR model, enabling the identification of a wider range of flames and flame-like phenomena. Additionally, the use of Complete Local Binary Pattern (CLBP) technology for texture feature extraction, combined with VQGAN technology for accurate flame identification through sample reconstruction, underscores our innovative approach. The experimental results demonstrate the exceptional performance of the model, achieving precision, recall, F1 score, and accuracy rates of 98.23%, 96.33%, 97.33%, and 95.08%, respectively, on the Dataset for Fire and Smoke Detection (DFS), substantially surpassing existing methods. Our objective is to further develop FFDNet into a robust, efficient, and widely applicable tool for flame detection, thereby providing significant technical support for fire prevention and response initiatives.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。