BiLSTM-LN-SA: A Novel Integrated Model with Self-Attention for Multi-Sensor Fire Detection

BiLSTM-LN-SA:一种用于多传感器火灾检测的新型自注意力集成模型

阅读:2

Abstract

Multi-sensor fire detection technology has been widely adopted in practical applications; however, existing methods still suffer from high false alarm rates and inadequate adaptability in complex environments due to their limited capacity to capture deep time-series dependencies in sensor data. To enhance robustness and accuracy, this paper proposes a novel model named BiLSTM-LN-SA, which integrates a Bidirectional Long Short-Term Memory (BiLSTM) network with Layer Normalization (LN) and a Self-Attention (SA) mechanism. The BiLSTM module extracts intricate time-series features and long-term dependencies. The incorporation of Layer Normalization mitigates feature distribution shifts across different environments, thereby improving the model's adaptability to cross-scenario data and its generalization capability. Simultaneously, the Self-Attention mechanism dynamically recalibrates the importance of features at different time steps, adaptively enhancing fire-critical information and enabling deeper, process-aware feature fusion. Extensive evaluation on a real-world dataset demonstrates the superiority of the BiLSTM-LN-SA model, which achieves a test accuracy of 98.38%, an F1-score of 0.98, and an AUC of 0.99, significantly outperforming existing methods including EIF-LSTM, rTPNN, and MLP. Notably, the model also maintains low false positive and false negative rates of 1.50% and 1.85%, respectively. Ablation studies further elucidate the complementary roles of each component: the self-attention mechanism is pivotal for dynamic feature weighting, while layer normalization is key to stabilizing the learning process. This validated design confirms the model's strong generalization capability and practical reliability across varied environmental scenarios.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。