NRGAMTE: Neurophysiological Residual Gated Attention Multimodal Transformer Encoder for Sleep Disorder Detection

NRGAMTE:用于睡眠障碍检测的神经生理残差门控注意力多模态Transformer编码器

阅读:1

Abstract

BACKGROUND/OBJECTIVE: Sleep is significant for human mental and physical health. Sleep disorders represent a crucial risk to human health, and a large portion of the world population suffers from them. The efficient identification of sleep disorders is significant for effective treatment. However, the precise and automatic detection of sleep disorders remains challenging due to the inter-subject variability, overlapping symptoms, and reliance on single-modality physiological signals. METHODS: To address these challenges, a Neurophysiological Residual Gated Attention Multimodal Transformer Encoder (NRGAMTE) model was developed for robust sleep disorder detection using multimodal physiological signals, including Electroencephalogram (EEG), Electromyogram (EMG), and Electrooculogram (EOG). Initially, raw signals are segmented into 30-s windows and processed to capture the significant time- and frequency-domain features. Every modality is independently embedded by a One-Dimensional Convolutional Neural Network (1D-CNN), which preserves signal-specific characteristics. A Modality-wise Residual Gated Cross-Attention Fusion (MRGCAF) mechanism is introduced to select significant cross-modal interactions, while the learnable residual path ensures that the most relevant features are retained during the gating process. RESULTS: The developed NRGAMTE model achieved an accuracy of 94.51% on the Sleep-EDF expanded dataset and 99.64% on the Cyclic Alternating Pattern (CAP Sleep database), significantly outperforming the existing single- and multimodal algorithms in terms of robustness and computational efficiency. CONCLUSIONS: The results shows that NRGAMTE obtains high performance across multiple datasets, significantly improving detection accuracy. This demonstrated their potential as a reliable tool for clinical sleep disorder detection.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。