Attention-Enhanced U-Net for Sensor-Efficient High-Density EEG Reconstruction in Wearable Brain Monitoring Systems

注意力增强型 U-Net 用于可穿戴脑监测系统中传感器高效的高密度脑电图重建

阅读:1

Abstract

High-channel-density (HCD) electroencephalography (EEG) enables fine-grained neural sensing but is constrained by high hardware costs, spatial complexity, and limited portability. This study developed a deep learning-based method to reconstruct high-density EEG signals from low-channel-density (LCD) inputs, enabling more practical and affordable brain-monitoring systems. This study introduces VEEG-A-U-Net, a lightweight U-Net architecture enhanced with attention gates and residual learning. The model combined spherical spline interpolation with a learnable correction signal to adaptively model spatial-temporal features. The framework was trained and evaluated on the SEED dataset, using normalized mean square error (NMSE), signal-to-noise ratio (SNR), and Pearson correlation coefficient (PCC) to assess reconstruction performance. Validation was conducted through leave-one-subject-out cross-validation (LOSO-CV) and cross-dataset experiments to examine generalizability. Under the same reconstruction setting (scale factor = 2), VEEG-A-U-Net achieved competitive reconstruction performance compared with state-of-the-art methods, while requiring substantially fewer parameters and computational operations. Cross-dataset evaluations confirmed stable performance across different EEG paradigms. Inference-time analysis showed low computational latency, indicating practical feasibility for deployment in resource-constrained and edge computing environments. A preliminary clinical EEG evaluation was also conducted to explore feasibility in clinical settings.The proposed framework offers an effective and lightweight solution for reconstructing high-density EEG from sparse measurements. These findings may support the development of sensor-efficient and portable EEG systems for practical neuroengineering and brain–computer interface applications. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10916-026-02374-5.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。