Exploring physical and functional EEG connectivity with multilayer graph transformer convolutional networks for emotion recognition

利用多层图变换卷积网络探索脑电图的物理和功能连接,以进行情绪识别

阅读:1

Abstract

Electroencephalogram (EEG)-based emotion recognition has emerged as a compelling direction in affective computing, driven by its ability to provide objective, neural-level insights into emotional states. However, the high-dimensional and complex spatial and functional characteristics of EEG data present substantial challenges for accurate modeling. To address this, we propose Multilayer-GTCN (Multilayer Graph Transformer Convolutional Network), which combines the strengths of Graph Convolutional Networks (GCNs) and Graph Transformer layers to effectively capture both local and global dependencies in EEG signals. The framework employs a dual-graph design over feature nodes: a physical proximity graph instantiated as a complete topology to stabilize information flow, and a functional connectivity graph whose edges are correlations derived from inter-feature relationships. Within this representation, GCN layers consolidate stable relational patterns, while transformer-based graph convolutions capture long-range dependencies and transient interactions across the feature space. Combining the two encoded views results in representations that jointly capture localized structure and global context, providing a robust basis for affective decoding. Extensive experiments on benchmark datasets confirm the effectiveness of our approach, achieving 98.24 ± 1.74% on SEED, 95.82 ± 1.89% on SEED-IV, and 93.35 ± 4.08% (valence) / 94.11 ± 2.98% (arousal) on DEAP. These results highlight the efficiency and flexibility of Multilayer-GTCN across varied datasets. By merging a physical proximity graph with correlation-based functional connectivity in a multilayer architecture, this study lays a foundation for scalable affective-computing systems and delivers a framework to guide upcoming advances in neural signal study.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。