Multi-View Feature Enhancement Based on Self-Attention Mechanism Graph Convolutional Network for Autism Spectrum Disorder Diagnosis

基于自注意力机制图卷积网络的多视角特征增强用于自闭症谱系障碍诊断

阅读:1

Abstract

Functional connectivity (FC) network based on resting-state functional magnetic resonance imaging (rs-fMRI) has become an important tool to explore and understand the brain, which can provide objective basis for the diagnosis of neurodegenerative diseases, such as autism spectrum disorder (ASD). However, most functional connectivity (FC) networks only consider the unilateral features of nodes or edges, and the interaction between them is ignored. In fact, their integration can provide more comprehensive and crucial information in the diagnosis. To address this issue, a new multi-view brain network feature enhancement method based on self-attention mechanism graph convolutional network (SA-GCN) is proposed in this article, which can enhance node features through the connection relationship among different nodes, and then extract deep-seated and more discriminative features. Specifically, we first plug the pooling operation of self-attention mechanism into graph convolutional network (GCN), which can consider the node features and topology of graph network at the same time and then capture more discriminative features. In addition, the sample size is augmented by a "sliding window" strategy, which is beneficial to avoid overfitting and enhance the generalization ability. Furthermore, to fully explore the complex connection relationship among brain regions, we constructed the low-order functional graph network (Lo-FGN) and the high-order functional graph network (Ho-FGN) and enhance the features of the two functional graph networks (FGNs) based on SA-GCN. The experimental results on benchmark datasets show that: (1) SA-GCN can play a role in feature enhancement and can effectively extract more discriminative features, and (2) the integration of Lo-FGN and Ho-FGN can achieve the best ASD classification accuracy (79.9%), which reveals the information complementarity between them.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。