Convolutional neural networks decode finger movements in motor sequence learning from MEG data

卷积神经网络利用MEG数据解码运动序列学习中的手指运动

阅读:1

Abstract

OBJECTIVE: Non-invasive Brain-Computer Interfaces provide accurate classification of hand movement lateralization. However, distinguishing activation patterns of individual fingers within the same hand remains challenging due to their overlapping representations in the motor cortex. Here, we validated a compact convolutional neural network for fast and reliable decoding of finger movements from non-invasive magnetoencephalographic (MEG) recordings. APPROACH: We recorded healthy participants in MEG performing a serial reaction time task (SRTT), with buttons pressed by left and right index and middle fingers. We devised classifiers to identify left vs. right hand movements and among four finger movements using a recently proposed decoding approach, Linear Finite Impulse Response Convolutional Neural Network (LF-CNN). We also compared LF-CNN to existing deep learning architectures such as EEGNet, FBCSP-ShallowNet, and VGG19. RESULTS: Sequence learning was reflected by a decrease in reaction times during SRTT performance. Movement laterality was decoded with an accuracy superior to 95% by all approaches, while for individual finger movement, decoding was in the 80-85% range. LF-CNN stood out for (1) its low computational time and (2) its interpretability in both spatial and spectral domains, allowing to examine neurophysiological patterns reflecting task-related motor cortex activity. SIGNIFICANCE: We demonstrated the feasibility of finger movement decoding with a tailored Convolutional Neural Network. The performance of our approach was comparable to complex deep learning architectures, while providing faster and interpretable outcome. This algorithmic strategy holds high potential for the investigation of the mechanisms underlying non-invasive neurophysiological recordings in cognitive neuroscience.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。