Exploring EEG-based motor imagery decoding: a dual approach using spatial features and spectro-spatial Deep Learning model IFNet

探索基于脑电图的运动想象解码:一种结合空间特征和频谱空间深度学习模型 IFNet 的双重方法

阅读:1

Abstract

INTRODUCTION: In recent years, the decoding of motor imagery (MI) from electroencephalography (EEG) signals has become a focus of research for brain-machine interfaces (BMIs) and neurorehabilitation. However, EEG signals present challenges due to their non-stationarity and the substantial presence of noise commonly found in recordings, making it difficult to design highly effective decoding algorithms. These algorithms are vital for controlling devices in neurorehabilitation tasks, as they activate the patient's motor cortex and contribute to their recovery. METHODS: This study proposes a novel approach for decoding MI during pedalling tasks using EEG signals. A widespread approach is based on feature extraction using Common Spatial Patterns (CSP) followed by a linear discriminant analysis (LDA) as a classifier. The first approach covered in this work aims to investigate the efficacy of a task-discriminative feature extraction method based on CSP filter and LDA classifier. Additionally, the second alternative hypothesis explores the potential of a spectro-spatial Convolutional Neural Network (CNN) to further enhance the performance of the first approach. The proposed CNN architecture combines a preprocessing pipeline based on filter banks in the frequency domain with a convolutional neural network for spectro-temporal and spectro-spatial feature extraction. RESULTS AND DISCUSSION: To evaluate the approaches and their advantages and disadvantages, EEG data has been recorded from several able-bodied users while pedalling in a cycle ergometer in order to train motor imagery decoding models. The results show levels of accuracy up to 80% in some cases. The CNN approach shows greater accuracy despite higher instability.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。