Improved Automatic Deep Model for Automatic Detection of Movement Intention from EEG Signals

改进的自动深度模型用于从脑电信号中自动检测运动意图

阅读:2

Abstract

Automated movement intention is crucial for brain-computer interface (BCI) applications. The automatic identification of movement intention can assist patients with movement problems in regaining their mobility. This study introduces a novel approach for the automatic identification of movement intention through finger tapping. This work has compiled a database of EEG signals derived from left finger taps, right finger taps, and a resting condition. Following the requisite pre-processing, the captured signals are input into the proposed model, which is constructed based on graph theory and deep convolutional networks. In this study, we introduce a novel architecture based on six deep convolutional graph layers, specifically designed to effectively capture and extract essential features from EEG signals. The proposed model demonstrates a remarkable performance, achieving an accuracy of 98% in a binary classification task when distinguishing between left and right finger tapping. Furthermore, in a more complex three-class classification scenario, which includes left finger tapping, right finger tapping, and an additional class, the model attains an accuracy of 92%. These results highlight the effectiveness of the architecture in decoding motor-related brain activity from EEG data. Furthermore, relative to recent studies, the suggested model exhibits significant resilience in noisy situations, making it suitable for online BCI applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。