[Cross-subject electroencephalogram emotion recognition based on maximum classifier discrepancy]

[基于最大分类器差异的跨被试脑电图情绪识别]

阅读:1

Abstract

Affective brain-computer interfaces (aBCIs) has important application value in the field of human-computer interaction. Electroencephalogram (EEG) has been widely concerned in the field of emotion recognition due to its advantages in time resolution, reliability and accuracy. However, the non-stationary characteristics and individual differences of EEG limit the generalization of emotion recognition model in different time and different subjects. In this paper, in order to realize the recognition of emotional states across different subjects and sessions, we proposed a new domain adaptation method, the maximum classifier difference for domain adversarial neural networks (MCD_DA). By establishing a neural network emotion recognition model, the shallow feature extractor was used to resist the domain classifier and the emotion classifier, respectively, so that the feature extractor could produce domain invariant expression, and train the decision boundary of classifier learning task specificity while realizing approximate joint distribution adaptation. The experimental results showed that the average classification accuracy of this method was 88.33% compared with 58.23% of the traditional general classifier. It improves the generalization ability of emotion brain-computer interface in practical application, and provides a new method for aBCIs to be used in practice.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。