EmoTrans attention based emotion recognition using EEG signals and facial analysis with expert validation

EmoTrans 是一种基于注意力的情绪识别系统,它利用脑电信号和面部分析进行情绪识别,并经过专家验证。

阅读:1

Abstract

Emotion recognition via EEG signals and facial analysis becomes one of the key aspects of human-computer interaction and affective computing, enabling scientists to gain insight into the behavior of humans. Classic emotion recognition methods usually rely on controlled stimuli, such as music and images, which does not allow for ecological validity and scope. This paper proposes the EmoTrans model, which uses the DEAP dataset to analyze physiological signals and facial video recordings. It consisted of EEG recordings from 32 viewers of 40 one-minute movie clips and facial videos from 22 participants, which can be used to analyze the emotional states based on variables; valence, arousal, dominance, and liking. To increase the model's validity, expert validation in the form of a survey by psychologists was conducted. The model integrates features extracted from EEG signals in the time, frequency, and wavelet domains as well as facial video data to provide a comprehensive understanding of emotional states. Our proposed EmoTrans architecture achieves the accuracies of 89.3%, 87.8%, 88.9%, and 89.1% for arousal, valence, dominance, and liking, respectively. EmoTrans achieved an impressive classification accuracy of 89% for emotions such as happiness, excitement, calmness, and distress, among others. Moreover, the Statistical significance of performance improvements was confirmed using a paired t-test, which showed that EmoTrans significantly outperforms baseline models. The model was validated with machine learning and deep learning classifiers and also by Leave-one-subject-out cross-validation (LOSO-CV). The proposed attention-based architecture effectively prioritizes the most relevant features from EEG and facial data, pushing the boundaries of emotion classification and offering a more nuanced understanding of human emotions across different states.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。