Temporal Capsule Feature Network for Eye-Tracking Emotion Recognition

基于颞囊特征网络的眼动追踪情绪识别

阅读:1

Abstract

Eye Tracking (ET) parameters, as physiological signals, are widely applied in emotion recognition and show promising performance. However, emotion recognition relying on ET parameters still faces several challenges: (1) insufficient extraction of temporal dynamic information from the ET parameters; (2) a lack of sophisticated features with strong emotional specificity, which restricts the model's robustness and individual generalization capability. To address these issues, we propose a novel Temporal Capsule Feature Network (TCFN) for ET parameter-based emotion recognition. The network incorporates a Window Feature Module to extract Eye Movement temporal dynamic information and a specialized Capsule Network Module to mine complementary and collaborative relationships among features. The MLP Classification Module realizes feature-to-category conversion, and a Dual-Loss Mechanism is integrated to optimize overall performance. Experimental results demonstrate the superiority of the proposed model: the average accuracy reaches 83.27% for Arousal and 89.94% for Valence (three-class tasks) on the eSEE-d dataset, and the accuracy rate of four-category across-session emotion recognition is 63.85% on the SEED-IV dataset.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。