Shared Neural Codes for Emotion Recognition in Emoji and Human Faces

表情符号和人脸情绪识别的共享神经编码

阅读:1

Abstract

Facial expressions are critical social signals that support human communication. In digital contexts, emojis serve as a primary surrogate for nonverbal cues such as facial expressions; however, little is known about the extent to which emoji expressions are processed using neural mechanisms similar to those engaged by real human faces. To address this question, we used EEG-based multivariate pattern analysis (MVPA) to examine the neural dynamics of emotional expression processing in real faces and emoji faces. Across two experiments using identical paradigms, independent groups of participants viewed facial expressions (happy, angry, sad, neutral) in real faces (4 female and 4 male identities, n = 24) or emojis (6 platforms, n = 25) while performing a two-alternative forced-choice emotion recognition task. Time-resolved multivariate classification and spatio-temporal searchlight analyses revealed robust decoding of emotional expressions within and across experiments. Consistent effects emerged early and peaked between 145 and 160 ms over posterior-occipital and parietal regions. Notably, robust cross-classification between real and emoji faces demonstrated that face-like emoji stimuli evoke neural responses comparable to those elicited by real faces, with more sustained effects over right posterior sites. These findings suggest that the brain uses partially overlapping spatio-temporal codes for naturalistic and symbolic facial expressions, providing new insights into the neural coding of social signals and the representational overlap between natural and artificial emotional expressions.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。