Happy mouth and fearful eyes: insights into emotional facial features from ERP

笑容灿烂,眼神恐惧:基于ERP的情绪面部特征洞察

阅读:1

Abstract

Facial expressions enable individuals to assess and understand emotions conveyed by others. Two crucial sources of expressive cues on the human face-the eyes and the mouth-capture attention and serve as reliable shortcuts for expression recognition. However, how the brain effectively extracts emotional information from these diagnostic features remains unknown. We investigated this issue using an electroencephalogram combined with a rapid serial visual presentation task in which participants were asked to recognize facial expressions (fear, happiness, and neutrality) from three formats (whole face, eye region, and mouth region). We found that participants recognized happy expressions from the mouth region more accurately than the other expressions, affirming the role of diagnostic features in facilitating bottom-up attentional capture. The isolated eye region with higher visual saliency induced the largest P1 component. Diagnostic features, such as a happy mouth and fearful eyes, elicited a larger N170 component compared to non-diagnostic features, such as a fearful mouth and happy eyes. Source analysis of N170 showed that the fusiform gyrus exhibited similar patterns in response to these emotional features. The P3 was effective in discriminating between different emotional content. When whole faces were visible, fearful and happy expressions were not distinguishable in the N170, while the P3 amplitude was larger when induced by fearful faces than by happy faces. Our study contributes to understanding how facial features play distinct roles in emotional perception, attention, and facial processing.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。