Parallel processing of distinct facial signals for the rapid evaluation of social agents

对不同面部信号进行并行处理,以快速评估社会主体

阅读:1

Abstract

The distributed model of primate face perception proposes that distinct facial signals, such as emotional valence and sex, are processed by separate neural mechanisms. A key prediction is that cues about a face's emotion and sex are extracted at different processing stages. To test this, we decoded time-resolved patterns of brain activity evoked by a large set of unfamiliar, naturalistic faces. Behavioral ratings were first collected to characterize the perceived emotional valence and sex of 900 faces. Electrophysiological recordings were then obtained while 40 participants passively viewed all face stimuli. The brain reliably distinguished both emotional valence and perceived sex from a brief glance, with decoding emerging in under 95 ms. Crucially, emotional valence showed earlier peak decoding than perceived sex, consistent with time-resolved representational similarity analyses. These findings indicate separable processing pipelines for changeable and stable facial signals, providing empirical support for the distributed model of face perception.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。