Abstract
The distributed model of primate face perception proposes that distinct facial signals, such as emotional valence and sex, are processed by separate neural mechanisms. A key prediction is that cues about a face's emotion and sex are extracted at different processing stages. To test this, we decoded time-resolved patterns of brain activity evoked by a large set of unfamiliar, naturalistic faces. Behavioral ratings were first collected to characterize the perceived emotional valence and sex of 900 faces. Electrophysiological recordings were then obtained while 40 participants passively viewed all face stimuli. The brain reliably distinguished both emotional valence and perceived sex from a brief glance, with decoding emerging in under 95 ms. Crucially, emotional valence showed earlier peak decoding than perceived sex, consistent with time-resolved representational similarity analyses. These findings indicate separable processing pipelines for changeable and stable facial signals, providing empirical support for the distributed model of face perception.