Abstract
The rapid emotional evaluation of objects and events is essential in daily life. While visual scenes reliably evoke emotions, it remains unclear whether emotion schemas evoked by daily-life scenes depend on object processing systems or are extracted independently. To explore this, we collected emotion ratings for 4913 daily-life scenes from 300 participants, and predicted these ratings from representations in deep neural networks and functional magnetic resonance imaging (fMRI) activity patterns in visual cortex. AlexNet, an object-based model, outperformed EmoNet, an emotion-based model, in predicting emotion ratings for daily-life scenes, while EmoNet excelled for explicitly evocative scenes. Emotion information was processed hierarchically within the object recognition system, consistent with the visual cortex's organization. Activity patterns in the lateral occipital complex (LOC), an object-selective region, reliably predicted emotion ratings and outperformed other visual regions. These findings suggest that the emotional evaluation of daily-life scenes is mediated by visual object processing, with additional mechanisms engaged when object content is uninformative.