Abstract
Object categorization is essential to navigate everyday life. It is ultra-rapid, can be completed by purely feedforward mechanisms, and is therefore thought to rely on neural representations that are robust. But how do these representations adapt when category boundaries change (eg buying fruit versus buying apples)? We tested this by asking participants to categorize images at different levels of abstraction while measuring their scalp electrical activity (EEG) with high temporal resolution. Participants categorized images either at the superordinate (animal/non-animal) or at the basic (bird/non-bird) level. We compared classification accuracy and representational similarity of EEG signals between birds, non-bird animals, and vehicles to determine if neural representations are modified according to categorical requirements. We found that neural representations of birds and non-bird animals were indistinguishable in the superordinate task but were separable in the basic task from ~250 ms. On the other hand, the separability of neural representations between non-bird animals and vehicles did not differ by task. These findings suggest that top-down influences modulate categorical representations as needed, but only if discrimination is difficult. We conclude that neural representations of categories are adaptively altered to suit the current task requirements.