The emergence of semantic categorization in early visual processing: ERP indices of animal vs. artifact recognition

早期视觉加工中语义分类的出现:动物与人工制品识别的ERP指标

阅读:1

Abstract

BACKGROUND: Neuroimaging and neuropsychological literature show functional dissociations in brain activity during processing of stimuli belonging to different semantic categories (e.g., animals, tools, faces, places), but little information is available about the time course of object perceptual categorization. The aim of the study was to provide information about the timing of processing stimuli from different semantic domains, without using verbal or naming paradigms, in order to observe the emergence of non-linguistic conceptual knowledge in the ventral stream visual pathway. Event related potentials (ERPs) were recorded in 18 healthy right-handed individuals as they performed a perceptual categorization task on 672 pairs of images of animals and man-made objects (i.e., artifacts). RESULTS: Behavioral responses to animal stimuli were ~50 ms faster and more accurate than those to artifacts. At early processing stages (120-180 ms) the right occipital-temporal cortex was more activated in response to animals than to artifacts as indexed by posterior N1 response, while frontal/central N1 (130-160) showed the opposite pattern. In the next processing stage (200-260) the response was stronger to artifacts and usable items at anterior temporal sites. The P300 component was smaller, and the central/parietal N400 component was larger to artifacts than to animals. CONCLUSION: The effect of animal and artifact categorization emerged at ~150 ms over the right occipital-temporal area as a stronger response of the ventral stream to animate, homomorphic, entities with faces and legs. The larger frontal/central N1 and the subsequent temporal activation for inanimate objects might reflect the prevalence of a functional rather than perceptual representation of manipulable tools compared to animals. Late ERP effects might reflect semantic integration and cognitive updating processes. Overall, the data are compatible with a modality-specific semantic memory account, in which sensory and action-related semantic features are represented in modality-specific brain areas.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。