BMC Neuroscience (Apr 2007)
The emergence of semantic categorization in early visual processing: ERP indices of animal vs. artifact recognition
Abstract
Abstract Background Neuroimaging and neuropsychological literature show functional dissociations in brain activity during processing of stimuli belonging to different semantic categories (e.g., animals, tools, faces, places), but little information is available about the time course of object perceptual categorization. The aim of the study was to provide information about the timing of processing stimuli from different semantic domains, without using verbal or naming paradigms, in order to observe the emergence of non-linguistic conceptual knowledge in the ventral stream visual pathway. Event related potentials (ERPs) were recorded in 18 healthy right-handed individuals as they performed a perceptual categorization task on 672 pairs of images of animals and man-made objects (i.e., artifacts). Results Behavioral responses to animal stimuli were ~50 ms faster and more accurate than those to artifacts. At early processing stages (120–180 ms) the right occipital-temporal cortex was more activated in response to animals than to artifacts as indexed by posterior N1 response, while frontal/central N1 (130–160) showed the opposite pattern. In the next processing stage (200–260) the response was stronger to artifacts and usable items at anterior temporal sites. The P300 component was smaller, and the central/parietal N400 component was larger to artifacts than to animals. Conclusion The effect of animal and artifact categorization emerged at ~150 ms over the right occipital-temporal area as a stronger response of the ventral stream to animate, homomorphic, entities with faces and legs. The larger frontal/central N1 and the subsequent temporal activation for inanimate objects might reflect the prevalence of a functional rather than perceptual representation of manipulable tools compared to animals. Late ERP effects might reflect semantic integration and cognitive updating processes. Overall, the data are compatible with a modality-specific semantic memory account, in which sensory and action-related semantic features are represented in modality-specific brain areas.