Frontiers in Psychology (Apr 2012)
Gesture’s neural language
Abstract
When people talk to each other, they often make arm and hand movements that accompany what they say. These manual movements, called co-speech gestures, can convey meaning by way of their interaction with the oral message. Another class of manual gestures, called emblematic gestures or emblems, also convey meaning, but in contrast to co-speech gestures, they can do so directly and independent of speech. There is currently significant interest in the behavioral and biological relationships between action and language. Since co-speech gestures are actions that rely on spoken language, while emblems convey meaning to the effect that they can sometimes substitute for speech, these represent important and potentially very informative examples of language-motor interactions. Researchers have recently been examining how the brain processes these actions, and the findings do not yet paint an unambiguous picture. For the most part, however, it seems that two complimentary sets of brain areas respond when people see gestures, reflecting their role in disambiguating meaning. These include areas thought to be important for understanding actions, and areas ordinarily related to processing language. What is just beginning to emerge are the shared and distinct responses across these two networks during communication. In this review, we talk about how the brain responds when we see gestures, how these responses relate to brain activity when we process language, and how these might relate in what we know as normal, everyday communication.
Keywords