Language goes beyond sight, sound in brain

Now hear this: Two brain areas long considered crucial for perceiving and speaking words may deal in more than speech. These patches of neural tissue spring to action in deaf people who are using sign language or watching others do so, a new brain-scan study finds.

These brain regions, one near the front of the brain and the other toward the back in the so-called auditory cortex, handle fundamental features of language that can be expressed either through speech or signing, concludes a team of neuroscientists led by Laura-Ann Petitto of McGill University in Montreal.

The new findings underscore the need to explore the brain’s flexibility in facilitating language, the researchers assert in the Dec. 5 Proceedings of the National Academy of Sciences (PNAS).

“What is certain is that the human brain can entertain multiple pathways for language expression and reception, and that the cerebral specialization for language functions is not exclusive to the mechanisms for producing and perceiving speech and sound,” Petitto and her coworkers say.

Earlier studies indicated that considerable overlap exists between brain regions involved in spoken and signed language. For example, strokes damaging the left brain undermine both types of communication. Researchers have also used brain-imaging devices to link several left-brain sections to sentence comprehension during both reading and signing (SN: 11/23/96, p. 326).

Petitto’s group narrowed the search for brain regions common to both linguistic forms by identifying neural activation unique to understanding and producing words.

The researchers studied 11 adults who had been deaf from birth, 5 of whom had learned American Sign Language and 6 of whom had learned another type of sign language used in Quebec and other parts of French Canada. Ten hearing adults who spoke English and had no knowledge of sign language also completed the experiment.

A positron-emission tomography (PET) scanner measured blood-flow surges in the deaf volunteers’ brains—an indirect marker of increased neural activity—during each of five tasks: staring at a point on a computer screen, watching a string of meaningless hand signs, observing meaningful signed words, imitating a new set of meaningful signed words, and generating appropriate signed verbs for a series of signed nouns.

Hearing participants performed the first four of these tasks. Half of them watched and imitated signs from each of the languages. The entire hearing group then produced spoken verbs in response to a series of printed nouns.

A frontal area of the left brain increased its activity uniquely when both deaf and hearing volunteers generated verbs, Petitto’s group reports. This region may search for and retrieve the meanings of words, whether linguistic expression hinges on sight or sound, they theorize.

Tissue located further back on both sides of the brain lit up only when deaf adults viewed meaningful signs. Long linked to speech perception and the formation of associations between sounds and meanings, this area may process abstract properties of language that get expressed through both speech and signing, the researchers propose. It’s also possible that this brain region responds to the rapid presentation of any sensory information in units that form patterns, not just to language, they add.

Further research needs to establish which of these contrasting theories holds up, comments neuroscientist David Caplan of Massachusetts General Hospital in Boston in the same issue of PNAS.

The new report “reinforces the view that the functions carried out in what is widely thought of as auditory . . . cortex need to be reconsidered,” Caplan holds.

Bruce Bower has written about the behavioral sciences for Science News since 1984. He writes about psychology, anthropology, archaeology and mental health issues.