Shifting sets of brain networks may get activated in order to make sense of what other people say
In the brain, the gift of gab — or at least the gift of knowing what someone’s gabbing about — depends on sight, not just sound. If a listener sees a talker’s lips moving or hands gesturing, certain brain networks pitch in to decode the meaning of what’s being said, a new study suggests.
In daily life, the number of brain networks recruited for understanding spoken language varies depending on the types of communication-related visual cues available, proposes a team led by neuroscientist Jeremy Skipper, now at Weill Medical College of Cornell University in New York City.
This idea contrasts with a longstanding notion that language comprehension is handled solely by a relatively narrow set of brain regions.
“The networks in the brain that process language change from moment to moment during an actual conversation, using whatever information is available to predict what another person is saying,” Skipper says.