Imaging scans show where symbols turn to letters in the brain
A functional MRI study maps the path symbols take as they gain meaning and sound
In learning to read, squiggles and lines transform into letters or characters that carry meaning and conjure sounds. A trio of cognitive neuroscientists has now mapped where that journey plays out inside the brain.
As readers associate symbols with pronunciation and part of a word, a pecking order of brain areas processes the information, the researchers report August 19 in the Proceedings of the National Academy of Sciences. The finding unveils some of the mystery behind how the brain learns to tie visual cues with language (SN Online: 4/27/16).
“We didn’t evolve to read,” says Jo Taylor, who is now at University College London but worked on the study while at Aston University in Birmingham, England. “So we don’t [start with] a bit of the brain that does reading.”
Taylor — along with Kathy Rastle at Royal Holloway University of London in Egham and Matthew Davis at the University of Cambridge — zoomed in on a region at the back and bottom of the brain, called the ventral occipitotemporal cortex, that is associated with reading.
Over two weeks, the scientists taught made-up words written in two unfamiliar, archaic scripts to 24 native English–speaking adults. The words were assigned the meanings of common nouns, such as lemon or truck. Then the researchers used functional MRI scans to track which tiny chunks of brain in that region became active when participants were shown the words learned in training.
The way letters look — curves or staunch lines — takes hold in the back of the ventral occipitotemporal cortex, the team found. But when sounds and meanings come into play, an area further forward in that brain region that better handles abstract concepts seemed to kick into gear.
The study “very clearly depicts the transition of a word … as you go from the eye to the cortex, and as you move along the cortex,” says David Rothlein, a cognitive neuroscientist at the VA Boston Healthcare System not involved with the work.
Words in the two scripts that had similar pronunciations or meanings triggered similar brain activity, the team found. The brain can make sense of words written in different fonts or sizes not just from the visual cues, but because the brain also connects the information with what it knows about spoken language, Taylor says. And eventually, “when you see a word, you immediately get its sound and its meaning without any effort.”
The brain region mapped in the study is known to process visual information. A few recent studies suggest that learning to read makes parts of the ventral occipitotemporal cortex more tuned into reading by displacing other functions, such as recognizing objects, or by encroaching on areas that are less tied to specific functions, Rastle says. This reorganization could be how reading becomes automatic. “Without that pathway … we would be like children reading letter by letter,” she says.