Noises On, Language Off: Speech impairment linked to unsound perception

A common childhood language disorder stems from a brain-based difficulty in discerning the acoustic building blocks of spoken words, especially in noisy settings such as classrooms, a new study suggests.

Researchers estimate that as many as 7 percent of U.S. elementary school students experience substantial problems in understanding what others say and in speaking comprehensibly, despite good physical health, normal hearing, and average-or-better intelligence. The precise grammatical failures of children with this condition, known as specific language impairment (SLI), remain controversial.

Psychologist Johannes C. Ziegler of the University of Provence in Marseille, France, and his colleagues find that these children’s subtle problems in identifying spoken consonants in quiet settings become far worse with the addition of background noise.

In kids free of language problems and in youngsters with SLI, constant background noise disrupted consonant detection more than intermittent background noise did. But both types of background sounds undermined speech perception much more in children with the language disorder than in the others, the scientists report in the Sept. 27 Proceedings of the National Academy of Sciences.

Ziegler’s group studied 20 French children, most 10 to 11 years old, who had been diagnosed with SLI at a Marseille hospital. Testing also included 20 kids, ages 10 to 11, and another 20 kids, most 8 to 9 years old, all with no language problem. The researchers included the younger group because their language skills were similar to those of the 10- and 11-year-olds with SLI.

Participants listened through headphones to a woman uttering a series of vowel-consonant-vowel combinations, such as aba, ada, and aga. Their job was to repeat each utterance or to point it out from among 16 choices displayed on a computer screen.

In some trials, it was quiet as the woman spoke. In others, she spoke over either a steady background tone or a tone that faded in and out in a regular sequence.

The children with SLI correctly identified 95 percent of the consonants presented without background noise, only slightly below the near-perfect performance of the other two groups. However, consonant detection in the SLI group fell to 72 percent correct with fluctuating background noise and hit 62 percent with static background noise.

Corresponding drop-offs for the other two groups were slight in comparison—with accuracies of 94 percent and 86 percent for older children and 91 percent and 83 percent for younger ones.

Ziegler’s group proposes that children with SLI hear just fine, but that their brains have difficulty picking out speech sounds from a stream of acoustic information.

However, some other scientists disagree. For instance, Mabel L. Rice of the University of Kansas in Lawrence suspects that the condition stems from miswiring or delayed growth of brain networks responsible for grammar use.

“It’s hard to know if the kids in this new study had SLI as many researchers now define it,” Rice remarks. Other investigators have found that the speech-articulation deficits that Ziegler’s group observed rarely accompany SLI in the general population of elementary school children but often turn up in medical clinics. So, Rice argues, the test group had problems beyond typical SLI.

Bruce Bower has written about the behavioral sciences for Science News since 1984. He writes about psychology, anthropology, archaeology and mental health issues.