By Bruce Bower
People listen with their skin, not just their ears. Air puffs delivered to volunteers’ hands or necks at critical times alter their ability, for better or worse, to hear certain speech sounds, a new study finds.
Tactile and auditory information, as well as other sensory inputs, interact in the brain to foster speech perception, propose linguists Bryan Gick and Donald Derrick, both of the University of British Columbia in Vancouver.
In many languages, speakers expel a small burst of air to make aspirated sounds. In English, for example, aspiration distinguishes ta from da and pa from ba.
Volunteers were more likely to identify aspirated syllables correctly when they heard those syllables while receiving slight, inaudible air puffs to the skin, Gick and Derrick report in the Nov. 26 Nature. Air puffs enhanced detection of aspirated ta and pa sounds and increased the likelihood of mishearing non-aspirated da and ba sounds as their aspirated counterparts, the researchers say.
Participants integrated skin sensations into what they heard despite having had few opportunities to feel the air flow that they or others produce while talking.
“This speaks to the power of the human perceptual system to make use of whatever information is available from any of the senses,” Gick says.
Psychoacoustics researcher Charlotte Reed of MIT agrees. “These new findings make it clear that we can intuitively use tactile information during speech perception,” Reed remarks.
Until recently, speech investigators focused mainly on how what a person sees on a speaker’s face can alter the intelligibility of what the speaker says. Consider what researchers call the McGurk effect: Someone mouthing, say, ga on a videotape while saying ba on the soundtrack is often perceived as emitting a compromise sound — da.
Gick and Derrick wanted to know whether people automatically make use of tactile information while listening to others talk. They tested 66 volunteers, 22 in each of three conditions. All participants sat in a soundproof booth and listened through headphones to the syllables pa and ba spoken eight times each. They then heard ta and da spoken eight times each. Background noise that played through the headphones increased the difficulty of telling syllables apart.
A valve attached to an air compressor delivered an air puff either to the back of participants’ right hands or the front of their necks during half of the syllable presentations. Air puffs were also delivered near their headphones but directed away from their bodies. Participants were aware of the valve but did not know when it would deliver air puffs.
Air puffs on the hands and neck had comparable effects. But air puffs that could not be felt had no effect on syllable perception, the researchers say. An additional trial, with another 22 volunteers, found that light taps to the back of the right hand also did not influence syllable perception.
“Perceivers in our study were not merely responding to some tactile stimulation,” Gick says. “They really did ‘hear’ the puff of air via their skin.”
This line of research might lead to hearing aids designed to shoot mild puffs of air on the neck upon detecting aspirated sounds, he adds.
Gick’s results align with other evidence suggesting that the brain quickly and effectively integrates information from the skin and ears, Reed notes.
In the October Journal of the Acoustical Society of America, she and her colleagues reported that presentating barely detectible tones at the same time as mild skin vibrations makes both stimuli easier to detect than when presented separately.