Brain-computer interfaces promise new freedom for the paralyzed and immobile
Thick Velcro straps cinched the robot’s legs to Steve Holbert’s calves and thighs. The straps were snug — they helped secure his body to the machine. But they had to be fastened just right. Too loose and Holbert might slip around. Too tight and the straps could cause pressure sores. Not that Holbert could tell the difference: He hadn’t felt his legs since 2009.
Holbert was testing out a brain-controlled walking device for people who are paralyzed. He had tried the machine before, but couldn’t quite get it to sync up with his thoughts. This afternoon, he was giving it another shot. Already, researchers in José “Pepe” Contreras-Vidal’s lab at the University of Houston had stretched an electrode-studded cap over Holbert’s head and strapped him into the robot, an 80-pound hulk of high-tech machinery and electronics named the NeuroRex. “It reminds me of the cargo-loader Sigourney Weaver drives in Aliens,” Holbert says.
He grabbed the robot’s armrests, scrolled through the LCD menu and selected “stand.” The robot’s motors whirred, and scientists hovered nearby in case the machine tipped over. At Contreras-Vidal’s cue, the bustling lab hushed and Holbert cleared his brain. Then he mentally commanded the NeuroRex to move.
“I’m thinking, ‘Go, go, go,’ and it started walking!” he says. “It took off at exactly the same time I was trying to make it move.” As Holbert marched slowly across the lab, he smiled. “Pepe, it feels like these legs are mine.” The whole room started clapping. “It was an emotional moment,” Contreras-Vidal says. Holbert remembers the scientist looking calm. “Pepe knew it was going to work,” he says. “He’s been optimistic the whole time.”
For years, Contreras-Vidal has been working to decode brain signals collected from the scalp. Because skin and bone distort these electroencephalography, or EEG, signals, deciphering messages from the brain can be challenging. But Contreras-Vidal and a pack of scientists around the world have been finding creative ways around the problem. The researchers are gleaning new information from EEG and creating hybrid systems that merge brain signals with intelligent robotics, or draw upon several types of signals. Together, the new work is clearing a path for practical-use brain-computer interfaces, or BCIs.
Robotic gadgets guided by peoples’ thoughts aren’t a new idea: Quadriplegics with brain-implanted electrodes can use their minds to move prosthetic arms, scientists reported in 2012 and February of this year. Volunteers could reach, grasp and even sip coffee from a bottle. But those BCIs require neurosurgery — an expensive operation that carries significant risks. “You get a hole drilled in your head, you risk infection and you have scarring for the rest of your life,” says Brendan Allison, a neuroscientist at the University of California, San Diego.
For some disabled people, the benefits of invasive techniques might outweigh the potential hazards. Implants can plug directly into neurons, tapping into detailed movement data. And patients don’t have to fiddle with an electrode cap every time they want to use their device. But for other users, paralyzed or healthy, noninvasive BCIs may be a better fit.
Though the technology is still a little clunky for everyday use, Contreras-Vidal hopes patients could one day easily integrate noninvasive BCIs into their lives. “This is the beginning of a dream where we can restore walking capabilities to patients,” he says. “We don’t want this technology to stay in the lab.”
After a motocross crash paralyzed him from the chest down, Holbert tried anything that might help him use his legs again. “I lived over 50 years with a fully functioning body,” he says. “To have that suddenly ripped away was extremely traumatic.”
In 2012, after trying physical therapy and experimental treatments, Holbert met Contreras-Vidal, a neuroengineer who was working on fusing a robotic walking machine with a new type of BCI. But getting users to control machines with their minds is tricky. For the system to work, an electrode cap fitted over the skull has to pick up brain signals and send them to a computer. Then, the computer has to translate the signals into commands and deliver them to the robot. And it all has to happen fast.
With some BCIs, electrodes resting above the motor cortex, the brain area responsible for movement, pick up electrical signals when users imagine moving their hands and feet. Thinking about moving different body parts sparks different brain signals. Using computer programs, researchers can link these signals to specific commands. When patients imagine moving their left arms, for example, their robotic devices could turn left.
In other systems, electrodes sit above the occipital lobes, visual brain regions bundled in the back of the skull. By flashing lights at different speeds, researchers can make people’s brains send out specific signals. As in the imagined movement system, researchers can tie different brain signals to commands. In this BCI, volunteers could look at a quickly blinking light to move a device in one direction, and a slowly blinking one to move in another.
Scientists have been studying EEG for decades, but decoding brain signals can be complicated and time-consuming. Volunteers often need weeks of training just to move a cursor in a virtual three-dimensional space, a feat accomplished only recently, in 2010. Even for simpler systems, the mental focus needed for control can be exhausting. So Contreras-Vidal’s team wanted to tap into the neural wiring of walking itself. “Our patients used to walk, so that memory is still there,” he says. He just had to find a way to pluck the information from the brain.
Like a radio tower, the brain broadcasts information at different wavelengths. Scientists can tune into these brain waves with EEG. For most BCIs, researchers have zeroed in on FM, or frequency modulated, signals, which encode information in the waves’ frequency, or how tightly packed they travel over time. But as talk radio fans know, there’s another source of information. Contreras-Vidal’s group flipped the receiver to AM, or amplitude modulation, and tuned into a low-frequency station called the delta band.
AM signals carry information in their waves’ power, or amplitude, which varies over time. “So many groups have focused on frequency,” says Contreras-Vidal. But AM signals carry a lot of information, he says. Picking up only FM stations means researchers miss out on other useful signals.
His team tuned the BCI to AM and trained a computer program to recognize the brain signals that urge legs to walk. To teach the program to spot these signals, Contreras-Vidal’s team strapped Holbert into the NeuroRex and switched on the robot’s remote controls. Then they told Holbert to think about walking, while researchers stepped the machine forward. As the robot walked, stopped and walked again, the team collected EEG signals from Holbert’s electrode cap.
Feeding these signals to the computer program teaches it how to recognize brain patterns for movement. The signals act like the computer’s cheat sheet. When volunteers control the NeuroRex mentally, the BCI decodes their thoughts by checking for the brain patterns it learned during training sessions.
In July, Contreras-Vidal’s team reported preliminary findings at the Engineering in Medicine and Biology Society Conference in Osaka, Japan. In training sessions, their system could recognize the brain patterns for movement about 98 percent of the time. And when Holbert took control of the NeuroRex, he could stop and start the machine with his mind. The system may even skirt the lengthy training sessions that plague other BCIs: The team’s training takes less than five minutes.
“This is good, original, helpful work,” says Allison, “and it seems to be leading to something real.”
This fall, Contreras-Vidal’s team is partnering with Houston Methodist Hospital in a phase I clinical trial to test the system in a larger group of paraplegic patients. The researchers suspect that the robot may offer patients more than mobility. The act of standing and walking could help them regain strength, as well as improve their cardiovascular health, bone density and bladder and bowel function, Contreras-Vidal says.
But the researchers still have to iron out some kinks. Twoof the first patients the team has worked with, Holbert and Gene Alford, can’t always control the robot with their thoughts. Getting the machine to walk outside of a controlled lab setting could make things even harder. “This has to work on the streets where you are bombarded with information,” says Contreras-Vidal. Even a system that works 90 percent of the time isn’t good enough, he says. Contreras-Vidal thinks pairing EEG signals with others, such as nerve signals in the muscles, could make the NeuroRex even more reliable, and that could give Holbert and other paralyzed people more options for movement.
To get around today, Holbert rides in a manual or an electric wheelchair. His accident wiped out muscular control from the nipples down, so he can still use his arms and shoulders but not his abs. Still, he has enough upper body strength to roll a wheelchair, and can almost hoist himself into the NeuroRex on his own.
Many paralyzed people have far less muscular control. Some have to rely on more limited movements to drive electric wheelchairs, such as lifting a finger, blowing a poof of air or twitching a cheek — the method Stephen Hawking uses. “Imagine that you need to ‘poof’ all day long,” says bioengineer José del R. Millán of the Swiss Federal Institute of Technology in Lausanne. “It’s hard. Your muscles will get very, very tired.”
In recent years, several research groups have experimented with electric wheelchairs coupled to EEG caps. In some of these chairs, cap-wearing users focus their eyes on specific commands such as “far right” or familiar household locations flashing on a screen. Focusing on these images spikes EEG activity, letting the BCI know which command to deliver to the wheelchair.
But Millán wanted to free patients from looking at a screen, and he wanted to ease the mental burden of constantly giving commands. So his team built a brain-controlled wheelchair with two key changes. It uses an imagined-movement mental task, and perhaps most important, it’s intelligent: The chair works with its operator to decide where to drive, by nabbing information about the environment from an army of sensors.
Two webcams perch above the chair’s tiny front wheels, a laptop sits on a shelf in the back, gadgets called wheel-encoders gauge speed and 10 sonar sensors collect data about the chair’s perimeter. Together, the extra hardware continuously maps out the chair’s surroundings. When an operator decides to turn left or right, the computer program refers to the map before responding. This “shared control” system is kind of like a friendly GPS: Rather than spouting directions, it takes the wheel occasionally to give drivers a chance to rest.
As in Contreras-Vidal’s system, Millán’s volunteers wear an EEG cap. But instead of trying to walk, they imagine moving their right or left hands to turn different directions. Usually, such a system would require people to constantly direct the chair’s movement. That’s where shared control comes in, Millán says. If a user is driving across a room toward a desk, “The wheelchair can say, ‘Aha, it seems that you want to reach that object, so I will help you.’ ” Then the chair can roll forward, avoiding obstacles automatically. This cuts down a user’s mental workload, Millán says. “At that moment, they can relax a bit.”
In March, Millán and a colleague reported that four healthy volunteers could sustain mental control of the wheelchair while steering it through an obstacle course. And because the system gave users a smooth ride and continuous control, it may outperform other brain-controlled wheelchairs, Millán suggests in IEEE Robotics and Automation Magazine. He and his colleagues have since tested their chair with two disabled people. The team hasn’t published its results yet, but Millán says the volunteers can drive the chair as well as healthy people do.
Still, training to use the system takes hours. And though “there will always be some people who are natural whizzes with imagined-movement BCIs,” says Allison, others may have trouble getting the hang of it. Millán thinks his wheelchair could work with other types of BCI, such as Contreras-Vidal’s system, or those that draw upon signals from the brain and body. In 2011, Millán and colleagues reported a BCI that fused EEG signals with signals picked up from the muscles. Hybrids like this, or others that combine different types of EEG signals, could open up BCIs for a wider range of people.
And it’s not just for patients. Healthy people can tap into the technology too, Allison says. Hybrid systems could one day help users explore virtual environments, remotely control robotics or even enhance prosthetics so that wounded soldiers can return to duty.
Though many people could likely use BCIs for simple tasks, such as moving a cursor across a screen, even slightly more complicated maneuvers are much more difficult. After scientists figured out how to master 1-D cursor movement with noninvasive BCIs in 1991, it took more than a decade to upgrade the technique to accurate 2-D control and six more years to achieve 3-D movement. “It’s harder than it sounds,” says Allison. With an imagined-movement BCI, as few as 5 percent of the population may be able to control a cursor in two dimensions.
But there might be another way to simultaneously move a cursor in 2-D, Allison thought. He wanted to combine the imagined-movement task with flickering lights, which can be used to control cursor movement in a different dimension.
“Maybe it’s too overwhelming to think about moving your hand while also looking at a flickering light,” Allison says, “but I didn’t think so.” This summer, as he watched a friend’s toddler dash over rocks, he was reminded of just how simple it is to combine movement and visual attention tasks. “It’s very basic,” he says. “Even a 3-year-old kid can do it.”
He and colleagues designed a BCI system where users imagined different movements to move a cursor up and down on a screen and looked at different flickering lights to move the cursor left or right. Using this hybrid system, some users were able to steadily move the cursor in two directions, such as up and right, at the same time, Allison’s team reported last year in the Journal of Neuroscience Methods. The team estimates that its hybrid BCI could work for up to 20 percent of the population.
Allison can envision his system, or hybrids like it, helping disabled people communicate. Using a combination of BCI technologies could make tasks like checking e-mail or surfing the Web easier. “It’s like having a keyboard and a mouse,” he says. People can get by using only one or the other, but it’s far less efficient. Allison’s volunteers didn’t find the dual BCI system annoying or even especially difficult. That’s crucial if the technology is ever going to find a place in people’s homes, says BCI researcher Theresa Vaughan of the Wadsworth Center, a research institute at the New York State Department of Health. “The technology has to be reliable, it has to be simple to operate and it has to be useful,” she says.
Vaughan and colleagues have spent years investigating how to translate BCIs for practical use. In 2005, her group teamed up with Helen Hayes Hospital in West Haverstraw, N.Y., to supply BCIs to people with amyotrophic lateral sclerosis, or ALS. In 2009, the researchers joined with the Veterans Administration to deliver 25 systems to vets with ALS. The collaborations are the first effort to track long-term use of BCIs in patients’ homes.
These BCIs help patients communicate: The system lets them spell out words by focusing on letters flashing on a screen. Vaughan says the study is revealing how people fit the technology into their lives, and could one day help researchers make improvements. For new BCI technologies, she says, “These kinds of translational experiments are going to be key.” Getting these technologies into real-life situations could hasten the delivery of a new toolbox to help people walk, move and talk.
“I’m ready for it right now,” Holbert says. After walking in the mind-controlled robot, he and Contreras-Vidal’s other volunteers are eager to try again. “They call me and say, ‘Pepe, can we come back?’ ” Contreras-Vidal says. The NeuroRex isn’t quite ready for its debut in people’s homes: Its bulk can be unwieldy, and its slow, preprogrammed steps aren’t exactly agile. But Holbert says even regaining a bit of walking ability could make a huge difference in his life. “Just being able to take a step closer to the sink or to grab something out of a cabinet,” he says. “That would be tremendous.”
Bionic leg listens to muscles
In just a few years, stepping out of a car, climbing up stairs and getting dressed might be a little easier for Zac Vawter. He’s young and strong enough to conquer these tasks now, but with a prosthetic leg even pulling on a pair of jeans can be challenging.
After a 2009 amputation to remove his lower right leg, Vawter worked with biomedical engineer Levi Hargrove of the Rehabilitation Institute of Chicago and colleagues to test a new type of prosthetic limb — a thought-controlled bionic leg.
As with other types of prostheses, a molded plastic socket suctioned to his residual limb holds the robotic leg in place. But Hargrove’s team added in a few high-tech improvements. Inside the socket, researchers embedded 10 electrodes that lie against the skin and pick up electrical signals from the muscles. These electromyographic, or EMG, signals travel to an onboard computer that also pulls data from 13 mechanical sensors that measure speed, position and pressure, among other variables.
A computer algorithm fuses information from all the sensors to figure out how a patient is trying to move the bionic leg — whether Vawter wants to point his foot, walk up a hill or stretch out his knee. “We’re really going after what the person is thinking about,” says Hargrove.
Thinking about walking sparks signals in the brain that zip down the spinal cord to the nerves in the legs. These nerves instruct muscles to contract, forcing joints to move. Researchers can eavesdrop on these instructions with the bionic leg’s EMG sensors, rather than capturing EEG signals from the scalp.
Scientists can even gather information from nerves that once plugged into muscles in the amputated limb. The brain signals that command legs to walk are healthy up until the point where the nerves have been cut, Hargrove says. By surgically rewiring these nerves to hook up to muscles in the thigh, his team can tap into commands that would have once controlled Vawter’s ankle. The surgery also offers a medical bonus: Rerouting snipped nerve fibers onto muscle tissue prevents painful scars called neuromas.
Hargrove’s team tested the bionic leg by having Vawter walk on flat surfaces, slopes and stairs. Using only the mechanical sensors, the leg’s movements failed to match up to Vawter’s intentions about 13 percent of the time. But combining data from the mechanical sensors with that from Vawter’s muscles dropped the error rate to less than 2 percent. And none of the errors were big enough to make Vawter stumble, Hargrove and colleagues reported September 26 in the New England Journal of Medicine.
Vawter was able to switch smoothly between walking and climbing stairs. “He said, ‘This is how I used to do stairs before my amputation,’ ” Hargrove recalls. But for Vawter, the bionic leg especially outshone other prosthetics in simple joint movements. “Being able to reposition the knee and ankle just by thinking about it — that’s something he’s never been able to do before,” Hargrove says.
Vawter has been coming to the lab a few times a year to try out the leg’s latest enhancements. It’s good for cruising around indoors and venturing outside a bit, but Hargrove wants to make it rugged enough for everyday use. He thinks Vawter will be able to take home a model in three to five years. As Vawter continues to test the device, Hargrove says, “I suspect it will get more and more difficult to leave it here each time he comes.”
L. Hargrove. et al. Robotic Leg Control with EMG Decoding in an Amputee with Nerve Transfers. N Engl J Med. Vol. 369, September 2013. p.1237-1242.
Note: To comment, Science News subscribing members must now establish a separate login relationship with Disqus. Click the Disqus icon below, enter your e-mail and click “forgot password” to reset your password. You may also log into Disqus using Facebook, Twitter or Google.