If these monkeys were 1970s TV stars, they would play crime-fighting cyborgs in “The Six Million Dollar Monkeys.”
Macaque monkeys with electrodes implanted in their brains learned to control a robotic arm with their thoughts, researchers report.
Scientists gently restrained the monkeys’ own arms and positioned the mechanical arm at each animal’s left shoulder as if it were a real arm. After practicing for several days, the monkeys appeared to treat the robotic arm as their own and could feed themselves with the arm using fluid, rapid motions.
“The thing that struck me was how naturally the animals interacted with the device,” comments John Kalaska, a neuroscientist from the University of Montreal who wrote a commentary that appeared with the research online May 28 in Nature. “It’s a further proof of principle that, down the line, we will be able to develop all the hardware necessary to allow paraplegic or quadriplegic patients to have prosthetic limbs that they can control in a natural way with their thoughts.”
Such devices for humans are still years away, Kalaska cautions. The computers that interpreted the monkeys’ brain signals in the current experiments are bulky, making them impractical for a portable prosthetic. And in past research, electrodes implanted into the brains of animals or humans lost contact with the nerve cells after months or weeks because cells in the brain treated the electrodes as foreign objects and attacked them. Both of these obstacles would have to be overcome before thought-controlled robotic arms or legs for people would be feasible, Kalaska says.
In previous research, monkeys and even quadriplegic people have controlled the movement of cursors on computer screens through electrodes implanted in their brains. Animals have also learned to open and close a simple robotic hand in a similar way.
But the new research, performed by Andrew Schwartz and his colleagues at the University of Pittsburgh and CarnegieMellonUniversity in Pittsburgh, is the first to use animals’ brain activity to manipulate a physical object that moves in complex ways.
“It’s a significant advance of the practical demonstration of a brain-machine interface,” Kalaska says.
Schwartz’s team implanted an array of tiny electrodes in a region of the monkeys’ brains called the motor cortex. This area controls voluntary movement, so the researchers could learn to associate patterns of electrical activity in this brain region with the monkeys’ desire to reach toward pieces of food placed in various locations.
Computer software interpreted where the monkeys wanted to reach — and whether they wanted to open or close the hand — based on that brain activity. The computer would then calculate the specific movements of the robotic arm’s shoulder and elbow joints to perform the task without the monkeys having to think about these details.
Rapid interpretation of the monkeys’ brain signals by the computer helped the robotic arm to move in a natural way. The computer could convert the monkeys’ thoughts into movements of the robotic arm in about 150 milliseconds, which is similar to the delay in a real arm.
Not only could the monkeys grab food with the robotic arm and put the food in their mouths, but one monkey learned to give morsels of food dangling on its lips an extra push with the mechanical hand, a behavior that had not been trained.
Watch the monkeys maneuver a robotic arm to feed themselves:
Video courtesy of A. Schwartz