Brain-inspired computer chip mimics 1 million neurons

By processing data in parallel, device could improve pattern recognition

Two computer chips

ENERGY SIPPER  The brain-inspired TrueNorth computer chip (left) requires less energy and thus stays cooler than a traditional chip (right), as shown in this thermal image.

IBM Research

Human brainpower has produced a computer chip reminiscent of the human brain.

The new chip, reported in the Aug. 8 Science, scraps the design that formed the basis of decades of computers in favor of an architecture that resembles a bundle of 1 million neurons. Such technology could pinch hit to perform tasks that conventional computers struggle with, such as identifying objects in photos and videos.

“It’s an impressive piece of silicon,” says Stephen Furber, a computer engineer at the University of Manchester in England. “A million neurons on a single chip is a big number.”

Computers’ basic architecture hasn’t changed much since the 1940s (SN: 10/19/13, p. 28). A central processor, following a sequence of instructions, takes data from memory, manipulates it and then returns it. The need to shuttle so many 1s and 0s back and forth limits the speed of computers and bloats their energy usage.

The human brain can’t do things like multiplication and division nearly as fast as a computer, but it works far more efficiently at other tasks that require recognizing patterns rather than crunching numbers. The brain’s roughly 85 billion neurons transport electrical signals across junctions called synapses, and some of those interneuron connections result in the storage of memories. All of this work is done in parallel, allowing the brain to perform complex tasks while consuming on average about 20 watts of power, just a fraction of what’s used by energy-hogging chips in most computers.

If a computer works by sequentially opening up large parcels of data, Furber says, then the brain works by reading stacks of postcards at the same time.

Mimicking the neuron-synapse architecture, a team led by Dharmendra Modha at the IBM Almaden Research Center in San Jose, Calif., introduced brain-inspired circuits called neurosynaptic cores in 2011. Instead of containing centralized blocks of processing and memory, each silicon core carried a network of decentralized components to transport, process and store data, much the way a bundle of neurons does.

Now the researchers have shrunk those cores and merged 4,096 of them into a keyboard-key-sized chip called TrueNorth, which contains an equivalent of 1 million neurons. Each neuron can communicate with 256 others, creating 256 million synapses, and the neurons relay electrical signals across synapses in parallel. The Department of Defense’s DARPA contributed $53.5 million to help fund the development of the technology.  

Still, IBM’s chip doesn’t rival the brains even of organisms with far fewer than 1 million neurons. Real neurons can connect to thousands upon thousands of others and can adjust those connections; each of TrueNorth’s neurons is locked in to a predetermined set of 256 others. “We have not built a brain, but we have come closest to approaching its structure in silicon,” Modha says. Multiple research teams are building similar cognitive computing devices with greater connectivity between neurons, in the hopes of more accurately simulating the brain.

Yet reverse engineering even an oversimplified brain has the potential to improve computing, Modha says. His team used TrueNorth to scan the pixels of a video in parallel to identify people and vehicles. While it’s hard to directly compare TrueNorth to a conventional chip, the latter would have to sequentially run tests on each pixel and would consume a thousand or more times the energy each second to complete the task.

Future iterations of brain-inspired chips could enable driverless cars to identify obstacles on the road, Furber says, or allow computers to find patterns in huge amounts of data. He also ponders the potential of using brain-inspired devices to complement the latest supercomputers, such as IBM’s Jeopardy-dominating Watson. “If you can use cognitive computing to enhance Watson,” Furber says, “then some really interesting capabilities will emerge.”

Editor’s Note: This article was updated August 18, 2014, to correct that TrueNorth was tested by identifying people and objects in a video, not a photograph.

More Stories from Science News on Computing

From the Nature Index

Paid Content