Saturday, August 9, 2014



Crossing a traditional computer with a biological one, the brain, is no easy task — but researchers at IBM have been hard at work doing just that. On Thursday they announced the results of 10 years of research and $53 million in DARPA funding. The company's new brain-inspired chip achieves unparalleled levels of power and efficiency, and IBM claims it may change the fundamental methods of computing.
The latest chip boasts more than 5 billion transistors, putting it in supercomputer-on-a-chip territory, but that's not the impressive part — those transistors are organized into 4,096 "neurosynaptic cores," on which are more than a million simulated "neurons" and 256 million "synapses." IBM's chip, you see, is an attempt to replicate the brain's style of processing information. It's described in a paper featured this week in the journal Science.
"We derived insight from two sides of neuroscience," said Dharmendra Modha, chief scientist of Brain-Inspired Computing for IBM, in an interview with NBC News. "Neuroanatomy for structure and neurophysiology for systems."
IBM
A close-up view of the "TrueNorth" chip, many of which work together to form the "neurosynaptic cores."
That may sound a bit like a cross between tech and biology jargon, but it really is a totally different way of looking at computing. Here's why.
The vast majority of computers, ever since the days of vacuum tubes and punch cards, use what is called the "Von Neumann" architecture, named after one of the earliest pioneers in the field. It basically means that data is stored somewhere, then sent to a central processing unit to be crunched and sent back.
But the brain doesn't work anything like that. Your visual and auditory systems don't have to send sights and sounds to a central location and wait for it to be processed and sent back — instead, a huge network of interlocking neurons work together in complex and subtle ways to both store and process information.
IBM
Even putting the brain on the same graph as regular processors is a bit misleading, but the brain out-performs traditional computers in both speed and efficiency. In fact, it's in a class of its own.
And the result is a system that's not only powerful and versatile, but incredibly efficient. The brain does the same work as a whole data center while using around 20 watts of power. The IBM team aimed to reproduce this, and they appear to have significant success.
The SyNAPSE chip (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) uses no central processor, and every "neuron" stores and processes data independently. Networks are programmed onto the chip that allow data to flow between adjacent neurons quickly and easily, performing complex computing tasks in real time while using a fraction of a percent of the power of traditional systems.
One capability demonstrated by the researchers is real-time tracking and identification of objects in a video feed. Using only a few milliwatts of power, the chip is able to pick out cars and people separately, highlighting their position as they move. It's possible to do this with traditional computers, sure, but the way they do it is hugely inefficient and consumes lots of power. By using a chip like IBM's, small devices could carry out complex tasks without taxing their batteries.
IBM
A demonstration of the system shows visual stimuli being categorized and tracked in real time, and using very little power.
Brain-inspired chips have their weaknesses too, of course. "We think of existing Von Neumann computers as essentially left-brain," said Modha, citing the common (if only vaguely correct) wisdom that the left side of the brain is more focused on logic and calculation. "This is a right-brain machine. We don't think this will replace Von Neumann computers, we think it will enhance them, complement them."
Having a computer that "perceives" rather than calculates is a bit of a difficult concept to grasp, but that's partly because we're so used to the latter. But think about the things your brain is much better at than computers: not doing massive long division or complex 3-D simulations, but pulling faces out of a crowd, or recognizing danger, or detecting when someone's voice indicates anger or humor.
We're still a long way from computers that do all those things, but brain-inspired computing is a powerful tool for getting there. "The next step is to make the impossible real," Modha concluded, "to take it from the lab to the real world."
Link: NBC

0 comments:

Post a Comment

share