Will Robots Dream?

Posted by Chuck Harrell on Nov 25, 2014 1:51:20 PM

dreamingThe robots in our factories are quite "dumb", they receive information from programs and carry out that task and occasionally make "yes" "no" sort of decisions, whether it's collecting goods from a shelf or welding one part onto another. This is all well and good, but what if we want our robots to start making decision for themselves in situations which would be unsuitable for humans?

Cognitive computing, or as it's more popularly know as artificial intelligence, is the development of computer systems based on the human brain. Having been first developed in the 1950's, computer companies gave computers the sort of intelligence that we have seen around us for awhile, where the computer can make decisions based on the choices that it is given or recognize a voice command but now, as scientists discover more about how the human brain works, we're moving towards having computers that can learn and produce independent thought.

IBM's SyNPASE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project has seen some amazing results. The team demonstrated the first neurosynaptic chips that will allow the building of "computing systems that emulate the brain's computing efficiency, size, and power usage".

In order to get these chips working so that they could develop sensory-based cognitive computing applications, they had to be programmed, and since no existing language was suitable, the team needed to develop their own. As Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research says "We are working to create a FORTRAIN for neurosynaptic chips. While complementing today's computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems."

To make these cognitive systems easier to build, the team has created reusable building blocks called corelets. Each corelet has a specific function which enables them to be used in different configurations, as best explained on the Neurosynaptics website: "a corelet could include all of the individual cores that perceive sound. The programmer could use that corelet in conjunction with others that represent edge detection and color identification to develop a new application that takes advantage of all those features."

While the chips are some way off, developers at the lab have been able to use a simulator to write and test the code and currently 150 corelets have been developed and there are plans to open it up to third parties.

These neurosynaptic chips and corelets are responsible for right-brain functions such as smell, sight, and sound; IBM has also developed a left-brain system called [highlight] Watson. [/highlight] Watson focuses on language and analytical thinking and is being opened up to the general public so that they can use the system to "improve diagnosis and treatment," "improve investment decisions and customer satisfaction".

It may take a good few years for androids to be able to dream of anything, but with IBM's research team aiming to build a neurosynaptic chip with 10 billion neurons and 100 trillion synapses (approximately the same number as in a human brain), that will be less than two liters in volume and only use on kilowatt of engery. It will form the right-brain functions and eventually meld the IBM's existing left-brain system.

 

 

 

Topics: Did you know?, Hot Topics

Subscribe to Email Updates