For years, scientists and programmers have tried to make software work the way the human brain does, with an ability to learn and to quickly adapt to changing circumstances (at least in some cases). IBM says a new type of chip emerging out of a collaboration with Cornell University and iniLabs will mimic the brain's abilities for perception and cognition. The technology theoretically could be harnessed by banks for the types of analytics projects for which they use high-performance computing today: evaluating risk, predicting defaults, pricing complex products, determining creditworthiness, making quick decisions about pricing and products.
Most computers are designed for sequential processing according to a pre-defined program. The human brain, IBM notes, operates comparatively slowly and at low precision but excels at tasks such as recognizing, interpreting, and acting upon patterns, while consuming the same amount of power as a 20 watt light bulb and occupying the volume of a two-liter bottle.
In August 2011, IBM demonstrated a building block of a brain-inspired chip architecture based on an interconnected, configurable network of "neurosynaptic cores." Each core brings memory, processors, and communication in close proximity and is designed to emulate the brain's ability to respond to biological sensors and analyze data from many sources at once.
IBM and its collaborators were recently awarded $12 million in new funding from the Defense Advanced Research Projects Agency for the next phase of this project, called Systems of Neuromorphic Adaptive Plastic Scalable Electronics, thus bringing the funding to approximately $53 million.
IBM's long-term goal is to build a chip system with ten billion neurons and a hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.
Systems built from these chips could bring the real-time capture and analysis of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity ranging from commerce, social, logistics, location, movement, and environmental conditions.
One use case for the technology IBM suggests is glasses for the visually impaired that would be equipped with sensors that would gather and interpret data about how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk. "Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds," IBM says. The same concept could work to provide sensory-based data input and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.