MoNETA: A Mind Made from Memristors

IEEE Spectrum – DARPA has funded a new memristor-based approach to AI consists of a chip that mimics how neurons process information.

This site has discussed how memristors look very promising for creating human scale synapse networks. It was highly speculative (and still is) to think that memristors would be the key to AGI, but now the funding stake and project has been made by DARPA. The memristor approach has not yet delivered any level of animal intelligence. Memristors seem to have a better shot at scaling to relevant complexity of synapses for a rat or other animal.

The memristors seem to have the analog properties matching a synapse at the level of handfuls of memristors.

This approach has an architectural shot at getting the brain simulating hardware to the right level of complexity. Can it all function together and do useful things. Don’t know. The projected hardware levels look interesting. Could it all fail to deliver in any number of ways ? Yes.

The government and corporate and academic levels of effort and funding also appear to have interesting levels of momentum. As opposed to AGI projects that only have one or two professors working on them.

The goal of the MOdular Neural Exploring Traveling Agent (MoNETA) project is to develop an animat that can intelligently interact and learn to navigate a virtual world making decisions aimed at increasing rewards while avoiding danger. The animat, which is a virtual agent living in a virtual environment, is designed to be modular: a whole brain system, initially including fairly simple modules, will be progressively refined with more complex and adaptive modules, and will be tested in increasingly more challenging environment. The animat brain is designed in Cog Ex Machina (Cog), the software realized by HP in collaboration with Boston University in the DARPA SyNAPSE project. Cog, which can run on CPUs, GPUs, and will run on memristive-based devices, allows to pack large-scale, highly interconnected, plastic, heterogeneous neural models that make up the animat brain in a low-power, high density chip which is suitable for implementing portable petascale neural-based computing. The current plan is for the animat to replicate a classic rat experiment, the Morris Water Maze (left), by February 2011, and progressively simulate more complex mazes used in rat experiments. Further evolution of the model will use the Iterative Evolution of Models (ItEM) project software.

When work on the brain-inspired microprocessor is complete, MoNETA’s first starring role will likely be in the U.S. military, standing in for irreplaceable humans in scout vehicles searching for roadside bombs or navigating hostile terrain. But we don’t expect it to spend much time confined to a niche. Within five years, powerful, brainlike systems will run on cheap and widely available hardware.

How brainlike? We’re not sure. But we expect that the changes MoNETA will foment in the electronics industry over the next couple of decades will be astounding.

In the brain inspired system:

Memristors will be used as analog synapses
CPUs and GPUs will be used for neurons (there also could be custom chips)

DARPA SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) goals are :
* 1 million neurons per square centimeter
* 10 billion synapses (memristors) per square centimeter
* 100 milliwatts per square centimeter
* total power 1 kilowatt

The total system would then be about 10,000 chips with a combined 100 trillion synapses and 10 billion neurons. The human brain has about 100 billion neurons and 100 trillion synapses. The human brain is 50 times more energy efficient than the DARPA Synapse goals.

Companies like Intel, Hynix, and of course HP are putting a lot of resources into finding ways to rely on these unreliable future devices. Neuromorphic computation will allow that to happen on both memristors and transistors.

It won’t be long until all multicore chips integrate a dense, low-power memory with their CMOS cores. It’s just common sense.

Our prediction? Neuromorphic chips will eventually come in as many flavors as there are brain designs in nature: fruit fly, earthworm, rat, and human. All our chips will have brains.

Brain-Inspired Computing by Massimiliano Versace

At this point in time and in the foreseeable future, convergent advances in neural modeling, neuroinformatics, neuromorphic engineering, materials science, and computer science will enable the study and manufacturing of novel computer architectures. These new architectures are not only promising in helping overcome Moore’s law imminent failure, but will also open the door to large-scale neural modeling research and applications. This talk focuses on memristor-based bio-inspired computing devices and models scalable to biological levels. These devices, realized in the context of the DARPA sponsored SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project, promise to advance our understanding of how simulated and robotic agents with whole-brain systems learn to interact with their environment, and to create innovative technological applications to impact general-purpose computing and mobile robotics.

Greg Snider, Rick Amerson, Dick Carter, Hisham Abdalla, Shakeel Qureshi, Jasmin Leveille, Massimiliano Versace, Heather Ames, Sean Patrick, Benjamin Chandler, Anatoli Gorchetchnikov, and Ennio Mingolla (2010) Adaptive Computation with Memristive Memory. Submitted, IEEE Computer.

Leveille, J., Ames, H., Chandler, B., Gorchetchnikov, A., Mingolla, E., Patrick, S., and Versace, M. (2010) Learning in a distributed software architecture for large-scale neural modeling. BIONETICS10, Boston, MA, USA.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

Featured articles

Ocean Floor Gold and Copper
   Ocean Floor Mining Company