Singularity Summit: Human Brain Emulation and Brain Emulation Roadmap

Dharmendra S Modha’s, IBM Almaden, talk on his brain emulation project was one of the highlights of the 2008 Singularity Summit. The Brain Emulation Roadmap was not presented at the Singularity Summit but was recently published online and relates to the brain emulation work.

IBM’s brain emulation project is able to carry out rat-scale simulations with 55 million neurons and 440 billion synapses in near real-time on 32,768 processor BlueGene/L machine.

The rat-scale model (55 million neurons, 442 billion synapses) is about 3.5 times bigger than our previous work on mouse-scale model (16 million neurons, 128 billion synapses) and eight times bigger than (almost) half-mouse-scale models (8 million neurons, 50 million synapses).

The essence of an efficient cortical simulator, C2, is as follows:

1. For every neuron:
a. For every clock step (say 1 ms):
i. Update the state of each neuron
ii. If the neuron fires, generate an event for each synapse that
the neuron is post-synaptic to and pre-synaptic to.
2. For every synapse:
When it receives a pre- or post-synaptic event,
update its state and, if necessary, the state of the post-synaptic neuron.

Our focus is on simulating only those details that lead us towards insights into brain’s high-level computational principles. Elucidation of such high-level principles will lead, we hope, to novel cognitive systems, computing architectures, programming paradigms, and numerous practical applications.

The human cortex has about 22 billion neurons which is roughly a factor of 400 larger than our rat-scale model which has 55 million neurons. We used a BlueGene/L with 92 TF and 8 TB to carry out rat-scale simulations in near real-time [one tenth speed]. So, by naïve extrapolation, one would require at least a machine with a computation capacity of 36.8 PF and a memory capacity of 3.2 PB. Furthermore, assuming that there are 8,000 synapses per neuron, that neurons fire at an average rate of 1 Hz, and that each spike message can be communicated in, say, 66 Bytes. One would need an aggregate communication bandwidth of ~ 2 PBps.

Scaling the synapses from rat brain to human brain is main driver in the computer power needed for brain emulation. There are petaflop supercomputers now so if such a system were dedicated to brain emulation a system ten times larger than the rat brain could be simulated.

Anders Sandberg and Nick Bostrom have released a 130 page human brain emulation technology development roadmap.

There are levels of brain emulation and it appears that the IBM work is a level 4 brain emulation.

An informal poll among workshop attendees produced a range of estimates where the required resolution for Whole Brain Emulation (WBE) is. The consensus appeared to be level 4‐6. Two participants were more optimistic about high level models, while two suggested that elements on level 8‐9 may be necessary at least initially (but that the bulk of mature emulation, once the basics were understood, could occur on level 4‐5). To achieve emulation on this level, the consensus was that 5×5×50 nm scanning resolution would be needed. This roadmap will hence focus on level 4‐6 models, while being open for that deeper levels may turn out to be needed.

Special hardware for WBE

It is possible that WBE can be achieved more efficiently using dedicated hardware rather than generic hardware Dedicated neural network chips have reached up to 1.7 billion synaptic updates (and 337 million synaptic adjustments) per second for ANN models (Kondo, Koshiba et al., 1996), which is approaching current supercomputing speeds for more complex models. Recently, there has been some development of FPGAs for running complex neuron simulations, producing an order of magnitude faster simulation for a motorneuron than a software implementation (four times real‐time, 8M compartments/s) (Weinstein and Lee, 2005). A FPGA implementation has the advantage of being programmable, not requiring WBE‐special purpose hardware. Other advantages include that as long as there is chip space, more complex models do not require more processing time and that precision can be adjusted to suit the model and reduce space requirements. However, scaling up to large and densely interconnected networks will require developing new techniques (Weinstein and Lee, 2006). A better understanding of the neocortical architecture may serve to produce hardware architectures that fit it well (Daisy project, 2008). It has been suggested that using FPGAs could increase computational speeds in network simulations by up to two orders of magnitude, and in turn enable testing grounds for developing special purpose WBE chips (Markram, 2006).

It may also be possible to use embedded processor technology to manufacture large amounts of dedicated hardware relatively cheaply. A study of high resolution climate modelling in the petaflop range found a 24‐ to 34‐fold reduction of cost and about two orders of magnitude smaller power requirements using a custom variant of embedded processor chips (Wehner,Oliker et al., 2008).

Possible Brain Emulation Complications

Whole Brain Emulation (WBE) on the neuronal/synaptic level requires relatively modest increases in microscopy resolution, a less trivial development of automation for scanning and image processing, a research push at the problem of inferring functional properties of neurons and synapses, and relatively business‐as‐usual development of computational neuroscience models and computer hardware. This assumes that this is the appropriate level of description of the brain, and that we find ways of accurately simulating the subsystems that occurs on this level. Conversely, pursuing this research agenda will also help detect whether there are low‐level effects that have significant influence on higher level systems, requiring an increase in simulation and scanning resolution.

11 thoughts on “Singularity Summit: Human Brain Emulation and Brain Emulation Roadmap”

  1. Pingback: 1pitfall
  2. I thought your readers would be interested in looking at these energy technologies and EPS’s theoretic base for ball lighting.

    Aneutronic Fusion: Here I am not talking about the big science ITER project taking thirty years, but the several small alternative plasma fusion efforts.

    There are three companies pursuing hydrogen-boron plasma toroid fusion, Paul Koloc, Prometheus II, Eric Lerner, Focus Fusion and Clint Seward of Electron Power Systems

    Vincent Page (a technology officer at GE!!) gave a presentation at the 05 6th symposium on current trends in international fusion research , which high lights the need to fully fund three different approaches to P-B11 fusion

    He quotes costs and time to development of P-B11 Fusion as tens of million $, and years verses the many decades and ten Billion plus $ projected for ITER and other “Big” science efforts

    Here are the links:

    U.S., Chilean Labs to Collaborate on Testing Scientific Feasibility of Focus Fusion

    However, short of a Energy “silver bullet” like fusion , Here is a fully DOABLE technology

    This technology represents the most comprehensive, low cost, and productive approach to long term stewardship and sustainability.Terra Preta Soils a process for Carbon Negative Bio fuels, massive Carbon sequestration, 1/3 Lower CH4 & N2O soil emissions, and 3X Fertility Too. Terra Preta (TP)soils and closed-loop pyrolysis of Biomass, this integrated virtuous cycle could sequester 100s of Billions of tons of carbon to the soils.

    UN Climate Change Conference: Biochar present at the Bali Conference

    SCIAM Article May 15 07;

    After many years of reviewing solutions to anthropogenic global warming (AGW) I believe this technology can manage Carbon for the greatest collective benefit at the lowest economic price, on vast scales. It just needs to be seen by ethical globally minded companies.

    Could you please consider looking for a champion for this orphaned Terra Preta Carbon Soil Technology.

    The main hurtle now is to change the current perspective held by the IPCC that the soil carbon cycle is a wash, to one in which soil can be used as a massive and ubiquitous Carbon sink via Charcoal. Below are the first concrete steps in that direction;

    S.1884 – The Salazar Harvesting Energy Act of 2007

    A Summary of Biochar Provisions in S.1884:

    Carbon-Negative Biomass Energy and Soil Quality Initiative

    for the 2007 Farm Bill

    After many years of reviewing solutions to anthropogenic global warming (AGW) I believe this technology can manage Carbon for the greatest collective benefit at the lowest economic price, on vast scales. It just needs to be seen by ethical globally minded companies.

    Even with all the big corporations coming to the GHG negotiation table, like Exxon, Alcoa, .etc, we still need to keep watch as they try to influence how carbon management is legislated in the USA. Carbon must have a fair price, that fair price and the changes in the view of how the soil carbon cycle now can be used as a massive sink verses it now being viewed as a wash, will be of particular value to farmers and a global cool breath of fresh air for us all.

    If you have any other questions please feel free to call me or visit the TP web site I’ve been drafted to co-administer.

    It has been immensely gratifying to see all the major players join the mail list , Cornell folks, T. Beer of Kings Ford Charcoal (Clorox), Novozyne the M-Roots guys(fungus), chemical engineers, Dr. Danny Day of EPRIDA , Dr. Antal of U. of H., Virginia Tech folks and probably many others who’s back round I don’t know have joined.

    Also Here is the Latest BIG Terra Preta Soil news;

    The Honolulu Advertiser: “The nation’s leading manufacturer of charcoal has licensed a University of Hawai’i process for turning green waste into barbecue briquets.”


    ConocoPhillips Establishes $22.5 Million Pyrolysis Program at Iowa State

    Glomalin, the recently discovered soil protien, may be the secret to to TP soils productivity;

    Erich J. Knight
    shengar at

  3. I hope that you are right; and also hope that my country is prepared for that time for the change this will do to the world.

    Do you think there is a relationship betwen the no more founding for the Iter and this? or this is completely unrelated?

  4. Oil prices can fluctuate for a lot of reasons. There is currently a $20-30 premium because of fear of more middle east conflict. the peak oil fears might also be adding $5-10 to the price per barrel. So any immediate hit to prices would be from changing the psychology around oil prices not from actual shifts in the economics of supply and demand. The supply and demand would get impacted over one to two decades. Once the full scale system is proved out then there would be a rush to build them.

    I think if the prototypes pan out this spring, most people will not believe it. So I do not think the working prototypes should effect price more than $1-2 per barrel if anything. the working full scale system $5-15 from a psychological shift. Maybe $20 with the optimism.

    Just as the thermoelectrics have actual released products (car seat warmers) but most people do not believe that the better thermoelectrics in the labs are on the way starting within 5 years. However, it will take time for the thermoelectrics to be deployed.

    The promise of highly successful two prototypes WB7 and then WB8 should definitely green light the full scale positive power system. That would still take 5 years (maybe 2-3 if people got excited and accelerated development and effort with promising results.)

    From the descriptions it is clear that the IEC fusion devices are far simpler than the ITER tokomak fusion devices. It is also simpler than nuclear fission reactors. So success would mean faster transformation, but it would still take five to ten years for big infrastructure impact to the point that oil would start to be significantly displaced. Plus it would first hit coal for electricity.

  5. Cant believe this people is working so fast in this.
    This are great news. But at the same time it scares me a bit. Im Venezuelan, and my country is very far from being prepared for this.
    The day this proves to work, do you imagine the fall of the oil prices?

  6. Thanks for the links!

    The problem with grids is that the very best you can do is 2% electron losses. With those kinds of losses net power is impossible.

    Losses have to get below 1 part in 100,000 or less to get net power.

    Otherwise your explanation is excellent.

Comments are closed.