Connectomics Definition, History and Progress to Map of the human Brain


Brain mapping process and work from a 2008 research paper. High-resolution T1 weighted and diffusion spectrum MRI (DSI) is acquired. 66 cortical regions with clear anatomical landmarks are created and then (3b) individually subdivided into small regions of interest (ROIs) resulting in 998 ROIs

The Economist magazine describes the history and progress of Connectomics. From Wikipedia, Connectomics is a novel, high-throughput application of neural imaging and histological techniques in order to increase the speed, efficiency, and resolution of maps of the multitude of neural connections in a nervous system. The principal focus of such a project is the brain. The map produced by such a project is called a connectome.

Connectomics actually started before the word existed. In 1972 Sydney Brenner, a biologist then at Cambridge University, decided to work out the connections of every cell in the nervous system of a small nematode worm called C. elegans. He picked this animal because its body has a mere 959 cells, of which 302 are nerve cells. It is also a hermaphrodite, fertilising itself to produce clones. That means individual worms are more or less identical.

Dr Brenner and his team embedded their worms in blocks of plastic, sliced the blocks thinly and then stained each slice so its features would show up in an electron microscope. The resulting pictures let the path taken by each nerve cell to be traced through the worm’s body. They also revealed the connections between cells. Over the course of 14 painstaking years the team managed to map the complete nervous system of C. elegans, work for which Dr Brenner, too, won a Nobel prize.

The scale of that work, though, hardly compares with today’s quests to map the brains of mice and fruit flies. The cerebral cortex—the part of a mammal’s brain that thinks—is composed of 2mm-long units called cortical columns. Winfried Denk of the Max Planck Institute for Medical Research in Heidelberg, Germany, estimates that it would take a graduate student (the workhorse of all academic laboratories) about 130,000 years to reconstruct the circuitry of such a column. But efforts to automate the process are gaining ground.

Dr Brenner’s method used what is known as a transmission electron microscope. In this the electrons that form the image pass through the sample, so the individual slices have to be prepared and examined. Dr Denk is speeding matters up by using a scanning electron microscope instead. This takes pictures of the surface of an object. Dr Denk (or, rather, his graduate students) are thus able to load the machine with a chunk of plasticised brain and a slicer. Once the microscope has taken a picture of the exposed surface of the chunk, the slicer peels away a layer 25 billionths of a metre thick, revealing a new surface for the next shot. The slice itself can be discarded, so the process is much faster than using a transmission microscope.

Researchers have also devised sneaky ways to tag parts of the brain that are of special interest, so that they can be followed more easily from slice to slice. Dr Denk, for example, tracks the myriad branches of a single nerve cell using an enzyme from horseradish. This gets stuck on the cell’s surface and then reacts with a stain that is added to the sample.

It is also possible to trace neural pathways from cell to cell. Ed Callaway at the Salk Institute in La Jolla, California, does so using rabies viruses. Rabies hops between nerve cells as it races to the brain, which is why even an infected bite on the ankle will eventually drive someone mad. That ability to leap the gap between cells means the connections branching from a single cell can be mapped.

Even when the images are in, however, making a map from them is another matter. Dr Brenner’s team traced each cell by eye—matching shapes through hundreds of cross-sections. Sebastian Seung, a computational neuroscientist at the Massachusetts Institute of Technology, is working on a program that will automate this process, too. It will allow a computer to learn how to match cells from one slice to another by trial and error, as a human would, but with the infinite patience that humans lack.

The result of all this effort, it is hoped, will be precise circuit-diagrams of brains. The first brains to be mapped will probably have belonged to mice. Besides being cheap and disposable, a mouse brain weighs half a gram and packs a mere 16m neurons. Human brains (1.4kg and 100 billion neurons) will come later, when all the wrinkles have been ironed out in rodents, and proper methods devised to analyse the results.


The first complete high-resolution map of the human cerebral cortex identifies a single network core that could be key to the workings of both hemispheres of the brain. Completed in 2008.

Researchers from Indiana University, University of Lausanne, Switzerland, Ecole Polytechnique Fédérale de Lausanne, Switzerland, and Harvard Medical School created the first complete high-resolution map of how millions of neural fibers in the human cerebral cortex in 2008 — the outer layer of the brain responsible for higher level thinking — connect and communicate. Their groundbreaking work identified a single network core, or hub, that may be key to the workings of both hemispheres of the brain.

a team of neuroimaging researchers led by Hagmann used state-of-the-art diffusion MRI technology, which is a non-invasive scanning technique that estimates fiber connection trajectories based on gradient maps of the diffusion of water molecules through brain tissue. A highly sensitive variant of the method, called diffusion spectrum imaging (DSI), can depict the orientation of multiple fibers that cross a single location. The study applies this technique to the entire human cortex, resulting in maps of millions of neural fibers running throughout this highly furrowed part of the brain.

Sporns then carried out a computational analysis trying to identify regions of the brain that played a more central role in the connectivity, serving as hubs in the cortical network. Surprisingly, these analyses revealed a single highly and densely connected structural core in the brain of all participants.

“We found that the core, the most central part of the brain, is in the medial posterior portion of the cortex, and it straddles both hemispheres,” Sporns said. “This wasn’t known before. Researchers have been interested in this part of the brain for other reasons. For example, when you’re at rest, this area uses up a lot of metabolic energy, but until now it hasn’t been clear why.”


High-Resolution Connection Matrix, Network Layout and Connectivity Backbone

FURTHER READING
MIT Technology Review in 2008discussed connectomics and Jeff Lichtman of Harvards work creating brainbows.

BrainBows: Genetically engineering mice so that their brain cells express different combinations of fluorescent colors reveals the brain’s complicated anatomy. In the image round green neurons are interspersed with diffuse support cells called astrocytes. Credit: Jean Livet


The overall goal of the Connectome project [Initiative in Innovative Computing (IIC) at Harvard has the connectome project as one of several computing initiatives] is to map, store, analyze and visualize the actual neural circuitry of the peripheral and central nervous systems in experimental organisms, based ona very large number of images from high-resolution microscopy. The proposingteam from the Center for Brain Sciences has already demonstrated its capacityfor, and expertise in, high-throughput imaging. The critical challenges arecomputational, as the total number of voxels needed to establish the Connectomeis ~10^14. The principal challenges are to develop: (a) algorithmsfor efficient 3D segmentation circuit identification (b) the ability to transfer, storeand analyze 3D images in multi 100GB range; and (c) scalable database techniques to store, manageand query multi-TB, multi-modal datasets.

About The Author

Add comment

E-mail is already registered on the site. Please use the Login form or enter another.

You entered an incorrect username or password

Sorry, you must be logged in to post a comment.

4 comments

by Newest
by Best by Newest by Oldest
1

what I indicated was misleading was the Plan B chart and text which sometimes indicates replacing all fossil fuels. It is more clear with the backup material but the roughly 12 page summary was not clear.

In reality, supply chain issues do matter. Even if a damn the costs and resources kind of approach were to be adopted then relative cost and resource use should still be a factor. In which case it does not make sense to focus on only wind and solar and geothermal. Nuclear energy already provides 20% of electricity. China is scaling up to 100 AP1000 reactors built or being built by 2020. (1.25-1.7GW sizes). Plus they are making mass producable meltdown proof high temperature reactors.

how many people died from the French leak ? Zero.

There is background level radiation in the ocean and water and environment.

I do not think that you are calculating the subsidies for wind power correctly.

http://nextbigfuture.com/2008/02/feed-in-tariffs-support-for-renewable.html" REL="nofollow">Feed in tariffs are multiple billions of dollars per year for wind and solar

http://nextbigfuture.com/2008/01/energy-costs-with-externalities.html" REL="nofollow">This site has looked a energy costs with externalities included and the data does not support your claims

Price Anderson has cost nothing to the tax payer because no payouts have ever been triggered. Based on the calculations of when Price Anderson would be triggered the possible costs are not that high. There are no containment domeless reactors for the high cost Chernobyl case.

You complain about any non-private funding or guarantees for nuclear and then support a plan B effort with massive public spending ala a world war 2 mobilization. So is non-private funding bad. If so then there should be no non-private funding for Plan B. If a larger public spending effort is in order then the optimum spending plan and plan with the fastest results should be considered and one that involves the least barriers to adoption from the countries that would matter (china, India, US, Europe, Canada, japan, Brazil).

So which was is it ? Big public spend for a new energy plan or private only funding ? If it is a big public spend then everything should be considered. If it is a new call for private only that is a change because all energy sources get public money in every country.

http://nextbigfuture.com/2008/07/current-information-on-wind-power.html" REL="nofollow">Looking at materials used for construction as well Offshore manufacturing of wind involves a lot more effort and materials than onshore which uses a lot more material to generate the same power as nuclear.

http://www.gwec.net/fileadmin/documents/test2/gwec-08-update_FINAL.pdf" REL="nofollow">The global wind energy report

The average feed-in tariff over 20 years for turbines
installed in 2007 ranged from 8.19 euro cent/kWh ('initial
tariff') to 5.17 euro cent/kWh (‘basic tariff’). The initial
tariff is reduced by 2% every year, so it will be 8.03 to
5.07 euro cent/kWh for turbines installed in 2008.

20TWh at an average 2008 feedin tariff rate of 6.5 euro cents is $1.3 billion euro.

By 2020, the overall German onshore capacity could be at
45,000 MW, assuming an optimal use of sites and no general
height restrictions for turbines, with an additional 10,000
MW offshore. This would account for about 25% of German
electricity consumption, or about 150 TWh/year.

30% reduction in feed in tariff.
5 euro cents per kwh.

7.5 billion euro/year for the 150TWh capacity.

150TWh is less than the 806 TWh produced by thirty year old nuclear technology in the USA from 99GW of nuclear. 55GW of nuclear would produce 440 Twh. About three times more than the projected 2020 Germany wind situation from a global wind group.

the current situation is that air pollution and other pollutants from coal and oil and natural gas are killing millions of people every year. (World Health Organization). Global warming is a potential issue. But addressing air pollution is a pressing concern for today. If the fear of global warming can be used to help address this real issue then all the better. But those informed about the energy and pollution issues should recognize that nuclear is far better and available to help reduce those deaths sooner just as they already have by displacing coal and oil for 20% electricity generation.

85% of energy from all uses is now from fossil fuels. 8% is from nuclear power. If nuclear power was not used up to this point then that would have been more coal and natural gas.

2

I could not disagree more with the title and content of this posting. A quick visit to earthpolicy.org allows you to download a blueprint of Plan B with references as well as access all data behind the calculations. For starters, the goal of Plan B is not to replace all fossil fuel use economy-wide, as this posting insinuates. The goal is to reduce net carbon emissions 80 percent by 2020. In achieving this goal, the power generation sector replaces all coal, oil and 70 percent of natural gas with renewable sources of energy such as wind, solar, geothermal, etc. Fossil fuel use in the industrial sector is maintained to build the renewable infrastructure.

Second, capacity factors are not ignored. The capacity factors applied are from the National Renewable Energy Laboratory’s Power Technologies Energy Data Book. The capacity factor used for wind is 36%. Thus, 3,000 GW of wind capacity will generate 9,461 TWh (34,059 PJ). A capacity factor of 36% in the year 2020 is well within reason since the next decade will see an explosion of offshore wind capacity, which is more reliable than onshore. Also, wind turbines being installed today have higher capacity factors than many wind turbines presently operating, due to better designs that enable the turbines to spin at slower wind speeds.

Thirdly, the mentality that supply chains and prices will thwart Plan B is definitely thinking inside the box. As explicitly explained, the entire economy will require retooling, much like the retooling of the economy for arms production during World War II. Realizing the Plan B economy will revitalize economies, creating jobs and lifting people out of poverty. While trillions of dollars over the next dozen years sounds like a lot, it is comparable to expected spending for fossil fuel development and military operations to protect oil supplies. With the introduction of the proposed Plan B carbon tax, the economy will efficiently shift away from fossil fuels and towards renewables. It is not a money issue. The money is already there. It is just being invested in the wrong areas.

Fourthly, the goal of Plan B to hold nuclear power generation at its current level is a sound strategy. Any cost accounting quickly exposes the exorbitant cost of nuclear power. The total amount of incentives handed out to the wind industry in its 25 year history in the United States is equal to a single year of incentives handed out to the nuclear industry during the first 40 years of nuclear development. Nuclear is also high risk and, thanks to the Price-Anderson Act, taxpayers must shoulder the burden in the event of a nuclear accident. And accidents still happen. Case in point is the nuclear leak that occurred in France last week. Besides, one of the only reasons that nuclear is competitive today in the United States is that many nuclear plants were sold to utilities at below cost. And if you think the private sector can develop nuclear power, think again. The only 100% privately-financed nuclear plant is being constructed in Finland and it is years behind schedule and billions over budget.

Lester Brown never said that achieving Plan B would be easy. When one takes the time to get the facts straight, it becomes clear that Plan B offers a solid strategy to stabilize climate and to move us towards a sustainable world economy.

3

The Wind Power numbers from that study are a fantasy.

The variability in wind power means its true utilization rates are <20% while most other power sources are > 90%. Add in the maintenance needs and other overhead and its just not competitive economically or by any other measure with primary load sources like coal and nukes.

4

Brian Thank you for this excellent post. Right now there is a whole lot of very bad thinking about energy. You have cut through some of the crap.