June 08, 2007

Venter creating and trying to patent minimal genome microbe

For the past few years, Craig Venter, the human genome pioneer, has been trying to build an organism from scratch. His company has a patent application which can be seen here

They started the project by selecting a microbe, Mycoplasma genitalium, that has only has 482 genes. They then introduced crippling mutations into each gene to figure out which ones are dispensable and which can't be done without. In January last year they reported that 382 were essential. In the patent, the number drops to 381. As the application explains, it would be theoretically possible to synthesize a 381-gene genome and plug it into a genome-free cell, and--voila--boot up a new organism. This artificial genome could be engineered so that it can easily accept other genes to carry out new functions--such as producing cheap hydrogen fuel.

There's no evidence in the patent that Venter has actually booted up a synthetic organism, but it's worth bearing in mind that the patent was filed in October and is only now coming available. So at this point, Venter is claiming a patent for something he has yet to build.

Wireless transmission of power

Researchers at MIT have created a revolutionary device that could remotely charge batteries and power household appliances.

The setup is straightforward, explains Andre Kurs, an MIT graduate student and the lead author of the paper. Two copper helices, with diameters of 60 centimeters, are separated from each other by a distance of about two meters. One is connected to a power source--effectively plugged into a wall--and the other is connected to a lightbulb waiting to be turned on. When the power from the wall is turned on, electricity from the first metal coil creates a magnetic field around that coil. The coil attached to the lightbulb picks up the magnetic field, which in turn creates a current within the second coil, turning on the bulb.

This type of energy transfer is similar to a well-known phenomenon called magnetic inductive coupling, used in power transformers. However, the MIT scheme is somewhat different because it's based on something called resonant coupling. Transformer coils can only transfer power when they are centimeters apart--any farther, and the magnetic fields don't affect each other in the same way. In order for the MIT researchers to achieve the range of two meters, explains Soljačić, they used coils that resonate at a frequency of 10 megahertz. When the electrical current flows through the first coil, it produces a 10-megahertz magnetic field; since the second coil resonates at this same frequency, it's able to pick up on the field, even from relatively far away. If the second coil resonated at a different frequency, the energy from the first coil would have been ignored.

The researchers' approach, says Soljačić, also makes the energy transfer efficient. If they were to emit power from an antenna in the same way that information is wirelessly transmitted, most of the power would be wasted as it radiates away in all directions. Indeed, with the method used to transfer information, it would be difficult to send enough energy to be useful for powering gadgets. In contrast, the researchers use what's known as nonradiative energy that is bound up near the coils. In this first demonstration, they showed that the scheme can transfer power with an efficiency of 45 percent

Some more info is here on this 'WiTricity' demonstration

Inhibiting a single protein lengthen fruit fries life by a third

From physorg, Fruit flies with a single blocked protein receptor saw their lives extended by a third, with no apparent side effects.

They also used a highly parallelized and combinatorial (trillions of tests) method to find the effective binding molecule. This can be part of methodologies to interfere and control the metabolism of people.

This method could have application for gerontology, engineering and geriatrics.

“First, it demonstrates that a single inhibitor can dramatically alter lifespan, a very complex trait. It is remarkable that you can alter it with a single genetic change.

“We don’t really need to make fruit flies live longer, but if we understand how to do this, our approach may have direct application to higher organisms, such as ourselves.”

Secondly, Roberts said, the method used by his research group to make the inhibiting proteins “opens the possibility of developing a lot of new therapeutics [drugs].”

Receptors are proteins that transmit signals across a cell membrane. In the fruit fly, Roberts and his team manufactured short proteins that blocked a receptor involved in fruit fly aging, as previously demonstrated by co-author Seymour Benzer of Caltech.

The same blocking strategy should work in all such receptors, known as class B GPCRs (for G protein-coupled receptors). Many GPCRs figure prominently in disease as well as in normal development, Roberts said.

“It is the most targeted family of receptors” by drug manufacturers, Roberts said, estimating that a quarter of all pharmaceuticals focus on GPCRs.

“This approach should be generally applicable.”

And generally powerful, given that GPCRs are notoriously unstable and difficult to work with. The Roberts group went around the problem by cutting off the unstable part of the receptor and running experiments only on the part of the receptor that sticks out of the cell.

“Essentially, we developed a way to do PCR on proteins,” Roberts said.

The use of RNA-peptide fusions allowed the easy creation and multiplication of randomly generated peptides. Roberts termed this approach “Irrational Design.”

In the new study, Roberts and his group literally threw trillions of peptides at the receptor and saved the ones that stuck.

“We let the molecules themselves decide if they bind, rather than trying to design them rationally,” he said.

After multiple cycles, the researchers had a group of peptides that stuck to the receptor and not to any other protein.

To reach full potential Synthetic biology needs a truly scalable design platforms

From eetimes, the last keynote speech at the Design Automation Conference (DAC) lived up to its billing of "design without borders." A tribute to the late A. Richard Newton, it focused on the application of manufacturing and design approaches developed for microelectronics to the emerging field of synthetic biology.

Some months before Newton's death, Rabaey noted, Newton gave a talk in which he declared that the "future is bio design automation (BDA).

[Applying Electronic Design Automation principles to >Synthetic biology]Rabaey quoted a definition of synthetic biology: "The creation of novel biological functions and tools by modifying or integrating well-characterized biological components into higher-order systems using mathematical modeling to direct the construction towards a desired end product." This, Rabaey said, sounds like "custom design.

What's needed to make synthetic biology successful, Rabaey said, are the same three elements that made microelectronics successful. These are a scalable, reliable manufacturing process; a scalable design methodology; and a clear understanding of a computational model. "This is not biology, this is not physics, this is hard core engineering," Rabaey said.

In electronics, photolithography provides a scalable, reliable manufacturing process for designs involving millions of elements. Biology has a long ways to go. What's needed, Rabaey said, is a way to generate thousands of genes reliably in a very short time period with very few errors. The difference between what's available and what's needed is about a trillion to one.

In electronics, there's a design methodology that involves clear abstractions, standardized interfaces, a constrained design space, and availability of intellectual property (IP). The same requirements exist in biology, Rabaey said; designers need to build models, compress them for analysis, and synthesize into "substrates" such as e.coli bacteria. "The synergy with EDA is huge," he said.

Computational models allow designers to describe what's possible and interpret the capabilities of the system that's being engineered. While synthetic biology can use advanced modeling and model reduction techniques, "the lack of clear computational models worries me," Rabaey said.

Just as techniques borrowed from microelectronics can be applied to biology, the reverse is also true, Rabaey said. He showed an example of a "biological oscillator" in which a gene generates a protein that is used to switch the oscillator on and off. It uses e.coli bacteria as a substrate. Hooked up to a scope, it shows a typical oscillation pattern, although the period is in the hundreds of minutes.

But there are more practical ways in which approaches borrowed from the biological world can help overcome the challenges of Moore's Law, Rabaey said. For instance, consider the problem of timing synchronization — and crickets.

Today, he noted, IC designers use a crystal to distribute a clock signal to all the flip-flops on a die. It's expensive and difficult to implement. What if designers could instead employ a variety of cheap oscillating elements? Nature does that with the sound the crickets make, Rabaey said, resulting in "distributed synchronization using only local communications without precision timing elements."

Rabaey said that experimental work has shown the same principle can be applied to chip design using simple analog and digital components with minimal power consumption, and it's been shown to work

Nuclear proliferation has killed no one, fear of nuclear power has killed

Nuclear proliferation has not killed anyone. The only nuclear bombs (Hiroshima, Nagasaki) used in war were by the USA. Any country that has gotten them since has not used them in a conflict. Proliferation has killed no one.

Fear of nuclear proliferation is used as a reason not to use more nuclear power. By not using nuclear power to help displace fossil fuel we have allowed more air pollution and water pollution to be created. This pollution causes 3-5 million deaths per year.


Air pollution (particulates, smog etc...) and water pollution (mercury, arsenic etc...) matters to me more than carbon emissions because they are clearly killing more now and for decades to come.

Those are all caused by fossil fuel usage. Air pollution kills more than 3 million each year.

Nuclear war - I know you are scared about it. Conventional war and attrocities killed 170 million in the 20th century. Nuclear war killed 214,000. 3 days of fire bombing Tokyo in WW2 killed 72,489. Operation rolling Thunder in Vietnam dropped 500 times the amount of bombs as the firebombing of Tokyo.

A one sided all-out modern conventional war can be just as deadly as a nuclear war to the losing side. The big nuclear power countries (US, Russia, china etc...) could firebomb and destroy the infrastructure of a target country. Take out medical, emergency response capability, roads, rail and bridges. Then poison food and water. Blockade and wait a couple of months. They could speed it up with some biologicals which would be devasting to a place without medical infrastructure but which is ok for a place with it.

Nuclear war is just a bit faster. We should stop letting the nuclear war fears cause us to continue to let 3-5 million/year die when we could save them.

Stopping deaths now is more important than nightmare (Bogeyman) scenarios which are not anymore likely or dangerous than other conventional kinds of all out war.

Nuclear energy is only deadly when something goes very wrong. This has rarely occured. Zero dead at Three Mile Island. Chernobyl an accident with a reactor design (no containment vessel) no longer in use. Fossil fuels are deadly all the time even when things are going relative right. Business as usual fossil fuels have been over 20 times more deadly than nuclear (weapons and power) have ever been for every year including 1945.

From 2007 onwards, who are we worried about proliferating to? 40 countries now have nuclear material and the know how sufficient to make nuclear bombs.

It is taking Iran and took North Korea a long time to get their nukes. They have had the main know how since AQ Kkan told them in the 1980's.

Plus there is deterrence. Iran/ North Korea or one of the minor nuclear powers uses their weapons by smuggling it in etc... Then the people in that country will die.

People dieing from lung cancer, lung disease or heart disease caused by coal or by a knife, gun, bomb from conventional war are just as bad and far more common than nuclear weapons.

What are the real incremental risks from more nuclear reactors ? We are mainly talking about more reactors in the USA, China, India, Russia, S Korea, Japan. All places that either have nuclear weapons or can easily make them. All places that already have nuclear reactors.

Air pollution indoor and outdoor kills 4.5 million people per year. Research published in 2005 suggests that 310,000 Europeans die from air pollution annually. (world health organization stats) It is a preventable situation. Plus if you add up the costs from sickness, death and lost productivity then a lot of the medicare and other program costs are impacted by coal pollution and fossil fuel pollution.

25% of disease is caused by preventable environmental causes (World Health Organization). Most of those environmental causes are fossil fuel usage.

People should be aware of the deaths that are happening now each and every day and year. Less concern should be placed on how we might die in a war. If big wars start then people will die by knife, gun, bomb, chemical. Nuclear could happen too but the body count would not be much different than all out conventional.

So why should we not use nuclear as part of a major effort to save the millions who are actually dieing each year and as insurance against global warming and to get off of a dependence on a highly unstable region of the world ?

Get your sense of proportion in line with reality. Get reality separated from potential fears. Get a correct risk analysis.

3-5 million dead from fossil fuel pollution - 100% this year and next year and the year after. Several times that number are also guaranteed to get sick. Each coal plant that is removed will save 50-1000 lives each year (depending upon how dirty the coal plant is) in the area of its pollution footprint.

Incremental risk of deaths from nuclear power ? Compare to the status quo or the alternative with fossil fuel usage.
What is the actual chance of something happening?
How many might die for the scenario ?
Incremental risk of proliferation ?
Incremental chance of war ? Is the chance for war actually reduced with nuclear weapons in the equation ? If fossil fuels are out of the equation we would not have the motivation for oil wars. With different groups with nuclear weapons would wars be deterred.
How many would more would die based on more nuclear weapons ? Would any more die ?

June 07, 2007

Towards cyborgs : brains in robot bodies

From the Register, Israeli boffins may be on the road to building artificial, living human brains which can function without a body to support them. Honest.

Ben-Jacob and his fellow boffins apparently mounted their artificially-cultured brain tissue on "a polymer panel studded with electrodes." (Won't be long before they start using full-size brains in jars of bubbling transparent fluid, we reckon.) The scientists then injected the hapless culture with "picrotoxin, a cocktail of gamma-aminobutyric acid (GABA)."

Apparently, "the cells on the electrode array came from the cortex, the outermost layer of the brain known for its role in memory formation," though it wasn't clear whose cortex or how they got the slime out of the donor's head.

The injection of picrotoxic gamma acid enabled the neurons, essentially, to start behaving persistently in an organised way - or to put it another way, BROUGHT A DEAD BRAIN TO LIFE.

Low power laptops 2009+

Power usage in a typical laptop is roughly evenly split between hard drive, CPU, monitor and wifi

By 2009, most new laptops will use solid state drives This will reduce power usage by up to ten times from what the hard drive would have used. A 20% reduction in overall power usage.

There are bigger impacts from faster boot and restart times Plus database and application performance is better from applications that would have had a lot of paging from the hard drive.

A totally remade laptop and handheld devices with lower power usage would need a new monitor such as OLED version, and redesigned CPUs and wifi components.

OLED and advanced LED can get to 30 lumens/watt or about double the current standard, but very advanced versions can reach 200 lumens/watt or better

Cameras can be made 50-100 times more efficient

Life extension 2007: halfway from 1984 to 2030

Some dismiss the view that the world and the technology that will be impacting it will be substantially different (or worse) in 2030 versus now. Even a thread on betterhumans has this discussion. The original poster is confused or purposely misinterpreting various predictions related the Singularity and to life extension.


I believe that we will see substantial progress in many areas and in this article I will focus on life extension. I am very confident in the power of specific social and technology changes that I see developing or spreading over the next 23 years. I will review what has happened from 1984 to 2007 (the same amount of time between 2007 and 2030). I will review the state of life expectancy in other places in the world now. I will look at some "mainstream" work to improve life expectancy and health. I will discuss what I expect from SENS.

Many might look at the differences between 1984 and 2007 and think that there will only be similar differences bettween 2007 and 2030.

Changing life expectancy shows that life expectancy in the USA at birth in 1984 was 74.7 years overall and 78.4 for women and 71 for men. In 2007, it is estimated to be 78 years overall and 80.97 for women and 75.15 for men

Some countries (Japan and Andorra) have a longevity advantage over the United States 2007 that the people in US have over people in the US livnig in 1984

USA 198474.77178.4
USA 20077875.1580.97
Japan 82.02 78.67 85.56

4 to 5.5 years longer overall (for best countries versus USA 2007)
3.3 years overall (USA 2007 versus USA 1984)

3.4 to 5.5 years longer for men (best countries versus USA 2007)
4.1 years longer for men USA 2007 versus USA 1984

4.6 to 5.7 years longer for women in best countries versus USA 2007
3.5 year longer for women USA 2007 versus USA 1984

A Harvard study shows that there are eight large demographic groups of americans in terms of life expectancy. Asian Americans have life expectancies similar to what Japan has overall.

The primary cause of the disparities between racial and geographic groups is early death from chronic disease and injuries, an analysis of data from the Census Bureau and the National Center for Health Statistics showed.

Asian-American women living in Bergen County, NJ, enjoy the greatest life expectancy in the US, at 91 years. American Indians in South Dakota have the worst, at 58 years.

The differences were attributed to a combination of injuries and such preventable risk factors as smoking, alcohol, obesity, high blood pressure, elevated cholesterol, diet and physical inactivity -- particularly among people from 15 years to 59 years of age. They were not due to income, insurance, infant mortality, AIDS or violence, said the study's lead investigator, Christopher J.L. Murray, director of the Harvard Initiative for Global Health.

Seventh day adventist claim their lifestyle choices enable 4-9 years of life expectancy increase for men and 2 to 7.5 years for women Lifestyle and health advisors and academic studies also seem to indicate that lifestyle and diet choices can add 4-10 years to life expectancy This is about 5-13%.

Life extension is discussed at wikipedia

I have discussed life extension in previous articles

I believe that by 2030 gene therapy will be widespread and fairly advanced. I base this on the 1200 some current clinical studies of gene therapy This would help people to get closer to the genetic advantages of the longest lived groups now and enable the mimicking of lifestyle longevity benefits using pills or other treatments.

Substantial progress against major pathological killers such as heart disease, cancer, diabetes and other diseases are occuring.

I believe that the Strategies for Engineered Negligible Senescence (SENS) is a good plan. SENS could help contibute to a far greater increase in life expectancy. However, SENS success is dependent on both successful science and development and on the funding that it receives.

For public health, we would need to clean up environmental air and water pollution using cleaner energy sources (solar, wind, geothermal, hydroelectric, nuclear etc...). I see this being achieved with the nuclear resurgence and other energy trends. We would also need to reduce traffic deaths. I see this being achieved with robotic cars and advanced collision avoidance systems.

Achieving three times or more progress in longevity from 2007 to 2030 versus 1984 to 2007 seems very achievable. This will be from public health improvements, disease cures or treatments, lifestyle improvements (from behavior or with medical assistance) and success from direct progress against the processes of aging. This would mean going from a life extension increase of 0.1 to 0.2 years each year to 0.5 years. For individuals, one needs to look at the progress being made for the age that you are. If you are 65 years of age in 2030, then what would effect you personally is the year after year improvement in life expectancy for those aged 65. Progress is also being made on that front.

Initial SENS success would go beyond what is described here to increase the maximum lifespan. Going from life expectancies of 90-95 years up to 120-125 years and maximum lifespans of 150 years with initial treatments. Continued progress would be from further advances such as from nanomedicine.

The future can arrive earlier for you if make the lifestyle adjustments now. You can give yourself a very good chance to live to 90 and the possibility of 100+ with lifestyle and pro-active medical tests and treatments. For the really big gains, help by donating to the SENS project.

Further reading:
Drug protects against Diabetes and Atherosclerosis in mice

Half of recent gains against heart disease in the US are from lifestyle improvements and half from medical treatments

June 06, 2007

Small 40% efficient heat - sound - electricity converter

From the New Scientist, new ways of turning heat into sound waves - and then into electricity - may be the next step toward a practical new source of alternative energy.

40% of waste heat from power plants could be converted to electricity if this works and can be scaled up and installed on power plants. Whereever the heat is not being used like for cogeneration systems, then extra electricity could be generated.

A team of University of Utah researchers plan to show they’ve succeeded in miniaturising and optimising [acoustic heat engines], which then turn the sound into usable electricity.

If true, the advance could open the door to super-efficient power plants, cars, and computers, as well as a new generation of solar cells.

To improve their prospects, Orest Symko and his team built smaller engines ranging from 11 to 18 centimeters long. At 40% efficient, the engines rival gasoline and diesel engines at energy conversion.

The team’s discoveries have also raised some eyebrows, however. "I realise anything to do with energy is really important these days," says Scott Backhaus, who studies thermoacoustics at the Los Alamos National Laboratory. "But we’re working on some applications for diesel engines, and I can tell you we’re not getting anywhere near 40% efficiency. I’m sceptical."

The Utah researchers have also built the smallest known acoustic heat engines, which at 1.8 millimeters long could produce 1 Watt of electricity per cubic centimeter when clustered together. Symko speculates that the clusters could be used as the 'cells' in a new type of solar panel.

He plans to test the devices within a year to produce electricity from waste heat at a military radar facility.

“It looks very promising, but at this point there is still much work to be done. We’re still working on an array,” he says, adding that he hopes to begin mass-production of miniature engines within the next year.

If all goes well, they could be installed on natural gas and coal-fired power plants shortly thereafter. The team will present their research on Friday at the annual meeting of the Acoustical Society of America in Salt Lake City, Utah.

Towards achieving the potential of carbon nanotubes for electricity

Carbon nanotubes have 1000 times the electrical conductance of copper

From physorg, researchers at Rensselaer Polytechnic Institute have developed a new method of compacting carbon nanotubes into dense bundles. These tightly packed bundles are efficient conductors and could one day replace copper as the primary interconnects used on computer chips and even hasten the transition to next-generation 3-D stacked chips.

A carbon nanotube bundle before (left) and after (right) densification. Credit: Rensselaer/Liu

The process boosts the density of these carbon nanotube bundles by five to 25 times. The higher the density, the better they can conduct electricity, Lu said. Several factors, including nanotube height, diameter, and spacing, affect the resulting density, Liu added. How the nanotubes are grown is also an important factor that impacts the resulting shape of the densified bundles.

Despite his initial successes, Lu said the density results obtained are not ideal and carbon nanotubes would have to be further compacted before they can outperform copper as a conductor. A close-up photo, taken using a scanning electron microscope, reveals there are still large empty spaces between densified nanotubes. The research team is exploring various methods to achieve ever-higher density and higher quality of carbon nanotube bundles, he said.

Lu is confident that these densified carbon nanotubes, with their high conductivity, ability to carry high current density, and resistance to electromigration, will be key to the development of 3-D computer chips. Chips used today can only shrink so much smaller, as their flat surface must have enough room to accommodate scores of different components. But the semiconductor industry and academia are looking at ways to layer chip components into a vertical stack, which could dramatically shrink the size of the overall chip.

Densified carbon nanotubes, with their ends trimmed and polished, can be the basic building blocks for interconnects that would link the stacked layers of a 3-D computer chip, Lu said.

“Carbon nanotubes are one of the most promising materials for interconnects in 3-D integration,” he said. Other potential applications of the densified nanotubes are high surface area electrodes for supercapacitors, fuel cell electrodes for hydrogen storage, heat dissipation materials for thermal conductors, and other situations that require high electrical, thermal, or mechanical performance.

The cost of carbon nanotubes would also need to be brought down thousands of times before this could be used for many bulk electrical applications.

Simple self replicating robots: 3D swiveling tetris cubes

A simple method for robots that can self replicate is proposed and it is basically 3d cubes that are deposited like blocks in Tetris. The blocks can swivel.

Sequences from the engineers’ hand-designed robotic self-reproduction. New Molecubes are deposited from the top, and each cube has the ability to swivel. Image credit: Zykov, et al. ©IEEE 2007.

The set-up of the manual process resembles a 3D Tetris game, where a dispenser drops cubic modules, called “Molecubes,” to a machine consisting of a line of connected cubes that can pick up, hold, and drop other cubes, as well as swivel to place a cube in a different location. Ultimately, the original machine can be programmed to build a copy of itself, which in turn can copy itself, ideally through several generations.

Each Molecube is identical: a 10 cm3, 625-gram module that can swivel along its long diagonal axis. On two of the cube’s sides, connectors can attach to other cubes’ connectors. A servo motor within each cube drives the swiveling mechanism.

The goal of the second part of the project was to find out whether or not it is possible to artificially evolve simple modular 2D self-reproducing machines with 2D Molecubes. First, the genetic algorithm searches for a specific shape of a 2D machine. The desired shape should enable the robot to pick up available modules around itself, swivel to a target location, and, by sequentially dropping the modules off one by one in the right places, form an exact 2D replica of itself.

Life Extension Engineering

About an hour long presentation is made by Aubrey de Grey. Thanks to Michael Anissimov at Accelerating Future and Fight Aging for pointing it out.

I have seen it and prior presentation made by Aubrey de Grey a few years ago at a Senior Associate Gathering of the Foresight Institute. Aubrey is making a slightly refined presentation.

He makes the basic case for taking an engineering approach to the accumulating damage of metabolism. This is also clearly presented at the sens.org site However, hearing Aubrey describe it and the chain of reasoning from that point through the entire life extension plan could convince many who believe in more incremental approaches.

Aubrey goes into an updated discussion of life extension escape velocity He and Chris Phoenix have performed computational analysis of how periodic improvements in our ability to reduce some of the accumulated damage from metabolism would translate into life extension. Any of the damage could eventually kill you, but we are able to tolerate a lot of damage. By being able to fix some of the damage and get better at fixing it they show that it translate into a percentage of people who will live a long time. Depending the age of someone who lives to the first major treatment for damage reduction it will determine the chance that they could live a long time. Say 10% for someone who is 80, 50% for someone who is 70 and so on.

Aubrey then talks about the details of fixing one type of damage, the removal of lysosomal aggregates

The project to fix this damage is lysosens

Not discussed was the mitosens project

There are ten other similar projects that are ready to be started when the research funds are available. You can donate to SENS research here as well as to the Mprize and donating airmiles to help cover travel

June 05, 2007

Geoengineering to counter climate change

A solar shield that reflects some of the Sun's radiation back into space would cool the climate within a decade and could be a quick-fix solution to climate change, researchers say.

With a solar shield, temperatures would be roughly the same as in 1900 (c), but precipitation would drop (d). Without the shield, temperatures would rise dramatically (a), and precipitation would increase in some regions and drop in others (b) (Image: PNAS/Caldeira/Matthews)

Solar shields are not a new idea - such "geoengineering" schemes to artificially cool the Earth's climate are receiving growing interest, and include proposals to inject reflective aerosols into the stratosphere, deploying space-based solar reflectors and large-scale cloud seeding.

The shields are inspired by the cooling effects of large volcanic eruptions that blast sulphate particles into the stratosphere. There, the particles reflect part of the Sun's radiation back into space, reducing the amount of heat that reaches the atmosphere, and so dampening the greenhouse effect.

Ken Caldeira at the Carnegie Institution of Washington, in California, US, and Damon Matthews at Concordia University, Canada, used computer models to simulate the effects that a solar shield would have on the Earth's climate if greenhouse gas emissions continued to rise along a "business as usual" scenario.

"We have been trying to pinpoint the one really bad thing that argues against geoengineering the climate," says Caldeira. "But it is really hard to find."

His computer models simulated a gradually deployed shield that would compensate for the greenhouse effect of rising carbon dioxide concentrations. By the time CO2 levels are double those of pre-industrial times - predicted to be at the end of the 21st century - the shield would need to block 8% of the Sun's radiation.

The researchers found that a sulphur shield could act very quickly, lowering temperatures to around early 20th-century levels within a decade of being deployed.

"The trouble is, the decadal timescale works both ways," says Caldeira. A sulphate shield would need to be continuously replenished, and the models show that failing to do so would mean the Earth's climate would suddenly be hit with the full warming effect of the CO2 that has accumulated in the meantime

And the ease with which they could work is also risky, he says: "These schemes are almost too cheap and easy. Just one fire hose spraying sulphur dioxide into the atmosphere would do the job for a century. That would cost about $100 million - nothing in comparison to the hundreds of billions it would take to transform our energy supply."

A previous article on other climate modification proposals

Caldeira has also looked at Gregory Benfords proposal

Benford has a proposal that possesses the advantages of being both one of the simplest planet-cooling technologies so far suggested and being initially testable in a local context. He suggests suspension of tiny, harmless particles (sized at one-third of a micron) at about 80,000 feet up in the stratosphere. These particles could be composed of diatomaceous earth. "That's silicon dioxide, which is chemically inert, cheap as earth, and readily crushable to the size we want," Benford says. This could initially be tested, he says, over the Arctic, where warming is already considerable and where few human beings live. Arctic atmospheric circulation patterns would mostly confine the deployed particles around the North Pole. An initial experiment could occur north of 70 degrees latitude, over the Arctic Sea and outside national boundaries. "The fact that such an experiment is reversible is just as important as the fact that it's regional," says Benford.

Is Benford's proposal realistic? According to Ken Caldeira, a leading climate scientist at Stanford University and the Carnegie Institution's Department of Global Ecology, "It appears as if any small particle would do the trick in the necessary quantities. I've done a number of computer simulations of what the climate response would be of reflecting sunlight, and all of them indicate that it would work quite well." He adds, "I wouldn't look to these geoengineering schemes as part of normal policy response, but if bad things start to happen quickly, then people will demand something be done quickly."

June 04, 2007

Increasing nuclear power in the past and in the future

I am all for more wind power. We should build more of it and all other power except coal. I was correcting the statement of some people on the oildrum.com and other sites who say that only 4 reactors per year are possible or that it will take a long time to ramp up reactor production in the USA.

If we look at the list of nuclear plants in the USA and when they were completed.

We can see:
12 nuclear plants were completed in 1974, 10 in 1973, 8 in 1972. There were years in the eighties with 8 completed. Before 1968 only small reactors were built. Only two had over 400MW, but most were less than 100MW. 1969, 1970, 1971 had 3-4 each year, then in 1972 the 8 reactors. So from a relative standing start the scale up was rapid to the peak of 12/year of the last build cycle. We are in a better position now because US rebuilt a new nuclear plant and is switching on Browns Ferry 1 this year.

The nuclear industry is a global industry. So the experience developed from the 30 nuclear plants that are being completed now globally by Westinghouse, Areva, GE and other global firms will mostly be transferrable to the US build up. We can fly in some of the project managers and lead foreman and key engineers and workers etc...


Besides building more nuclear reactors, it has been possible to increase operational performance.

The rate of increase in units of energy, delivered as electrical power per year in the period between 1993 and 2005 (12 years). The units of this calculation will be thousand megawatt-hours/per year.

Nnadir at dailykos compared the increased power per year from 1993 to 2005 of non-fossil fuel sources in the United States

Wood (biomass): 96 thousand megawatt-hours/per year.
Waste: - 259 thousand megawatt-hours/per year. Negative number.
Geothermal: - 190 thousand megawatt-hours/per year. Negative number.
Solar: (Usually everybody's favorite): +8
Wind (Another favorite): 1345 thousand megawatt-hours/per year.

Overall, renewable energy in the United States has increased at a rate of 1,000 thousand megawatt-hours/per year. The nuclear energy figure is 16,203 thousand megawatt-hours per year for nuclear even without building a new plant.

Where did all this energy come from if no new plants were built? Improved operations mostly. Nuclear reactors are 90% efficiency to the 782,000 thousand megawatt-hours per year could only be theoretically increased another 75,000 thousand megawatt-hours/year without more reactors or without up-rating the power.


Recent work from MIT indicates that existing nuclear plants could be modified to safely generate 50% more energy.

This can be done by changing the shape of the fuel from rods to cylinders and by adding nanoparticles to the water. A power uprating application takes about a 18-24 month to be processed.

So applying the MIT work over the next 10 years would add 390 billion kWh in the USA, even without new plants.

Update DNA sequencing costs

The NY Times reports that the cost of sequencing he human genome is down below $100,000 and that 1% of the genome that is considered the most relevant can be sequenced for $1000

By the end of the summer [2007], Dr. Church’s research project promises to deliver sequences to its first 10 volunteers. Unlike Dr. Watson, whose complete genome cost $1 million, the project’s volunteers will receive the one percent of their genome currently deemed most useful at a cost of $1,000.

One start-up company, 23andme, recently announced plans to provide affordable chunks of their DNA to individual consumers, along with tools to help them keep track of and understand their genetic information.

And technology companies like Illumina, Applied Biosystems and 454 Life Sciences, which solicited Dr. Watson’s DNA to prove its abilities, say the price of a complete human genome has already dropped to $100,000. They are competing for a $10 million “X prize” to sequence 100 human genomes within 10 days.

Форма для связи


Email *

Message *