November 24, 2007

Economist magazine wants more spent to verify safety of nanoparticles

The Economist has a lengthy article that suggests that more money and effort should be sent to verify the safety of nanoparticles.

America already spends the most on EHS (environmental health and safety) research into nanotechnology. Depending on who does the counting, it ranges from $11m to $60m.

Safety legislation cannot be expected to work until the products of the technology are better understood. What does it mean to regulate nanotechnology materials when you cannot even measure their release into the environment or agree on how to weigh a nanoparticle?

Had Dr Maynard's bag split wide open in Congress, scattering his carbon nanotubes into the air, would any harm have been caused? Probably not. But, as an answer, “probably” is not good enough.

Physorg reports that a survey in Nature Nanotechnology of 363 leading U.S. nanotechnology scientists and engineers shows that they are more concerned about the health and safety of nanoparticles than the general public.

Nanowerk has a spotlight on the issue of nanotechnology safety.

Scientists are more concerned about pollution and health but less concerned about jobs, loss of privacy, nanotech arms race, terrorist use of nanotech and self replicating robots.

superconductors plug ‘terahertz gap’

Ulrich Welp at Argonne National Laboratory and colleagues have discovered a simple way to synchronize the phase of these “intrinsic” Josephson junctions in high-temperature superconductors to emit milliwatts of power at frequencies up to 0.85 THz.

Schematic of the terahertz-source, which was fabricated on the top of an atomically layered superconducting crystal. The applied current excites the fundamental cavity mode (solid half-wave) on the width w of the mesa, and high-frequency electromagnetic radiation is emitted from the side faces (red waves).

Unlike far more energetic X-rays, T-rays do not have sufficient energy to "ionize" an atom by knocking loose one of its electrons. This ionization causes the cellular damage that can lead to radiation sickness or cancer. Since T-rays are non-ionizing radiation, like radio waves or visible light, people exposed to terahertz radiation will suffer no ill effects. Furthermore, although terahertz radiation does not penetrate through metals and water, it does penetrate through many common materials, such as leather, fabric, cardboard and paper.

These qualities make terahertz devices one of the most promising new technologies for airport and national security. Unlike today's metal or X-ray detectors, which can identify only a few obviously dangerous materials, checkpoints that look instead at T-ray absorption patterns could not only detect but also identify a much wider variety of hazardous or illegal substances.

T-rays can also penetrate the human body by almost half a centimeter, and they have already begun to enable doctors to better detect and treat certain types of cancers, especially those of the skin and breast, Welp said. Dentists could also use T-rays to image their patients' teeth.

There had been a "terahertz gap" between about 0.5 and 2 terahertz, which no device has been able to fill.

Being able to produce a beam of 'T-rays' in that range could revolutionise airport security and medical scans. They have persuaded normally independent quantum junctions to work together. Welp and colleagues made hundreds of junctions work together, creating a beam of laser-like terahertz light with 10,000 times more power (about half a microwatt).

The team used a high-temperature semiconductor called BSCCO, which naturally contains stacks of Josephson junctions in its structure. It comprises of superconducting sheets, a couple of atoms thick, separated by 1.5 nanometer insulating gaps.

"We were able to pack in a huge number of Josephson junctions" in each crystal, Welp says. In a strip of the material about one micron tall, 100 microns wide, and 300 microns long, they fitted in more than 600 junctions.

"If the power output were boosted up to 1 to 10 milliwatts, it would be a very promising niche device", complementing other devices that create terahertz radiation at other frequencies, Yergens says.

The frequencies covered by the new device are some of the more useful for imaging. "You have to be slightly below one terahertz to take full advantage of such radiation," he adds.

November 23, 2007

Carnival of Space week 30

Carnival of space 30 is up at bad astronomy The latests methods to find extrasolar planets, fusion rockets if bussard fusion works, more astronomy of planets and comets, new space finance and Obama's position on space. Obama would cut space programs and use the funds for education.

My contribution was the view of space travel if the Bussard IED fusion project is successful.

Mini black holes are discussed

Centauri dreams forecasts that in 2015 we will have discovered a lot more extrasolar planets (with planetary transit techniques, but also robotics and better telescopes) and gotten spectrographic analysis of any atmospheres. Spectrographic analysis using the light from the object to tell us what elements it is made.

Neutron scatter Camera detects nuclear bombs at a distance

Neutron cameras "can detect, unambiguously, at a greater distance, and through more shielding," said Jim Lund, manager of the Rad/Nuc Detection Systems group at Sandia National Laboratories in Alameda, Calif.

neutron scatter camera
The neutron scatter camera has an advantage over traditional neutron detection because it can differentiate low energy neutrons from high energy neutrons.

While some gamma rays can be blocked from detectors, neutrons are much more difficult to conceal. In a lab test, the camera easily detected and imaged a source placed across the hallway, through several walls and cabinets. The camera has the potential to reduce false alarm rates — a critical issue for in-transit radiation detection.

If brought within a few feet of nuclear material, current gamma-ray detectors can see through shielding, but there are too many ships and containers to scan them all up close.

A neutron scatter camera works by arraying two orthogonal detectors so that any incident neutrons can be traced to a specific trajectory in three-dimensional space--thus identifying the direction from which they came, as well as their energy level. Low-energy background neutrons, from cosmic rays and elsewhere, are ignored, while the high-energy neutrons typical of radioactive materials are imaged, albeit not in real time. Once an image is processed--taking several minutes--it reveals all the nuclear hot-spots within its field of view and at a distance (the exact distance is classified).

The extreme sensitivity of a neutron-scatter camera is exacted at a price, however, since the liquid detectors used by the neutron camera are too bulky to be handheld, are inflammable, are hazardous to humans, and require special handling and disposal procedures. A neutron-scatter camera could, however, be mounted on the back of a truck for mobile duty at a seaport, or from the deck of a Coast Guard vessel that scans incoming cargo ships.

The detectors are housed in a proton-rich liquid-filled scintillator, which fluoresces when struck by neutrons. The protons serve as the bumpers off which the neutrons bounce, scattering about (thus, "neutron scatter") like billiard balls. The impact nudges the protons to a higher energy level, but when they fall back to normal they shed a photon to get rid of the extra energy. Photomultiplier tubes are coupled to the scintillator to detect the visible light photons. Software analyzes the output from the photomultiplier and constructs a visual image that identifies the nuclear hot spots.

Next, the research group is going to calibrate its detector by locating several in the normal environment in which they will be used--one is already at sea, and several more will be located around New Mexico and California. The normal background neutrons scattered in these locations will enable the units to be calibrated to prevent future false alarms. After calibration, they will be tested with real concealed nuclear materials.

The researchers also claim that solid scintillator materials are possible to engineer, but the effort to do so will have to follow on successful completion of the current calibration and testing regimes.

Neutron-scatter camera and a gamma-ray detector need to be used together.

Current and future port security

DIY nuclear weapon detectors patrol SF Bay

Gamma ray telescope may be used to detect plutonium

New Sensor Technology Detects Chemical, Biological, Nuclear And Explosive Materials at a distance

Superconductors used to distinguish nuclear material

Surveillance blimp

High resolution Lidar

More technical details on Dwave System's Quantum Computing

D-Wave Senior Scientist and condensed matter physicist Mohammad Amin gave a highly technical presentation at MIT. (54 slides in this power point presentation) Dwave recently demonstrated a 28 qubit computer. They are predicting that they will have 512 qubit and 1024 qubit quantum computer systems in 2008. If Dwave is successful then in 2009 it will begin to greatly accelerate the development of molecular nanotechnology which needs better molecular modeling.

adiabatic quantum computer required conditions
Adiabatic quantum computer (AQC) required conditions

Theory from 2001 requires an energy gap
AQC Theory from 2001 requires an energy gap

Theory predicts energy levels
AQC Theory predicts energy levels

Experimental measurements show energy levels consistent with quantum noise
Experimental measurements show energy levels consistent with quantum noise

Experimental measurements fit the theory
Experimental measurements fit the theory

What the difference regions of quantum effects, mixed effects and classical effects would be
What the difference regions of quantum effects, mixed effects and classical effects would be

Interpreting several Measurements

the experimental results are indicating that Dwave is looking at quantum results
The experimental results are indicating that Dwave is looking at quantum results

The point of view of Dwave skeptic Scott Aaronson

Amin and Berkley maintained that their 16-qubit device was indeed a quantum computer and their evidence was that simulations of its behavior that took quantum mechanics into account gave, they said, a better fit to the data than simulations that didn’t. On the other hand, they said they were not able to test directly for the presence of any quantum effect such as entanglement. (They agreed that entanglement was a non-negotiable requirement for quantum computing.)

Dwave CTO Geordie Rose replies in the comments

Finally, the variety of demos we’ve run (including sudoku, image matching, etc.) are not “crap”. They use a novel hybrid approach to integrating QCs into classical solvers. In hindsight it is pretty obvious that to make any QC useful it needs to be integrated with the best known classical techniques regardless of what quantum algorithm it’s embodying. And while I’ve said this 10^87 times I’ll say it again: what we’re doing is explicitly heuristic and has no global optimality guarantees. While you can use the system we’re building on decision problems it is natively an optimization solver for quadratic unconstrained binary optimization problems

From another commenter:
How then does Dwave “solve” the image feature matching problem using just 28 bits, for images that are large and have many features (such as those that Dwave used in their SC demo)? Apparently they “cheat” and break the overall problem into many small maximum common subgraph problems (of a size that can be encoded in 28 bits). Each small MCS problem is “solved” on the QC, and then the solutions are somehow combined classically.

Like the soduku, solve the 3X3 squares iteravely then combine to a 9X9 solution.

It is not a cheat in that they are using the quantum system to its best ability by combining with our current best methods. To only solve problems with quantum systems is like having only allowing pencil and paper on tests when the real world has regular computers, wikipedia and Google.

Der Spiegel indicates radiation deaths are exaggerated

About 4,000 children were afflicted with cancer after Chernobyl. Less well-known, however, is the fact that only nine of those 4,000 died -- thyroid cancers are often easy to operate on.

Officially 47 people -- members of the emergency rescue crews -- died in Chernobyl from exposure to lethal doses of radiation. This is serious enough. "But overall the amount of radiation that escaped was simply too low to claim large numbers of victims," explains Kellerer.

The iodine 131 that escaped from the reactor did end up causing severe health problems in Ukraine. It settled on meadows in the form of a fine dust, passing through the food chain, from grass to cows to milk, and eventually accumulating in the thyroid glands of children.

A lethal dose of radiation, which causes fever, changes in the composition of the blood, irreparable damage to the body and death within two weeks, is 6 Gray.

From a study of all residents of Hiroshima and Nagasaki who had survived the atomic explosion within a 10-kilometer (6.2-mile) radius. Investigators questioned the residents to obtain their precise locations when the bomb exploded, and used this information to calculate a personal radiation dose for each resident. Data was collected for 86,572 people. More than 700 people eventually died as a result of radiation received from the atomic attack:

87 died of leukemia;

440 died of tumors;

and 250 died of radiation-induced heart attacks.

In addition, 30 fetuses developed mental disabilities after they were born.

Such statistics have attracted little notice so far. The numbers cited in schoolbooks are much higher. According to Wikipedia, the online encyclopedia, 105,000people died of the "long-term consequences of radiation."

Up to 4000 people from Chernobyl might die prematurely.

Other reports on chernobyl

New Scientist also discusses the UN report of 2005

Because of the difficulty of attributing specific cancers to radiation over decades, the precise number of deaths is "unlikely ever to be known", the study says.
Michael Repacholi, radiation manager for the UN's World Health Organization, notes that 25% of those affected by Chernobyl would ultimately die from spontaneous cancers anyway, and only 3% would die from cancer as a result of exposure. "Most people will be surprised that there are so few deaths," he told.

November 20, 2007

Transcript of my Foresight unconference talk is available

Fusion propulsion if Bussard IEC fusion works

Work is continuing towards Robert Bussard's vision of inertial electrostatic fusion.

UPDATE: Tom Ligon wrote the slides and gave the speech which presented this material.

Fusion R&D
Phase 1 - Validate and Review WB-6 Results
The proposed WB-7 and WB-8 devices will be constructed and tested during 2008.
1.5 - 2 years (by the end of 2008), $3-5M

Fusion R&D
Phase 2 - Design, Build and Test
Full Scale 100 MW Fusion System
5 years (2009-2013), $200M

If the full commercial system is successful then from 2014-2029+ there would the development of nuclear fusion enabled space vehicles.

bussard fusion space plane
Two 8 gigawatt Thermal fusion engines power this vehicle. The cost estimate is 100 times less than the best system available now.

bussard fusion lunar lander
20 hours to go from the low earth orbit to lunar orbit. 24 hours to go from the earth surface to the lunar surface using these systems.

These vehicles would have three main types of fusion engines. The DFP, CSR (controlled space radiator) and ARC which are shown with different ISP and thrust levels. CSR and ARC are Quiet Electric Discharge (QED) engines with ISP in the 1500 to 70000 range. They use arcjet heating of reaction mass.

DFP are diluted fusion product engines. They have high ISP 50,000 to 1.2 million. Reaction mass added to fusion product directly from the reactor.

QED fusion space engines types
Here are block diagrams of the QED engine types

long range space vehicle sizing with the different engine types
Here are long range space vehicle sizing with the different engine types. The Bussard rockets are about 1 to 2 football fields long.

description of Mars Bussard fusion vehicle
Description of Mars Bussard fusion vehicle. 33-38 days to Mars one way.

Description of a vehicle to go to Saturn's moon Titan
Description of a vehicle to go to Saturn's moon Titan.

diluted fusion product engine schematic
Diluted fusion product (DFP) engine schematic

fusion rocket schematic for Titan mission
Fusion rocket schematic for Titan mission. Each way would take 75-90 days.

The vehicles will enable large and inexpensive space colonies.

Lunar colony
4000 people, 25 tons each, $12.5 billion (1997 estimate)

Mars colony
1200 people, 50 tons each, $15.6 billion

Titan colony
400 people, 60 tons each, $16.2 billion

Why the colonies and vehicles are relatively cheap

If we have the relatively lightweight working fusion reactors with power ranges up to 10 gigawatts or more each, then all sorts of space propulsion becomes possible.

The fusion reactors could power massive laser arrays or high powered minimag Orion style propulsion systems.

If the Bussard inertial electrostatic fusion reactors work out we would redesign everything. All of the navy and military vehicles and our entire civilization. Even these first designs would give flights to the moon and orbit equal to about the price of old Concorde airplane tickets.

2006 space conference notes on Bussard fusion rockets

The 2007 power point presentation with most of the slides for this article

A paper describing inertial electrostatic fusion propulsion

Robert Bussard on inertial electrostatic fusion.

This project had been funded again by the US Navy

Askmar has links to the scientific papers on the Fusion concept

M Simon has put together a great site with a lot of technical information on IEC reactors including the Bussard project

Programmable metallization cell - super computer memory follow up

Arizona State University's Center for Applied Nanoionics (CANi) claims that Programmable Metallization cell (PMC) memory could be a 1,000 times more efficient than existing flash memory and could enable devices like USB drives to greatly increase the memory of digital cameras, MP3 players and laptops.

This is a follow up to my first article on programmable metallization cell memory in October, 2007

Best of all, the new technique can be used on existing, conventional storage which means that the cost will not be prohibitive.

"In using readily available materials, we've provided a way for this memory to be made at essentially zero extra cost, because the materials you need are already used in the chips—all you have to do is mix them in a slightly different way," said Kozicki.

It might not be too long before we see products incorporating the new technology. Kozicki estimates that the first commercial product could be within 18 months. PMC has already attracted interest from several memory vendors, including Micron Technology. Samsung, Sony and IBM have also been interested in the technology.

Early experimental PMC systems were based on silver-doped germanium selenide glasses, but these materials were not able to withstand the temperatures used in standard CMOS fabs.

Work then turned to silver-doped germanium sulfide electrolytes, and then finally to the current copper-doped germanium sulfide electrolytes. Axon Technologies has been licensing the basic concept since its formation in 2001. The first licensee was Micron Technology, who started work with PMC in 2002. Infineon followed in 2004, and a number of smaller companies have since joined as well.

Flash is based on the floating gate concept, essentially a modified transistor. Conventional transistors have three connections, the emitter, collector and base. The base is the essential component of the transistor, controlling the resistance between the emitter and collector, and thereby acting as a switch. In the floating gate transistor, the base is attached to a layer that traps electrons, leaving it switched on (or off) for extended periods of time. The floating gate can be re-written by passing a large current through the emitter-collector circuit.

It is this large current that is Flash's primary drawback, and for a number of reasons. For one, each application of the current physically degrades the cell, and they will eventually not be able to be written to. Write cycles on the order of 10**5 to 10**6 are typical, limiting its application to roles where constant writing is not common. The current also requires an external circuit to generate, using a system known as a charge pump. The pump requires a fairly lengthy charging processes so writing is much slower than reading, and requires much more power as well. Flash is thus an "asymmetrical" system, much more so than conventional RAM or hard drives.

PMC, on the other hand, writes with relatively low power and high speeds. The speed is inversely related to the power applied (to a point, there are mechanical limits), so the performance can be tuned for different roles. Additionally, the writing process is "almost infinitely reversible", making PMC much more universally applicable than Flash.

In research published in October's IEEE Transactions on Electron Devices, Kozicki and his collaborators from the Jülich Research Center in Germany describe how the PMC builds an on-demand copper bridge between two electrodes. When the technology writes a binary 1, it creates a nanowire bridge between two electrodes. When no wire is present, that state is stored as a 0.

Three companies, Micron Technology, Qimonda and Adesto (a stealth-mode startup) have licensed the technology from Arizona State's business spin-off, Axon Technologies. Kozicki says the first product containing the memory, a simple chip, is slated to come out in 18 months.

"No other technology can deliver the orders-of-magnitude improvement in power, performance and cost that this memory can," says Narbeh Derhacobian, CEO of Adesto, who previously worked at AMD's flash-memory division.

Adesto has received $6 million from Arch Venture Partners and additional funding from Harris & Harris, a venture firm specializing in nanotechnology.

Qimonda is a 13,500 employees computer memory company with over $4 billion/year in sales

Nano ionics defined: The term nanoionics is applied when electrochemical effects occur in materials and devices with interfaces, e.g., electrodes or electrochemically different material phases, that are closely-spaced (typically a few tens of nm or less). In this size regime, the functionality of ionic systems is quite different from the macro-scale versions but in a highly useful manner. For example, internal electric fields and ion mobilities are relatively high in nanoionic structures and this, combined with the short length scales, result in very fast response times. In addition, whereas deposition electrochemistry and most batteries use liquids or gels as ion transport media, nanoionics can take advantage of the fact that a variety of solid materials are excellent electrolytes, especially at the nanoscale.

Nano-ionic appliations
The ability to redistribute metal mass within a structure via the application of a voltage leads to a wide range of potential applications. Electrodeposition of a noble metal such as silver will produce localized persistent but reversible changes to materials parameters and these changes can be used to control system behavior.

Examples of the applications of mass transport in solid electrolytes include the following:

-Electrical resistance changes radically when an electrodeposit with a resistivity in the tens of or lower is deposited on or in a solid electrolyte which has a resistivity some eight orders of magnitude higher. This leads to a myriad of applications in solid state electronics, including memory, storage and logic.

-Deposition of mass can be used to alter the resonant frequency of a vibrating element in a microelectromechanical system (MEMS). This has applications in tunable high-Q MEMS-based resonators in RF systems.

-The optical properties of the electrodeposits have a profound effect on the transmission and reflection of light and so optical switches become a possibility using this technique. Such elements may be used in integrated optics and optical networks.

-The morphology of a typical electrodeposit leads to a large change in the wetting of a surface, making it highly hydrophobic, and so the technique can be used in microvalves and other fluid/droplet control devices in applications ranging from lab-on-a-chip to micro fuel cells.

Making embryonic stem cells without destroying embryos

Researchers have created human embryonic stem cells without destroying embryos or using hard-to-get eggs. The technique may prove to be easier, cheaper, and
more ethically appealing than an alternative approach that requires cloning.

Two separate teams of researchers say they have sidestepped the
cloning method and reprogrammed mature human cells into a primordial,
embryonic-like state. Those cells were then transformed into other tissue types, such as heart cells. The long-term hope is that such freshly-created tissue may, for example, be used to heal a heart-attack patient. Unlike cloning, "the wonderful thing about this approach is that it's easy."

There are several limitations to the current approach. For now, both teams had to use dangerous viruses to effectively transport the genes into the cell, which could have deadly consequences if it was immediately applied to humans. Dr. Yamanaka and others say they are testing other viruses in the hopes of finding a non-harmful one.

And before the reprogramming technique can be applied to human patients, it needs to be tested on large animal models to ensure that it's safe and effective.

Still, the latest results are a big step up from similar breakthroughs in mice, separately reported this summer by Dr. Yamanaka's group and two other research teams in the U.S. The Kyoto team reported that embryonic-like cells developed with the new technique could even help form a new mouse -- a gold-standard test for the viability of the created tissue.

UPDATE: "I believe that these new results, while they don't end that controversy, are the beginning of the end of the controversy," James Thomson Thomson, a cell biologist at the University of Wisconsin in Madison (on one of two teams that did the work), said.

One first step may be to grow tissue transplants to repair a damaged heart, replace the brain cells destroyed by Parkinson's disease, or perhaps even to grow another whole organ.

But the ultimate goal is even more ambitious. "From a heart cell we don't have to go back to an embryonic stem cell," Gearhart said in a telephone interview.

"We could go back to a cardiac progenitor cell. If we knew the right combination of things ... we could be instructing our own cells to get them to do what we want them to do."

Short, mid and long term energy and transportation system overhauls

I am not convinced that we are at peak oil/liquids and if it does occur I believe that it will not do severe damage to the economies of the developed countries or China.

In the event of a peak in oil, the most vulnerable countries have a lot of oil imports. Oil producing countries would cut back on exports of oil at a faster rate. The USA imports 12 million bpd out of about 20.6 million bpd consumed. Japan imports 5.1 million bpd out of 5.6 million bpd consumed. China imports 3.4 million bpd out of 7.3 million bpd consumed. Germany and south Korea are next on the list.

We are seeing some problems that are being made worse because by the Iraq war and potential of war with Iran. Most projections of the peak are for a substantial plateau. Any demand destruction would hit poorer countries first.

The initial step is conservation. Dropping speed limits back to 55mph on highways.
750,000 gallons per day.
Instituting other conservation measures.
No drive days. One day a week retail store closures.
More government imposed and assisted telecommuting.
Satellite office programs.
Fuel rationing.
Carpooling, transit, odd-even and other measures can reduce fuel usage by 8-15% right away and several can be sustained without harming the economy. A mid-term transition would be to require and setup satellite offices and wifi buses and trains (so that people could be productive while traveling on transit)

Those steps were in the first part of my transitioning from oil article.

10-20% reduction with those conservation measures. 2-4 million barrels per day for the USA.

I see no indications that such measures for 4 years would not be sufficient to allow ANWR and the new gulf oil to get spun up. Then the shale oil, biofuels and more oilsands and more electrification and high efficiency vehicles.

ANWR 800,000 bpd by 2018, gulf Oil brazil - maybe 1 million bpd by 2015, Chevron -gulf of Mexico Maybe 700,000 bpd by 2012, Shell oil shale - maybe 1.5 million bpd by 2025, other significant deep oil possible of coasts of africa and asia, Canada will still be exporting oil from oilsands in the pipelines to the USA.

IF Iraq and Iran get stable they each could produce about 6 million bpd, which is 6 million bpd more than they do now. A more desperate big country with a big military could super-surge double down to make that happen.

I believe that China also has the means to conserve and use its $1.4 trillion in reserves to ride out a rapid transition.

The best technologies for moderating peak oil would be better oil recovery (enhanced oil recovery) like Toe to heel air injection

There are other means of enhanced oil recovery. Actually carbon sequestering into old fields enhances oil recovery.

The material on the Petrobank web site indicates that it is expected that THAI will recover 70% to 80% of oil originally in place. If 10% of the oil originally in place is burned in the process, this would leave 10% to 20% of the oil originally in place in the ground.

By comparison, recovery using current steam processes is estimated to be 20% to 50% in the high-grade, homogeneous areas where steam methods can be used.

Summarizing the near term transitioning from oil steps (from now out 5+ years). Conserve. Use and develop alternative liquids (biofuels, fuel from shale). Drill everywhere like ANWR (Enhanced oil recovery, don't fall back to coal, but go hard for more oil and a nuclear/electric switch). Use electrical transportation. China has 60 million electric cycles and scooters already. (existing batteries good enough for bikes and scooters)

Then there is the transition to far greater efficiency in the mid-term 2010-2025
Thermoelectronics, I see as big from 2010+ making engines and society more efficient

Superconductoring motors for industrial efficiency and for improved power grid efficiency and reliability.
Improved industrial processes and direct current long haul power lines would also help.

More power, nuclear fission, wind, and maybe fusion
If we do not get good nuclear fusion then the world and China will build a lot of nuclear fission.
I also like the kitegen system for wind power and think it would work and be cheaper and better. There will also be 10MW superconducting wind generators.

I discuss the nuclear plans of China, India and Russia

China's nuclear build is accelerating, with interior provinces likely to get reactors.

China's big hydro build and more renewables for China

I go into detail about scaling up nuclear fission by a lot. I also discuss how "nuclear waste" is unburned nuclear fuel. The right reactors (which have been built before) would burn it all.

Worst case oil problems trigger a four year crash program transition and a deep recession. Weaker countries in Africa etc... are hit the most.

Nuclear Fusion best bets in my view : Bussard fusion, Tri alpha energy colliding beam fusion, Z-pinch rapid fire, Hyper (laser fusion)

To summarize the overall plan: conserve, drill more, use enhanced oil recovery, switch to more efficient electrical transportation, switch to more efficient systems (thermoelectric, superconductors) and develop nuclear fission, wind and fusion.

Changing the USA energy production mix

Over 20 years, a replacement energy mix scenario for the USA is (total needed power is in the range of 5200 billion kwh for electricity and then a similar amount for transportation. Need to replace as much of the 80% of the power that is produced by coal and oil as possible.):
160 new 2GW nuclear plants (up-rated 1.55 GW reactors) with 16.25 billion kWh each. 2600 billion kWh
600 billion kWh from up-rating of existing nuclear reactors (increased from original article because of MIT and other research on generating a lot more power from current nuclear plants)
400 billion kWh from wind
200 billion kWh from solar
34 billion kWh from superconductor motor industrial efficiency
1000 billion kwh from thermoelectronic and other efficiency technology (new)

There is a new IEA plan for stabilizing world CO2 at the 450 parts per million level for less global warming.

Nuclear capacity under this projection would more than double from its current capacity to 833 GW by 2030. Even if this increase were to happen, nuclear would account for only 16% of the necessary reductions in CO2 emissions worldwide. This should speak to the monstrous challenge the world faces in curbing CO2 emissions. Improved fossil-fuel efficiency would account for 27% of the reductions; end-use energy efficiency would provide 13%; biofuels for transportation, 4%; renewables for power, 19%; and CO2 capture and storage, 21%.

The USA, China, Japan, S korea, Russia, Europe and Canada would be the ones who would be needing to step up and install a lot more nuclear power.

The climate bill passages in the US and Europe and even faster building in the interior of China could combine to increase power by another 800GW by 2030. 1.6 TW.

November 19, 2007

Detecting chemical reactions in a single living cell for the first time

Bioengineers at the University of California, Berkeley, have discovered a technique that for the first time enables the detection of biomolecules' dynamic reactions in a single living cell.

They can determine in real time whether specific enzymes are activated or particular genes are expressed, all with unprecedented resolution within a single living cell. This could lead to a new era in molecular imaging with implications for cell-based drug discovery and biomedical diagnostics.

The researchers tackled this challenge by improving upon conventional optical absorption spectroscopy, a technique by which light is passed through a solution of molecules to determine which wavelengths are absorbed. Cytochrome c, for instance, is a protein involved in cell metabolism and cell death that has several optical absorption peaks of around 550 nanometers.

The researchers came up with a novel solution to this problem by coupling biomolecules, the protein cytochrome c in this study, with tiny particles of gold measuring 20-30 nanometers long. The electrons on the surface of metal particles such as gold and silver are known to oscillate at specific frequencies in response to light, a phenomenon known as plasmon resonance. The resonant frequencies of the gold nanoparticles are much easier to detect than the weak optical signals of cytochrome c, giving the researchers an easier target.

Gold nanoparticles were chosen because they have a plasmon resonance wavelength ranging from 530 to 580 nanometers, corresponding to the absorption peak of cytochrome c.

The researchers repeated the experiment matching the protein hemoglobin with silver nanoparticles and achieved similar results. "Our technique kills two birds with one stone," Lee said. "We're reducing the spatial resolution required to detect the molecule at the same time we're able to obtain chemical information about molecules while they are in a living cell. In a way, these gold particles are like 'nano-stars' because they illuminate the inner life of a cellular galaxy."

Help, an army of volunteers

The inspiration of Help Hookup is actually a comic book called Global Frequency by Warren Ellis. My brother, Alvin Wang, took the idea to startup weekend and they launched the idea this past weekend for hooking up volunteers. It is similar to the concepts of David Brin's "empowered citizens" and Glenn Reynolds "an army of Davids".

Hooking up skilled volunteers with great causes and events.

Global Frequency was a network of 1,001 people that handled the jobs that the governments did not have the will to handle. I thought that it was a great idea and it would be more powerful with 1,000,001 people or 100,000,001 people. We would have to leave out the killing that was in the comic.

Typhoons, earthquakes, and improperly funded education could all be handled. If there is a disaster, doctors could volunteer. Airlines could provide tickets. Corporations could provide supples. Trucking companies could provide transportation. Etc. State a need, meet the need. No overhead. No waste.

The main site is here it is a way for volunteers to hookup

The helphookup blog is tracking the progress.

The project has been covered on

There is a facebook group

Track the online reaction via technorati

Interviewed by the

I was interviewed for a podcast by the website

Some of things that I talked to them about was my view that many people are already making the choice to enhance some aspects of their body and mind using invasive and non-invasive approaches.

Collective individual choices for future technology and the choices that are not commonly understood in the context of changing technology. We are choosing our future now.
- 3 million in the USA choose steriods despite health downsides and
access restrictions. 4 out of 5 for appearance reasons
- 7 million worldwide use steroids
- 11 million (in USA) choose cosmetic surgery despite results that are sub-optimal and risks to health
- dietary supplements are a $22+ billion industry
- Maybe 1 in 10 use drugs for better results on academic tests.
It is not about "intelligence enhancement" it is about business productivity
and academic performance.
The drugs seem to help the performance of most students, but some do worse under its effect. Kids took practice tests with and without. If they thought it helped
then they took it for the test
Some felt it gave them a 200 piont boost on SAT scores

- non-invasive can work too (wikipedia, google etc...)
- cheap mind machine interfaces for PS3, Xbox etc...
these can mostly be grouped under crappy beta versions of human enhancement.

Better versions of those kinds of enhancements are in the works:
Myostatin inhibitors are better and safer than steroids.
Fake myostatin inhibitors are sold now.
Millions will use it for muscle disease, to counter muscle wasting from old age and for performance and appearance enhancement.

Apparently 4 times stronger effect than high dose steriods.
Need to consume (eat more food) more - which is why evolution did not select those genese, but can help increase muscle for fat burning to counter obesity. So not only is it safer it could provide health benefits to the obese, elderly and those with muscle diseases. Since 2005 there have been human trials.

Gene therapy -genetic engineering can provide more endurance, radiation resistance, life extension.

Unevenness of advancement- life extension
There is already more than 30 year life expectency differences between
different groups in the USA and around the world. There is 0.1-0.3 years added to life expectancy every year.

Many people do not want futurists to predict anything controversial or exceptional. Very rapid technological advancement, really powerful technology (AGI, versions of nanotech, certain space technologies, certain medical advancement etc...) While exceptional technology and breakthroughs are not what commonly occur every day, it is the exceptional breakthroughs that transform society over the longer term and we as a society need to lower the development barriers.

Yes, certain choices and societal forces could cripple the development of those technologies. The Space program has not advanced because all the plans have not been focused on making big and meaningful things happen. Actual purpose political pork. My goal is to think of ways of getting around those blockages and to push for a better future and to spot movement around blockages that are already happening. Part of the reason is that I think the current societal choice/technology mix is not sustainable and people ignore the negatives of the current balance. 56 million dead/year is not something to be tolerated. People ignore the slaughter and the real dangers of the now. Nuclear power could kill 2000 people over 40 years when there is a really bad reactor design but that has to be compared to 1 million/year from coal
and 3 million/year from air pollution.

I think of the Tom Hanks character in Saving Private Ryan on the opening Omaha beach sequence. Some soldiers mistakenly believed it was better to hide behind the steel crosses on the beach or to not creatively attack the pill boxes that had them pinned down. I think of the difficult goals of getting space colonized in a major way or
conquering diseases and making significant progress against age deterioration as pill boxes that have us pinned down on a dangerous beach. Just because the time has been stretched out to decades, centuries, millenia does not mean that we are not collectively on a dangerous beach. We can and should do a lot over the next 50 years and beyond. Every year 55 million people die from all the various causes and we are straining the ecosystem and facing growing dangers from the power of technology. Stepping back from where we are now to a "sustainable" position would be like retreating from the beack back into the sea. It is a bad plan because it would cost 5.5 billion lives and only save about 1 billion. Breakthrough out of being pinned down on the beach does not mean that everything is safe and utopia. There is still a struggle beyond with more risks and challenges. However, pressing forward in the most creative way with the best plans is the best course of action.

In terms of a radically better future, why not choose the best plans we can come up with. Why stick to clearly failed or flawed plans just because that is what we have been doing ? If something is not working as well as it could then there should be change. The choice for the future does not have to be perfectly safe. It just has to be better overall than the current situation and path. We can and should do a
lot better.

Progress on Stronger Carbon-Nanotube Fibers

Researchers have improved techniques for spinning fibers of carbon nanotubes: they make the nanotubes align in the fiber, creating fibers as strong as, or stronger than, materials such as Kevlar that are used in bullet-proof vests. Also, the nanotube fibers, unlike regular ropes, can be knotted without hurting their strength much.

Alan Windle, a professor of materials science at the University of Cambridge, in England, made and tested the new nanotube fibers along with researchers at the Natick Soldier Research Development Center, in Massachusetts. Windle and his colleagues tugged on the nanotube fibers, finding that the weaker ones snapped at stresses around one gigapascal, making them comparable to steel, gram for gram.

The better-performing carbon-nanotube fibers broke at around six gigapascals, beating the strengths that manufacturers report for materials used in bullet-proof vests, such as Kevlar. These nanotube fibers matched the highest reported strengths for a couple of the strongest commercially available fibers, Zylon and Dyneema, also used in bullet-proof vests. A lone, extremely strong nanotube fiber was off the charts, reaching nine gigapascals of stress--far beyond any other reported material--before breaking. Earlier work with carbon nanotubes has produced fibers that withstand at most three gigapascals.

We are still waiting to hear if Superthread material is available.

New methods to make carbon nanotubes without metal catalysts could help prevent the defects that often cause nanotubes to break which are longer than one millimeter

Carbon nanotube production is increasing and prices are falling

Personal DNA services for about $1000

There are serveral new services, which will let customers analyze their DNA. This will tell them how likely it is that they will contract inherited diseases by creating a personalised, genetic profile.

I think personalized medicine is the way to go. This is one piece of a larger puzzle. More analysis of this and other data is required. There needs to be individual computer models that represent our detailed physical condition. We need to be able to run simulations against that computer model to know what we should expect before we treat the physical person.

23andMe, one of whose founders is married to the co-founder of Google, Sergey Brin, is one of a number of firms aiming to capitalise on their new market for personalised healthcare, where companies aim provide tailored, genetic information to customers. Customers will be able to take 'preventive action' in relation to their health.

Last week, DeCode Genetics, an Icelandic firm, began a similar service for North American and European customers costing $985, and another Californian company, Navigenics, is also due to enter the market soon.

Genetics experts criticised the service, saying that for the vast majority of customers it would be "scarcely of any use at all," and that 80 per cent of the information relevant to a determination about a customer's life expectancy, say, could be ascertained in a doctor's appointment.

Форма для связи


Email *

Message *