July 27, 2007

1 gigawatt wind turbine

Treehugger shows a 1 gigawatt magnetically levitated wind power generating turbine

Maglev Wind Turbine Technologies proposes this technology


It looks like it is 200 meters tall and 100 meters in diameter, based on the car and streetlights in the foreground

A regular 5 megawatt wind turbine has about 120 meter tall tower and a 126 meter diameter blades They supply around 13 million kilowatt hours per year each. A 200 times larger wind rotor would then proportionally have 2.6 billion kwh.
The chinese claimed that the magnetically levitated wind turbines could be 20% more efficient. This would increase the energy generated to 3.12 billion kwh.

The average US home uses about 10000 wkh per year

Maglev Wind Turbine Technologies is claiming 8.75 Twh would be generated. This is operating efficiency equal to a nuclear power plant. Even if they are 50% more efficient than a regular wind turbine (because of more consistent wind at higher altitude and 20% superconducting efficiency) I do not see them doing better than about 5 billion Twh maximum. This would be about 500,000 homes. They could do better but they would need a bigger turbine with a higher maximum power rating.

A wind turbines operating load factor is less because of the inconsistency with which winds blow. although a very tall wind turbine
The benefits of magnetic levitation for wind turbines is discussed

Wind is more consistent at 270 meters (1000 feet) A large wind turbine that could take advantage of that could achieve operating efficiency of 40-50%.

Offshore wind farm

A nuclear power plant with a 1 Gigawatt rating would produce 8 to 9 billion kwh per year.

The tallest buildings and structures in the world from wikipedia

There are only about 20-30 man-made structures that are taller than 300 meters.

Tall building profiles

New building materials could allow the giant wind turbine to be made taller and lighter. New materials such as carbon nanotube reinforced polymers or nanograin metals.

My other articles related to wind energy and comparisons to other energy sources

A chinese firm ordered superconducting wire from American Semiconductor (not for maglev but for overall engine efficiency)

Shares of American Superconductor Corp. (Nasdaq: AMSC) got a jolt in pre-market trading this morning after the energy technologies company, announced that it received a follow-on $70 million order for wind turbine electrical systems from Beijing-based Sinovel Wind Corp. Ltd.

American Superconductor’s subsidiary, AMSC Windtec, will ship the customized electrical systems to Sinovel in 2008 for use in Sinovel's 1.5 megawatt wind energy systems.

AMSC Windtec turbine electrical systems provide wind turbine operation by controlling power flows, regulating voltage and controlling the pitch of wind turbine blades to maximize efficiency.

Another nuclear plant for the USA and more nuclear power

Many people think that nuclear power is not being added in the USA or that it cannot be added in time to help with climate and pollution problems. That belief is not correct. The world has 320 nuclear reactors in the construction pipeline This is an increase of 80 reactors from January 2007 when 240 reactors were in the pipeline

Completely new nuclear plants have started the licensing phase in the United States


I think all 320 reactors can be completed by 2030. The world can double the amount of energy from nuclear power. I think more nuclear reactors will be added to the pipeline. Getting up to double the prior world peak in nuclear plant construction (24 in 1984) is possible even without radical effort and new mass production redesign (I think we should get redesigns ready for for the 2020+ period). I think 800+ nuclear reactors can be built by 2030 and all of them can be up-rated by 50% using MIT developments of cylinder shaped fuel and nanoparticles for higher and more efficient temperatures. A 350% increase in nuclear power by 2030.

The US has built a new reactor and will be building more reactors and is increasing the power generated from existing reactors

Watts Bar Unit 2 is the next partially completed nuclear reactor that will likely start construction Jan 2008 and be operating about 2013

This is following the reactivation of Browns Ferry 1

A 20% power up rating application was approved at Vermont Yankee and is being made.

Expected new nuclear plant applications for the USA

Power uprate applications expected

Power uprate applications pending

Constructing a lot of nuclear reactors is not constrained by material

My position on energy and climate:
I think a goal of 80% green house gas reduction is a tough target for 2050.
I think we need to maximize the use of all available options to get improvements in green house gases and pollution reduction as fast as possible.
We have to use renewables, nuclear, efficiency to get where we need to be.
France's use of nuclear for electricity is an example of how much nuclear can help.
(80% of electricity generation)
Plus by switching to plug in hybrids at the same time we can shift more transportation over to clean energy sources.
If the plug in hybrids are efficient, then instead of oil we can use genetically engineered (synthetic biology) biofuels (ideally generated from engineered single cell organisms).

From 1993-2005, nuclear has been helping more as a better energy source:
Wood (biomass): 96 thousand megawatt-hours/per year.
Waste: - 259 thousand megawatt-hours/per year. Negative number.
Geothermal: - 190 thousand megawatt-hours/per year. Negative number.
Solar: (Usually everybody’s favorite): +8
Wind (Another favorite): 1345 thousand megawatt-hours/per year.

Overall, renewable energy in the United States has increased at a rate of 1,000 thousand megawatt-hours/per year. The nuclear energy figure is 16,203 thousand megawatt-hours per year for nuclear even without building a new plant.

I have compared costs of all the energy source options

Carnival of Space 13 at Liftport Blog

July 26, 2007

Other work from the nanomechanical computer people

The site for the people making the nanomechanical computer indicates there are other work

They are also working on artificial ion channels:

We apply nanostructuring techniques to the most fundamental unit of information exchange in living systems: ion channels. Ion channels regulate the flow of ions through cell membranes, thus maintaining electrostatic activity and potential in all living systems.

Mis-functioning and non-functioning ion channels are involved in many diseases, such as immune deficiency and cardiovascular diseases.

Using semiconductor materials such as glass, quartz and plastic, we have machined nanostructures that adapt to cell membranes. These 'lab on chip' devices monitor the flow of charge through ion channels and stimulate mechanically and electronically active ion channels.

They are designing and building 3D nanostructures, using processing techniques for standard semiconductor materials (silicon & GaAs) and others (glass & plastic).

a 4 page paper on a suspended quantum dot Using the suspended quantum dot, which acts as a very sensitive bolometer, we are able to study the interaction of low dimensional electronic and phonon systems. A bolometer is a device for measuring the energy of incident electromagnetic radiation.

There nanomechanical shuttle is the basis of the new nanoelectromechanical computer

Richard Jones has a writeup which does help clarify the operations slightly

My original post on the nanomechanical computer project

Gaseous core nuclear design, the Liberty Ship

The liberty ship is a Gaseous Core Nuclear Reactor design, of the Nuclear Lightbulb subvariant. This design is from nuclearspace.com

It has an ISP of 3,060 and leverages existing technology to conservatively deliver 1000 tons to low earth orbit, 33% of its takeoff weight.

It flies to space with a thousand tons of cargo, and flies back using some gentle aero-braking and its thrusters with another thousand tons of cargo.

I have more on safety below. Chemical rockets are likely fancy balloons filled with fuel. A nuclear rocket has more performance so it can carry more weight, so it can be like a flying armored tank.

I am saddened by the Scaled Composite deaths. However, why is it more scary for anyone to risk dieing from a nuclear rocket than from a car accident (1.2 million deaths per year), air pollution (3 million deaths per year), chemical rocket (300+ deaths total mostly civilians and ground crew) ? Do we go 3 people died in car accident so lets stop driving cars ? We did not say the Titanic sank or there were boating accidents so let us not make nuclear powered submarines or air craft carriers. Far fewer deaths happen from all our nuclear powered reactors and vehicles because their risks are different and we just tend to be more cautious with things nuclear. Why is better for people to die doing mundane matters as part of mundane lives than to risk dieing trying to achieve something audacious like really establishing offworld colonies ? The Liberty ship does not even have to be a manned vehicle. We just use them as the best way to send up 1000+ tons of cargo in one go. I bet it would put fewer lives at risk to send up a fully assembled international space station than to put up 40 space shuttle flights and a dozen EVAs assembling the ISS. You could do more and it would be safer or at least as safe.


It has safety systems, redundancy and is re-usable.

It is based on technology and ideas that are decades old. Modern technology just makes it possible to make the design more powerful while keeping it safe, but the concepts and design principals are all well established.

The nuclear waste can be dumped into the sun during orbit.
In a traditional chemical rocket, the circularization burn is used to add a tiny bit more speed to the spaceship, making the orbit nicely round. In this nuclear system, we have so much power to burn that we deliberately 'overshoot' on the way up, so the circularization burn is a lot larger than normal.

Now, if you will remember, up above I mentioned that the exhaust of this nuclear spaceship shoots out at a whopping fast 30 kilometers per second. If you add this 30 kilometers per second to the 8.5 kilometers per second the whole rocket is moving while in orbit, and you point your rocket in just the right direction, you can literally shoot the exhaust right away from the planet so fast that it never comes back. You can then aim it to drop into the Sun without too much trouble.

Now, the radioactive spent fuel of this rocket is gaseous, remember? So, if we only use one of the seven big rocket engines to perform the circularization burn, it is a trivial chore to pump the gaseous waste from the other six rocket engines into the rocket chamber, heat it super hot, and shoot it into space forever.

Gaseous core nuclear light bulb engine schematic

External pulsed plasma rockets

A prior review of different nuclear thermal rockets

ADDED: On the question of safety:
You would launch this from the middle of the Pacific Ocean.

A nuclear rocket if this type would have in the range of 10 pounds of radioactive nuclides in it. The Ivy Mike nuclear bomb test which took place on November 1st, 1952. (see the link to wikipedia on Ivy Mike) Released 1023 pounds worth of radioactive nuclides. No one died or was injured.

The gas cored reactor has several potential "scram" (emergency shutdown) modes, both fast and slow, and the speed of the reaction is easily "throttled" by adding and removing fuel or by manipulating the vortex. A 'scram' is an emergency shutdown, usually done in a very fast way. For example: a gas cored reactor can be fast scrammed by using a pressurized "shotgun" behind a weak window. If the core exceeds the design parameters of the window, which are to be slightly weaker than the silica "lightbulb," then the "shotgun" blasts 150 or so kilos of boron/cadmium pellets into the uranium gas, quenching the reaction immediately. A slightly slower scram which is implemented totally differently is to vary the gas jets in the core to instill a massive disturbance into the fuel vortex. This disturbance would drastically reduce criticality in the fission gas. A third scram mode, slightly slower still, is to implement a high-speed vacuum removal of the fuel mass into the storage system. Having three separate scram modes, one of which is passively triggered, should instill plenty of safety margin in the nuclear core of each thruster.

Because we have so much performance margin we can make the nuclear core very heavy and strong so that even if it falls out of the sky it will not leak. We have the means to prevent it from exploding with the emergency shutdown modes. You launch from the Pacific Ocean so you have a lot of time to abort so that it does not hit anything. With the power and performance this thing can fly pretty much straight up.

I think with the performance margin this can be made far safer than chemical rockets. The space shuttle weighs 2,029,203 kg (4,474,574 lb) and puts 24,400 kg (53,700 lb) into LEO. A little over 1% is cargo. It is why there is so little safety margin. You had trim in order to launch anything and reach orbit. The Liberty ship only has 40% rockets and fuel. 25% of the weight beyond that is devoted to building this thing like a flying armored tank able to resist damage and contain problems.

The Space Shuttle generates about 100 gigawatts of power when it is launched, or as much as 50 big nuclear power plants. Plus, the exhaust gases left behind by those huge rockets are not very safe to breathe, either.

There is risk in everything that we do.
As of 2007, in-flight accidents had killed 18 astronauts, training accidents had claimed 11 astronauts, and launchpad accidents had killed at least 70 ground personnel.

There have been 230+ ground crew and civilian casualties.

Driving cars kill 1.2 million each year.


Trains have .04 deaths for every 100 million miles
Air travel has .01 deaths for every 100 million miles traveled.
Automobile has .94 deaths per 100 million miles

Computational work has been done to model the superheated gas which is confined within a vortex.

Los Alamos researchers believe they can come up with a stable flow configuration

Heat transfer to the working fluid (propellant) is by thermal radiation, mostly in the ultraviolet, given off by the fission gas at a working temperature of around 25 000°C.

In the forty years since the Rover program, hundreds of millions of dollars have been spent in plasma research and in developing powerful computational modeling capabilities. The most notable efforts in these areas were the fusion energy programs and the nuclear weapons programs. Both of these large programs relied heavily upon benchmarked computational models to examine stability, operations, and technical feasibility prior to executing expensive experiments. Similarly, the concept of a gas-core nuclear reactor can now be examined computationally before large, expensive and hazardous test facilities must be constructed.

Recently, a new, small effort was initiated to seriously assess the feasibility of the gas core concept using the computational tools and expertise at Los Alamos. By applying the knowledge developed over fifty years as part of the nuclear weapons program, the question of developing a rocket that truly opens up the solar system to manned exploration might finally be answered.

From the inception of this project, the complexities and difficulties inherent in the GCNR concept have been recognized. This is a hard problem. Initially, the research has focused on modeling the cylindrically symmetric configuration wherein an annular injection of hydrogen forms a recirculation vortex in the chamber. Once formed, the vortex is replaced with a uranium vortex which will go critical, heat up to around 5 eV, and radiatively couple to the surrounding hydrogen to produce thrust. So far, five different computer codes have been exercised to assess their capability to model vortex formation and stability in a cylindrically symmetric geometry. From the past few months we have ascertained the following for the cylindrical configuration:

1) flow through the base plate can alter the location of the vortex allowing for active control but can actually destroy the vortex if too high a mass flow is injected;

2) the strength of the vortex, the vorticity, depends almost wholly on the inlet velocity for annular injection;

3) for conditions with high levels of vorticity, no shedding or breakup of the vortex was observed;

4) fuel pellet injection and subsequent evaporation appears to be a viable concept for start-up and fuel-loss recovery;

5) "vacuuming out" the fuel back through the base plate appears to be a viable shut-down concept;

6) diffusion of the fuel throughout the propellant volume appears to occur rapidly for the cylindrical configuration, so that fuel retention is low;

As the result of these studies, we have determined that the cylindrical configuration will not scale to full size because the full-scale mass flow will be between 2 to 6 kg/s which, for an annular injection with a radius of .75 to 1.0 meter, would mean the thickness of the annulus would be quite narrow. A narrow injection results in the thickness of the hydrogen propellant between the uranium and the wall will be narrow and relatively transparent to the emitted radiation. The result is that wall heating will be high, propellant heating will be low, and the configuration is not practicable.

During the short time this project has been underway, the team at Los Alamos has made exceptional progress (Thode 1997) in understanding the physics inherent in an open-cycle gas-core rocket, in developing the computational tools to pursue design of a stable configuration, in identifying strengths and deficiencies of those tools, in testing several computer codes against existing data, and in generating an intrinsic "feel" for what operational conditions will be required to make a gas-core rocket feasible. Eventually, we intend to examine critical issues such as shear-flow-turbulence losses of the uranium, mixing caused by displacement of the vortex due to acceleration, the need for sufficient residence time of the propellant in the chamber, fission product removal, and stability of the vortex.

As a result of our efforts so far, the team is confident that a gas core reactor can be built in a stable configuration and driven critical with substantial power generation. The questions of final performance with regards to fuel-loss rate, specific impulse, and mass will depend upon the integration of many factors into the final design.

Steve Howe reviewed the gas core work for consideration for Mars Missions

As a result of the Los Alamos effort, a new geometric configuration for a gas core rocket has been formulated. Conceptually, a high speed jet of gas is injected axially into the reaction chamber. As the jet expands across the chamber, some of the gas will exit through the nozzle but some will be recirculated along the outer wall. The recirculation creates a toroidal vortex. In a nuclear system, uranium is injected into the vortex. Driving the uranium to criticality will heat the gas along the axis to extreme temperatures providing a very high specific impulse thrust. This configuration has the advantages of having the highest heat flux be at the centerline of the hydrogen flow stream, of having a thick hydrogen barrier around the walls, and of not suffering the uranium migration loss mechanism.

There is little doubt that a gas core reactor can be built in a stable configuration and driven critical with substantial power generation. The questions of final performance with regards to fuel-loss rate, specific impulse, and mass will depend upon the integration of many factors into the final design and must be experimentally investigated. This is the next step.

Gas core reactor rockets at wikipedia

Links to other research

The old 1970's solution to protecting the side walls from the superheated gas:

The basis of the design is the injection of a cold layer of hydrogen around the walls of a spherical cavity. This layer prevented the radiation from the hot uranium in the center to reach the walls. The drawback of this design, though, rests in the fact that as the ship accelerates, the heavier uranium in the center tends to migrate backward toward the nozzle and is vented in the exhaust stream. Although the estimated loss rate of 1 kg-uranium to 400 kg-hydrogen is deemed acceptable from a mission performance point-of-view, this configuration is no longer under consideration.

A 1992 study of the nuclear gas core light bulb concept

MSNBC promotes space solar power study

MCNBC is covering the online space solar power study. I have provided several comments to the space solar power study. The MSNBC article has boosted traffic by several times at the study site.

Space-Based Solar Power study article at space.com

Memjet printer details

This article has pictures and video of Memjet prototypes.

Something that will be interesting to watch is a potential shakeup of the $100+ billion printer (photo printer) and copying markets. This is a new physical manufacturing (of printed pages) process with large disruptive potential. It will be instructive as a small preview of what the impact of molecular manufacturing and a nanofactory might look like. Memjet could also be the basis for disruptions in 3D printing and rapid manufacturing and rapid prototyping.


See a lot more video at the companies website

They plan to release products in the first half of 2008.

What Memjet does is hard to believe:

It prints letter-size output at 60 ppm—that's one page per second—with a 1,600- by 1,600-dot-per-inch (dpi) printer that Silverbrook says will be available in 2008 for maybe $200 to $300. Not only that, but the projected cost per page is less than 2 cents for a monochrome page and less than 6 cents for a color page

The printheads are a major piece of Memjet technology, spanning the printer's page width so they can print across the entire width at once.

They consist of an array of individual microchip segments, with 6,400 nozzles in each 20mm-long chip, and as many chips as needed for the width of the particular printer. That means there are fewer chips in a dedicated photo printer, for example, than in a letter-size printer.

The second piece is a driver chip that, in the case of the letter-size model, drives 70,400 nozzles and calculates the firing of 900 million drops per second, according to Silverbrook. The remaining two pieces of the puzzle are ink that's designed to work with the printer and software that makes it easy to create drivers for the particular printer model.

If you're familiar with HP's Edgeline technology, you'll notice a similarity in concept, with a printhead that spans the width of the page, and also a similarity in some of the key numbers. HP's approach, however, is very different in detail, and far more expensive. The HP CM8060 Color MFP, introduced in early April, for example, claims a peak speed of 70 to 71 ppm, and an average speed of 60 ppm for monochrome and 50 ppm for color, with a base configuration price—if you were to buy it outright—of well over $23,530

More information at Texyt on why this is not a hoax

PC magazine discusses the Memjet technology roadmap

Memjet technologies plans to increase its output sixfold in two to three years, to a theoretical output of 360 pages per minute from an ordinary printer. Memjet also has set its sights on the commercial printer market, hoping to change newspaper and magazine printing. Future plans include a commercial printer capable of an unheard-of 64,000 pages per minute.

Silverbrook (research company for which Memjet Technologies sells) was founded by Kia Silverbrook, who has spent a decade perfecting the technology. The U.S. Patent Office has approved 1,452 patents with Silverbrook's name on them, more than Thomas Edison. The third most recent? A patent for placing a printer in a cellular phone – which Silverbrook has demonstrated a working model of as well, said Bill McGlynn, the chief executive of Memjet's home and office business.

The Memjet technology depends on something else: the rate at which the ink can be squirted through the micronozzles.

And Memjet executives said they're already thinking about the future. "This is not a one-trick pony," McGlynn said.

The Memjet heads cycle at 20 KHz, enough to produce the 60 pages per minute on the A4 printer. "But that's not that fast," McGlynn said.

Other inkjets cycle at 24 KHz. Memjet's plan is to develop a 120-KHz cycle head in two to three years, increasing the print speed sixfold to 180 pages per minute at photo quality, 360 pages per minute at normal color quality, and 720 pages per minute in draft mode.

Another thing the company could do is add more rows of nozzles. Already, the company uses 10: two each for the CMYK (Cyan, Magenta, Yellow, Black) inks, plus an additional back-ink nozzle. There's no reason why a customer couldn't "stack" the nozzles in four or five series of rows, placing more rows of inks on the paper and speeding up the process even further. One of Memjet's customers are talking about placing heads on the front and back, doubling the Memjet effective output by printing in duplex mode, Beswick said. Another is considering a black-and white office printer, she added.

World's Largest Submillimeter Telescope being built for 2013

A consortium has been created to oversee the building of a $100 million 25-meter submillimeter telescope on a high elevation in Chile

Because submillimeter-wavelength astronomy is especially effective for imaging phenomena that do not emit much visible light, the Atacama telescope will allow observations of stars and planets forming from swirling disks of gas and dust, will make measurements to determine the composition of the molecular clouds from which the stars are born, and could even discover large numbers of galaxies undergoing huge bursts of star formation in the very distant universe.

Also, the 25-meter telescope could be used to study the origin of large-scale structure in the universe. The telescope will 6 to 10 times the light as the 10.4 meter Caltech Submillimeter Observatory.

July 25, 2007

More on graphene paper

Technology Review reports on graphene paper which could be mixed with polymers or metals to make materials for use in aircraft fuselages, cars, and buildings.

In theory, graphene sheets could be superior to all other materials, Ruoff says, "with the possible exception of diamond".

Researchers at Northwestern University have reassembled one-atom-thick graphene sheets that make up soft and flaky graphite crystals in order to create a tough, flexible, paperlike material. Credit: Dmitriy Dikin

Rodney Ruoff, a Northwestern nanoengineering professor who led the work, published in Nature this week, says that the methods behind making the novel graphene paper could lead to even stronger versions. Right now, water molecules hold together the individual 10-nanometer-thick graphene flakes to create the micrometers-thick graphene paper. By using other chemicals as glues, the researchers could make ultrastrong paperlike materials with various properties. "The future is particularly bright because the system is very flexible ... The chemistry is almost infinite," Ruoff says.

Ruoff's idea was to "disassemble graphite into individual layers and reassemble them in a different way than they are in graphite." The goal was to find a way to glue the graphene platelets together while reassembling them, which would create a tough and flexible material.

Since it's hard to separate the graphene sheets in graphite, the researchers first used an acid to oxidize graphite and make graphite oxide. Then they put the graphite oxide in water. Individual graphene-oxide sheets easily separated in water.

When the researchers filtered the suspension, the graphene-oxide flakes settled down on the filter, randomly overlapping with each other. Water glued the flakes together; its hydrogen atoms bonded with the carbon atoms in adjacent flakes. The result was a dark-brown, thin, flexible graphene-oxide paper. By adjusting the concentration of graphite oxide in the water, the researchers changed the thickness of the paper, ranging from 1 to 100 micrometers.

Ruoff says that he can alter graphene's chemistry in other ways to change its electrical properties and make it an insulator, a conductor, or even a semiconductor.

That electrical versatility combines with an ultrastrong material has some observers excited. "They haven't used any tough glue between the [graphene platelets]," Geim says. "I expect very, very tough materials if a proper glue between graphene is used."

My previous article on graphene paper

Spinal cord injury therapy developed to prevent paralysis

Researchers at the Sloan-Kettering Institute for Cancer Research studied rats with crushed spinal cords.

The scientists found treatment soon after injury, combining radiation therapy to destroy harmful cells and microsurgery to drain excess fluids, significantly helped the body repair the injured cord. The scientists, led by Nurit Kalderon, said their findings demonstrate conventional clinical procedures hold promise for preventing paralysis due to spinal cord injuries

More military laser projects

Al Fin points out that Boeing has been awarded a High-Energy Laser Technology Demonstrator project to put a battlefield laser onto a truck

Has a summary of more military laser projects

The 100 kW JHPSSL lasers won’t be ready to deliver to the battlefield. The program aims to demonstrate technology that the armed services can adapt for their weapon platforms on the ground, in the air, or at sea. The high-energy laser and the beam-control system are “the two technology drivers” for weapon systems, says William Gnacek, HEL TD program manager at the Army Space and Missile Defense Command. Once those technologies are demonstrated, a ruggedized laser and beam-control system will be integrated with power generation, thermal management, and fire control and communications systems for use on a wheeled vehicle to be tested against rockets, artillery and mortars in 2013.

Military agencies are also looking at less mature technologies. Neice is working on a joint program to develop fiber lasers, which he says have the potential to match chemical-laser efficiency. Key technical issues are raising output of single-mode lasers and developing ways to combine their output into a high-quality beam. The Defense Advanced Research Projects Agency (DARPA, Arlington, VA) has launched a program called Architecture for Diode High Energy Laser Systems to develop diode lasers with efficiency greater than 60% and high-quality, low-divergence beams delivering 10 kW.

This past article shows that I believe the incremental improvement to military capabilities from lasers is less important than the revolutionary change of successfully deploying arrays of cheap, high efficiency lasers for launching vehicles into space

The scientific and technological advancement that is enabled by using lasers to understand and modify matter will have far more impact that crudely using lasers to blast things in battle More powerful weapons will come from using lasers to help unlock things like nuclear fusion.

Intel makes silicon optical modulator with 40 Gigabits per second speed

Intel has fabricated the first modulator made from silicon that can encode data onto a beam of light at a rate of 40 billion bits per second, or gigabits. Modulators are key components in using lasers to send data down fiber-optic cable.

Such speeds -- roughly 40 times faster than the most sophisticated corporate data networks -- now require expensive materials, a factor that helps push the cost of existing 40-gigabit modulators into the thousands of dollars. Intel, which boasts the biggest revenue among companies that make silicon chips, wants to use that material to create much less-expensive communication components, an effort it calls "silicon photonics."

Exactly when the company may offer such components, and how much they will cost, hasn't been determined. But Mr. Paniccia says the company is committed to commercializing silicon photonics technology in some fashion by the end of the decade. He adds that such laser components need to cost in the neighborhood of $5 each to be commercially viable.

Luxtera Inc., a start-up in Carlsbad, Calif., is planning to enter the market in the fourth quarter of 2007 with chips that include the equivalent of four lasers, each of them able to send 10 gigabits of data a second. Alex Dickinson, Luxtera's chief executive, said Intel's development is interesting from a scientific point of view. But he argues that Luxtera's approach can bring practical benefits sooner, for applications such as connecting together servers to create a supercomputer.

Luxtera's goal is to bring to market a complete optical link -- two transceivers and the fiber -- for the price of a copper interconnect ($200).

Today's optical links built with discrete components costs around $700 per 10 Gbps, versus $200 for the copper version. Copper is actually fine at 10 Gbps for up to 2 meters. But once you get beyond 10 Gbps or a 2 meter reach, you need more electronic componentry and more copper to maintain the communications. Copper interconnects tend to max out at around 50 meters or so. By contrast, a single optical fiber strand can carry data up to two kilometers. The other big downside to copper is its manageability. As communication distances increase, thicker copper cabling is required, increasing its weight and bending radius. Also, power consumption becomes an issue as data rates rise and communication distances increase. The 4 by 8 millimeter chip will be housed in a standard optical transceiver form factor so that it can be plugged into existing communication hardware. Their CMOS chip is being fabbed by Freescale Semiconductor using a 130nm SOI process.

It seems likely the Luxtera chip will be successful and move to smaller lithography processes for lower price and faster speed.

Toyota begins Plug-in Hybrid tests in California and Japan

The NY Times reports that Toyota has started testing Plug in Hybrid versions of the Prius

The Mercury news has extra information on its blog and in the main SJ Mercury newspaper

* Susan Shaheen, a UC-Berkeley researcher, said she’s still figuring out the methodology for the studies into how people react to PHEVs, but that she anticipates using travel diaries, logs, surveys and other tools. It’s likely, she said, that this means real people will get some seat time — “exactly who, I don’t know,” she said.

Her 2-year, $750,000 grant will be shared with Tim Lipman, who will study more technological issues including energy usage and impact upon the environment.

* Jaycie Chitwood, a Toyota senior strategic planner, told me that the company will have a few of the prototype plug-in hybrids, too, here in the U.S. in addition to those going to Berkeley and Irvine. In Japan, they’ll have eight others. The vehicles are being built in Japan, too.


Epigenetics refers to heritable traits (over rounds of cell division and sometimes transgenerationally) that do not involve changes to the underlying DNA sequence

Epigenetics was featured on Nova Science Now

Epigenetics is proving we have some responsibility for the integrity of our genome," Jirtle says. "Before, genes predetermined outcomes. Now everything we do—everything we eat or smoke—can affect our gene expression and that of future generations. Epigenetics introduces the concept of free will into our idea of genetics."

With no more than a change in diet, laboratory agouti mice (left) were prompted to give birth to young (right) that differed markedly in appearance and disease susceptibility.

As Discovery magazine pointed out, the new science of epigenetics rewrites the rules of disease, heredity, and identity.

Back in 2000, Randy Jirtle, a professor of radiation oncology at Duke University, and his postdoctoral student Robert Waterland designed a groundbreaking genetic experiment that was simplicity itself. They started with pairs of fat yellow mice known to scientists as agouti mice, so called because they carry a particular gene—the agouti gene—that in addition to making the rodents ravenous and yellow renders them prone to cancer and diabetes. Jirtle and Waterland set about to see if they could change the unfortunate genetic legacy of these little creatures.

Typically, when agouti mice breed, most of the offspring are identical to the parents: just as yellow, fat as pincushions, and susceptible to life-shortening disease. The parent mice in Jirtle and Waterland's experiment, however, produced a majority of offspring that looked altogether different. These young mice were slender and mousy brown. Moreover, they did not display their parents' susceptibility to cancer and diabetes and lived to a spry old age. The effects of the agouti gene had been virtually erased.

Remarkably, the researchers effected this transformation without altering a single letter of the mouse's DNA. Their approach instead was radically straightforward—they changed the moms' diet. Starting just before conception, Jirtle and Waterland fed a test group of mother mice a diet rich in methyl donors, small chemical clusters that can attach to a gene and turn it off. These molecules are common in the environment and are found in many foods, including onions, garlic, beets, and in the food supplements often given to pregnant women. After being consumed by the mothers, the methyl donors worked their way into the developing embryos' chromosomes and onto the critical agouti gene. The mothers passed along the agouti gene to their children intact, but thanks to their methyl-rich pregnancy diet, they had added to the gene a chemical switch that dimmed the gene's deleterious effects.

Epigenetics along with RNA interference and RNA activiation are providing tools for the control of genetic expression. Epigenetics has been used to put certain diseases in people into remission.

A step toward quantum computing with neutral atoms

Physicists at the Commerce Department’s National Institute of Standards and Technology have induced thousands of atoms trapped by laser beams to swap “spins” with partners simultaneously.

This "quantum square dance" may be useful in quantum computing with neutral atoms. Atoms are loaded into individual sites of a 3D grid of light made with laser beams. Initially all the atoms have the same "spin," as indicated by their consistent color. Then, a radio-frequency field (shown as semi-transparent planes) is applied to flip the spins of atoms in every other site, and the sites are paired up, with one atom of each pair spin up (or 1) and the other spin down (or 0), as indicated by the two different colors. Then, all pairs are merged, which causes the atom partners to swap spins repeatedly. These oscillations have the effect of periodically "entangling" the atom pairs, a quantum phenomenon that links their properties even if they are later physically separated. Illustration: Trey Porto/NIST

The swapping process is a way of creating logical connections among data, crucial in any computer. A logic operation is the equivalent of an “if/then” statement, such as: If two qubits have opposite states, then they should exchange values. The logical connections in quantum computers are created using entanglement, which in effect allows for multiple simultaneous, correlated possibilities.

The NIST experiment was performed with about 60,000 rubidium atoms in a Bose-Einstein condensate (BEC), a special state of matter in which all atoms are in the same quantum state. They were trapped within a three-dimensional grid of light formed by three pairs of infrared laser beams. The lasers were arranged to create two horizontal lattices overlapping like two mesh screens, one twice as fine as the other in one dimension. This created many pairs of energy “wells” for trapping atoms.

The scientists attempted to place a single atom in each well, with one atom spin up (or 1) and the other down (or 0). Then, they merged all double wells to force each pair of atoms into the same well, where they could interact with each other. When two such identical atoms are forced into the same physical location, quantum mechanics imposes a specific type of symmetry (only two of four seemingly possible combinations of quantum states are allowed). Due to this restriction, the merged atoms oscillate between the condition in which one atom is 1 and the other is 0, to the opposite condition. This behavior is unique to identical particles.

As they swap spins, the atoms pass in and out of entanglement. At the “half-swap” points the spin of each atom is uncertain and, if measured, might turn out to be either up or down. But whatever the result, a measurement on the other atom, equally uncertain before the measurement, would be sure to be the opposite. This entanglement is the key feature that enables quantum computation. According to Porto, the work reported in Nature is the first time that quantum mechanical symmetry (“exchange symmetry”) has been used to perform such an entangling operation with atoms.

The current set-up is not directly scalable to an arbitrary computer architecture, Porto says, since it performs the same spin-swap in parallel for all pairs of atoms. Researchers are developing ways to address and manipulate any pair of atoms in the lattice, which should allow for scalable architectures. Furthermore, not all atoms participated in the swap process, primarily because of imperfect initial loading of the atoms in the lattice. (Some double-wells contained only one atom and had no partner to exchange with.) The scientists estimate that the swap worked for at least 65 percent of the double wells.

Graphene oxide paper created

Researchers at Northwestern University have fabricated graphene oxide paper which is remarkably stiff and strong yet lightweight material and should find use in a wide variety of applications.

"The mechanical, thermal, optical and electrical properties of graphene are exceptional," says Ruoff. "For example, the stiffness and strength of these graphene-like sheets should be superior to all other materials, with the possible exception of diamond."

To form the graphene oxide paper, the group oxidized graphite to create graphite oxide, which falls apart in water to yield well-dispersed graphene oxide sheets. After filtering the water, the team was able to fabricate pieces of graphene oxide 'paper' more than five inches in diameter and with thicknesses from about one to 100 microns, in which the individual micron-sized graphene oxide sheets are stacked on top of each other.

"I have little doubt that very large-area sheets of this paper-material could be made in the future," Ruoff notes.

In addition to their superior mechanical properties as individual sheets, the graphene oxide layers stack well, which could be key to the development of other materials.

"You can imagine that these microscale sheets may be stacked together and chemically linked, allowing us to further optimize the mechanical properties of the resulting macroscale object," Ruoff says. "This combination of excellent mechanical properties and chemical tunability should make graphene-based paper an exciting material."

Ruoff sees a wide variety of application for graphene oxide paper, including membranes with controlled permeability, and for batteries or supercapacitors for energy applications. Graphene oxide paper could also be infused to create hybrid materials containing polymers, ceramics or metals, where such composites would perform much better than existing materials as components in, for example, airplanes, cars, buildings and sporting goods products.

LA Times editotial supports deadly coal: they are wrong

the LA times has an editorial in support of coal and natural gas power and against nuclear power

A Musing Environment points out that California has legislation that prevents the use of new coal power for electricity

Information on California's energy usage

The coal that California uses is generated out of state. About 40% of the air pollution still flows to adjacent states and countries.

We support Lee points out more problems with the LA Times position

I have written a lot about how coal pollution kills 30,000+ people each year in the USA, costs $160 billion in health and business costs each year

The deaths and costs from coal are not addressed in the LA Times editorial.

Particulates from coal and Nitrogen oxide and mercury and other pollutants kill over 30,000 americans every year (half of the deaths at Hiroshima, over 30 times annual US Iraq war deaths)

These figures have a lot of support in the medical journals.

LA has some of the worst air pollution in the United States and some parts have 15-40% greater risk of death because of air pollution. 1200 people die each year from air pollution from LA's ports.

The study, which will be published in the November issue of Epidemiology, found the risk of death rose by 11 to 17 percent from the cleanest parts of Los Angeles to the most polluted areas of Riverside and San Bernardino counties to the east.

The risk of fatal heart disease rose by between 25 percent to 39 percent as the concentration of fine particles in the neighborhood's air rose by a measure of 10 micrograms per cubic meter of air, the study showed.

Data from monitoring sites within Los Angeles show that the concentration of such airborne particles -- tiny specks of solids and droplets of acids and other chemicals -- rises by almost 20 micrograms per cubic meter as commuters head east from L.A.'s wealthier, westside neighborhoods.

LA has more asthma because of air pollution:
Funded by the National Institute of Environmental Health Sciences, a federal agency, the study found that children living in homes with a higher concentration of nitrogen dioxide -- a pollutant found in car exhaust -- had an 83 percent higher chance of developing asthma.

The study, based on an analysis of data on almost 23,000 people tracked by the American Cancer Society, also found that the risk of death from diabetes almost doubled in the more polluted areas of Southern California.

State of California agencies estimate that 1,200 Southern Californians die every year from the soot and smog coming from those ports.

The American Heart Associations position on air pollution
People living in the most polluted U.S. cities could lose between 1.8 and 3.1 years because of exposure to chronic air pollution.

Nuclear "waste" is 95% unburned nuclear fuel. There are nuclear reactors (molten salt) that were built in the 1960s and 1970s which can generate electricity from the "waste".

Nuclear waste is contained in vats, pools or cans. Coal pollution including tons of Uranium and thorium goes into the air, food and lungs.

Coal makes our traffic worse. Coal costs billions in health and business costs (acid rain property damage, smog causes airline delays for poor visibility)

Land usage problem for renewable energy

The blog, we support lee, describes articles by the New Scientist and Science Daily which describe how renewable energy has a land usage problem when scaled up.

Renewable does not mean green. That is the claim of Jesse Ausubel of the Rockefeller University in New York. Writing in Inderscience's International Journal of Nuclear Governance, Economy and Ecology, Ausubel explains that building enough wind farms, damming enough rivers, and growing enough biomass to meet global energy demands will wreck the environment.

Futurepundit points out that renewables do not have to be that bad for land usage I agree that renewables should be a fairly big part of the solution. Up to 30-50%. We just have to not be stupid and build willy-nilly without thinking about mitigating the downsides. Nuclear energy should also be a big part of the mix. What we need to eliminate is coal power and then oil usage.

Nanomechanical computer project

Nanowerk reports on researchers who are working to build nanomechanical computers

The 9 page paper "A nanomechanical computer—exploring new avenues of computing" by Robert H Blick1, Hua Qin, Hyun-Seok Kim and Robert Marsland is here

It has to be clearly stated that current operating speeds of nano-electromechanical single electron transistors (NEMSETs) are of the order of 1 GHz, which is not competitive with standard complimentary metal oxide semiconductors (CMOS_

As we have found in recent measurements self-excitation can be exploited to generate mechanical oscillations without any ac excitation. Hence, dc voltages are sufficient to operate the NMC. Basically, a dc voltage creates an electric field to support mechanical oscillations of the nanopillars. A classical example is straightforward to construct. It has to be noted that onset of the mechanical oscillations is induced by a thermal fluctuation, which is found to be enhanced, if the electrical field is inhomogeneous.

The current work that is described as nanomechanical, will still be using DC current. However, a mechanical piece the pillar controls the flow of current.

We propose a fully mechanical computer based on nanoelectromechanical
elements. Our aim is to combine this classical approach with modern nanotechnology to build a nanomechanical computer (NMC) based on nanomechanical transistors. The main motivation behind constructing such a computer is threefold:
(i) mechanical elements are more robust to electromagnetic
shocks than current dynamic random access memory (DRAM) based purely on
complimentary metal oxide semiconductor (CMOS) technology,
(ii) the power dissipated can be orders of magnitude below CMOS and
(iii) the operating temperature of such an NMC can be an order of magnitude above that of conventional CMOS.

They have designed all of the different types of circuits that are needed. Logic elements and memory

Eric Drexler's nanorod logic computer concepts:

Drexler's work on nanomechanical computer concepts is not mentioned. Chapter 12 of Nanosystems. They do discuss the potential for reversible computing implementation. A summary of Nanosystems is here

There was an analysis and simulation of the Drexler Nanocomputer architecture by Bryan Wagner

The Drexler idea was based on nanoscale rod logic

Mechanism for two nanocomputer gates, initial position. One control rod with two gate knobs is seen laterally; two more rods with knobs are seen end on. Each rod with associated knobs is a single molecule

Here is an array of rod logic

NOTE: Drexler chose to model this cruder system to show that even simple and easy to define mechanical processes could have interesting performance at the nanoscale

Chapter 10 of Robert Freitas's Nanomedicine book describes nanomechanical (Section 10.2.1) and nanoelectronic (Section 10.2.2) computers, biocomputers (Section 10.2.3), and briefly examines the ultimate limits to computation including reversible and quantum computing (Section 10.2.4).

Atomic Layer Deposition

atomic layer deposition (ALD), is a thin-film growth technique that offers the unique capability to coat complex, three-dimensional objects with precisely fitted layers.

The three images illustrate how a combination of anodized aluminum oxide (AAO) and atomic layer deposition (ALD) provides precisely controlled, ultra-uniform porous support for new and well-defined catalysts. Credit: ANL

The scientists expose an object to a sequence of reactive gas pulses to apply a film coating over the object's surface. The chemical reactions between the gases and the surface naturally terminate after the completion of a "monolayer" exactly one molecule thick. ALD can deposit a variety of materials, including oxides, nitrides, sulfides and metals.

Potential applications are more efficient and less costly solar cells, solid-state lighting and industrial catalysts, improved superconductors and separation membranes.

July 24, 2007

China Yuan may have a one day 3.5% revaluation

China may strengthen the yuan by as much as 3.5 percent in a single day to cool the economy and appease U.S. lawmakers, said Glenn Maguire, chief Asia economist at Societe Generale SA.
Such a revaluation would take the yuan to about 7.3 to one US dollar.

This would be in line with my projection that China's economy will pass the USA on an exchange rated basis by 2020

Current trends appear to indicate currency appreciating faster than I had projected and economic growth in China faster than I projected and economic growth in the United states slower than I had projected. If those trends continue China's economy could conceivable pass the United States as early as 2016.

``The prospect of a one-off revaluation, which the market hasn't factored in, is quite possible,'' Hong Kong-based Maguire said yesterday. ``They may move 2.5 to 3.5 percent, rather than persisting with this gradualist appreciation.''

A stronger yuan may slow the economy by making goods sent overseas more expensive, helping reduce China's record trade surplus. The appreciation also may deflect criticism from U.S. politicians, who are proposing new duties on exports from countries that use their currencies to put American companies at a disadvantage.

The yuan rose as high as 7.5615 against the dollar, the strongest since China started managing it against a basket of currencies of its trading partners on July 21, 2005. It was at 7.5638 as of 1:37 p.m. in Shanghai. The yuan is allowed to move up to 0.5 percent from a daily rate fixed by the central bank.

A revaluation would be a ``direct appeasement to overheating Washington politicians,'' Maguire wrote in a research report. He sees the yuan rising 5 percent by year-end, finishing at 7.20, the most bullish of 27 economists surveyed by Bloomberg News

Irene Cheung, an economist at ABN Amro Bank NV, predicts gains of 2.2 percent to 7.40, saying China will use other tools to cool the economy.

Maguire says monetary tools used by China haven't worked, so they'll turn to a stronger yuan.

There are others who think there will be a faster pace of revaluation

It's a bit out of control," said Chen Zhao, global investment strategist at BCA in Montreal, who arrived back from China yesterday. "I think they're going to raise rates and let the currency go -- 8% or 10% a year. It has been 5% a year but they'll probably double the pace.

Charles Dumas, director of Lombard Street in London, said however the country needs a 100% revaluation to take the yuan down to under four per U.S. dollar to fend off gross overheating, soaring inflation and a lurch towards trade protectionism led by the U.S. Congress and France.

China will revalue as fast as they can while still enabling enough growth to keep unemployment under control.

Liftport the space elevator company to keep operating

More bendable Optical fiber will broader ultrabroadband deployment

New production process for single walled carbon nanotubes

Kenji Hata and Tatsuki Hiraoka of the Nano-Carbon Materials Team, the Research Center for Advanced Carbon Materials of the National Institute of Advanced Industrial Science and Technology (AIST) and Zeon Corporation have jointly developed a technology to synthesize a large amount of single-walled carbon nanotubes directly on large area metal substrates for the first time.

Certain aspects of the process are 100 times improved over previous processes.

The research team designed and built a trial synthesis furnace that can utilize the newly developed technology jointly with Zeon Corp., and successfully synthesized single-walled carbon nanotubes with a uniform structure spanning an A4 size foil substrate. This large area synthesis is a 100-fold jump from conventional levels, and production was scaled in units of grams.

Single-walled carbon nanotubes synthesized by this newly-developed technology grow upward vertically from the metal foil, and it takes only 10 minutes to form a structure of 1 mm height. The single-walled carbon nanotubes exhibit excellent properties, including the world's highest level of purity and greatest specific surface and length, similar to those synthesized on silicon substrates, and they are considered to be promising for various applications such as super-capacitors and actuators. The newly developed technology reduces the substrate cost to one-hundredth of the existing cost

A previous article on carbon nanotube production

First commercial helium ion microscope

Carl Zeiss SMT said during SEMICON West July 2007 that it had shipped the worlds first ‘ORION' Helium ion microscope, developed by a company called ALIS that it acquired in 2006 to the National Institute of Standards and Technology (NIST) in Gaithersburg, MD.

This was part of a project calls for a new microscope for direct observation and analysis of individual nanostructures at an unprecedented resolution of 0.5 Angstrom -- approximately one-third the size of a carbon atom - a key dimension for atomic level research.

Image scanned by a helium ion microscope

According to Carl Zeiss SMT, this new breed of microscope is expected to provide images of unrivalled ultra-high resolution surface and material contrast, unachievable with state-of-the-art technologies of today.

The microscope uses a beam of Helium ions, which can be focused into a smaller probe size and reveal a much stronger sample interaction compared to electrons typically used in scanning electron microscopes (SEM), to generate the signals to be measured and imaged.

Scanning helium ion microscope details

Solar cell efficiency record of 42.8%

Using a novel technology that adds multiple innovations to a very high-performance crystalline silicon solar cell platform, a consortium led by the University of Delaware has achieved a record-breaking combined solar cell efficiency of 42.8 percent from sunlight at standard terrestrial conditions. This beats the previous 40.7% record of Spectrolabs.


UD researchers Christiana Honsberg and Allen Barnett

The consortium’s goal is to create solar cells that operate at 50 percent in production, Barnett said. With the fresh funding and cooperative efforts of the DuPont-UD consortium, he said it is expected new high efficiency solar cells could be in production by 2010.

The highly efficient VHESC solar cell uses a novel lateral optical concentrating system that splits solar light into three different energy bins of high, medium and low, and directs them onto cells of various light sensitive materials to cover the solar spectrum. The system delivers variable concentrations to the different solar cell elements. The concentrator is stationary with a wide acceptance angle optical system that captures large amounts of light and eliminates the need for complicated tracking devices.

The VHESC would have immediate application in the high-technology military, which increasingly relies upon a variety of electronics for individual soldiers and the equipment that supports them. As well, it is hoped the solar cells will have a large number of commercial applications.

Modern solar cell systems rely on the concentration of the sun’s rays, a concept similar to youngsters using magnifying glasses to set scraps of paper on fire. Honsberg said the previous best of 40.7 percent efficiency was achieved with a high concentration device that requires sophisticated tracking optics and features a concentrating lens the size of a table and more than 30 centimeters, or about 1 foot, thick.

The UD consortium’s devices are potentially far thinner at less than 1 centimeter. “This is a major step toward our goal of 50 percent efficiency,” Barnett said. “The percentage is a record under any circumstance, but it’s particularly noteworthy because it’s at low concentration, approximately 20 times magnification. The low profile and lack of moving parts translates into portability, which means these devices easily could go on a laptop computer or a rooftop.”

Honsberg said the advance of 2 percentage points is noteworthy in a field where gains of 0.2 percent are the norm and gains of 1 percent are seen as significant breakthroughs.


Barnett's main page at the University of Delaware

The researchers provide a 6 page solar cell FAQ sheet

The US photovoltaics industry has a 16 page roadmap with a goal of providing half of all new electricity generation by 2025 They have a cost target of 3.8 cents per kwh and 200 GW of installed power

A past article on nuclear power, wind and coal

A prior article that compares the cost of all energy sources. Nuclear, hydroelectric, wind and coal power would still be less expensive than the solar industry target of 3.8 cents but solar would be competitive in many situations.

Here is Gen III Molecular Beam Epitaxy (MBE) that is used by the team for crystal growth which offers the ability to control material composition down to the atomic layer level.

Quantum Cascade Lasers created

Several types of lasers exist today that can emit at desired infrared wavelengths, none of these lasers meet requirements for some applications because they are either too expensive, not mass-producible, too fragile or require power-hungry and inefficient cryogenic refrigeration. Applications for suitably portable laser systems include the use of infrared countermeasures to protect aircraft from heat-seeking missiles and highly sensitive chemical detectors for reliable early detection of trace explosives and other toxins at a safe distance for personnel.

A new type of semiconductor-based laser, called the Quantum Cascade Laser (QCL) are compact and suitable for mass production and Manijeh Razeghi, Walter P. Murphy Professor of Electrical Engineering and Computer Science at the McCormick School of Engineering and Applied Science, has recently made great strides in laser design, material growth and laser fabrication that have greatly increased the output power and wall-plug efficiency (the ability to change electrical power into light) of QCLs.

Demonstrations have been made of individual QCL lasers, 300 of which can easily fit on a penny, emitting at wavelengths of 4.5 microns, capable of producing over 700 milli-Watts of continuous output power at room temperature and more than one Watt of output power at lower temperatures. The lasers are efficient in converting electricity to light, having a 10 percent wall-plug efficiency at room temperature and more than 18 percent wall-plug efficiency at lower temperatures.

100 megawatt pulses of ultra-fast light make non-linear effects

Researchers at the U.S. Department of Energy's Brookhaven National Laboratory have generated extremely short pulses of light that are the strongest of their type ever produced and could prove invaluable in probing the ultra-fast motion of atoms and electrons. The scientists also made the first observations of a phenomenon called cross-phase modulation with this high-intensity light - a characteristic that could be used in numerous new light source technologies.

The light pulses used were in the terahertz (THz) range of the broad electromagnetic spectrum, found between the microwave and infrared range. Scientists send tight bunches of electrons at nearly the speed of light through a magnetic field to produce THz radiation at a trillion cycles per second - the terahertz frequency that gives the light its name and that makes them especially valuable for investigating biological molecules and imaging, ranging from tumor detection to homeland security.

By slamming an electron beam from an accelerator into an aluminum mirror, the researchers produced 100 microjoule (100 megawatt) single-cycle pulses - the highest energy ever achieved to date with THz radiation. For comparison, 100 megawatts is about the output of a utility company's electrical generator.

Using this strong light, researchers can "kick" molecular processes such as catalysis or electronic switching (important for developing data storage media) into action and watch their mechanisms on a very short timescale.

The team also found something surprising: the intensity of their THz pulses is so great that they introduce so-called "nonlinear optical effects," specifically, a phenomenon known as cross-phase modulation.

"When you pull on a spring, if you pull twice as hard, it stretches twice as much," said NSLS researcher Larry Carr. "But there's a limit where if you pull twice as hard, the spring doesn't move anymore. That's when it's called nonlinear. The same thing happens in materials. You let these short pulses pass through a material, and they stress it and pull some of the charges apart so they don't act in a linear manner."

As a result, the researchers can manipulate both the ultra-fast THz pulses and the material they interact with. Some of the simplest examples include changing the color of the light or turning the material into a focusing lens.

This is the first time cross-phase modulation has been observed in single-cycle THz pulses. Learning how to control this characteristic could lead to even more light source technologies.

I had a previous article about extreme temperature and light being used to control the quantum world of matter

California nuclear initiative

There is an initiative in California to enable a vote to repeal the ban on new nuclear plants In order to reach CO2 targets, California needs 4-5 new reactors and increase renewable power from 8% to 20% of energy generated.

Current California energy situation and three scenarios

Go to this link to sign up and support the initiative.

July 23, 2007

Resveratrol and SIRT1 gene can prevent neurodegenerative disorders like Alzheimers

Researchers demonstrated that activating SIRT1 and injecting resveratrol, which have both been previously associated with life-span extension in lower organisms, can also prevent cognitive problems in the mice.

Electrical guideways, road trains and platooning cars

The DARPA urban driving challenge is helping to enable automated driving which can be part of a larger vision of electrical guiderails and platooned cars which can enable large efficiency gains.

Here is a powerpoint that lays out the case for electrical guideways to increase car and truck efficiency by 3 to 6 times The cost would be comparable to building the national highway system.

BTW: dual mode means the vehicles can drive by themselves on regular roads when they choose not to attach to the guideway. So individuals will have the choice of where to drive. However, not driving on the guideway would be like driving on a very bad dirt road. Guideway versus regular driving would be faster, safer and more efficient.


A dual mode car looks like this.

50-80% drag reduction. This can also be achieved with robotically driven vehicles or advanced adaptive cruise control.

A dual mode vehicle can have train like contact to the guideway which enables 3 to 6 times the rolling efficiency

The rolling efficiency is from the hard wheels

The overall running efficiency is from the combined aerodynamic improvement and the rolling improvement

The total improvement includes the propulsion improvement

Ten times more cars per guideway. Some of this benefit can be achieved with car and truck platooning using advanced cruise control.

Miles of guideway needed is about 100 miles per million people.

Costs to build the guideways is about one trillion dollars for the USA

However, the savings are

$140 billion/year for drive time
28 billion hours per year of drive time @ $5/hr

$125 billion/year from traffic delays

$100 billion/year from traffic accident reduction
$230 billion cost of car crashes

$100 billion/year from lower military and homeland security costs

$50 billion/year from reduced car maintenance

$150 billion/year from increased longevity of cars

$70 billion/year from less truck driving, less highway maintenance, more car sharing, reduction in other transportation subsidies

Total of more than $500 billion/year in savings would pay for the system in 2 years.

It would take a phased deployment over about 40 years to realize a complete deployment of electrical guideways and dual mode vehicles.

If nanofactories were developed then increased production capabilities could accelerate the deployment.

The source of the e guideway document has a website

Revisions of the long range transportation vision

An examination of road trains

A dual mode bus/truck. Notice the guiderail connection in the bottom middle.

Wikipedia has an entry for the future of the car

An article on road trains

An online powerpoint presentation on road trains

A wikipedia entry on autonomous cruise control

Startup company Ceravision has 50% efficient microwave light bulb

Startup company Ceravision Ltd. (Milton Keynes, England) has said it has invented a 50% efficient microwave-powered light bulb that is more efficient than filament (5% efficient) or fluorescent (15% efficient) lighting and with a long stable lamp life. The company said it has prototypes available for evaluation by lamp and electronics manufacturers.


Prototype microwave light

The Continuum 2.4 system comprises a microwave source and power amplifier, a microwave interface unit and a low-loss dielectric resonator with an interior void where noble gas is excited to produce light.

One of the breakthroughs claimed by Ceravision is the ability to prevent high levels of microwave power being reflected back to the source and damaging it at switch on. The microwave interface unit limits less than 0.5 percent of incident microwave power is reflected back, the company said.

Instead the output is launched via a metal antenna into a metal-coated low-loss dielectric resonator. The mechanical dimensions of the resonator determine the ultimate performance of the lamp system and where the microwave energy will be focused. At the focal point the resonator has a cavity into which the electrode-less "burner" is inserted.

Ceravision hompage

Article at the economist on the Ceravision microwave light

Because the lamp has no filament, the scientists who developed it think it will last for thousands of hours of use—in other words, decades. Moreover the light it generates comes from what is almost a single point, which means that the bulbs can be used in projectors and televisions. Because of this, the light is much more directional and the lamp could thus prove more efficient than bulbs that scatter light in all directions. Its long life would make the new light ideal for places where the architecture makes changing lightbulbs a complicated and expensive job. Its small size makes it comparable to light-emitting diodes but the new lamp generates much brighter light than do those semiconductor devices.

Another environmental advantage of the system is that it does not have to use mercury. The metal is highly toxic and is found in most of the bulbs used today, including the energy-saving bulb, fluorescent tubes and the high-pressure bulbs used in projectors. Its developers reckon it should be cheap to make.

With lighting accounting for some 20% of electricity use worldwide, switching to a more efficient system could save not only energy but also on emissions of carbon.

Switching from predominantly 5% efficient lighting to 50+% efficient lighting would save 17% of electricity used worldwide.

A past article on a 100 lumen LED light that is available in volume quantities

An article about a quantum lighting device

OLED Display and Lighting Markets to Expand to $10.9 Billion by 2012 Other projections are for $2.9 billion for the display market.

OLED TV and computer display market of over $500 million in 2007

The LED market reached $4.2 billion in 2006 and is set to emerge from its current state of low growth, according to Strategies Unlimited. Emerging applications including illumination will drive the market towards $9 billion by 2011.

A comparison of LED and compact flourescents

Reviewing conservation difficulties (in spite of problems we still must try):
In 2005, U.S. consumers spent about $1 billion to buy about 2 billion lightbulbs--5.5 million every day. Just 5%, 100 million, were compact fluorescents.

The Walmart promotion and the Al Gore push is trying to get it up to 200 million or 10% of bulbs in 2007. There are cheap ($1 and even saw 25 cent promotions at supermarkets) 40watt bulbs in california supermarket checkouts, but they are still not moving that well. Not much difference between the lowest price and pay your own stamp for a free one.

Walmart sold sold 40 million of the energy efficient bulbs compared with about 350 million incandescent bulbs. It was from August 2005 to August 2006 — not in 2005. Walmart took a series of steps to promote the energy efficient bulbs.

2005 title 24 was introduced to force more better light usage when californians remodel their homes with a permit. But when people remodel and flip homes most changed back to incandescent because the house looked better for resale. They got a higher price because the house showed better. When I speak to contractors and real estate agents this is common practice to circumvent title 24.

In 2005, only 28% of americans said they "plan to install measures to conserve energy at home before this winter," according to a survey taken last month by the National Oilheat Research Alliance.

Форма для связи


Email *

Message *