August 09, 2008

Finally Diamond Mechanosynthesis Viability Experiments funded for $3.1 million

Finally experiments have been funded to test the viability of diamond mechanosynthesis as described in detail by Robert Freitas and Ralph Merkle. This is a major step towards achieving the long held vision of molecular nanotechnology as envisioned by Eric Drexler.

UPDATE: Based on an interview of Robert Freitas in 2007.

Based on the computational chemistry work, their latest estimates suggest that an ideal research effort paced to make optimum use of available computational, experimental, and human resources would probably run at a $1-5M/yr level for the first 5 years of the program, ramp up to $20-50M/yr for the next 6 years, then finish off at a ~$100M/yr rate culminating in a simple working desktop nanofactory appliance in year 16 of a ~$900M effort.

Robert Freitas believes that early nanofactories necessarily will be extremely primitive. They will be very limited in the composition and complexity of products they can build and in the types of chemical elements and feedstocks they can handle. They will be fairly unreliable and will require significant supervision and maintenance. They will be relatively expensive to own and operate. Over a period of perhaps 10-20 years, nanofactory costs and capabilities will slowly improve and product costs will gradually drift downward toward the likely $1/kg regulatory floor, giving society some time to adjust to new threats as nanofactories become increasingly ubiquitous in our environment and economy.

The experimental proof of the nine molecular tools are the first funded steps on a journey of 27 years. [Unless after the first few years, funding is substantially increased and the effort is more coordinated then competing Manhattan projects could achieve results in half the time.]

Products of More Mature Molecular Nanotechnology
MNT-built diamond products can be at least ten times stronger than steel, 100 times stronger than aluminum or plastic.
-desktop computers with a billion processors
-inexpensive, efficient solar energy systems
-medical devices able to destroy pathogens and repair tissues
-materials 100 times stronger than steel
-superior military systems
-exponential manufacturing [I build one factory, it builds two factories, they build four factories and so on and so on]
-additional molecular manufacturing systems
More from the wikipedia entry
-smart materials and nanosensors
-medical nanorobots and nanomedicine
-Utility fog
-Phased array optics

Professor Philip Moriarty of the Nanoscience Group in the School of Physics at the University of Nottingham (U.K.) has been awarded a five-year £1.53M ($3.1M) grant by the U.K. Engineering and Physical Sciences Research Council (EPSRC) to perform a series of laboratory experiments designed to investigate the possibility of diamond mechanosynthesis (DMS). DMS is a proposed method for building diamond nanostructures, atom by atom, using the techniques of scanning probe microscopy under ultra-high vacuum conditions. Moriarty’s project, titled “Digital Matter? Towards Mechanised Mechanosynthesis,” was funded under the Leadership Fellowship program of EPSRC. Moriarty’s experiments begin in October 2008.

The five year grant is described here

Computer-controlled chemistry at the single molecule level, a field very much in its infancy, represents arguably the most exciting and, to many, definitive example of the power and potential of nanotechnology. Recent ground-breaking work in Germany and the US has shown that it is possible to drive chemical reactions and to synthesise molecules via interactions driven by a scanning probe. In the UK, the nanoscience groups at Nottingham, Birmingham, and Oxford have demonstrated that atomic/molecular manipulation strategies pioneered at low temperatures can be extended to a room temperature environment. The focus of this fellowship application is to develop next-generation protocols for scanning probe manipulation capable of automated atom-by-atom assembly of, ultimately, three dimensional nanostructures. Our goal is to program the assembly of matter from its constituent atoms. This exceptionally challenging objective has the potential to revolutionize key areas of 21st century science including nanofabrication, materials processing, surface chemistry, and the study of low dimensional electron systems.

Moriarty is interested in testing the viability of positionally-controlled atom-by-atom fabrication of diamondoid materials as described in the Robert Freitas-Ralph Merkle minimal toolset theory paper. Moriarty’s efforts will be the first time that specific predictions of DFT in the area of mechanosynthesis will be rigorously tested by experiment. His work also directly addresses the requirement for “proof of principle” mechanosynthesis experiments requested in the 2006 National Nanotechnology Initiative (NNI) review, in the 2007 Battelle/Foresight nanotechnology roadmap, and by EPSRC’s Strategic Advisor for Nanotechnology, Richard Jones (Physics, Sheffield University, U.K.).

“We congratulate Philip for his tremendous success in securing funding for this pathbreaking effort,” said Freitas. “We look forward to working together closely with his experimental team as this exciting project goes forward over the next five years.”


Philip Moriarty

Nanoscience group

School of Physics at the University of Nottingham

U.K. Engineering and Physical Sciences Research Council (EPSRC)

Leadership Fellowship program

Robert Freitas' site

Institute for Molecular Manufacturing

Mechanosynthesis debate

Ralph Merkle's site

The nanofactory vision is described here

Nanofactory publications

Dimer tool paper 2003

Theoretical Analysis of Diamond Mechanosynthesis. Part I. Stability of
C2 Mediated Growth of Nanocrystalline Diamond C(110) Surface 2004

Theoretical Analysis of Diamond Mechanosynthesis. Part III. Positional C2 Deposition on Diamond C(110) Surface Using Si/Ge/Sn-Based Dimer Placement Tools

High-level Ab Initio Studies of Hydrogen Abstraction from
Prototype Hydrocarbon Systems

Horizontal Ge-Substituted Polymantane-Based C2 Dimer Placement TooltipMotifs for Diamond Mechanosynthesis

Ab Initio Thermochemistry of the Hydrogenation of Hydrocarbon Radicals Using Silicon-,
Germanium-, Tin-, and Lead-Substituted Methane and Isobutane

Atomically precise manufacturing roadmap

Is China's economy being underestimated by official statistics ?

China's gross domestic product(GDP) totalled 13.0619 trillion yuan (1.9062 trillion U.S.dollars) in the first half of 2008, a 10.4 percent increase year on year. 3.8 trillion pace for the year. Assuming another 5% increase in the second half of the year and the currency strengthening from 6.85 to 6.65 then China's GDP would be at just over 4 trillion at the end of 2008.

The case was made by JXie at Fool's mountain that China's economic statistics are still being underestimated.

The main reasons [for believing that China's GDP is underestimated] are:

#1 China’s GDP deflator is larger and likely overstated compared to the US’. For instance, the 2Q08 China GDP deflator was at implicit 10.6%; and the 2Q08 US GDP deflator was at supposed 1.1%. The difference is breathtakingly extraordinary if you consider CNY was quite a bit stronger than USD between 3Q07 and 2Q08.

#2 China had a one-time 16.8% upward GDP revision in 2005, mostly readjusted for its understated service economy. Was the revision a one-time event, or likely repeated down the road? In 2007, the service economy of China was 39% of the total economy. For instance, Egypt, which has a roughly 30% lower per capita GDP, has its service economy at 54% of the total economy. Is China’s service economy less developed than Egypt’s, or is it simply understated by the Chinese statisticians — that will require further upward revisions down the road? I tend to believe it’s the latter. For anyone who has traveled to Egypt, judged by the available restaurants, shopping malls, and the number of domestic leisure travelers, it’s very hard to fathom China’s service economy isn’t anything but far more developed that Egypt’s.

How fast can CNY rise?
Black Swan: the yuan replaces the US dollar as the world reserve and trade currency.
Assume the Chinese yuan followed what the Brazilian Real did between 2003 and 2008. Brazil benefited from rising commodity prices and sounder monetary and fiscal policies, has seen its currency compared to USD per Big Mac Index rising from 48% undervalued, to 34% overvalued. Big Mac currency fair value is 3.5 yuan to one US dollar. 33% overvalued would be 2.6 yuan to one US dollar. (the Euro also swung from undervalued to overvalued over an 7-8 year timeframe). If the yuan had a move to that level then China’s GDP would overtake the US economy in 2013.

This sites analysis has been that China would overtake the USA in 2016. China's GDP deflator and service economy evidence of an underestimated economy are interesting observations and could suggest another 12-20% of the Chinese economy is under reported. However, a move to that strong a currency seems unlikely. China's economic leaders would try to fight it. It also assumes US and probably european currency weakness for another 5-8 years. If the economy were underestimated by 12-20% than a 2015 passing of the US economy could be possible.

Carnival of Space Week 66

August 08, 2008

Memjet printers officially delayed until 2009

Memjet technology, radically faster 360 page per minute printers, have been delayed until 2009.

Silverbrook's technology (which will be commercialized under the business name Memjet) was supposed to be released in early 2008, according to what company executives told me then. Now, a company spokeswoman says that the "A4/letter printhead and related components" will be shipped to OEMs by the end of this year, with products slated for sometime in 2009. This is consistent with "early timetables," according to the spokeswoman.

Memjet isn't going to manufacture the printers themselves. Instead, they're going to sell the components to OEMs, who can put their own stamp on the technology.

Delays associated with new technology are nothing new. Still, in 2007, Memjet officials promised: a photo printer, which the company hoped to sell for less than $150 by the end of the year or early 2008; the 8.5-inch x 11-inch (A4) color inkjet, due to arrive at the end of 2008 for under $200; a label printer; and a large-format photo printer, expected to cost about $5,000, and capable of printing poster-sized prints at rapid speed.

The Memjet technology uses a series of individual MEMS-based inkjet nozzles, fabricated using conventional semiconductor manufacturing techniques. Each chip measures 20 millimeters across and contains 6,400 nozzles, with five color channels, the company said. A separate driver chip calculates 900 million picoliter-sized drops per second. For a standard A4 letter printer, the result is a total of 70,400 nozzles.

This site covered the memjet printer last year on several occasions.

Memjet printers were one of the developments to watch for 2008.

More realistic 6 dimensional pictures and video

By producing "6-D" images, an MIT professor and colleagues are creating unusually realistic pictures that not only have a full three-dimensional appearance, but also respond to their environment, producing natural shadows and highlights depending on the direction and intensity of the illumination around them.

The process can also be used to create images that change over time as the illumination changes, resulting in animated pictures that move just from changes in the sun's position, with no electronics or active control.

The basic concept is similar to those inexpensive 3-D displays sometimes used on postcards and novelty items, that use an overlay of plastic that contains a series of parallel linear lenses that create a visible set of vertical lines over the image. (It is a different approach from that used to create holograms, which require laser light to create.) In addition to three-dimensional images, these are sometimes used to present a series of images that change as you view them from different angles from side to side. This can simulate simple motion, such as a car moving along a road.

By using an array of tiny square lenses instead of the linear ones, such displays can also be made to change as you change the viewing angle up or down - making a "4-D" image. This reveals different views with horizontal as well as vertical movement of the viewer. The new "lighting aware" system adds additional layers of lenses and screens to add two more dimensions of change. The image that is seen is then not only based on the position of the viewer, but also on the direction of the illumination.

The new system, still in a relatively low-resolution laboratory proof-of-concept, could have applications including pictures used for training purposes, he said. In training someone how to carry out industrial inspections, an image of the device to be inspected would respond just like a real object when the inspector shines lights on it from different angles, for example.

Because the system is being built by hand from custom-made parts, Raskar says, the present version costs about $30 per pixel to make. Since it takes thousands of pixels to create a recognizable image, practical devices at an affordable price will require significant further development. "It will be at least 10 years before we have any realistic practical-sized displays," he estimates.

MIT transportation plan

MIT has a transportation plan from now until 2035

For the near term (up to 15 years), we should increase our efforts to improve light-duty vehicle engines and transmissions, but all improvements must go toward increasing fuel efficiency rather than making cars bigger and faster. Also critical is reducing vehicle weight and size.

For the mid- and long-term (15-30 years, and more than 30 years), we should ramp up work on radically different technologies such as plug-in hybrids and hydrogen fuel-cell vehicles.

We must also develop and market more environmentally benign fuels based on nonpetroleum sources. In general, the use of biofuels will grow but not as fast as expected just a few years ago.

The final key is policy action. A coordinated set of regulatory and fiscal measures will be needed to push and pull improved technologies and greener alternative fuels into the market place in high volume. Measures should require auto manufacturers to make smaller, more-efficient cars, encourage consumers to choose those vehicles, and discourage everyone from driving so much.

The full MIT transportation report is here

Over the next 25 years, the fuel consumption of new vehicles could be reduced by 30-50 percent and total U.S. fuel use for vehicles could be cut to year 2000 levels, with greenhouse gas emissions cut by almost as much

What do you want to hear more about or see implemented ?

Which solutions do you care about most ?
Reducing CO2 from energy and transportation sources
Storing CO2 in cement or sequestering in the ground or ocean
Prevention of nuclear weapon usage
Reducing damage from nuclear and other weapons
Curing diseases (cancer, heart disease, Alzheimers, etc)
Effective anti-aging treatments
Poverty reduction
Starvation prevention
Peak oil solutions
Stronger economy (triple economic growth or more)
Improve humans (Safe singularity)
Other use comments free polls

August 07, 2008

Terrafugia Transition: Has working roadable Plane

The Transition "Personal Air Vehicle" is expected to be released in late 2009 and has just shown its operational prototype.

The estimated purchase price is $148,000. Owners will drive the car from their garage to an airport where they will then be able to fly within a range of 100 to 500 miles (800 km). It will carry two people plus luggage and will operate on a single tank of premium unleaded gas. It will have a 115 mph cruising speed.

With the Transition® development program still on schedule, Terrafugia's order book continued to grow at the show. The Transition® Proof of Concept begins its powered testing program, including drive, taxi, and flight testing, after returning to Terrafugia's Prototype Development Facility in Woburn, MA.

They responded to many doubters at in May, 2008

The real market for these vehicles is solidly in the hundreds of units per year. So they are not replacing cars but light sport aircraft. They are toys for the wealthy. This is an airplane first, and not a replacement for anybody's car.

There is a market for general aviation, so the question is what can we do to make it better, to make it safer. And I believe we're doing a lot to make it safer.

They are not competing with cars yet.

They are selling a vehicle that uses super-unleaded automobile gas, and that will get about 27.5 miles per gallon flying at 115 miles per hour, which is better mileage than most cars get on the highway right now, and at nearly twice the speed. So from a fuel-economy perspective, it's actually one of the greenest planes out there. And the Transition is such a light vehicle that the mileage should be
quite good on the road. We are expecting between 30 and 40 miles per gallon.

Potential Competitors are still flying models

They are flying big models and appear to be well funded and expect to be selling in 2010. They will be selling pricey $3 million vehicles that are more maneuverable and able to go where helicopters cannot for special jobs like building evacuations and medical evacuations. X-hawks can fit and fly in places too tight for helicopters. X-hawk has no exposed rotors that make it dangerous or impossible for helicopters to maneuver in complex urban and natural environments.

X-hawk would have
* Max speed: 155 mph (248 kph)
* Max altitude: 12,000 ft
* Endurance: 2 hrs of flight time

Urban Aeronautics' X-Hawk is a VTOL aircraft which operates much like a tandem rotor helicopter, however it doesn't have the exposed rotors which make helicopters dangerous for personal use. This is accomplished by containing the rotors in large 'ducts' which make up most of the body of the craft; the requisite decrease in rotor size also decreases fuel efficiency. The X-Hawk is being developed by Urban Aeronautics, and is being promoted for rescue and utility functions.It is expected to be available for about $3 million around 2010.

The Panda model is 1.5 meters in size.

PAL-V Europe BV: the PAL-V ONE is a hybrid of a gyrocopter with a motorcycle.
They must raise more money - a concept vehicle -they are flying models - motorcycle converts into a plane. It does not need to file a flight plan (flies below 4000 feet).
Short Takeoff and landing (STOL) 165 feet takeoff. 16 feet to land.

It has 3 wheels and a top speed of 200 km/h (124 mph) on land and air. It can run on petrol, biodiesel or bio-ethanol and will cost $US75000. The vehicle has a very short take of and vertical landing capability. At less than 70 decibels it is quieter than a helicopter due to the slower rotation of the main rotor. The PAL-V ONE has one seat.

Airborne, the PAL-V ONE flies under the 4,000 feet (1,500 m) floor of commercial air space. The PAL-V ONE is highly fuel-efficient and powered by an environmentally certified car engine. It would run on petrol like a conventional car and can reach speeds of up to 200 km/h both on land and in the air. It can be driven to the nearest airfield or helipad and, because it flies below 4,000 feet, can take off without filing a flight plan. 30km/liter fuel efficiency.

Gress aerospace - flying models but have interesting concept.

Dual fan VTOL can fit in DOT max 8.5 feet width for ground vehicles. It does not transform. The vehicle is just small for one person. A single seat helicopter is 20 feet wide. The Gressaero plane would have twice the speed and range of a helicopter.

A UAV that could lead top robotic human VTOL flight
The Northrop Grumman MQ-8B vertical takeoff and landing UAV will get deployed in low volume.

The U.S. Department of Defense awarded Northrop Grumman Corporation's $13.6 million for the procurement of long lead items, in support of the low-rate initial production of MQ-8B Fire Scout Vertical Takeoff and Landing Tactical Unmanned Aerial Vehicle (VTUAV). There should be nine purchased and operating in 2009.

The MQ-8B features four-blade main rotor, in contrast to the larger-diameter three-blade rotor of the RQ-8A, to reduce noise and improve lift capacity and performance. The four-blade rotor had already been evaluated on Fire Scout prototypes. They boost gross takeoff weight by 500 pounds to 3,150 pounds (by 225 kg to 1,430 kg), with payloads of up to 700 pounds (320 kg) for short-range missions.

The Transition is being designed to be a factory certified Light-Sport Airplane.
Two seats, side-by-side.
GTOW: 1320 lbs (600 kg)
Fuel: Super-unleaded autogas
Fuel Capacity: 20 gal (120 lbs / 54 kg)
Fuel Consumption (75% power): 4.5 gph
Engine: 100 hp Rotax 912 S (four-stroke)
Vs = 45 kts (51 mph, 83 km/hr)
Vr = 70 kts (80 mph, 130 km/hr)
Cruise Speed: 100 kts (115 mph, 185 km/hr)
Range: 400 nm (460 mi, 740 km)
Takeoff Distance over 50 ft obstacle: 1,700 ft (520 m)

A projected rollout of electric planes

Electric planes do not have to be private (although things will start out that way.)
Electric planes (with jumpjet type takeoff and landing) could form a virtual callable personal pod transportation (which have been proposed for cities) but without building the rails. One of the Personal pod transportation proposals is shown here.

Current electric planes hold one or two people.

The airplane efficiency numbers (where there are no need for roads)are already competitive with trains. Electric planes are ten to twenty times more efficient fuel wise than current small planes.

The 438 mpg equivalent electric planes were previously discussed on this site.

However, the electric plane rollout will likely follow as vehicles for the business jet and an expansion of the current small plane owner class to fly above car traffic and not just for recreation and between cities. The USA currently has about 250,000 planes. This will be something else for people to complain about that the rich and affluent having that they do not. Perhaps 1 to 2 million personal electric planes by 2020, but business as usual without robotic controls could see the numbers in the 50,000-100,000 level in the USA and 100,000 to 500,000 level worldwide by 2020. Electric planes are already in the $40,000 to $150,000 price range (in the range of upper end cars). High volume production could reduce those prices to the $20,000 to $75,000 level. Actual visionary adoption could create the volume to get below $20,000. An optimistic projection where the supply chain for hybrid cars could be leveraged, battery and robotic control technologies were mature and where Mundane Singularity type production takes over would have up to 5 to 20 million personal electric planes by 2020. Definitely there is the potential for the widespread vision starting in the 2020-2030 time frame.

The limited use vision instead of public systems for reduced commutes for everyone would be where we have:

"Look at them flying above our gridlock in vehicles we were mocking when they were proposed as an everyman system. But now that it is only the rich that have it, I want to bitch and complain that I should get it too. My lack of vision will lead me to complain about it after I see them flying overhead while I am stuck in ground traffic"

: 2008 CAFE Foundation Electric Aircraft Symposium has been held to work on the technical issues of personal aviation.

The NASA vision of personal aviation (PAV): Near all-weather STOL PAVs will be able to transport people to within just a few miles of their doorstep destination at trip speeds three to four times faster than airlines or cars. NASA predicts that up to 45% of all miles traveled in the future may be in PAVs. This will relieve congestion at metropolitan hub airports and the freeways that surround them, reduce the need to build new highways and save much of the 6.8 billion gallons of fuel wasted in surface gridlock each year.

The average doorstep to doorstep trip speed for automobiles is just 35 mph and for airliners is just 55 mph on trips under 250 miles. Recent delays caused by anti-terrorism security inspections reduce this speed even further. Traffic jams in the U.S. cost $78 Billion in year 2004, wasting 6.7 Billion gallons of fuel. These figures and the stress and pollution they entail worsen each year. Building new freeway lanes or light railway lines costs about $20 Million per mile and do not solve the fundamental problem.

98% of the U.S. population lives within 20 miles of at least one public use airport and yet 95% of commercial air traffic uses only 30 of our nation’s 5,000 airports. A study on airspace capacity contracted by NASA shows that our skies can accommodate at least 700 times more aircraft than are flying today.

20,000 large jets - 14 million would 700 times more
200,000 small planes - 140 million would be 700 times more

FAA and the Joint Planning and Development Office are already planning FAA’s Next Generation Air Transportation System (NGATS) to be an automated system that provides each aircraft its own traffic-free, computer-coordinated “Highway In The Sky.”

- Short runway use--Walk to grandma's from small residential airfields

Dozens of research papers that address the various issues. Plus the recent conference papers

The efficiency of public transportation has to factor in the number of people being moved. The Brad Templeton case is that if you have low usage (4 people in a bus then the public transportation system is less efficient than a car.)

More restrictions can be applied to air traffic corridors and loosened only as the air traffic technology safely permits.

Intermediate step of using the 3,400 small airports in the U.S. alone. So mostly not building to building air traffic.

The SATS Project (2001-2006), conducted by NASA and partners in the National Consortium for Aviation Mobility (NCAM) proved the viability of technical capabilities in the following four areas:

* High-volume operations at airports without control towers or terminal radar facilities
* Technologies enabling safe landings at more airports in almost all weather conditions
* Integration of SATS aircraft into a higher capacity air traffic control system, with complex flows and slower aircraft
* Improved single-pilot ability to function competently in evolving, complex national airspace

Navigation related
Virtual skies navigation concept

Air transportation models help planning as volume increases

Synthetic vision for all weather flight
This site believes that the interface should be one where automated robotic flying is used for personal aviation. However, up to the 1 million to 3 million electric planes in the USA level it will probably be advanced flight assist systems and people getting sport plane licenses or private air licenses who fly the planes.

Technology News roundup: Vasimr rocket, $12 PC, $10 microscope, better biomass

This technology news roundup has imminent testing of the Vasimr plasma rocket in space, twelve dollar personal computers, ten dollar dime sized microscopes and a plant that is 250% better than corn for biofuels and twice as productive as switchgrass. (the plant has not been modified yet and genetic modifications could vastly increase yields.)

Nasa will be testing the Vasimr plasma rocket soon on the International Space Station. The 200KW Vasimr was covered here in April 2008

A group of people attending the MIT International Development Design Summit are working to make a $12 computer based on the Apple II. These include, clockwise from front left, U.S. graduate student Derek Lomas, Anuj Nanavati of India, MIT graduate Jesse Austin-Breneman.

Imagine a microscope implanted into your body that could automatically sort out cancerous cells based on how they looked. That's the long-term promise of a lensless microscope that Caltech researchers describe this week in the journal Proceedings of the National Academy of Sciences.

Exploiting technology commonly used in consumer digital cameras, the M&M-size microscope is able to provide resolution comparable to an optical microscope at a mere fraction of the cost, perhaps as cheaply as $10 per unit.

"Microscopy is undergoing a great revolution now because of modern optics and spectroscopy," Feld said. "There are many exciting new approaches and this is one of them."

But Yang's tiny, cheap microscope could have nearly immediate applications. In the very short-term, Yang envisions a system for identifying diseases in the Third World that could cost a mere $100 and come embedded inside a cellphone or custom device for field work. "Because we can build [the microscope] very compactly, we can imagine building an entire system that is the size of an iPod," he said.

All of these applications could come into being very soon. Yang's lab is currently negotiating with semiconductor companies to mass produce his devices. Right now, it takes two days for one of his grad students to assemble one.

Once they enter manufacturing, however, they'll be able to make hundreds of the devices, and that's when high-throughput optical microscopy could become a reality. Working with image processing software designers, they're hoping to come up with autonomous systems for finding ad imaging cells.

Researchers have determined that an acre of the giant perennial grass Miscanthus x giganteus makes 2 1/2 times the amount of ethanol we can produce per acre of corn.

“One of the criticisms of using any biomass as a biofuel source is it has been claimed that plants are not very efficient – about 0.1 percent efficiency of conversion of sunlight into biomass,” Long said. “What we show here is on average Miscanthus is in fact about 1 percent efficient, so about 1 percent of sunlight ends up as biomass.”

“Keep in mind that this Miscanthus is completely unimproved, so if we were to do the sorts of things that we’ve managed to do with corn, where we’ve increased its yield threefold over the last 50 years, then it’s not unreal to think that we could use even less than 10 percent of the available agricultural land,” Long said. “And if you can actually grow it on non-cropland that would be even better.”

“Our highest productivity is actually occurring in the south, on the poorest soils in the state,” he said. “So that also shows us that this type of crop may be very good for marginal land or land that is not even being used for crop production.”

Because Miscanthus is a perennial grass, it also accumulates much more carbon in the soil than an annual crop such as corn or soybeans, Long said.

“In the context of global change, that’s important because it means that by producing a biofuel on that land you’re taking carbon out of the atmosphere and putting it into the soil.”

“One reason why Miscanthus yields more biomass than corn is that it produces green leaves about six weeks earlier in the growing season,” Long said. Miscanthus also stays green until late October in Illinois, while corn leaves wither at the end of August, he said.

Using corn or switchgrass to produce enough ethanol to offset 20 percent of gasoline use – a current White House goal – would take 25 percent of current U.S. cropland out of food production, the researchers report. Getting the same amount of ethanol from Miscanthus would require only 9.3 percent of current agricultural acreage.

August 06, 2008

Cheap, recyclable ultrastrong magnets that will enable smaller, more powerful engines

Ultra-strong, high-temperature, high-performance permanent magnet compounds, such as Samarium Cobalt, are the mainstay materials for several industries that rely on high-performance motor and power generation applications, including the Department of Defense (DOD) and the automotive industry. Until now, producing Samarium Cobalt has been a difficult and expensive multi-step process. Northeastern University researchers have broken new ground with an innovative invention of a rapid, high-volume and cost-effective one-step method for producing pure Samarium Cobalt rare earth permanent magnet materials.

To create a field
with the strength of 100 mT (1,000 G) at a 1 mm distance from the pole, a barium ferrite magnet must be around 25 times larger than a samarium-cobalt magnet.

Smaller and more powerful motors will make wheel motors more practical. Wheel motors can reduce losses in a car by 30-40% by have no transmission to wheel power losses.

Samarium-cobalt (SmCo) magnets are produced by pressing powdered alloys to shape and then sintering them in a furnace. This powder can also be mixed with polymer binders to form bonded magnets.

SmCo exhibit excellent thermal qualities with several grades designed specifically for use up to 570°F. For high-energy material, SmCo offers the best resistance to temperature. Until this development sintered samarium-cobalt has commonly beenused in stepper motors for robotics and aerospace as well as motors for magnetic pumps and couplings. But high costs confined it to small or thermally demanding situations. This low cost breakthrough will enable widespread use. Engines and generators can be made smaller, lighter, more efficient and reliable. Compact, high-power motors without field coils will be made common.

Electron Energy Corporation already makes and sells existing Samarium Cobalt magnets with some over 1 Tesla and 30 megagauss of energy. Samarium-Cobalt (SmCo) can achieve a maximum of 225 kJ/m**3. Samarium Cobalt batteries were used for Nasa's Deep space one space probe which used an ion engine.

The 2006 Progress Report for Automotive Propulsion Materials Program for the Freedomcar project explained the benefits of a strong permanent magnet

The Freedomcar project looked at superconducting magnets, which are stronger then Samarium Cobalt magnets but need cooling.

Permanent magnets are used in the traction motors of hybrid electric vehicles because of their superior magnetic properties (energy product) compared with other permanent magnets. Higher-strength magnets are desired because they would enable manufacturers to reduce the size, weight, and volume of the traction motor and thus increase the fuel efficiency of the vehicle.

A major component of the HEV is the electrical machine (traction motor) used to drive the wheels. The traction motor employs a number of permanent magnets (PMs). Energy product is directly proportional to the energy stored per unit volume of the magnet; the torque produced by a PM electric motor is approximately proportional to the energy product of the PM. Increasing the energy product of the PM will proportionally increase the torque. Therefore, increasing the energy product will reduce the weight and size of the PM required to generate the same torque. Furthermore, reducing the weight and size of the PM may reduce the size of the entire motor required to generate the same torque. This will further reduce the overall weight of the motor and increase the mileage of the HEV.

Typical performance requirements for linear drive motors are (BH)max = 40 MGOe (320 kJ/m3) and Hc = 2 Tesla (1.6 MA/m). It is the objective of this study to increase the energy product by using stronger magnetic alignment Figure 2. Energy product vs coercive field for various fields generated by the SCM while maintaining the same coercive field (by raising the operating applications. point vertically in Figure 2). So far, NdFeB magnets show the highest value of remanence Br and energy product (BH)max, and samarium-cobalt magnets exhibit the highest coercive fields, Hc.

The direct chemical synthesis process is able to produce Samarium Cobalt rapidly and in large amounts, at a small fraction of the cost of the current industry method.

Samarium Cobalt magnets are superior to other classes of permanent magnetic materials for advanced high-temperature applications and the Northeastern invention goes beyond the currently known fabrication process of these nanostructured magnets. Unlike the traditional multi-step metallurgical techniques that provide limited control of the size and shape of the final magnetic particles, the Northeastern scientists’ one-step method produces air-stable “nanoblades” (elongated nanoparticles shaped like blades) that allow for a more efficient assembly that may ultimately result in smaller and lighter magnets without sacrificing performance.

This revolutionary invention is anticipated to not only revitalize the permanent magnet industry, it has the potential to bring major changes to several federal and commercial industries, including its potential to impact the size, weight, and performance of aircraft, ships, and land-based vehicles, as well as contribute to more efficient computer technologies and emerging biomedical applications.

“This work represents the most promising advance in rare earth permanent magnet processing in many years,” said Laura Henderson Lewis, Professor of Chemical Engineering and Chair of the Department of Chemical Engineering at Northeastern University and a collaborator on this project. “I expect it to revitalize international interest in the development of this important class of engineering materials.”

A turbogenerator study which used permanent magnets to make parts of a generator smaller and more efficient.

Cheap, strong permanent magnet can help make more powerful in wheel motors.

Different kinds of permanent magnet engines are analyzed and compared in this 123 page Oak Ridge National Lab study for the Freedomcar project

August 05, 2008

China has new 70GW nuclear power target for 2020

China has declared a new target of 5% of electricity by 2020 will be from nuclear power. This will be 70GW. About 500TWh. The target from 2007 was 60GW and the target before that was 40GW.
16% of power by 2030 (about 1400 TWh, 200+GW)

A projected total generation of 8472 TWh and an installed capacity of 1775 GW by 2030 means that China will equal the current levels of production and capacity in the USA and the European Union combined.

China is expected to have 311 GW of hydropower in place by 2020, meeting the government target, and 380 GW in 2030. Hydropower is expected to rise to over 1000 TWh in 2030, but its share of total power output will fall from 16 per cent to 12 per cent. The target for wind power is expected to be exceeded, with wind power reaching 42 GW in 2020 and 79 GW in 2030.

China has made considerable progress in the implementation of state-of-the-art coal fired generation technologies, by building larger, more efficient power plants. China added 18 GW of supercritical plant in 2006, bringing total supercritical capacity to about 30 GW. There are about 100 GW of supercritical plant on order, implying that the share of supercritical technology in new capacity will increase significantly over the next few years.

After 2015, new coal power stations will probably be as efficient as those built in the OECD. The average gross efficiency increases from 32 per cent in 2005 to 39 per cent in 2030, bringing it much closer to the OECD average of 42 per cent by 2030. Furthermore there will likely be a greater implementation of cleaner technologies such as supercritical, ultra-supercritical and integrated gasification combined-cycle plants.

Prediction: Dark Knight will top Titanic United States Box Office

This is a small prediction on a trivial topic. Dark Knight will earn more than the non-inflation adjusted total of Titanic.

This Mondays numbers were 45% over the Dead Man's chest numbers, which could be an indication of longer legs than Dead Man's chest. An extra $4 million this week if the pattern holds throughout the week. An informal poll of people I know indicate more of a willingness to watch Dark Knight a second time than Dead man's Chest. Especially to watch Dark Knight on Imax (Imax showings were sold out several days in advance early on). If the 45% weekdays over Dead Mens Chest effect is all there is then that would put DK at about $560M with the the slightly longer stay in theaters out to Week 15. I would say $500M on 38th to 45th day of release. There would need to be another wave of repeat viewings at about weeks 5-10 that carried through for an even longer stay to 20 weeks to get to 600M. Getting to 580-590M would be enough for "take out Titanic" marketing push for rabid fans. I believe DK will be showings its longer legs this week and the next four. The non-inflation adjusted Titanic total will be (barely)taken out. DK will be well short of the inflation adjusted total of $900M. DK will also not be near Titanics World Wide total

What would it take ? New Nuclear electricity at less than two cents per kwh

New nuclear plants constructed in the USA can provide electricity at about 5-9 cents/kwh

Because of the large capital costs for nuclear power, and the relatively long construction period before revenue is returned, servicing the capital costs of a nuclear power plant is the most important factor determining the economic competitiveness of nuclear energy. The investment can contribute about 70% of costs of electricity, according to one 2005 OECD/NEA study (which assumed a 10% discount rate). The discount rate chosen to cost a nuclear power plant's capital over its lifetime is arguably the most sensitive parameter to overall costs.

Industry consensus is that a 5% discount rate is appropriate for plants operating in a regulated utility environment where revenues are guaranteed by captive markets, and 10% discount rate is appropriate for a competitive deregulated or merchant plant environment, however the independent MIT study which used a more sophisticated finance model distinguishing equity and debt capital had a higher 11.5% average discount rate.

Construction delays can add significantly to the cost of a plant. Because a power plant does not yield profits during construction, longer construction times translate directly into higher interest charges on borrowed construction funds. Modern nuclear power plants are planned for construction in four years or less (42 months for CANDU ACR-1000, 60 months from order to operation for an AP1000, 48 months from first concrete to operation for an EPR and 45 months for an ESBWR) as opposed to over a decade for some previous plants.

The first AP1000 has some delays but we will see how much that drops with the later ones which will be built, in particular China is ordering 40-100 AP1000s. China has put together what amounts to a production line for further build of AP1000 nuclear reactors. A 71000 square meter (710,000 square feet) factory specifically designed for nuclear power plant component modules. Shandong Nuclear Power Construction Group built the facility, which has the capacity to support the construction of two AP1000s each year, in just 11 months. [They will need four more factories to build ten AP1000s each year. Nine more to build twenty AP1000s per year.]

Shandong said the new 71,000 square metre factory includes a cutting workshop, a pipeline workshop, a paint shop and a workshop for containment vessels (the steel liners that lie within the overall reinforced concrete reactor containment).

Large components for the Haiyang units have already been contracted: Doosan Heavy Industries of Korea is making the reactor pressure vessels and steam generators, while Mitsubishi Heavy Industries of Japan and Harbin Boiler Works of China will supply the steam turbines. For Westinghouse's other pair of AP1000s at Sanmen the steam generators and reactor pressure vessels will be made in China by either Harbin, First Heavy Machinery Works or Shanghai Electric.

First concrete at Haiyang - the official start of construction - is expected in September 2009, with commissioning of the first unit about 36 months later.

Longer construction times add higher interest charges. (About 20-40% to the cost over 4-5 years.) Bringing construction time down to 2 years for China's HTR would help lower costs.The China HTR is starting at 6 cents per kwh but they need to get volume scaling to lower costs equal to large plant scaling.

$ per Megawatt-hour (10 $/Mw-hr = 1 cent/kw-hr):

                                           Old Nuclear      Coal  New Nuclear est
1 Fuel 5.0 11.0 5.0
2 Operating & Maintenance - Labor & Materials 6.0 5.0 8.0
3 Pensions, Insurance, Taxes 1.0 1.0 1.0
4 Regulatory Fees 1.0 0.1 1.0
5 Property Taxes 2.0 2.0 2.0
6 Capital 9.0 9.0 39.0
7 Decommissioning & DOE waste costs 5.0 0.0 5.0
8 Administrative / overheads 1.0 1.0 1.0
Total 30.0 29.1 60.0

Payoff capital costs (after say 30-40 years) and the it goes down to 2 cents per kwh.
for the remaining 20-30+ years on old nuclear. New nuclear has inflated capital costs because of currently higher commodity prices and other inflated costs.

Waste costs can be reduced if we have new deep burn capable reactors to burn it.

Nuclear fuel costs

Uranium: 8.9 kg U3O8 x $53 472
Conversion: 7.5 kg U x $12 90
Enrichment: 7.3 SWU x $135 985
Fuel fabrication: per kg 240
Total, approx: US$ 1787

At 45,000 MWd/t burn-up this gives 360,000 kWh electrical per kg, hence fuel cost: 0.50 c/kWh.

If MW days per ton is double then half volume of fuel is needed. This can be achieved with new annular fuel and slightly higher enrichment levels. 8% instead of 5%.

Enrichment from 2012 onwards will be 3-20 times cheaper with laser enrichment.

Half the fuel at three times cheaper enrichment is

Uranium $250
Conversion $45
Enrichment $165
Fuel fab $240 (halved for less fuel but doubled for more complex
Total approx $750

0.2 c/kwh instead of 0.5 cents for the fuel

The world nuclear link also factors in building a lot more units to get learning curve reductions in costs and the effect of 60 year life instead of 40 year (the 50% longer life reduces kwh costs by 1 penny when combined with 5 year instead of 7
year construction)

More units allow first of kind costs to be amortized.
$1200/kw capital costs, 5 year construction, 60 year life is 3.4 cents/kwh

$1000/kw capital costs, 2 year construction, 80 year life, reduced fuel costs 2 cents/kwh for new construction, plus less waste handling with deep burn.

Operation and maintenance can be reduced with more automation and design efficiency.

                                   Volume Improved Nuclear estimate
1 Fuel 2.0 (laser enrichment, 50% uprate)
2 Operating & Maint - Labor&Materials 2.0 (automation, improved designs)
3 Pensions, Insurance, Taxes 1.0
4 Regulatory Fees 1.0
5 Property Taxes 2.0
6 Capital 7.0 (loan guarantees,factories, high volume, longer 80 year operation so more amortization)
7 Decommissioning & DOE waste costs 1.0 (Deep burn the waste, more time until decommissioning with longer life,more time for interest to build for decommissioning fund)
8 Administrative / overheads 1.0
Total 17.0 equal to 1.7 cents per kwh

China could get labor costs down with lower salaries and capital costs could be reduced because of lower construction costs. China could get to 1.3-1.5 cents per kwh.

UPDATE FURTHER READING [Added after the above analysis was made]
This study shows an increased cost for current new power build.
There is a new 12 page whitepaper on current nuclear, coal and natural gas

In the base case, new nuclear capacity produced a levelized cost of $83.40/megawatt-hour. Supercritical coal was at $86.50/MWh; supercritical coal with CCS at $141.90/MWh; IGCC at $92.20/MWh; IGCC with CCS at $124.50.MWh; gas combined-cycle (CC) at $76.00/MWh; and gas CC with CCS at 103.10/MWh. Although it had the highest capital cost, the new nuclear capacity produced the lowest-cost electricity, except for gas-fired CC capacity without CCS.

Natural gas had $76MWh assuming $6.26/mmBtu.

CBO also found that new nuclear capacity could be competitive at even lower carbon dioxide charges if the price of natural gas rose above the price assumed in the reference scenario ($6.26/mmBtu) or if the construction cost reductions predicted by the reactor designers were accurate. CBO found that in a high gas price environment of $12/mmBtu (near present-day prices), natural gas would no longer be competitive with new nuclear.

Nuclear Energy Institute modeling shows that a merchant nuclear plant with an 80 percent debt/20 percent equity capital structure, supported by a federal loan guarantee, will produce electricity in the range of $64/MWh to $76/MWh. (The range reflects EPC costs from $3,500/kWe to $4,500/kWe) A high-cost ($4,500/kWe EPC cost) nuclear plant producing electricity at $76/MWh is competitive with a gas-fired combined-cycle plant burning $6-8/mmBtu gas or an SCPC plant.

Similar results are found using the same capital cost range for a regulated plant. Assuming a 50 percent debt/50 percent equity capital structure typical of a regulated electric company, and assuming the company is permitted to recover the cost of capital during construction (CWIP), NEI’s financial model shows the levelized cost of electricity from the plant ranges from $74/MWh to $88/MWh – competitive with a gas-fired combined cycle plant burning gas at $8-10/mmBtu or an IGCC plant (without carbon capture and sequestration).

First Integrated Nanowire Sensor Circuitry

Randomly oriented nanowires, on the growth substrate at left, are having a “bad hair day.” But after contact printing, the nanowires on the receiver substrate are highly aligned.

Scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory and the University of California at Berkeley have created the world’s first all-integrated sensor circuit based on nanowire arrays, combining light sensors and electronics made of different crystalline materials. Their method can be used to reproduce numerous such devices with high uniformity.

The Javey group has devised two printing methods, contact and roller. The roller method involves growing nanowires on the surface of a cylinder and rolling it across the application substrate, like painting with a paint roller.

Contact printing involves growing nanowires on a flat substrate, inverting it, and pressing it onto the desired substrate. Then the nanowires are detached by sliding the growth substrate away, leaving them attached to the application substrate. Due to the lack of strong surface chemical interactions between nanowires, the process is self-limited to the transfer of only one layer of nanowires. The printed nanowires are highly aligned in the direction of the sliding.

For their integrated nanowire photosensor circuitry, the Javey group used cadmium selenide nanowires as visible-light sensors. For the electronics, nanowires with a germanium core and a silicon shell were the basis of field-effect transistors that would amplify the current produced by the photosensors in response to light by five orders of magnitude.

Results of the Javey group’s integrated nanowire circuit showed successful photoresponse in 80 percent of the circuits, with fairly small variations among them. Where circuits did fail, the causes were due to defects in fabrication of the circuit connections (10 percent), failure in photosensor printing (5 percent), or defective nanowires (5 percent). The relatively high yield of complex operational circuits proved the potential of the technology, with improvements readily achievable by optimizing nanowire synthesis and fabrication of the devices.

Artist’s impression of an integrated light sensor circuit based on nanowire arrays (Javey Group).

EEStor update from MIT Technology Review

EEStor has manufactured materials that have met all certification milestones for crystallization, chemical purity, and particle-size consistency. The results suggest that the materials can be made at a high-enough grade to meet the company's performance goals. The company also said a key component of the material can withstand the extreme voltages needed for high energy storage. Representatives of the company said in a press release that certification data proves that voltage breakdown of the aluminum oxide occurs at 1,100 volts per micron--nearly three times higher than EEStor's target of 350 volts.

EEStor claims that its system, called an electrical energy storage unit (EESU), will have more than three times the energy density of the top lithium-ion batteries today. The company also says that the solid-state device will be safer and longer lasting, and will have the ability to recharge in less than five minutes. Toronto-based ZENN Motor, an EEStor investor and customer, says that it's developing an EESU-powered car with a top speed of 80 miles per hour and a 250-mile range. It hopes to launch the vehicle, which the company says will be inexpensive, in the fall of 2009. At the EESU's core is a ceramic material consisting of a barium titanate powder that is coated with aluminum oxide and a type of glass material.

EEStor claims momentum is building and that they'll start coming out with information about the company's progress on a "more rapid basis." Plans are also under way for a major expansion of EEStor's production lines. "There's nothing complex in this," he says, pointing to his past engineering days at IBM. "It's nowhere near the complexity of disk-drive fabrication."

If EEStor is successful, their technology could be used to extend the range, capacity or other performance metrics of electric planes. Current electric planes have a range of about 100 miles. Tripling range would be 300 miles. Alternatively a shorter range with more passengers is also possible. 4 people instead of two people at about 100-150 mile range.

Hybrid electric planes can go 720 miles.

Re-inventing Civil Defense : Zero soft targets

If buildings are built better then the blast pressure has less effect at each range from the blast. 5 PSI buildings can be made or upgraded by using Hurriquake nails on existing buildings. $15 of material for each house. If the building do not collapse then there would be fewer fires because of fewer natural gas leaks. The smaller the damage radius then the lower the population at higher risks of death.

Civil defense used to be about bomb shelters in the woods or in our backyards or state supported tunnels and shelters. Those ideas were interesting for their time in the Cold War. However, modern nuclear threats may not provide hours or minutes of warning time for people to leave homes and offices to reach shelters. If a terrorist or group were to deploy a nuclear bomb it may not be in a missile that can be detected and a warning broadcast to the civilian population. The nuclear device might be brought into a harbor or over a border undetected.

The world is changing and over the long term deterrence against the use of nuclear weapons will probably fail, if for no other reason than perfection is difficult to sustain. Nuclear non-proliferation has failed. Almost all the key knowledge is available and there are more and easier ways to produce the enriched uranium.

Civil defense needs to be integrated with our day to day lives.
This means applying affordable technology to make homes and offices into survivable structures with zero or limited aesthetic compromises. This was the theme of the previous article on this site about simple and affordable defenses against nuclear weapons It includes enabling other effective civil defense measures such as new ways to "immunize against radiation" or treat radiation damage to prevent deaths.

If our homes and offices are our blast shelter then 80+% or more of the time that is where we will be if something bad happened.

If you had an old style bomb shelter in your backyard then it is not likely you would be in it when things hit the fan.

The plan is to design society to shrink the radius of fatalities around a nuclear explosion and reduce the percentage of dead.

There is an online blast calculator with live maps Note: for blast sizing, nuclear fission bombs cap out at about 500 kilotons. Megaton and multi-megaton bombs need fission triggered fusion bombs. It is a non-trivial and multi-year development effort to go from crude working fission bombs to better fission and better fusion bombs. It usually requires testing your designs (nuclear bomb tests which would be detected).

Simple steps to reduce deaths from nuclear weapons, hurricanes, earthquakes by 10 to 1000 or more times.

-Require a retrofit of all old houses and new houses to use Hurriquake or better nails
-distribute new anti-radiation drugs widely.
-Use radiation resistance increasing gene therapy and drugs
-Use/retrofit with cellulose nanopaper (stronger than cast iron) for
-Use stronger cement
-Use carbon nanotube/graphene reinforcement (add hydrogen for

radiation blocking)
-Use stronger windows and doors
-Find better fire/thermal resistance modifications

Many small nuclear reactors would enable a more robust electrical grid (in the event of very rare accidents/incidents, recall that we should have radiation resistance enhancing treatments and super-effective anti-rad drugs).

Spend more money to harden the structure of hospitals and other places that we would especially need to survive nuclear or other attack.

10-50 times reduction in fatalities is very do-able even without the tech getting too fancy [Hurriquake nails, anti-radiation drugs, stronger concrete and cellulose paper as strong as steel, monolithic domes for 50-100PSI resistant structures]

1000+ times reduction with better but achievable technology, which we will need because that same category of tech will boost offense.
(nanofactories could make a lot of nukes, so it is good thing that MNT would also help make us closer to being immune to nukes as well)

Very tough buildings do not need to look like bunkers. Monolithic domes made of iCrete which is being used in New York high rises like the Freedom Tower would resist 50PSI overpressure. This is staying standing when 0.3 miles from ground zero of a 1 megaton bomb. This is 13 times closer than buildings that fall apart at 5PSI or 20 times closer than 2PSI buildings. Initially only people who want to buy a new custom house would consider monolithic domes, but they should be considered for hospitals and other key facilities.

There are several new anti-radiation drugs that are coming out. The most promising is one that reduces radiation damage by 5000 times in mice that is undergoing human trials now. (trials should be done in one or two months. Rice University James Tour's work) Having civil defense that was a lot more effective against nuclear bombs that any nuclear power plant accident would be trivial and harmless regardless of severity. (ie. We could not worry about Chernobyl at all. We would still not get sloppy but there would be zero risk of death.)

The boost to defense means a far bigger safety margin and the small fry (N Korea/Iran) and medium-small fry (Pakistan) and against ambitious groups (Al Qaeda) or accidents or whatever.

The level of deaths that is currently expected from one nuke would require 50 nukes or 1000 nukes (depending upon the level of defense incorporated into widespread construction.

Note: Overall survivability requires widespread adoption. If you are in a 100PSI resistant Monolithic dome made from concrete with better steel and quartz aggregate, then great your building and you survive, it would still be bad for you if all your neighbors and the rest of the city got destroyed and their crappy buildings were set on fire. It is like innoculation with vaccines, the system holds up better if everyone is required to participate.

One aspect of this is if buildings are built to not fall down when a nuclear bomb goes off at some distance and the buildings do not get set on fire then there is less nuclear winter. This means less problems for agriculture. The nuclear winter scenario is based on widespread fires from burning buildings and other material. Note: that simulations have indicated that the initial nuclear winter calculations indicated a larger problem then is likely. Current simulations show it would be more like a nuclear autumn.

Also, note that any higher construction costs are offset by lower maintenance costs and lower insurance for individuals and society.

If your detection and human intelligence is such a sieve that you cannot tell when someone is bringing 50 nukes or 1000 nukes into your cities then there is a serious competence problem.

Doing things in advance we can massively mitigate the risks of nuclear bombs, earthquakes and hurricanes and conventional explosives. We should stop thinking that effective mitigation is not possible and aggressively pursue it. We cannot depend upon perfect avoidance of risk exclusively.

Neutron scatter cameras detect nuclear bombs

Current and future port security

DIY nuke detector patrols SF Bay

Cheap domes for blocking nuclear missiles

Detecting nuclear, biological and chemical materials

Countering bioweapons

August 04, 2008

Oil has streak of positive news

Oil supply is running on a streak of positive news after a lot of negative news pushed oil prices up.

Saudi Arabia apparently hit its target of 9.7 million barrels a day in July. There is a steady recovery in oil production from the Gulf of Mexico, where Hurricane Katrina demolished rigs and set back plans by months and in some cases years. BP's Thunder Horse platform, which was left a listing hulk by Katrina, finally began production last month and should be pumping 250,000 barrels a day by the end of next year. Thunder Horse is believed to be the largest oil find ever in the Gulf of Mexico. As oil companies return to their pre-Katrina schedules, Littell says, "we should see one or two of these (new Gulf of Mexico projects) every quarter through 2009."

Crude oil fell to a 13-week low amid speculation that Tropical Storm Edouard will miss most offshore oil facilities as it approaches the coast of Texas.

Oil mega projects for 2008 from wikipedia

Competitors for kite generated wind power

Scientists from Delft University of Technology in the Netherlands harnessed energy from the wind by flying a 10-sq metre kite tethered to a generator, producing 10 kilowatts of power.

An Italian company, Kitegen, has come up with a theoretical design for a system that could generate a gigawatt, as much power as a standard coal-fired power station. Its idea involves flying 12 sets of lines with four 500-sq metre kites on each. Kitegen has been covered several times by this site.

Researchers have plans to test a 50kW version of their invention, called Laddermill, eventually building up to a proposed version with multiple kites that they claim could generate 100 megawatts, enough for 100,000 homes.

Furey has worked out that flying kites in a figure of eight pattern means the air flowing over them travels even faster than the ambient wind speed. When a kite needs to be reeled in, it is angled so that it falls out of the sky like a glider, without the need for much power. Ockels's system uses these flying patterns to maximise the power the kites can generate. He is also looking at extending his basic prototype to use multiple kites that yo-yo: when one goes up, another goes down. Ockels estimates that kites could generate power at less than 4p per kilowatt-hour., the philanthropic arm of the Californian web-search company, invested $10m (about £5m) last year in a US kite company called Makani Power Inc.

The aim of both teams is to tap into high-altitude wind, which is an energy source that is more abundant and reliable than the ground-level wind on which normal turbines depend.

Ken Caldeira, a climate scientist at Stanford University's Carnegie Institution, has estimated that the total energy contained in wind is 100 times the amount needed by everyone on the planet. But most of this energy is at high altitude.

Highest 50 nanometer resolution X-ray holograms

Top: the ALS beamline 9.0.1 experiment used a uniformly redundant array (URA) 30 nanometers thick with scattering elements 44 nanometers square (left). At right is the lithograph of Da Vinci’s Vitruvian Man. The scale bar is two micrometers long. Bottom: the FLASH experiment used a URA with 162 pinholes, next to a Spiroplasma bacterium. The 150-nanometer diameter pinholes in the URA limited resolution, but computer processing improved image resolution to 75 nanometers. The scale bar is four micrometers long.

The pinhole camera, a technique known since ancient times, has inspired a futuristic technology for lensless, three-dimensional imaging. Working at both the Advanced Light Source (ALS) at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, and at FLASH, the free-electron laser in Hamburg, Germany, an international group of scientists has produced two of the brightest, sharpest x-ray holograms of microscopic objects ever made, thousands of times more efficiently than previous x-ray-holographic methods.

The x-ray hologram made at ALS beamline 9.0.1 was of Leonardo da Vinci’s famous drawing, “Vitruvian Man,” a lithographic reproduction less than two micrometers (millionths of a meter, or microns) square, etched with an electron-beam nanowriter. The hologram required a five-second exposure and had a resolution of 50 nanometers (billionths of a meter).

The other hologram, made at FLASH, was of a single bacterium, Spiroplasma milliferum, made at 150-nanometer resolution and computer-refined to 75 nanometers, but requiring an exposure to the beam of just 15 femtoseconds (quadrillionths of a second).

The values for these two holograms are among the best ever reported for micron-sized objects. With already established technologies, resolutions obtained by these methods could be pushed to only a few nanometers, or, using computer refinement, even better.

The research paper at Nature Photonics

Massively parallel X-ray holography

Stefano Marchesini, Sébastien Boutet, Anne E. Sakdinawat, Michael J. Bogan1, Sas carona Bajt1, Anton Barty1, Henry N. Chapman1, Matthias Frank1, Stefan P. Hau-Riege1, Abraham Szöke1, Congwu Cui, David A. Shapiro, Malcolm R. Howells, John C. H. Spence, Joshua W. Shaevitz, Joanna Y. Lee, Janos Hajdu & Marvin M. Seibert

Advances in the development of free-electron lasers offer the realistic prospect of nanoscale imaging on the timescale of atomic motions. We identify X-ray Fourier-transform holography as a promising but, so far, inefficient scheme to do this. We show that a uniformly redundant array4 placed next to the sample, multiplies the efficiency of X-ray Fourier transform holography by more than three orders of magnitude, approaching that of a perfect lens, and provides holographic images with both amplitude- and phase-contrast information. The experiments reported here demonstrate this concept by imaging a nano-fabricated object at a synchrotron source, and a bacterial cell with a soft-X-ray free-electron laser, where illumination by a single 15-fs pulse was successfully used in producing the holographic image. As X-ray lasers move to shorter wavelengths we expect to obtain higher spatial resolution ultrafast movies of transient states of matter.

Thermoelectrics and refrigerators

Members of the Quantum Simulations Group at Lawrence Livermore National Labs provide an extensive discussion of how thermoelectrics can replace freon based refrigerators when inexpensive thermoelectric materials reach a ZT of 3.

The Livermore group has begun working on simulations [modeling material processes using quantum molecular dynamics methods] for a diverse group of technological applications. For example, nanoscale materials could improve cooling technologies in military equipment and reduce the size of gamma radiation detectors being developed for homeland security.

Thermoelectric materials convert heat into electricity and vice versa. They have no moving parts and release no pollutants into the environment. A few niche markets have used them for decades to cool electrical parts or generate power. Researchers have considered using thermoelectric-based refrigerators to replace current heat-pump-based refrigerators that compress and expand a refrigerant such as Freon.

Canted nanowires grown in the [001] direction can achieve a ZT of 3.5 but require considerable doping with either phosphorus or boron. “I doubt that the wires could be doped strongly enough for this surface to work,” says Vo. “Wires grown in the [011] direction will probably be the best compromise.”

Although the low effective mass of silicon increases electrical conductivity, it also contributes to a high thermal conductivity. Thermal conductivity must be low for a thermoelectric material to be efficient. One solution is to change the material used for the wires. Vo’s simulations indicate that a SiGe combination will reduce lattice thermal conductivity by as much as five times without affecting electrical conductivity.

From the Air Conditioning and Refrigeration Technology Institute:

The final report on thermoelectric technology assessment of the ACRT institure

The energy conversion efficiency, or Coefficient of Performance (COP) of thermoelectric cooling devices, is determined by thermoelectric figure-of-merit, commonly denoted by ZT. The highest ZT to date is reported in Bi2Te3/Sb2Te3 and PbSeTe/PbTe superlattice thin films. Coolers based on such materials typically have a COP of ~2, which is lower than the COP of 3-4 vapor compression refrigerators. However, there is no known theoretical impediment to significant increases in thermoelectric energy conversion efficiency, and given a breakthrough in materials, thermoelectric technology might offer the possibility of a safe, efficient, and affordable alternative to fluorocarbon compression equipments.

The use of thermoelectric devices and systems has been limited by their relative low energy conversion efficiency. Present commercial thermoelectric devices operate at about 10% of Carnot efficiency, whereas the efficiency of a compressor-based refrigerator increases with size: a kitchen refrigerator operates at about 30% of Carnot efficiency and the largest air conditioners for big buildings operate near 90%.

Today’s thermoelectric devices are particularly useful when the efficiency is a less important issue than small size, low weight, or high reliability. For example, thermoelectric devices are suited for situations where the heat load is small (say, <25W) or the temperature lift is small (say <10C) or the variation of the heat load is large (e.g., train passenger cabin). It is important to note that the COP of thermoelectric coolers increases significantly with decreasing the temperature lift.

• Instead of utilizing a full-fledged thermoelectric cooling system, it is possible to use a thermoelectric heat pump to improve the performance of an existing vapor compression system, so called “hybrid system.” For example, a hybrid vapor compression – thermoelectric cooler systems could use thermoelectric heat pumps to enhance the outlet subcooling of a condenser, in which thermoelectric heat pumps operate at small ΔT and high COP. Theoretical analysis predicted the cooling capacity and COP of the hybrid system could be significantly improved.

Thermoelectric heat pumps could operate at very high COP (possibly COP>6)under the condition of small temperature lift. They would provide a high COP boost to conventional refrigerating systems.

The thermoelectric subcooler is modeled as additional component that provides a given temperature lift. The simulation results are shown below
The important findings from the studies are listed below:
• A theoretical maximum improvement of 16.2% in COP can be achieved. The
corresponding increase in cooling capacity is about 20%.
• A theoretical maximum improvement of 35% in capacity can be achieved,
without change in COP.
• No increase in the size of the heat exchangers in the system.
• The economic aspects of coupling a thermoelectric device with a conventional
vapor compression system remain to be investigated.

• High Reliability: Thermoelectric coolers possess high reliability. Depending on
the conditions of application, the lifetime of thermoelectric coolers is in the range
of 100,000 to 200,000 hours

The old thermoelectric coolers from 1996

Thermoelectric coolers are solid state heat pumps used in applications where temperature stabilization, temperature cycling, or cooling below ambient are required. There are many products using thermoelectric coolers, including CCD cameras (charge coupled device), laser diodes, microprocessors, blood analyzers and portable picnic coolers.

12 questions about thermoelectric cooling

Let's look conceptually at a typical thermoelectric system designed to cool air in an enclosure (e.g., picnic box, equipment enclosure, etc.); this is probably the most common type of TE application. Here the challenge is to 'gather' heat from the inside of the box, pump it to a heat exchanger on the outside of the box, and release the collected heat into the ambient air. Usually, this is done by employing two heat sink/fan combinations in conjunction with one or more Peltier devices. The smaller of the heat sinks is used on the inside of the enclosure; cooled to a temperature below that of the air in the box, the sink picks up heat as the air circulates between the fins. In the simplest case, the Peltier device is mounted between this 'cold side' sink and a larger sink on the 'hot side' of the system. As direct current passes through the thermoelectric device, it actively pumps heat from the cold side sink to the one on the hot side. The fan on the hot side then circulates ambient air between the sink's fins to absorb some of the collected heat. Note that the heat dissipated on the hot side not only includes what is pumped from the box, but also the heat produced within the Peltier device itself (V x I).

Let's look at this in terms of real numbers. Imagine that we have to pump 25 watts from a box to bring its temperature to 3°C (37.4°F) from a 20°C (68°F) ambient. To accomplish this, we might well have to take the temperature of the cold side sink down to 0° C (32°F). Using a Peltier device which draws 4.1 amps at 10.4 V, the hot side of the system will have to dissipate the 25 watts from the thermal load plus the 42.6 watts it takes to power the TE module (for a total of 67.6 watts). Employing a hot side sink and fan with an effective thermal resistance of 0.148 C°/W (0.266F°/W), the temperature of the hot side sink will rise approximately 10°C (18°F) above ambient. It should be noted that, to achieve the 17° C drop (30.6°F) between the box temperature and ambient, we had to create a 30° C (54°F) temperature difference across the Peltier device.

More research papers from the Air Conditioning and Refrigeration Technology Institute.

More research projects of the ACRT institute.

Existing commercial thermoelectric refrigerating cooler [many other high end applications listed in the ACRT thermoelectric assessment report:

The Igloo Kool Mate 56-Quart Thermoelectric Cooler sold at Walmart for $119.73

- Plugs into 12V utility outlet in your boat or car
- 56-quart (1.9 cubic feet) capacity holds up to 72 twelve-ounce cans
- Cools to down to 44-degrees F below outside temperature

This site has already reported that there is about a billion dollars in research from the US military and the Department of Energy on thermoelectrics.

Replacing Freon in AC and refrigerators would be a big part of reducing greenhouse gases. Freon refrigerant gas was banned from vehicular air conditioning systems In the mid 1990’s to prevent Ozone Layer depletion. R134-a refrigerant gas was universally adopted as the replacement However R134-a has 1,300 times* the global warming potential of CO2 The European Union is prohibiting use of R134-a in cars for
•New models in 2011
•All new cars in 2017

The research goal is included on page 35 of the 55 page slide deck presentation by Vehicular Thermoelectrics Applications Overview
,John W. Fairbanks, Technology Development Manager-Thermoelectrics, U.S. Department of Energy – Washington, DC, Presented at the DEER 2007, Detroit, Michigan August 15, 2007

Форма для связи


Email *

Message *