July 25, 2008

Steam power > Combustion > Jet engines > Advanced Thermoelectric > Carnot limit

Thermoelectrics using nanostructures offers the potential of getting very close to the carnot limit of efficiency using very light weight systems for convert heat to electricity. Early versions of thermoelectrics have been sold for many years but are being rapidly improved. About 90 percent of the world’s power (approximately 10 TW) is generated by heat engines that convert heat to mechanical motion, which can then be converted to electricity when necessary. Such heat engines typically operate at 30-40 percent efficiency, such that ~ 15 TW of heat is lost to the environment To be competitive compared to current engines and refrigerators (efficiency 30-40 percent of Carnot limit), one must develop materials with ZT > 3. For the last 50 years, the ZT of materials has increased only marginally, from about 0.6 to 1, resulting in performance less than 10 percent of Carnot limit. There is no fundamental upper limit to ZT.

Some experts doubt that high ZT materials can reach the levels needed to displace the best current systems. The systems shown here represent an estimate of ‘best practice,’ meaning these values are based on the actual performance of up-to-date systems. The comments for each of the ZT levels seems pessimistic. ZT 2.0 is happening now, so each of the level descriptions should be adjusted.
ZT 2 (happening now, commercial by 2011)
ZT 4 (in lab work, commercial 2012-2015)
ZT 20 (ambitious plausible eventually)
ZT infinity (unknown)

These are not ‘best possible’ values as each of these technologies can be expected to improve going forward. The smallest mechanical engine represented is the ‘Solar/Stirling’ machine at 25 kWe. The others are at least 9 times larger and range up to 1600 MWe for the Nuclear/Brayton+Rankine study.

Typical conversion systems become less efficient as they are scaled down to small size. This means there is a crossover point: below some power level thermoelectric technology will tend to be more efficient. Increasing ZT will move the crossover point to higher power levels, increasing the range of applications where thermoelectrics compete. Thus the ZT of 3 to compete with current best car size and refridgerator mechanical systems.

Higher ZT scores are a way to rank materials. The higher the score then the closer the material is getting to Carnots limit (ultimate efficiency for heat engines)

Many people are aware of the importance of getting material that is stronger and lighter like carbon nanotubes. Carbon nanotubes are 100 times stronger than steel by weight. Carbon nanotubes will enable the creation of a space elevator for a new age of cheap access to space. Carbon nanotubes are part of the long progress of human material technology that has defined the times. Stone age, bronze age and iron age.

The dominant Power technology has similarly been important in defining human civilizations. The Age of steam and the age of the internal combustion engine. The overall timeline of heat engine technology. The steam engine has not gone away as 50% of electrical power in the USA is still coal power that is heat water to make steam via modern turbines. Most nuclear power plants use steam turbines.

Technology has been advancing by trying to achieve the theoretical efficiency that is possible with the Rankine Cycle

The heat efficiency of engines are limited by carnot efficiency (theoretical maximum) and endoreversible limits (identical to the Carnot cycle except in that the two processes of heat transfer are not reversible)

Some of the ways of getting to higher thermal efficiency are more expensive and less practical. They involve using gases or liquids that are difficult to handle. The internal combustion engine uses the Otto cycle, diesel uses the diesel cycle, jet engines use the Brayton cycle and supersonic ramjets may use the humprey cycle.

Thermoelectrics offer an alternative conversion of heat to electricity which has simplified handling of liquids or gases and which appear to theoretically and experimentally offer a practical path to higher thermal efficiency. They can also be applied to all aspects of heating and cooling. Power from heat as well as more efficient air conditioners and refridgerators. They also can be systems that are cheaper and lighter than current engines for converting heat. Thermoelectrics have been light enough to already have been chosen for energy conversion in space ships where weight was a critical factor. They can also have fewer moving parts, which means that they can breakdown less frequently and be cheaper to maintain. The actual parts where heat is converted to electricity does not move it is solid state. There are no pistons or gears only a tube that brings hot gas or liquid past the thermoelectrics and tubes for cooling the other side.

Very high efficiency thermoelectrics would be a fundamental upgrade to the power technology of civilization which would effect everything related to heat power and cooling.

The first generation of thermoelectrics are already in beer coolers and car seats with a ZT of 0.7 or less.

The second generation of thermoelectrics will be air conditioners and co-generators in cars with most in power ranges of 100watts to 5kw with ZT 1 to 3. They will be commercial from 2009-2015. (Others had targeted a 2011-2012 start, but progress appears to be going better)

The third generation of thermoelectrics will be replacements for combustion engines or will be significant boosters of efficiency for cars and powerplants and industrial processes. ZT of 4-20. They will be commercial and having wide impact from 2015 onwards. (Others had targeted a 2020 start, but progress appears to be going better.)

Fourth generation thermoelectrics if possible would be able to replace large powerplant thermal converters. ZT 20+.

Second, third and fourth generation thermoelectrics would help to increase the energy of higher temperature RTGs (radioisotope thermoelectric generator) for space vehicles. Most RTGs have been 10% efficient. Advanced thermoelectrics would boost higher temperature RTGs to 40-50% efficiency. RTGs are usually the most desirable power source for unmanned or unmaintained situations needing a few hundred watts or less of power for durations too long for fuel cells, batteries and generators to provide economically, and in places where solar cells are not viable. The advanced thermoelectrics would expand that superior power range to tens or hundreds of kilowatts.

There was the recent announcement from Ohio state University of a material for converting heat to electricity which could enable a 10% increase in car fuel efficiency and that they could soon increase it to achieve 15% improved fuel efficiency. This is 3-4 years from commercialization. Almost all the recent improvements (which have been many significant ones) to thermoelectric materials have come with a decrease in their thermal conductivity. Heremans and his colleagues increased the voltage that the materials create.

There is a lot of existing thermoelectric successes and many promising research projects. Ultrananocrystalline dispersed doped diamond (now available in kg quantities) could achieve a ZT of 4 at 1200C-2000C. This would enable conversion of over 40% of heat to electricity. Getting over 30% heat to electricity would make it practical to replace combustion engines with thermoelectrics.

BMW has found that re-purposing the otherwise wasted exhaust heat to power a thermoelectric generator generating up to 1kW could be used to reduce real-world fuel consumption by as much as 5%. The benefit comes from storing the electricity and using it to pre-heat the engine or power the air-conditioning systems.

Honda’s similar work on the Rankine Cycle [non-thermoelectric form of waste heat capture], which uses exhaust gas to heat water, creating steam that spins a turbine to generate electricity, has found as much as 32kW can be generated by the method, though the weight penalty for the device reduces fuel efficiency benefits to 3.8% at a 100km/h (62mph) cruise in a 2.0L direct-injection petrol four-cylinder.

The Freedom car project which has a portion for researching thermoelectrics is funded for $1.7 billion over 5 years. Cummins (big diesel engine company), GM, Argonne national labs working with the DOE developing and commercializing thermoelectrics.

Thermoelectrics can be used to make zonal air conditioning which would use 700Watts per person instead of 3500-4000Watts for the whole car. 3000 watts for five people will come out 2012 (early target range of 2012-2015). Thermoelectric (TE) HVAC enables the use of distributed cooling/heating units. This approach would cool/heat the specific number of occupants rather than the whole cabin and its components. In addition to decreasing engine load and thus increasing vehicle efficiency, TE HVAC will reduce or eliminate the need for conventional air conditioning working fluids, further reducing greenhouse gas emissions.

Michigan State University has been working with materials with a ZT 1.6-1.8 at 650-700K since late 2007. This is better than what was announced at Ohio university. Michigan State is targeting materials with a ZT 2.6 @ 800K.

Argonne labs working boron doped diamond in Kg quantities. Think they can get ZT over 4 working at 1200-2000C. Temp differentials that high would be like 40+% heat to electricity.

John LaGrandeur, BSST Waste Heat Recovery Program Development of a 100-Watt High-Temperature Thermoelectric Generator John LaGrandeur, BSST LLC

Development of Thermoelectric Technology for Automotive Waste Heat Recovery at General Motors.

GM has 350Watt avg systems on a truck 3% fuel savings can get to 4% with better integration.
The near term target is 10% fuel savings

Efficiency Improvement in an Over-the-Road Diesel-Powered Engine System by the Application of Advanced Thermoelectric Systems Implemented in a Hybrid Configuration Harold Schock, Michigan State University

From Feb 2008. Had ZT 1.6-1.8 at 650-700K. Targeting ZT 2.6 at 800K

Research has been funded for the development of ultrananocrystalline dispersed doped diamond (now
available in kg quantities).

Stable to > 1200°C, mechanically robust
Electrical conductivity increased by heat treatment and boron additions
Seebeck and thermal conductivity measurements in progress
Potential of ZT>4 Bulk TE produced by surface catalyzed reaction with
hydrocarbon molecules binds UNCD particles together covalently in an
sp2 bonded nanocarbon network
Inexpensive, non-toxic, environmentally benign

Future Directions – Long-range program
– Powder consolidation
– Atmospheres (argon initially, then methane, and hydrogen)
– Temperature (might be as high as 2000°C)
– Doping (p and n type)

Thermoelectric news

Advanced aerodynamic solutions are also being developed and should be capable of being economically mass-produced, safe, and amenable to the broad commercial truck market. Factory installed aerodynamic solutions expected to achieve a 20% reduction in trailer aerodynamic drag or 15% improvement in overall fuel economy of the tractor/trailer combination shall be available for purchase by truck fleets within 2 1/2 years from project start date, according to the DOE.

Stanford study of aging supports developmental drift model: If right gene therapy is the main anti-aging weapon

A Stanford study of the aging of the C elegans worm does not support he accumulated chemical damage model of aging

If aging is not a cost of unavoidable chemistry but is instead driven by changes in regulatory genes, the aging process may not be inevitable. It is at least theoretically possible to slow down or stop developmental drift. If the Stanford study is correct and applicable to humans then periodic (every few decades) genetic engineering could be used to slow down or stop the effects of development drift aging. There have been a lot of recent progress towards genetic engineering and gene silencing. Adults with genetically caused blindness had eyesight improvement from gene therapy.

It is also possible that both mechanisms have an effect on aging, but that the developmental drift model may be more dominant. Fixing developmental drift could lengthen life spans by 2-4 times but using SENS against chemical damage would be needed for even longer lifespans.

"The take-home message is that aging can be slowed and managed by manipulating signaling circuits within cells," said Marc Tatar, PhD, a professor of biology and medicine at Brown University who was not involved in the research. "This is a new and potentially powerful circuit that has just been discovered for doing that."

key regulatory pathways optimized for youth have drifted off track in older animals. Natural selection can’t fix problems that arise late in the animals’ life spans, so the genetic pathways for aging become entrenched by mistake. Kim’s team refers to this slide as “developmental drift.”

“We found a normal developmental program that works in young animals, but becomes unbalanced as the worm gets older,” he said. “It accounts for the lion’s share of molecular differences between young and old worms.”

Kim can’t say for sure whether the same process of drift happens in humans, but said scientists can begin searching for this new aging mechanism now that it has been discovered in a model organism. And he said developmental drift makes a lot of sense as a reason why creatures get old.

“Everyone has assumed we age by rust,” Kim said. “But then how do you explain animals that don’t age?”

Some tortoises lay eggs at the age of 100, he points out. There are whales that live to be 200, and clams that make it past 400. Those species use the same building blocks for their DNA, proteins and fats as humans, mice and nematode worms. The chemistry of the wear-and-tear process, including damage from oxygen free-radicals, should be the same in all cells, which makes it hard to explain why species have dramatically different life spans.

Very high temperature gas-cooled reactor could burn 65% of uranium

The US Department of Energy (DoE) is planning to build a very high temperature gas-cooled reactor (VHTR) at Idaho National Laboratory, with the prime objective of supplying heat at about 900°C. This heat could be used to generate electricity, or for other industrial processes such as hydrogen production or water desalination at a neighbouring facility.

The current timeline to have the first VHTR completed is 2021 assuming it the NGNP was fully funded from now until 2021.

Dan Yurman indicates that the NGNP had been getting $30 million per year sincc 2005 which is 1/3 the needed level and then in Dec 2007 had gotten $100 million which will help to get some more work done but still have the project slightly behind the 2021 schedule.

The vision for Idaho National Labs for the VHTR is similar to the vision of the Uranium Hydride reactor by Hyperion Power Generation and the Chinese 200 MW HTR-PB.

If NextGen is successful, between 15 and 20 years from now North Americans will build dozens - maybe hundreds - of comparatively small, high-temperature gas-cooled reactors to breathe heat into refineries and chemical plants, especially plants to make hydrogen. Right now on-site industrial heating plants are fueled with gas, coal or oil and most nuclear power today is generated from big light-water reactors to meet base-load electrical demand.

Light-water reactors that operate today drive turbines by attaining temperatures of 300 degrees C or so but high-temperature, gas-cooled reactors can reach 900 degrees C or higher. That means such reactors can provide more than enough heat to refine crude oil or to separate bitumen from shale or sand in the Western United States and Canada - all of which helps extend domestic energy supplies.

Even more hope is needed when you consider the project's main industrial goal, hydrogen production, which requires temperatures of 800 degrees C.

If all this is to happen in a by 2021, engineers and scientists must perfect heat-resistant materials and other aspects of the design while private companies must risk billions beyond the federal tax dollars on the $3 billion to $4 billion project.

Research commissioned this week by the DoE would see two teams of scientists examine the potential of a VHTR such as NGNP for 'deep-burn' of nuclear fuel. 'Deep-burn' refers to the VHTRs ability to burn up to 65% of its inital fuel, compared to burn-up levels of around 5% in conventional light-water reactors. Instead of 95% of the uranium going into what would be nuclear waste. This reactor would only have 35% become nuclear waste.

This deep burn capability would not be as good as a molten salt nuclear reactor. A Molten salt reactor can burn 99% of initial fuel. Clearly the 65% deep burn would be an improvement of 13 times over existing reactors. The higher temperature of the reactor would all 60% of the heat to be converted to electricity instead of 33% for current reactors.

The concept of deep burn relates to the US-led Global Nuclear Energy Partnership, in which advanced reactors would destroy similar wastes produced by mainstream light-water reactors of the kinds widely used today. It is projected that volumes of high-level waste could be reduced by a factor of 50, while extra electricity is generated. The reactor envisaged for GNEP, however, would be a sodium-cooled model.

The VHTR would have about one third the volume of high level waste.

Thermoelectric advance could increase car, truck, power plant and industrial fuel efficiency by 15%

Industrial waste heat is 7 quads in the USA. There is more waste heat from power plants and from cars. Capturing 20% of that waste heat is 1.4 quads every year. 1.4 quads is double all of the wind energy generated in the USA from 2003 to 2006. Further down this article is diagrams and descriptions of the many ways to capture waste heat in cars and trucks (not just thermoelectrics). The positioning of where the temperature differentials are is explained.

This will mean 15% more fuel efficient cars and trucks starting in 2011-2012 and better air conditioners and refridgerators that do not need R134 gas. Applying to our current power plants would be like adding 10-30 nuclear power plants and 150-375 coal plants and 500-1500 natural gas plants that would not use any more fuel because it would be from more efficient use of existing power plats.

Thermoelectric technology and the Freedom Car project were covered at this site before

The superior thermoelectrics can also replace air conditioner technology and refridgerator technology.

MIT Technology Review has coverage

The ZT 1.5 efficiency would translate into a 10 percent increase in the fuel economy of cars if the devices are used to replace alternators in automobiles by generating electricity from the heat in exhaust. The ZT 3.0 materials would be a 15% increase in fuel economy of cars and trucks. The devices could begin selling in 3 to 4 years.

NOTE: If you get up to ZT 5 or so with a cheap enough system then you can replace most of the moving parts of an engine with thermoelectrics. You would generate heat and then use thermoelectrics with no moving parts to convert the heat directly to electricity with higher efficiency.

The new material, thallium-doped lead telluride, has a ZT rating of 1.5 -- more than twice that of the previous commercial leader. The researchers believe they can increase it to a ZT of 3.0. ZT is a measure of how good a material is at converting heat to electricity. The chart below shows the differences in efficiency gains from different levels of ZT. The higher the better. The current commercial best is sodium doped lead at 0.71. It is not necessary to understand the science or the principles behind ZT to appreciate it or thermoelectrics.

What's more important to Heremans is that the new material is most effective between 450 and 950 degrees Fahrenheit (300-600K temp differences possible)-- a typical temperature range for power systems such as automobile engines at usable for higher temperature utility power reactors (nuclear and fossil fuel)

The current commercial best ZT figure of 0.7 meant 5-10% recapture of energy from heat at 200-300 degrees temp difference. 1.5 means 12-18% recapture of energy from heat for 300-600K degree temperature differences. ZT at 3.0 for 300-600K temperature differences means 16-25% recapture of energy from heat.

If a ZT of 10 could be achieved for 300-600K differences then it is 26-38% energy recapture. Being able to apply those to current or future nuclear or fossil fuel reactors would be a great boost in energy from fewer power plants.

Waste heat from cars

a big part of 62% of friction and heat losses.

Heat2power is a company that is working on capturing the waste heat from cars and provides an overview of the work in this area.

Not just thermoelectrics could be used for capturing waste heat in cars but :
• Electrical Turbo-Compounding (Caterpillar) : 3 to 10% announced fuel economy
• Mechanical Turbo-Compounding : 5 to 10% announced fuel economy
• TIGERS : Turbo-generator Integrated Gas Energy Recovery System : 6% announced fuel economy
• Thermo-electricity : 20% announced fuel economy
• Stirling Cycle in co-generation : up to 40% announced fuel economy but a too low specific power
• Rankine Cycle : Turbosteamer : 17% announced fuel economy
• Organic Rankine Cycle (ORC) : up to 60% announced fuel economy
• Thermo-acoustics : low specific power

Catepillar has a patent for using thermoelectrics in diesel engines. The above diagram shows how they would place a system to capture 265C temperature differential under the truck chassis.

The 1.5ZT material would get 10% of power at 265C degrees of difference.
the 3.0ZT material would get 15% of power at 265C degrees of difference.

Power flows in a car/truck from heat2power

Heat to Power method for capturing waste heat in a car/truck

The heat2power system is based on the use of one or more cylinders for the regeneration of waste heat. These cylinders can be in replacement of the combustion cylinders inside an existing engine or as an add-on module that is connected to the engine by means of a gear set or a belt drive. Also is it possible to have no mechanical linkage between combustion engine and regeneration unit in case the power from the regeneration unit is taken off electrically. In general for low cost of installation and development we recommend OEMs to use an add-on system. In that way the original engine remains basically unchanged.

The thermal power is extracted from the exhaust of the internal combustion engine by means of a heat exchanger. This is an gas-gas heat exchanger operating at high temperatures: up to about 950°C. Basically the heat2power system works like most other thermodynamic cycles : intake and compress a gas, then heat it up and finally let it expand. The difference between an ICE and the heat2power system is that the heat input is not by a combustion inside the cylinder but by heat exchange external to the cylinder.

After the expansion stroke the air is released at low temperatures (250-300°C instead of 600-950°C). This can also be considered as an advantage for military vehicle that require a low thermal profile.

The heat exchanger in the exhaust is placed after the catalyst (gasoline vehicles) or after the particle filter (diesel vehicles). In such manner the exhaust gas after treatment remains unaffected and the combustion engine does not need its tuning to be done all over again. However we recommend to apply thermal insulation of the exhaust manifold and the first part of the exhaust and catalyst/DPF so that a maximum amount of heat is available for the regeneration process.

Industrial waste heat
The amount of process heat that is wasted in the glass, steel, aluminum and chemical industries exceeds 100 TBTU/yr and may be approaching a Quad/yr. In a typical plant, the waste heat available is on the order of 60 Million BTU/hr, or 17.6 MW.

Capturing 20% of that heat is 12 million BTU/hr or 3.5MW.

This material appears suitable for capturing waste heat from our current electrical power plants. Some of those power plants have co-generation where the heat is already used, but many do not capture or use the waste heat. Wide deployment of 20% efficient recapture of waste heat would mean about 5-10% more electrical power from existing sources. For the USA, that is 220-440 TWh of electricity. This would be roughly equivalent to doubling all of the hydroelectric power generated in the USA or a 25-50% increase in nuclear power but without any more fuel use or new plants being built.

List of countries by electricity generation.

World electricity generation at wikipedia

Electricity generation table from 1997-2007 for every country in the world from BP

Researchers working towards the Carnot limit of energy recapture form heat

Other researchers are working to commercial inexpensive material with ZT of 1.4

Big energy picture for the USA

World energy usage statistics by BP (48 pages)

A 24 page powerpoint presentation on the industrial waste heat opportunity from the US department of energy

July 24, 2008

Apple 3G iPhone availability tracking

Here is a webpage that tracks the availability of the three different Apple 3G iPhones by apple store with links to the stores in each state. It is updated every 15 minutes.

There are also graphs of the percentage of stores with available Apple iPhones. It appears that around 10AM-noon each day there are bursts of availability as deliveries are made. Availability has been improving since Monday July 21 and the supply issues could be mostly resolved by Friday or early next week.

Universal allergy treatment going to Phase III clinical trials

Cytos Biotechnology has a product CYT003-QbG10 with the potential to treat many different allergies because it does not give people tiny doses of the specific substance that they are allergic. Instead, it works by distracting the overactive immune system, which is thought to be the cause of most allergic reactions. Current allergy shot treatments require about one hundred shots over several years with a very high total cost. ($35 [often after medical plan coverage] per shot times 100, $3500 overall treatment)

Study 08 with CYT003-QbG10 monotherapy included 80 patients suffering from house dust mite and /or cat allergy and investigated the safety, tolerability and efficacy of six injections of ascending doses of CYT003-QbG10 (300-900μg) or placebo. Treatment with CYT003-QbG10 was safe, very well tolerated and significantly reduced rhinoconjunctivitis symptoms in daily life compared to placebo (p=0.008). The CYT003-QbG10 treatment group mean total rhinoconjunctivitis symptom score had fallen from 9.3 points pre-treatment to 3.6 points post-treatment (-61%), whereas for the placebo group a reduction from 9.2 points pre-treatment to 6.3 points post-treatment (-32%) was observed. Also, the allergen tolerance as measured in the conjunctival provocation test was improved after CYT003-QbG10 treatment compared to placebo (borderline significant, p=0.06).

A product with an allergen-independent mechanism of action has major advantages over current immunotherapy approaches, which are all based on allergen components. By nature, allergen extracts can cause severe side effects, which may lead to potentially life-threatening conditions like anaphylaxis. This is why these treatments are contraindicated in those patients who would benefit most from them: People with severe allergies and asthma. In addition, such treatments are usually
administered only by specially trained physicians.

In contrast to this, the almost placebo-like side effect profile of CYT003-QbG10 monotherapy may enable the use of this product in larger patient populations and by general practitioners. Its allergen-independent mechanism of action simplifies treatment, since a single agent can be used for the treatment of multiple allergies. Also, it may be possible to use this product in people for whom immunotherapy is currently contraindicated.

With these major advantages, CYT003-QbG10 has potential to rejuvenate the mature allergic diseases market, which is dominated by either purely symptomatic drugs like antihistamines or corticosteroids or by immunotherapy products containing allergen components. We are therefore excited to advance CYT003-QbG10 monotherapy as a first-in-class, disease-modifying product candidate into late-stage development. Start of phase IIb is planned already in the forth quarter this year.”

Carnival of Space Week 64

July 23, 2008

A less secretive company trying to store CO2 in cement

From MIT Technology Review, a Canadian company, Carbon Sense Solutions (CSS), says that its curing process can store 60 tons of carbon dioxide inside 1,000 tons of precast concrete products, such as concrete blocks, while saving energy. CSS has plans for a 5000 ton per day plant by 2011 and increasing 25,000 tons per day by 2012-2013.

This site has covered Calera which claims to be able to store 1000 tons of carbon dioxide for each 1000 tons of concrete, but has been secretive about its process.

Robert Niven, founder of Halifax-based Carbon Sense Solutions, says that his company's process would actually allow precast concrete to store carbon dioxide. The company takes advantage of a natural process; carbon dioxide is already reabsorbed in concrete products over hundreds of years from natural chemical reactions. Freshly mixed concrete is exposed to a stream of carbon-dioxide-rich flue gas, rapidly speeding up the reactions between the gas and the calcium-containing minerals in cement (which represents about 10 to 15 percent of the concrete's volume). The technology also virtually eliminates the need for heat or steam, saving energy and emissions.

Work is expected to begin on a pilot plant in the province of Nova Scotia this summer, with preliminary results expected by the end of the year. If it works and is widely adopted, it has the potential to sequester or avoid 20 percent of all cement-industry carbon-dioxide emissions, says Niven. "If the technology is commercialized as planned, it will revolutionize concrete manufacturing and mitigate hundreds of megatons of carbon dioxide each year, while providing manufacturers with a cheaper, greener, and superior product." He adds that 60 tons of carbon dioxide could be stored as solid limestone--or calcium carbonate--within every 1,000 tons of concrete produced. Further, he claims that the end product is more durable, more resistant to shrinking and cracking, and less permeable to water.

The idea of concrete carbonation has been around for decades but has never been economical as a way to strengthen or improve the finished product. In the late 1990s, researchers showed how carbon dioxide could be turned into a supercritical fluid and injected into concrete to make it stronger, but the required high pressures made the process too energy intensive. Carbon Sense Solutions claims to achieve the same goal but under atmospheric pressure and without the need for special curing chambers

Research professor Tarun Naik, director of the University of Wisconsin-Milwaukee's Center for By-Products Utilization, says that all concrete absorbs carbon dioxide over time if left to cure naturally--but only up to a point. The gas usually penetrates the first one or two millimeters of the concrete's surface before forming a hard crust that blocks any further absorption. Naik says that something as simple as using less sand in a concrete mix can increase the porosity of the finished product and allow more ambient carbon dioxide to be absorbed into the concrete. It's simpler than Carbon Sense Solutions' accelerated curing process and can be applied to a much larger market, he says.

Other groups are taking aim at emissions from the cement-making process itself. Researchers at MIT are seeking new ingredients in cement that are less energy intensive, while companies such as Montreal's CO2 Solution have an enzymatic approach that captures carbon-dioxide emissions from cement-factory flue stacks, converts the greenhouse gas into limestone, and feeds it back into the cement-making process. Calera, backed by venture capitalist Vinod Khosla, even claims that it can remove a ton of carbon dioxide from the environment for every ton of cement it produces.

Optical lithography can go to 12 nanometers at least

From the EEtimes, MIT researchers have solved issues with scanning beam interference lithography and have tested at 25 nanometers and believe they can get to 12 nanometers at least. This is a big deal because it will ensure that Moore's law to continue to improve computers for another 15 years or more. Current lithography is in the 45-60 nanometer range and 12 nanometers is 15-25 times more dense.

Mark Schattenburg, director of the Space Nanotechnology Laboratory, and his group have done the work

Scanning-Beam Interference Lithography is being commercialized at the Plymouth Grating Laboratory (Mark Schattenburg's company

Using our scanning beam interference lithography technique, optical lithography is mainly limited by the roughness of materials--and our ability to see such fine features."

"In traditional interference lithography the wafer is stationary, but in scanning beam interference lithography the wafer is constantly moving," said Schattenburg.

"We synchronize the grating image with the movement of the wafer using 100-MHz sound waves," said Schattenburg. The sound waves vibrate the laser's crystals, slightly shifting their frequency up and down as they recede from and approach toward, respectively, the desired feature being imaged. This compensation produces a stable, consistent grating image across the patterns being transfered to the wafer, according to the researchers.

Schattenburg has founded a lithography company called Plymouth Grating Laboratory (Plymouth, Mass.) which is currently considering the commercialization of the new lithography technique.

July 22, 2008

Electric powered, exclusive robotic car urban transportation zones

Invented by Alvin Wang (nextbigfuture co-author)

This is a plan to enable the safe early deployment of robotic cars, trucks and buses. The robotic car only zones can start off smaller with 10-100 cars covering 10X10 blocks or so and then expanding as the system is proven. Public transportation would be cheaper and better and enable the start of complete shift to completely robotic driven cars which would be safer than current human driven cars and a reorganization of transportation to be cleaner, cheaper and safer without sacrificing time or convenience. 45,000 people per year in the United States die from traffic accidents and 1.2 million people in the world die from traffic accidents. The effective global implementation of a revamped robotic car system would save all those lives and would not need to cost time or convenience. Time can be saved and the system can be more convenient than the current system of human driven cars.

UPDATE: Brad Templeton had written an excellent series of articles on robocars for public transportation.

Computer driven cars are being developed and tested in the Darpa grand challenge but they are expected to take 20-25 years to be ready for deployment to existing roads with other human piloted cars.

Boss, a robotized 2007 Chevy Tahoe, was the fastest of the 2007 Urban challenge competitors by a large margin. Boss averaged about 14 miles an hour over approximately 55 miles, finishing the course about 20 minutes ahead of the second-place finisher, Stanford. It followed California driving laws as it navigated the course and that it operated in a safe and stable manner. So in 2-3 years it should be possible to increase the speed of safe operation up to 30-40 mph, which would be facilitated by adjusting the city streets with electronic markers to make the city course easier for the robot cars.

The Carnegie Mellon team is at this site.

Tartan Racing technology enables Boss to:

Follow rules of the road
Detect and track other vehicles at long ranges
Find a spot and park in a parking lot
Obey intersection precedence rules
Follow vehicles at a safe distance
React to dynamic conditions like blocked roads or broken-down vehicles

High-level route planning determines the best path through a road network. Motion planning requires consideration of the static and dynamic obstacles detected by perception, as well as lane and road boundary information, parking lot boundaries, stop lines, speed limits, and similar requirements.

This new plan is to have only other robotic cars as dynamic obstacles which are registered with the high level routing system (with a known routing path) and to place electronic markers and other navigational aids and local traffic routers to assist safe driving at faster speeds.

Electric cars are expected to not be significant in transportation for ten years or more.

City districts and then whole cities can create an environment that would allow for rapid low infrastructure modification deployment of all robotic electric car driving zones.

There are sections of some cities where cars are banned. These are usually closed streets with open air shopping and people walking. Larger zones can have human driven cars banned. In the robotic taxi/public transit zone, on every corner an electronics filled box or post can be placed that would provide assistance to mass produced Darpa Grand challenge type robotic cars. There would be no human drivers for the robotic cars to deal with. There would be controlled intersections and all troublesome unpredictabilities removed within the no human car zone.

Unlike dual mode transportation which would require guiderails to be built, this plan would only require enough computers and sensors to assist the safe operation of the robotic cars. It could be possible for some designated streets to have bicyclists and if the robotic cars and traffic control is up to it there could be shared roads with separate lanes for bicyclists and lanes for robotic cars.

All the robotic cars could be electrically powered because they would only operate within a section of city or eventually a whole city. When no one is in the vehicle the car would go to maintenance and recharging areas as needed.

Because there would be no human piloted cars, city parking within the zone would be freed up either to park the robotic cars or for increased densification of the city.

Robotic cars could have variety as needed. Some could be trucks for moving cargo. Some could have flat beds which could even transport cars across the no car zone.

If required initially some of the cars could not be electric, but ideally they would be electric.

The robotic car network and zone would integrate with rail transit that came to the city/zone. For example, San Francisco would have robotic cars waiting at the BART and Caltrain stops. There would be mostly pickup to doorstep and doorstep to parking lot or public transit transportation with minimal or zero waiting.

Parking for passengers vehicles that arrive at the city/zone would stop at the edge of the robotic vehicle only zone or go to park and rides and arrive via public transportation.

Robotic car waiting and pre-booking

Cities would be free to determine service level, but pre-booking and capacity planning would allow there to be virtually no wait times for robotic cars.

The dynamic car pre-booking and live booking systems could show robitic car availability similar to booking seats on airplanes.

Given the known rush hour flows each car should be able to make several trips each rush hour. Dynamic pricing would allow those who were willing to ride share or have some rides on buses to be charged less.

Cellphones could be used call for a robotic car.

The Current Darpa Robotic vehicle technology is good enough
Robotic vehicles have navigated a closed course urban setting and a closed course offroad setting. The controlled environment of the zone would match the capabilities of existing robotic vehicles.

Robotic vehicles could provide accident free transportation in the zone/city.

They would help operational costs stay lower so that more vehicles could be operated without needing vehicles large enough to justify buses and bus drivers.

If ten trips per person ratios were sufficient then for 1,000,000 people there would be 100,000 vehicles. Most of the vehicles might only need a top speed of 35-40mph and could be produced for $2500-10,000 each for the vehicle. Assuming a $10,000 average ($5000 for the car and $5000 for mass produced control electronics) then the vehicle portion of the system would be $1 billion. Add another $300 million for the on the street guidance assistance markers and systems and the booking and reservation system. The parking would be the re-assignment of existing city resources.

Bay Area transit agencies, including BART and AC Transit received $922 million directly from the state under that Proposition 1B bond measure rounding out a total of nearly $1.3 billion for area transit.

Add 30% each year for maintaining the vehicles and the system.

The best places to introduce the system would be cities with fairly large downtown cores (Manhattan, Chicago, San Francisco etc...) and in the entire new cities that China is building as they increase urban population by 1-2% each year.

The system is using technology that is ready now in terms already demonstrated in small or large numbers. Increasing the speed means more research and waiting. Plus one hundred times current speeds of 40 mph is 4000 mph which is hypersonic.

Even using the Darpa systems involves scaling up of production and getting unit costs down.

Safety is addressed because everything in the robotic zone is tailored for simplicity. No non-computer controlled vehicles on roads with robots. There need be no large vehicles either. The option of mixing vehicles of any kind is not needed if safety is compromised in any way.

Everything is modularized and able to be separately upgrade. Vehicle power can be upgraded. Vehicles can be upgraded one at a time. control systems and software can be upgraded. Different zones can be separately upgraded. When better tech is ready then it can be adopted just maintain overall system compatibility.

Driverless zones can cover the whole cities and eventually highways as well. The buildings are not moving and when new ones are added the digital maps are changed. The Darpa challenge vehicles have driven urban courses safely. GPS routing can lay out a course from point A to B anywhere in the city. Then it is just a matter of making sure that the robotic cars can be made aware of each other and ensure that there interactions are simplified so that no unpredictability is created.

You would have master routing and overall automated traffic control.

Computerized car people movers that book in their "flight plan" live. With localized (intersection by intersection) safe and fast resolution of local routing conflicts.

Money does buy happiness or at least satisfaction. Chinese 83% satisfied with direction of China.

From the International Herald Tribune, eighty-six percent of the Chinese surveyed said they were content with the country's direction, up from 48 percent in 2002 and a full 25 percentage points higher than the next highest country, Australia. And 82 percent of Chinese were satisfied with their national economy, up from 52 percent.

China has had double digit economic growth for over five years. It seems that having a fast pace of recent (5-10 years) of personal and national economic gains are the keys to broad based satisfaction (aka happiness).

The biggest concern of Chinese - expressed by 96 percent - was rising prices. Corruption and environmental degradation also worried majorities of Chinese.

Only 23 percent of Americans surveyed said they were satisfied with the country's direction and only 20 percent said the U.S. economy was good.

Russians were the third most-satisfied people with their country's direction, at 54 percent.

43% of Iraqis are happy with the direction of their country

Except for Spain, which placed fourth at 50 percent, the peoples of major European countries were far from content. Only about 3 in 10 British, French and Germans expressed satisfaction.

Sixty-five percent of the Chinese said the government was doing a good job on the issues most important to them, though support was somewhat lower in the western and central provinces, which have not enjoyed the rapid growth of eastern regions.

The poll was based on 3,212 face-to-face interviews that were conducted in 16 dialects from March 28 to April 19 across China, though disproportionately in urban areas. The margin of sampling error is plus or minus two percentage points. Sample sizes and error margins in the other countries varied.

North Dakota Bakken oil heading for 200,000 barrels of oil per day by end of 2008

North Dakata has added 20,000 barrels of oil per day since the end of 2007 and the end of May, 2008 At the end of May, 2008 North Dakota is producing 156,356 barrels of oil per day.

Nine other states currently are listed in the count as being "major" oil-producing states: Alaska, Arkansas, California, Colorado, Louisiana, New Mexico, Oklahoma, Texas and Wyoming. In the most recent count, North Dakota ranked seventh in terms of number of rigs but had the second-biggest increase over the year with 29 more rigs, behind only Texas with an additional 84

Northern Oil and Gas, Inc. announced significant Bakken discoveries, Three Forks/Sanish Plans, and updates accelerated 2008 drilling schedule and capital position.

Northern holds a working interest in an additional fourteen wells that are in the drilling or completion stages and is included in nearly 70 permitted or docketed-for-permit drilling locations that are expected to drill between now and early 2009. Northern Oil controls approximately 60,000 net acres in the North Dakota Bakken play, yielding over 90 net well locations based on 640 acre drilling units. In addition, Northern controls approximately 22,000 net acres in Sheridan County, Montana and has successfully completed two wells to the Red River formation.

The recently completed Johnson 33 #1H well operated by Brigham Exploration is representative of Northern's growing success in the North Dakota Bakken play. The well flowed at an early rate of 650 BOPD, with sustained production of approximately 560 BOPD. The Johnson well is located significantly north of the Parshall field, representing an important Northern extension of prolific Bakken production. Northern participated in the Johnson 33 #1H with a 16.5% Working Interest.

Trikke - Take a Ride.Exhaust Pipe Adapters

An article in the Billings Gazette describes the oilwell operations in the Bakken

After a vertical well is drilled to that level, the drill goes horizontal, a technique that had been used for years. In the Bakken - named for a North Dakota farmer on whose farm the shale layers were first identified in the 1950s - horizontal drilling is paired with another technique, fracturing. That involves forcing a mixture of sand, water and gel underground at enormous pressures, thanks to those banks of 2,200-horsepower pump trucks.

As the mixture shoots out of holes in the horizontal pipe, the water opens up cracks in the rock and the sand flows in behind it, holding the fractures open so oil can ooze down into the pipe - the same one used to pump in the mixture of sand and liquid - and from there to be drawn to the surface.

"It's kind of like a mining operation, in a way," Lechner said. "You need more tunnels."

In the case of the Hartland 14X-26 well, the frac crew worked for a little more than two days, pumping 700,000 gallons of water and gel and about 800,000 pounds of sand into the ground. A typical drilling unit will have as many three lateral wellbores per two sections of land, which is 1,280 acres. The perpendicular fractures run out about 50 feet on both sides of each lateral. The laterals have perforated pipes, called liners, that keep the wellbores from collapsing after the frac job.

The Bakken oil and increased wheat and other commodity prices is creating an economic boom in Saskatchewan, Canada

"For every square mile of land in the Bakken, there's four to five million barrels of oil in place," says Gregg Smith, Petrobank's vice-president. "We feel we can make the Bakken profitable if prices stay above $50 a barrel -- it's extremely good quality oil, it's what the refineries want the most." Today at least 65 oil rigs, many of them locally owned, are drilling and finishing wells on the Bakken, at a cost of $2-million per well.

This is a follow up to previous articles on North Dakota oil and Bakken oil

Less than ten years to develop offshore oil if regulations modified

Investors Daily has an editorial by Monica Showalter which proposes that offshore drilling in the United States could develop oil far faster than ten years if regulations were adjusted.

California's 10 billion barrels in offshore oil could be brought to market in as little as a year "if the moratorium were lifted," according to a recent Sanford C. Bernstein report said, citing that the oil is under shallow water and drilling platforms already exist.

Polls show most Americans favoring opening federal lands and offshore areas to energy production. As it stands, 97% of our offshore areas and 94% of our federal lands are off limits.

To begin with, industry analysts note, much of the drilling delay is self-inflicted — a result of excessively stringent environmental and land-use regulations.

Scrap those, or modify them, and new oil can be produced in far less than 10 years.

Producing oil from new sources has three stages, which can take years, notes Marilyn Crockett, executive director of the Alaska Oil and Gas Association in Anchorage.
1. environmental impact report
2. bidding on leases
3. drilling.

1. The environmental impact statement alone can take two to three years.

The Interior Department's Minerals Management Service manages those studies for federal offshore holdings, after input from local, state and environmental groups, according to spokeswoman Robin Cacy.

2. After that comes bidding.

MMS manages the 574 million acres of offshore federal holdings, and the Bureau of Land Management directs those for federal land, such as the National Petroleum Reserve of Alaska and the Arctic National Wildlife Refuge. For instance, Cacy said MMS conducts lease bids in five-year blocks, with recent ones in the Chukchi Sea off northwest Alaska set for 2007 to 2012. Based on this, "we can't add new sales to that," she said.

Yet, in some areas, the regulatory processes is largely done, so oil can come to market far sooner than 10 years — if Congress lets it.

Bureaucracy's not the only problem.

According to the Institute for Energy Research, a private think tank, citing Bureau of Land Management data, protests, appeals and lawsuits over oil development averaged 1,180 per year between 2001 and 2007, a 706% increase over 1997-2000. The IER notes, for instance, that 100% of New Mexico's 78 oil leases were protested by environmental and neighborhood groups.

In Alaska's case, stringent environmental regulations permit exploration only in winter — from December to April. "We go to extreme means to make sure we do no harm to the tundra," said Crockett. Offshore exploration has fewer environmental delays, but requires more infrastructure to bring the oil to market, said Crockett.

July 21, 2008

Spinal cord stem cells could enable new paralysis treatment

A researcher at MIT's Picower Institute for Learning and Memory has pinpointed stem cells within the spinal cord that, if persuaded to differentiate into more healing cells and fewer scarring cells following an injury, may lead to a new, non-surgical treatment for debilitating spinal-cord injuries.

Their results could lead to drugs that might restore some degree of mobility to the 30,000 people worldwide afflicted each year with spinal-cord injuries.

The researchers at MIT and the Karolinska Institute found that neural stem cells in the adult spinal cord are limited to a layer of cube- or column-shaped, cilia-covered cells called ependymal cells. These cells make up the thin membrane lining the inner-brain ventricles and the connecting central column of the spinal cord.

"We have been able to genetically mark this neural stem cell population and then follow their behavior," Meletis said. "We find that these cells proliferate upon spinal cord injury, migrate toward the injury site and differentiate over several months."

Upon injury, ependymal cells proliferate and migrate to the injured area, producing a mass of scar-forming cells, plus fewer cells called oligodendrocytes. The oligodendrocytes restore the myelin, or coating, on nerve cells' long, slender, electrical impulse-carrying projections called axons. Myelin is like the layer of plastic insulation on an electrical wire; without it, nerve cells don't function properly.

"The limited functional recovery typically associated with central nervous system injuries is in part due to the failure of severed axons to regrow and reconnect with their target cells in the peripheral nervous system that extends to our arms, hands, legs and feet," Meletis said. "The function of axons that remain intact after injury in humans is often compromised without insulating sheaths of myelin."

If scientists could genetically manipulate ependymal cells to produce more myelin and less scar tissue after a spinal cord injury, they could potentially avoid or reverse many of the debilitating effects of this type of injury, the researchers said.

George Church and the Personal genome project featured in Wired

George Church's lab makes the gene sequencer Polonator G.007 — offered at the low price of $150,000. Church maintains that, while the Polonator isn't up to whole-genome reads, it is clocking in at about one-third the cost of Applied Biosystems' estimate ($60,000 for Applied Biosystems. So the Polonator is at $20,000 but needs to improve for whole genome reads).

PGP (personal genome project, sequencing the 100,000 exomes in the PGP, exomes are 1% of the genome that is the most important), the Polonator, and the fact that the rest of the world is finally starting to understand what he's been talking about, Church's obscurity is coming to an end. He sits on the advisory board of more than 14 biotech companies, including personal genomics startup 23andMe and genetic testing pioneer DNA Direct. He has also cofounded four companies in the past four years: Codon Devices, Knome, LS9, and Joule Biosciences, which makes biofuels from engineered algae.

If the PGP were simply an exercise in breaking down 100,000 individuals into data streams, it would be ambitious enough. But the project takes one further, truly radical step: In accordance with Church's principle of openness, all the material will be accessible to any researcher (or lurker) who wants to plunder thousands of details from people's lives. Even the tissue banks will be largely accessible. After Church's lab transforms the skin into stem cells, those new cell lines — which have been in notoriously short supply despite their scientific promise — will be open to outside researchers. This is a significant divergence from most biobanks, which typically guard their materials like holy relics and severely restrict access.

For the PGP volunteers, this means they will have to sign on to a principle Church calls open consent, which acknowledges that, even though subjects' names will be removed to make the data anonymous, there's no promise of absolute confidentiality.

To Church, open consent isn't just a philosophical consideration; it's also a practical one. If the PGP were locked down, it would be far less valuable as a data source for research — and the pace of research would accordingly be much slower. By making the information open and available, Church hopes to draw curious scientists to the data to pursue their own questions and reach their own insights. The potential fields of inquiry range from medicine to genealogy, forensics, and general biology.

Church cautions, however, that keeping clinicians and patients in the dark about specific genetic information — essentially pretending the data or the technology behind it don't exist — is a farce. Even worse, it violates the principle of openness that leads to the fastest progress. "The ground is changing right underneath them," he says of the medical establishment. "Right now, there's a wall between clinical research and clinical practice. The science isn't jumping over. The PGP is what clinical practice would be like if the research actually made it to the patient."

In the not-too-distant future, Church says, hospitals and clinics could be outfitted with a genome sequencer much the way they now have x-ray machines or microscopes.

Offsetting Peak oil for one to two decades: Viable Electric cars for less than $5000 electric car credit

Previously this site looked at a technology that is almost in hand for solving excess manmade CO2 (possible global warming) and now a simple nearly in hand technology and plan for addressing peak oil. They both need to be part of an overall energy and technology development plan, but a key point is technology and good planning can solve the issues that the world is concerned about. Some may say, hey if we can solve peak oil and climate change with those simple technologies and painless plans then we are done. No, we have to provide enough clean energy so that everyone living in China, India, Africa etc... can be rich. This will require a lot more energy per capita. Many of the negawatt plans are basically plans to make everyone poor with very little energy per person. Those negawatt are bad plans that ignore what people really want. Long term we will need to switch to clean electric power sources (nuclear fission, nuclear fusion, solar, wind, geothermal, hydroelectric and others)

UPDATE: H/T Instapundit. A description of why small cars are not unsafe on the road when hit by an SUV. Although we would be safer if there were fewer large vehicles on the road

John McCain has proposed a $5000 tax credit for electric cars.

Inexpensive electric cars that cost less than $5000 could be purchased as extra commuter vehicles without the need to wait for people to get rid of their current cars which can take a decade or more. A model of an inflatable electric car could be only $2500 in price and available starting in 2010-2011. Such an inexpensive vehicle would make economic sense to drivers who could have the cost recovered in fuel savings in one year or less instead of over ten years with current expensive hybrids.
UPDATE: Addressing the possibility that the inflatable electric car does not deliver or is a hoax. There are other cheap electric car possibilities listed here. An electric Tata Nano. The concepts for an inflatable electric car do not sound impossible and would seem to offer valid though radical concepts that would be worthwhile to research even if those behind XP Vehicles were not the ones to do it.

This would be a solution to Peak Oil. Offsetting peak oil for one to two decades allows time for the transition build up of nuclear and other energy sources. Peak oil is the problem when oil production starts to decline and the price of oil increasing rapidly. Current price increases are peak lite, with demand exceeding supply and supply still increasing. If we are able to rapidly scale up to tens of millions and then hundreds of millions of electric cars below $5000 with a driving range between charges over 200 miles and able to hold 4 passengers then the electric commuter car should rapidly achieve 50-80% adoption with some government support on prices and regulations. 30% of US and world fuel usage could then be displaced, which would be about 30 million barrels of oil per day. A straight line ten year deployment would be fuel savings that cumulatively increase at 3 million barrels of oil per day each year. If there was oil production decline then the 3 million barrels per day would offset the decline in production. This would free up any new algae biofuel production increases and other measures to address increased demand.

Inflatable electric cars could have 2500 mile range using a single hot-swap XPack Multi-CoreTM Battery/Fuel Cell power plant.

40% of US oil is used for cars. There are 850 million cars in the world and 2 million of them are hybrid electric.

There are over 60,000 electric cars in the United States

There are many electric cars that are starting or will be starting production listed at wikipedia.

India's Tata Motors is rumored to produce a compressed air car that would cost around Rs. 350,000 [US$8200] in India with a range of 300 km [180 miles] and would only need US$2 for the electricity to compress the air for refueling. None of the materials for the vehicle would have supply issues for replacing 850 million cars unlike Lithium for high performance batteries. The range of the compressed air only car drops when it is driven at highway speeds to less than half the distance.

Some analysts believe the need to build an infrastructure of compressor stations and the need to comply with strict safety standards would prevent the compressed air car from being any more than city based commuter option niche vehicle.

"In North America, it's basically a nonstarter," says Rinek, admitting that there are limited niche markets. "The only potential, if any, would be for an inner-city, short-commute vehicle with an ultra-greenie owner."

Trikke - Take a Ride.Exhaust Pipe Adapters

In the dual-mode version [hybrid], with assistance from fuel, speeds can reach 100 miles per hour, and range expands to 900 miles on less than a gallon of fuel (although the faster one goes, the shorter the range). ZPM wants to produce a 6-seater, 75-hp model with a 1000 mile range at 96 mph, all for just $17,800.

A compressed air car would have no expensive batteries to replace

The cheapest cars in the world in March 2008 part One #10 to #6

The cheapest cars in the world in March 2008 part two #5 to #1

#1. Tata Nano: 4-door hatchback. India. $2,497. Weighs about 700kg.
#2. Chery QQ: 4-door hatchback. China. $4,781.
#3. Suzuki Maruti 800: 5-door hatchback. $4,994.
#4. Geely MR: 5-door. China. $5,500
#5. Geely HQ SRV: 5-door "tall" estate. China $5,780

Light cars are the best for converting into electric cars

Twelve cheapest cars in the USA

A 2007 Business week look at the race to make cheap at less than or near $3000

The Speculist had noted that there appears to be a short bridge and transition in cars through hybrid cars to electric cars Very light electric cars (700 kg or less) would use one third or less the number of batteries which are the most expensive part of an electric car that is targeting the low end market. Using fewer batteries to move less material note only makes the car cheaper (which increases unit sales) but also allows for an easier ramp up of the battery industry to support more electric vehicles. Use one third the batteries and the same amount of batteries makes three cars instead of one. A projection of one million electric cars in one year becomes three million electric cars.

Currently there are 25 million electric bikes and scooters being made each year. The bikes and scooters weight 25-200 pounds and move one person at up to 70mph with ranges up to 100 miles. A light electric car (including one or two passengers) might only move three times the weight of a loaded electric scooter 400-500 pounds). Three times the batteries would be 8 million electric cars to equal the batteries fo 25 million more power electric scooters.

Lighter cars with lower power requirements can also make better use of integrated solar panels. The Prius is using 2-5 KW solar panels to power the air conditioning in a Toyota Prius

5KW is 6.7 Horsepower. The Tata Nano has a 33 HP engine. An electric Tata Nano car with 5 KW solar panels could provide 1 hour of driving time for every 5 hours of charging time.

The model T had 20 HP (15KW) engine and a top speed of 40 mph

The Model T weighed 540 Kg

Improve the solar cells to slightly over double the power generation would allow unlimited urban driving for an electric car.

High Calibre Batman Villains

Who are the Batman villains that would be good enough or capable of being adapted into villains suitable for Batman movies with the high quality of Dark Knight ?

I agree with the Speculalist review that Dark Knight is a Godfather calibre movie

Batman from the comics has a large collection of high quality villains. The best villains for the Christian Bale Batman would be ones with action and those who would create emotional, moral and psychological challenges. The villains also need to be adapted to be believable, which is more a writing challenge and how they are approached rather than something exclusively in the concept of the villain characters.

Catwoman : One of the primary love interests for Batman in the comics. An action character who is emotionally and romantically tempting.

Talia al Ghul and the remainders of the League of Assassins: Daughter of Ra's al Ghul (Batman Begins). There would be interesting plot dynamics here, but they would seem to be repeating Batman Begins. So there would need to be 2-4 movies done with other villains before going back to this well.

Hugo Strange: Strange is an insane psychologist who knows Bruce Wayne's secret identity and lusts to take the identity for himself. He is also a chemical genius who can turn people into lumbering, brutal giants.

A plot would get to delve deeply into the motivations of Batman and a very close pretender to Batman.

Clayface : Cinematic face changing look. Could be just adjusted to be just a master of disguise. Probably a better secondary villain whose abilities makes it more difficult for Batman to defeat a primary villain.

Hush: A childhood friend of Bruce Wayne's, Thomas "Tommy" Elliot targets both Batman and Bruce Wayne. Hush uses manipulation and guile instead of noisy "signatures".

The Riddler : Already being tipped as the primary villain in the next movie. A good movie would focus on the mind games.

Joker: He is used again and again in the comic books and can be brought back in the movies as well.

Bane: Would need to be made more psychological interesting for a good movie. King Snake is a martial artist who becomes a mercenary, who was Bane's father in the comics.

An escaped convict from an island prison in South America, Bane has abnormal strength as a result of having had experiments with a derivative of the drug Venom performed on him. He became known as "The Man Who Broke the Bat" when he broke Batman's spinal cord, forcing Bruce Wayne to give up the Batman persona while he recuperated.

Deacon Blackfire: A religious fanatic who forms an army in the sewers beneath Gotham, largely composed of the homeless. A movie with this villain could make interesting comments upon religion. There could be personality cult followers of Batman used as a counter point.

Upcoming comic based movies surveyed at Wired

the Speculist site noted the problem of maintaining the quality level in Batman sequels which boils down to quality villains who allow for a quality story.

Sparse carbon nanotubes would be invisible but could support significant weight

Macroscopic invisible cables.

This is not just theoretical because DARPA/MIT has already produced one foot long carbon nanotubes and should have one meter long carbon nanotubes by the end of this year or early in 2009.

Spiders suggest to us that producing high strength over density ratio invisible cables could be of great importance. In this paper we show that such invisible cables could in principle be built, thanks to carbon nanotube bundles. Theoretical strength of ~10 MPa, Young’s modulus of ~0.1 GPa and density of ~0.1 Kg/m3 are estimated. The theoretical strength over density ratio is huge, i.e. that of a single carbon nanotube; the strength of a real, thus defective, invisible cable is estimated to be ~1 MPa. Finally, we demonstrate that such cables can be easily transported in their visible state (with bunched nanotubes) and that an efficient anti-bunching controllable mechanism, involving pressure of ~1 Pa, can control the visible–invisible transition, and vice versa.

New Scientist indicates being narrower than the wavelength of light, carbno nanotubes are normally invisible - as long as they are separated by more than one wavelength

Nicola Pugno of the Polytechnic of Turin in Italy has calculated how many nanotubes would be needed to support a person, taking into account small defects that develop in the tubes during manufacture. When held 5 micrometres apart, to keep them invisible, they would form a cable only 1 centimetre in diameter weighing a mere 10 milligrams per kilometre.

Applications: better live magic shows. More effective garrots and decapitation traps. Live shows with Matrix movie like wire work.

These and keeping the carbon nanotubes invisible while very interesting and havnig uses will ultimately be cool yet trivial applications. The big applications are increasing production and lowering costs to increase the strength to weight ratio of materials used to build car, planes, buildings and other applications. Performance and efficiency can be radically increased and transform society.

Форма для связи


Email *

Message *