June 22, 2007

US moves for cleaner energy and transportation

Tax incentives for renewables are not dead A forbes article believes a $15 billion tax package will make its way into law to go along side the 10mpg icrease in car fuel standards by 2020. A boost in production of biofuels to 36 billion gallons annually in 2022 from the current production level of 6.3 billions per year today is also in the new Senate bill.

Fuel efficient planes

Pratt & Whitney has a more efficient jet engine under development that would be 50% quieter and 50% less polluting and use nearly 15% less fuel, in part because the engine would be made of lighter materials than today's engines. Maintenance costs also would be 40% lower because of a simplified design using about 30% fewer parts. The new engines are unlikely to be ready before 2015.

Easyjet is one of companies making a big push for more fuel efficient jets

Scientific American talks about a superconductor enabled all electric airplane concept However, this idea looks like it needs some more breakthroughs with superconductors and fuel cells or in nanotechnology before it gets off the ground.

South Korea looking to revamp education to boost innovation and growth

From Business week, as South Korea's growth has cooled to 5% annually over the past decade, from an average rate of nearly 8% during the prior 30 years, some experts are griping that Koreas educational system no longer makes the grade.

The country of 48 million now ranks 11th among the worlds economies and is a top exporter of everything from steel and ships to cell phones and computer chips. It spends 7.5% of its gross domestic product on education, a bigger share than any other industrialized country, and that figure doesnt even include the $38 billion a year Korean parents shell out for after-school cram sessions.

In the meantime, Korea risks losing some of its best and brightest. The number of the country's students enrolled in foreign schools and universities rose to 214,000 in 2005, from 109,000 in 1998, the Korea National Statistical Office.

the call is for: "The Ministry of Education and Human Resources must be disbanded in favor of a much slimmed-down department focused on lifetime learning"

Speeding up computational chemistry

Computational chemistry which is needed for theoretical molecular nanotechnology research will be greatly boosted with new hardware coming out this year and in following years. By the end of the year a double precision Nvidia Telsa could boost speeds of computational chemistry applications by 300 times. The speeds could increase by 3 times or more per year.

UPDATE: The EEtimes has more information on Intel's chip plans for more powerful and lower energy usage chips

Intel is developing "adaptive circuits" within a processor that would determine the minimum amount of performance required for a task. "We have a brain in the chip," Bryan Casper, principal engineer for Intel's Circuit Research Lab in Hillsboro, said. All power associated with a task is turned down to a "just-enough" level.

A prototype of the technology was demonstrated in a PCI Express card with a chip that consumed one-tenth the power of a card with today's chip technology, or 2.7 milliwatts versus 20 to 30 milliwatts. Reducing power consumption is critical, given that using today's technology to power a PCI Express card with a bandwidth of a terabit per second would require 100 watts of energy, Casper said.

Researchers showed a prototype of a Wi-Fi card with firmware that automatically turned off the power when the card was not in use. The technology also knew when to power up to receive or transmit data packets. Such cards use from 50% to 70% less power than standard wireless cards, researchers said.

Harrison demonstrated a pager-size device that could be hooked on a belt, and sense whether the wearer was bicycling, running, standing, using a Stairmaster exercise machine, or walking. In the demo, the device's choice was displayed on a screen in the form of percentages, since power walking, for example, could be considered half walking and half running. The idea behind the research is to make computing as unobtrusive as possible in everyday life.


Intel is working on Larrabee which will likely start out having 3 teraflops of performance and will compete with the general purpose GPUs like Nvidia Tesla.

Package size: 49.5mm x 49.5mm
Process: 45nm
Clockspeed: 1.7-2.5GHz
Power: >150W

Intel will try to get it out in 2008

Intel will continue to release more cores on its best chips

Compilers from Intel and Microsoft and others are being adapted to the high number of core and lots of threads world

Some of the existing computational chemistry applications are well suited to massively parallel hardware. Others like ACES II will be ported to the new systems. Nanoengineer-1 and others systems will get a lot faster.

Solid state flash memory has been improving at a faster than Moore's law rate

Enterprise solid state flash drives have higher capacities and faster data connections. Availability of solid state drives at wikipedia

Simple tech has 256 GB drives now for $10,000 while the 512GB version will follow in Q3 2007 for $15,000.

Basic sub $1000 SSD flash drives as of 2007 are delivering about 5000 to 10,000 IO per second (IOPS compared to 500 for faster hard drive arrays. More expensive systems can provide 100,000 IOPS or more.

An article addresses the limited write cycle issue of flash drives as a non-issue now

If Dwave quantum computers pan out, then there will 1000 qubits by the end of 2008 It will 2009, before the Dwave system is adapted or molecular simulation and probably 4000 qubits.

A projection of what this might mean for computational chemistry.

so by the end of 2007, $60,000 for 12 teraflops, $10,000 for a nice 8-core workstation, $15,000 for 512 GB of enterprise solid state flash drive. 100 times faster how much you would buy in early 2007.

By the end of 2008, $60,000 double precision and 36 teraflops, $10,000 for nice 32 core system, $20,000 for 2 Terabytes of enterprise solid state flash drive. 300 times faster than what could be bought in early 2007.

by the end of 2009, $60,000 for 100 Teraflops, $10,000 for nice 80 core system, $20,000 for 4 Terabytes of enterprise solid state flash, 4000 qubit quantum computing molecular simulator. 900+ times faster than what could be bought in early 2007.

Ontario trying to phase out coal by 2009 or 2014

Here is a pdf that describes Ontario's efforts and progress to phase out coal energy usage by 2009 This is what a realistic phase out of coal energy proposal looks like. Although the Ontario government is backsliding somewhat on its commitment to eliminate coal usage. I am not aware of any other place that has significant coal energy use now which is implementing a plan to phase it out.

The official policy is currently to phase out by 2014. There is an election in Ontario on Oct 10, 2007. Current administration did break a promise to phase out coal by 2007 The incumbant Liberal Dalton McGuinty and Conservative leader John Tory are in tie in the polls.

John Tory's plan is to build more nuclear and other clean sources and clean up the coal

This coal energy phase out will reduce air pollution and save at least 657 lives per year.

Some articles on the coal phase out and the pollution reduction benefits of removing coal energy supplying 20% of the electricity.

Hat tip to NNadit at dailykos

Ontario's coal plants are responsible for:

36% of Ontario’s airborne mercury emissions;
28% of Ontario’s industrial smog-causing nitrogen oxides emissions;
23% of Ontario’s industrial smog-causing sulphur dioxide emissions; and
8% of Ontario’s industrial PM2.5 small particulate emissions that go deep into our lungs and cause asthma attacks, heart and lung diseases, strokes and premature mortality.

Coal Phase-Out will raise electricity bills by 34 to 53 cents per month

An Ontario coal phase-out would also save at least 657 lives per year in the province, according to an April 2005 report prepared for the Ontario Ministry of Energy.

Ontario has a population of 12.8 million people. The United States has 24 times the population and uses coal for 50% of its power. Assuming proportional coal air pollution effects then the US would have 60 times the number of deaths or over 39,000.

In 2006, Ontario has 6.4 Gigawatts of coal power and a total capacity of 31 Gigawatts. Some of the hydroelectric and other non-coal power sources will not be available during peak summer days.

The total coal free power on summer peak demand days is 22.14 Gigawatts. Various improvements that will be made have a projected 27.2 Gigawatts of peak summer power generation in 2009. From 2010 to 2012, they will add a bit more to 28.3 Gigwatts.

This will cover the expected peak demand of about 27 Gigawatts but not the 17% reserve margin.

So another 4 gigawatts of non-coal peak power is needed in 2010-2012.

There are various options.
The cheapest is to convert coal plants in Thunder Bay and Nanticoke to natural gas (70 million to 156 million per GW).
Add new natural gas for $600 -900 million per GW.

Restart the Pickering A unit 4 nuclear plant for $2.4 billion per GW.

Other alternatives are more aggressive conservation and demand management, more aggressive renewables targets, acquire more cogeneration resources and another natural gas option in Mississauga.

Nvidia will have double precision floating point soon

Hpcwire (High Performance Computing Wire), looks at the Nvidia Tesla and expected future competition

The Tesla S870 server board is really the big breakthrough for NVIDIA, since it represents their first product designed for the HPC datacenter. It fits in a 1U chassis, contains four GPUs, and communicates with the server host using a Gen 2 PCI Express switch. Temperature sensors and system monitoring are included to provide the level of reliability expected in datacenter hardware. The board dissipates 550 watts. Add another 10 watts for a PCI Express host adapter card. That might seem like a lot of juice for an accelerator, but for 560 watts you get over 2 teraflops of single-precision performance. MSRP for the server board is $12,000.

The Tesla server also comes in a 2-GPU version, and an 8-GPU version is in the works. The latter configuration is expected to improve upon the performance per watt ratio somewhat.

The addition of double-precision capability will open up the entire technical computing market for NVIDIA, since the inherent limitations of single precision arithmetic will be removed. So unless AMD comes out with a double precision GPU in the next few months, NVIDIA will be the vendor to pioneer 64-bit floating point in GPGPU computing. As such, it becomes a more direct competitor with ClearSpeed boards, a math co-processor offering that also targets the HPC market. Although NVIDIA has not released power or performance specs for their upcoming double-precision devices, one can surmise that ClearSpeed will be able to claim a performance per watt advantage, but perhaps not a performance per dollar advantage. Depending on how Intel's Larrabee processor development plays out, NVIDIA could eventually run into additional competition there as well.

In any case, there may be plenty of acceleration opportunities to go around. The commercial HPC market is growing rapidly -- even faster than the general IT market. According to IDC, technical computing revenues will reach $14.2 billion by 2010. Currently, the oil & gas and financial services segments represent two of the highest growth areas right now. But manufacturing, biotech and government HPC are also expanding. NVIDIA thinks its new HPC line can ride a lot of this growth as users start to figure out that Tesla-equipped workstations can replace decent sized clusters and Tesla-equipped clusters can match the raw performance in some high-end supercomputers.

The new devices will support double precision math, a basic requirement for many technical computing applications. Double precision support will make its first NVIDIA appearance at the end of Q4, 2007. At this point, it's not clear if NVIDIA's first double precision processor will be in a Quadro product or the new HPC offering.

Nvidia's latest offerings are still at the 80nm process.

The NVidia Tesla systems might get updated with the G92 technology and 65nm processes for triple the performance and lower power usage in 2008 Annual upgrades could triple performance each year.

How much is China doing enviromentally and currency wise?

Many people are complaining about China becoming the biggest carbon emitter, but 25% as much per capita as the US. They complain that China currency is too low, which is "forcing" Americans to go to Walmart to buy cheaper imported Chinese goods. Let us look at what China is doing for a cleaner energy future and at the currency and economy.

China has partially floated its currency. It is now appreciating by about 5% per year.

China may allow the RMB to strengthen by 7.5% per year At 7.5% per year, the 40% reduction that many in the US are clamouring for is handled in about 5 years.

btw: the US $ sank 60% against the Euro (from .82 US$ vs 1 Euro to 1.36 US$ vs 1 Euro) over the last seven years and almost as much against the Canadian $. How much pressure is there on the USA to stop deflating its currency ?

This trend would mean that China would pass the USA in overall economy in about 2020

Here is an indicator on the financial pressure on the RMB to appreciate (created by Milken Institute and Xinhua Finance). RMB pressure Indicator This is increasing financial pressure not political pressure.

China is adding a lot of cleaner hydroelectric power and some nuclear and natural gas.

The dams also help them move more freight by river instead of truck (river 10-20 times cleaner) or rail (river 2-4 times cleaner)

China has some 64 nuclear plants in the pipeline. Currently completing about 2-4 per year.
There is talk of 300+ nuclear plants by 2050

By 2020, China should have 42% non-coal energy versus about 20% now.

According to government statistics, more than 50% of world mass urban rail construction projects will be under implementation in China in 2006. Right now, there are more than 30 cities in China that have been building or plan to build their own urban rail systems. China will need investments over USD $25 billion worth of railway lines during the eleventh five year plan. Before 2010, the total length of urban railway line in Beijing, Shanghai, Guangzhou three cities will reach 1000 kilometers in comparison with 300 kilometers now.

China is closing the smallest and dirtiest coal plants and is making somewhat cleaner bigger coal plants. Not ideal but has the US shutdown any of its dirtiest coal plants ? No. The USA has grandfathered protections in to let the oldest and dirtiest coal plants continue to pollute.

China is making electric cars, like the flybo

Tianjin Qingyuan Electric Vehicle Co. Ltd. (QYEV) is building a 165 million yuan (US$21 million) factory capable of producing 20,000 electric powered vehicles a year in the northern port city of Tianjin. The plant will produce cars powered by battery, hybrid power and fuel cells. It is expected to be completed at the end of 2007.

In 2006, Chinese bought 16 million to 18 million electric bicycles.

What is the US doing ?
trying to pass 35mpg CAFE.
Making biofuels.
Might get 28 nuclear plants by 2020. First might get done 2015.
Might clean up some coal pollution emissions (mostly not CO2 but at least saving some lives if it passes)
Coal usage likely to go up as a percentage.

Also note $287 billion of what China makes goes to the USA. $182 billion goes to the EU.

UPDATE: An economic argument that shows that those who think that an appreciated Chinese currency will solve the trade imbalance.

The chinese yuan has depreciated by over 50% against the Euro, while appreciating by about 7% against the USD over the last 7 years.

If nominal exchange rates were driving trade flows as commonly alleged, then Chinese exports to the U.S. should have been growing faster than to Europe. The data show something completely different... Plotted together over that entire decade, these two series look nearly identical. This is because the same real economic forces -- e.g., China's relative abundance of less-skilled labor -- have been driving both sets of trade flows

Put it this way: In a counter-factual world where over the past decade China allowed the yuan to float against the dollar, the U.S. would still have run a large and growing trade deficit with China. The real economic forces of comparative advantage that drive trade flows operate regardless of which nominal prices central banks choose to fix.

There is also some discussion of sterilized vs non-sterilized intervention.

Gene Therapy reduces Parkinson's disease symptoms by 70%

Gene Therapy appears to reduce the symptoms of Parkinson's disease by 70%. The improvement occurs 3 months after treatment starts. Parkison's disease is the disease that the famous actor Michael J Fox and boxing sports legend Muhammad Ali have.

June 21, 2007

More specifications of the Nvidia Tesla Supercomputer

The Teraflop power of the Nvidia tesla supercomputer board, desktop and workstations will be available August 2007.

go to this link to sign up to get more information on buying one

Tesla C870 GPU specifications ($1500 add in card):
- One GPU (128 thread processors)
- 518 gigaflops (peak)
- 1.5 GB dedicated memory
- Fits in one full-length, dual slot with one open PCI Express x16 slot

The GPU is especially well-suited to address problems that can be expressed as data-parallel computations with high arithmetic intensity–in other words when the same program is executed on many data elements in parallel with a high ratio of arithmetic to memory operations.

Here is the C programming stack for the NVIDIA supercomputer

In case you did not look at my original article after I updated it

Here is a chart of NVidia crushing Moore's law. The G92 is expected to be three times faster than the current best chip at 1 teraflop instead of 330Gflop.

Here is a link to the Nvidia 21 page technical brief

Here is a link to the developers info

CUDA (Compute Unified Device Architecture) technology gives computationally intensive applications access to the tremendous processing power of NVIDIA graphics processing units (GPUs) through a revolutionary new programming interface. Providing orders of magnitude more performance and simplifying software development by using the standard C language, CUDA technology enables developers to create innovative solutions for data-intensive problems. For advanced research and language development, CUDA includes a low level assembly language layer and driver interface.

The CUDA Toolkit is a complete software development solution for programming CUDA-enabled GPUs. The Toolkit includes standard FFT and BLAS libraries, a C-compiler for the NVIDIA GPU and a runtime driver. The CUDA runtime driver is a separate standalone driver that interoperates with OpenGL and Microsoft® DirectX® drivers from NVIDIA. CUDA technology is currently supported on the Linux and Microsoft® Windows® XP operating systems.

Google is trying to pursuade ATI and Nvidia to open up their specs on drivers.

There are the reverse engineered Nouveau drivers

Here is an online petition to get Nvidia to opensource their drivers

Wikipedia on all Nvidia GPUs

Carnival of space Week 8 at Universe Today

June 20, 2007

Nvidia Tesla supercomputer for $1500 to $60000 for 2 to 12 teraflops

Here is a pdf of the announcement of Nvidia 2 to 8 teraflop Tesla desktop supercomputer

New desktop supercomputers are now available from Nvidia partners and there will be AMD Firestream competition. Four Tesla boards in one desktop machine for 4 teraflops of single precision power for less than $10,000 and 400 gigaflops of double precision.

Pricing for the Tesla GPU would start at $1,499 and the deskside computer at $7,500.

Nvidia Tesla product overview The systems have fast memory access with 76.8 GB/sec.

The Tesla C870 one GPU card $1499.

The deskside Tesla D870 supercomputer with two x8 series GPUs, needs 550W of power, 1 teraflop will cost $7,500 beginning in August. 3 Gigabytes of system memory (1.5 GB per GPU) With multiple deskside systems, a standard PC or workstation is transformed into a personal supercomputer, delivering up to 8 teraflops of compute power to the desktop. Eight of the deskside systems would cost $60,000 to deliver 8 teraflops.

Future versions of the deskside system will be able to provide up to four Tesla GPUs per system, or eight Tesla GPUs in a 3U rack mount. [From the Nvidia technical briefing pdf]

The GPU computing server blade Tesla S870 will have a retail price of $12,000, four GPUs, 2 Teraflops and use 550 watts. This is a 1U GPU computing server. With four to eight GPUs in a 1U form factor, GPU computing with the highest performance per volume per watt will be possible. When the 8 GPUs would be 4 teraflops and likely will be around $18,000 (estimate). Four of the 4-way S870s would be $48,000 and would total 8 teraflops in power.

A 12 Teraflop system (24 GPUs) that will cost $60,000-70,000 will be selling soon from Evolved Machines.

How much it can speed up certain applications. Note: molecular dynamics 240 times faster


The Tesla GPU (graphics processing unit) features 128 parallel processors and delivers up to 518 gigaflops of parallel computation. A gigaflop refers to the processing of a billion floating point operations per second. Nvidia envisions the Tesla being used in high-performance computing environments such as geosciences, molecular biology, or medical diagnostics.

Nvidia also will offer Tesla in a workstation, which it calls a Deskside Supercomputer, that includes two Tesla GPUs, attaches to a PC or workstation via a PCI-Express connection, and delivers up to 8 teraflops of processing power. A teraflop is the processing of a trillion floating point operations per second.

A Tesla Computing Server puts eight Tesla GPUs with 1,000 parallel processors into a 1U server rack.

Tesla GPU computing processor, deskside supercomputer, and GPU Computing server

As recently as 2005, the general price points were roughly $1000 a gigaflop in common supercomputer configurations. In 2004, the DOE spent $25 million for a Cray system rated at 50 teraflops

Wikipedia cost of computing tracking

2000, April: $1,000 per GFLOP, Bunyip, Australian National University. First sub-US$1/MFlop. Gordon Bell Prize 2000.
2000, May: $640 per GFLOPS, KLAT2, University of Kentucky
2003, August: $82 per GFLOPS, KASY0, University of Kentucky
2005: about $2.60 ($300/115 GFLOPS CPU only) per GFLOPS in the Xbox 360 in case Linux will be implemented as intended
2006, February: about $1 per GFLOPS in ATI PC add-in graphics card (X1900 architecture) - these figures are disputed as they refer to highly parallelized GPU power
2007, March: about $0.42 per GFLOPS in Ambric AM2045

UPDATE: This is amazing. But not all FLOPS are equal.
For graphics and problems similar to graphics then Moore's law is being shattered.

Graphic of Nvidia kicking Moore's Law butt. 330 GFlops for the 8800 and 1 Teraflop for the G92 expected at christmas, 2007

Specialized processing systems like the Japan's petaflop MDGrape3 machine can be lot faster for particular problems.
If the problems that you are interested in are accelerated, then things are getting a lot better faster.

If the computational chemistry software packages get a big speed boost then this is big for molecular nanotechnology.

My follow up article with more specs and more links

TG Daily has some more details on this new Nvidia GPU supercomputer

Nvidia is supplying a C based programming model (CUDA) and AMD has an assembly language based system (CTM). The programming models will let developers get pretty good results right away but fine tuning by experts will speed things up by 5X or more.

These machines are inferior to regular supercomputers for memory bandwidth and some other factors. However, one could buy these systems and try to find ways to beef up the memory with enterprise versions of solid state flash harddrives at a fairly affordable price to lessen the weaknesses in this area.

It seems it would be possible to port a version of Python (CPython) onto the C programming model fairly easily

The developer documentation and samples that nvidia provides. Available on the new GeForce 8800 graphics card and future Quadro Professional Graphics solutions, NVIDIA claims computing with CUDA overcomes the limitations of traditional GPU stream computing by enabling GPU processor cores to communicate, synchronize, and share data.

Open source Openvidia libraryfor GPUs

Wikipedia on the state of General Purpose GPUs

There was previous talk of 80 core Intel chips and integrating the ATI GPU with an AMD processor to get to teraflop machines in 2008-2010

Intel is demonstrating their 80-core chip now and hope to release it in 2009

The NVIDIA Tesla computing pages

What it would take for zettaflop computing

China is building three petaflop computers by 2010

Japan is building a 10 petaflop machine by 2011

Other petaflop projects including the completed MDGrape3

Flash memory improving faster than Moore's law will accelerate larger database searches

Other things going faster than Moore's Law
which includes gene sequencing costs

system integration


Trading Futures
Nano Technology
Netbook     Technology News
Computer Software
Future Predictions

Nanodynamics IPO

NanoDynamics IPO could help increase funding for various kinds of nanotechnology companies NanoDynamics is looking at a $100 million IPO. They have nanostructured components as part of fuel cell products and nanoparticle products.

Travellers Dilemma in Game Theory

Scientific American reviews the Travellers Dilemma Basically it shows that in real world people are willing to work together so that they can both win or tie in order to get more return on average.

This is relevant for economics and for choices in technology and between nations.

Win-win, Tie-tie with an overall-win situations can be the more popular choice.

China's and India's growth analysis

A recent paper by Barry Bosworth and Susan Collins of the Washington-based Brookings Institution compares performance over the 1978-2004 period. After 1993 it compares India’s post-1991 reforms with China.

Accounting for Growth: Comparing China and India, Working Paper 12943, February 2007, National Bureau of Economic Research, www.nber.org

Breakdown shows close to equal service growth.

With a remarkably open economy and gross fixed investment at 43 per cent of gross domestic product last year, it is hard to identify significant constraints on China’s growth in the medium term. A breakdown in the global economic and political system would presumably do it. So might domestic political or social instability. In the long term. Failure to persist with reform would also be a danger.

India’s fixed investment has been far lower. But it is already close to 30 per cent of GDP. If the fiscal position continues to improve and the inflow of long-term capital from abroad to accelerate, the investment rate could rise still further. Partly because infrastructure is poor and industrial performance disappointing, the upside for Indian growth is also bigger than for China’s. But India also suffers from serious handicaps. The most important, apart from weak infrastructure and a relatively ineffective government, is the scale of mass illiteracy. Adult male literacy was only 73 per cent and female literacy a deplorable 48 per cent in 2002, against 95 and 87 per cent, respectively, in China.

Chindia is on the move. Since China’s standard of living is roughly a fifth of that of the high-income countries and India’s one-tenth, the fast growth of the giants might persist for a generation.

A detailed pdf analyzing China's purchasing power GDP

It is part of this study:
Alan Heston, Robert Summers and Bettina Aten, Penn World Table Version 6.2, Center for International Comparisons of Production, Income and Prices at the University of Pennsylvania, September 2006.

My projection that shows that China's economy is likely to become bigger than the USA economy by around 2020 Mainly work in the appreciation of chinese RMB against the US$ and continuation of growth at over 7% per year versus USA 3%.

My analysis that China is urbanizing faster than official chinese reports and China in the past has been consistently underestimating future urbanization rates

China's one child policy is eroding and a population growth rebound is likely

China's military will be proportionally big

China is building a lot of hydro power
China might make 300+ nuclear power plants by 2050

Details on Nuclear plant operating extensions

Extending the life of nuclear reactors in the the United States with research from Oak Ridge National Lab
Comprehensive risk analysis provided by Oak Ridge National Laboratory researchers plays a crucial role in keeping billions of dollars of electricity generation on line – without compromising safety. In the absence of license renewal, more than 40 percent of the nation’s 104 nuclear power plant licenses will expire by 2015. The replacement cost value of electricity generation capacity being submitted by industry in 2007-2008 for 20-year extensions is close to $20 billion, according to Richard Bass of ORNL’s Computational Sciences and Engineering Division. The major concern is pressurized thermal shock, caused by either a rapid temperature or pressure change in the reactor vessel. This combined with the fact reactor vessels become embrittled over time increases the potential for a pre-existing crack to propagate through the vessel wall, causing failure. Using advanced risk-assessment engineering technologies and high-performance computing resources, ORNL provides the technical basis for the Nuclear Regulatory Commission to set standards used in the license renewal process. This research is funded by the NRC.

ORNL is also developing stainless steels that are stronger but five times cheaper than alternatives.

A new type of stainless steel alloy developed at Oak Ridge National Laboratory could allow for significantly increased operating temperatures and corresponding increases in efficiency in future energy production systems. The new alloys offer superior oxidation resistance compared to conventional stainless steels, without significant increased cost or decreased creep resistance (sagging at high temperature). What sets this proprietary material apart from other stainless steels is its ability to form protective aluminum oxide scales instead of chromium oxide scales. The combination of creep and oxidation resistance offered by these alloys previously was available only with nickel-base alloys, which are about five times more costly than the new stainless steels. This material also has potential applications in high-temperature (up to 800 degrees Celsius) chemical and process industry applications.

June 19, 2007

Diabetes death rate down

The rate of premature death among American men with diabetes has dropped dramatically over the last few decades, but the same can't be said for women with the disease, a study by the Centers for Disease Control and Prevention has found

Analyzing data for 20,000 people from across the United States, researchers found that annual death rates from all causes in men with diabetes fell to 24.4 per 1,000 from 42.6 per 1,000 - a 43 per cent reduction.

The death rate from cardiovascular disease (CVD), the most common cause of death in diabetics, fell for men to 12.8 per 1,000 from 26.4. These drops paralleled declining death rates among both men and women without diabetes in the U.S. population (down to 9.5 per 1,000 from 14.4) over the three-decade period.

Not only have mortality rates for women with diabetes not declined, but the difference in death rates for diabetic and non-diabetic women has actually widened over the three decades as females without diabetes started living longer.

The gender gap found in the U.S. diabetic population can't necessarily be extrapolated to Canada, said Toronto endocrinologist Dr. Lorraine Lipscombe.

In a study she co-authored as a researcher at the Institute for Clinical Evaluative Sciences, published in March, Lipscombe said death due to the complications of diabetes in Ontario fell for both men and women between 1995 and 2005.

"In our study we did not find any difference between men and women and we found an overall decline of about 25 per cent," said Lipscombe.

Because the Ontario research contained more recent data than the U.S. study, it might reflect more current medical practices, she said. "Maybe we are getting better at taking care of women with diabetes."

Cyborg moths for surveillance and weaponized insects

Quote from Deathrace 2000.
Machine Gun Joe VeTurbo: Frankenstein! You want Frankenstein? I'll give you Frankenstein!
[Joe opens fire into the stands]

From the times Online, the creation of insects (moths) whose flesh grows around computer parts – known from science fiction as ‘cyborgs’ – has been described as one of the most ambitious robotics projects ever conceived by the Defense Advanced Research Projects Agency (Darpa), the research and development arm of the US Department of Defense. The moth will be capable of landing in an area without arousing suspicion, all the while beaming video and other information back to its masters via what its developers refer to as a “reliable tissue-machine interface.”


Moths are creatures that need little food and can fly all kinds of places," he continued. "A bunch of experiments have been done over the past couple of years where simple animals, such as rats and cockroaches, have been operated on and driven by joysticks, but this is the first time where the chip has been injected in the pupa stage and ‘grown’ inside it. “Once the moth hatches, machine learning is used to control it

Debates such as those over stem cell research would “pale in comparison” to the increasingly blurred distinction between creatures – including humans – and machines, Mr Brooks, told an audience at the University of Southampton’s School of Electronics and Computer Science.

“Biological engineering is coming. There are already more than 100,000 people with cochlear implants, which have a direct neural connection, and chips are being inserted in people’s retinas to combat macular degeneration. By the 2012 Olympics, we’re going to be dealing with systems which can aid the oxygen uptake of athletes.

Weaponized Insects

I would note that you could also use this technology to control poisonous insects.

Weaponization of insects:
Control bugs to spread disease
Control poisonous bugs for assassinations
Control bugs and have them lay eggs to destroy crops (locusts)

Also, note you can use gene therapy, RNA interference and activation to enhance characteristics of your weaponized insect. Have your bug fly faster and farther, have its venom be more deadly, attach artificial poison sacks and needles etc...

Have your bug hitch a ride on your cyborg controlled falcon.

On the plus side you could control bees and stuff for more precise pollination for better agriculture.

This is related to the Chinese implanting a chip to control the flight of a pidgeon

A peer to peer network of cyborg moths

A point brought up by Jamais Cascio (how do they relay the information out):
On the electronics side a lot of the smart dust work for low power systems can be ported over to the Hybrid insect-MEMS.

I would think that they could do some peer to peer transfers.
A lot of moths.
With peer to peer you can create redundancy so that if you destroy one moth and set of sensors you can still operate.

Also look at the optical communication options for smart dust.

For a major node with more power and range stick it onto a pidgeon or larger bird. It could also be the master long-range instruction receiver.

Nuclear power momentum in the United states

Later this month the state's (california) energy commission plans to tread carefully when for the first time it will review new ways to handle the radioactive waste produced by nuclear energy — the biggest legal obstacle to building new plants in California.
One possible option could be to reprocess, or recycle, the waste.

I support a lot more nuclear power in the United States, China (300+ reactors by 2050), Canada, Europe, Japan, Australia, Russia, India, South Korea, South Africa and other places. I support nuclear reprocessing and high burn reactors. It will help to save the 3 millions lives that are lost each year to air pollution. It will help to address any eventual lack of oil. I would rather we get off oil and coal for the pollution reasons than because we ran out.

Nobel Prize winner Steven Chu, who is also the director of Lawrence Berkeley National Laboratory, echoes the desire to rethink nuclear. He reasons that despite the fears and concerns about the energy source, nuclear power must be considered because it does not produce greenhouse gas during generation. Anything, he said, would be better than carbon-spewing coal plants.

And what of the people who don't want to consider nuclear energy in the hope that less controversial solutions like renewable energy and conservation will be enough?

"If you start thinking like that, then you doom yourself," he said.

Berkeley professor supports nuclear reprocessing:

Opponents say reprocessing would encourage nuclear proliferation, but nuclear supporters like University of California-Berkeley nuclear engineering Professor Per Peterson said such concerns need to be re-evaluated.

"The whole logic of abstaining from a technology so that others would not pick it up no longer makes sense," Peterson said.

Stanford president supports nuclear power:

"Nuclear power has to be part of the solution," Stanford University President John Hennessy said at an alternative-energy gathering in Palo Altothis spring. "Can we really understand the notion of risk? Nuclear plants versus carbon emissions — which will kill and has killed more people?"

28 nuclear plants could get authorized and first could start in 2015
The federal Nuclear Regulatory Commission expects applications for as many as 28 new nuclear reactors during the next two years.

Dennis Spurgeon, the Bush administration's senior nuclear technology official, said new plants could be running by 2015.

The Nuclear Regulatory Commission believes there's enough momentum that some of the expected plant applications will result in construction.

"This time, we are taking it very seriously," said agency public affairs officer David McIntyre. "Our agency has been reorganized to prepare for these applications coming in. We're hiring people right and left. Congress has given us a budget increase."

California EPA is going to revisit the issue: Note: California also has the strong action of California Assemblyman Chuck DeVore who submitted a bill to lift California's ban on nuclear power.
The California Environmental Protection Agency's Dan Skopec said climate change provides the perfect opportunity to revisit the controversial power source.

"We need to have a debate on nuclear," said Skopec, who was appointed undersecretary for the agency by Gov. Arnold Schwarzenegger.

Wind, now the cheapest of renewable energies, is expected to cost 6.8 cents per kilowatt-hour by 2020, according to the Federal Energy Information Institute. Natural gas, by comparison, would cost 5.6 cents per kilowatt-hour. Nuclear energy would cost 6.1 cents per kilowatt-hour. All these figures include the cost of plant construction.

Advocates argue that not including construction costs, nuclear power is the cheapest option of all. The California Energy Commission's most recent estimates put nuclear power's current cost at 1.4 cents to 1.6 cents per kilowatt-hour.

the Utilities support nuclear power:
We don't believe that conservation and renewables combined will be sufficient to meet demand in our market for an extended period of time," said Brad Peck, spokesman for the Columbia Generating Station, a nuclear plant in Washington state that feeds a small amount of power to Northern California. "You simply can't conserve yourself into prosperity."

The leader of PG&E Corp., the parent company of Northern California's largest utility, agrees. "We need all of the options to meet this huge challenge and, therefore, nuclear ought to be on the table," said Chairman and Chief Executive Officer Peter Darbee.

June 18, 2007

Why colonize space instead of the Gobi Desert?

What made North America any better than uninhabited parts of Europe ?

People who left were able to start fresh.
They were able to tap into a different resource base.

We can think of the world as your parents house and the unfinished basement as the Gobi Desert.
What makes moving out better than living in your parents basement ? It is cheaper in your parents basement. It is tougher to move out, you have to do your laundry and maybe cook for yourself and landscape your new place. You would have enough room in your parents basement. Does your older brother have to remodel and live in your parents basement before you can move out ?


Strike out on your own.
Make an independent life for yourself.

We could keep densifying your parents house and people living in the house could get weapons that could easily blow up your parents house. If we spread out across the block and city then we would not kill each other as easily and we would have more room to stay out of each others way. If you have guns and grenades or matches in the same house it is very easy to take out the whole family. The families survivability is better in multiple houses. This is unrelated to how well different family members get along or whether some people have trouble keeping a job or being able to buy their own food and feed themselves.

re: Perfecting ourselves before leaving the earth ?
Is there some standard that we should be perfect before we leave our parents house ? Do we solve all of our problems before we leave ?

Does moving out mean that parents do not need to keep working on their problems. Don't you have to keep working on your own problems. Would staying under the same roof help solve the problems any faster ? Does it prove anything that you learn to not kill your parents and you learn to live together with them ?

In the solar system the earth gets one trillionth of the total solar energy emitted by the sun.
There are a lot more available resources in the solar system that we can utilize if we crack the cost of space nut.

Those who think no one will ever colonize space are in the same mould as "I am living in my parents basement and I am happy and it is better economically than moving out, I cannot imagine anyone moving out of their parents basement ever"

The "lets not colonizing space until we colonize the Gobi Desert fallacy or until we have no problems fallacy" is the same as "lets not move out of our parents house until our family problems are all gone and have remodeled every square inch fallacies".

Let me also repeat that short camping trips in the backyard do not count as serious attempts to leave home.

The easiest way to get people to want to leave home is to have an abusive set of parents or those who are too domineering "if you live under my roof then you have live by my rules". Eventually someone will want to leave even if the parents are nice and cool..

UPDATE: Further analysis of the analogy based on a comment by Shubber Ali.

The leaving your parents basement analogy was for indicating that sometimes we leave someplace even if it costs more to do so. This indicates a clear and common case where you go even if it costs more and is harder to do than staying.

Shubber brought up issues of 1) air 2) water 3) food 4) ease of access 5) gravity. For those issues we go back to the title of my article. You would have to find ways to bring water and food to the Gobi desert as well. You would have the cost of growing food. There is also the issue that China claims the Gobi desert. So if other countries tried to colonize the Gobi desert then they would need to follow the rules of the Chinese government. This matches up with the analogy of the basement (my house my rules).

Air can be brought and for certain places (moon, Mars) can be processed from local materials. Moon and Mars have gravity and a spinning space colony would have simulated gravity from centrifugal force.

To match the basement analogy to these situations more closely:
Leaving the basement, to a house that you have to build. You have to connect to the local water line or if you move out to a place with no utilities (like space currently) then you have to dig your own well (it costs more but it can be done), you have to build a septic tank, you may have to grow your food if there is no store nearby.

Another option is that you buy a big mobile home that has more ammenities with it. You could start growing some indoor plants before you leave home while the mobile home is parked in the driveway. thus it will be easier when you go to that new plot of land off of your parents property.

Those two examples, I just brought up are ways that I think we should use as part of a plan. We can build or send infrastructure into space first. Robotically built or inflated structures for gathering energy. Sending machines that process and pre-genrate lunar regolith for oxygen or from the Mars soil and atmosphere. Structures with aeroponics or hydroponics can be sent after the energy and air infrastructure has been established.

The mobile home analogy is that you make and launch something big. Something as close to project Orion as the public relations will let you get.

Space colonization

Charlie Strossberg wrote an article indicating the difficult and in his view impossible challenges of space colonization It is also pointed to from Centauri dreams

There are several problems with it

1. failure of a science fiction author to figure out a technological way forward is a flaw in the science fiction author not in the technological/sociological plans

Just because you cannot think of a good way to do it does not mean that there are no good ways. It shows that your ways would not work and that you are approaching the problems in a faulty way.

It also shows that when we (collectively) are not trying to colonize space and have screwed up approaches to civilization energy and to space then we will not go anywhere.

2. We could be doing a lot better with space colonization if we actually had plans for space colonization and tried to do it. We have not tried.
Three to six people on camping trips are not attempts to colonize space.

What does a real modern space colonization effort look like ?

- build out power infrastructure first
Using 25 ton LEO launch Proton M for $70 million
launch 100 megawatt solar concentration system
Use 40.7% efficient Spectrolab solar cells
Use magnetic inflation as per James Powell (NIAC 2006 study)
Use the power for electrical boosting to L2 or a higher orbit
Use the power for communication and other satellites that could use cheaper power and to remote power other LEO boost stage to high orbit.

- As you build up space infrastructure robotically and with magnetic inflation and formation flying then get the costs down so that eventually it is affordable to send people for colonization

- Increase the energy and economic capacity of human civilization. 14 Terawatts does not cut it.
Mass produce nuclear power (fission or fusion). Use high burn or reprocessing of all nuclear material. Currently 10% of nuclear material is reprocessed. High burn (not breeder) reactors were made in the 60's and 70s. See thoriumenergy.blogspot.com

- Look at other non-chemical launch systems. Laser arrays and mirrors or magnetic launch. Launch stuff (infrastructure) first not people. So cheaper high acceleration is OK for stuff.

-Look at public relations safe external nuclear pulse propulsion. Update project Orion with a public relations safe version. With project Orion the cost is 70 cents per kilogram into space. Repeatable Z-pinch and minimag Orion or colliding beam fusion (funded by Venrock and others) could get us to public relations safe superior ground launch.

To get serious we need better technology and to use the technology that we have. We can build up our economy so besides lowering the costs we have more money/energy to spend

See past articles on colonizing space

China new rockets will have more than double current capacity

Atom trap used to spot individual atoms

A device that can hold hundreds of atoms in a 3D array, and image each one individually, has been developed by scientists in the US.

The team [David Weiss, Karl Nelson, and Xiao Li at the Pennsylvania State University] used three lasers arranged at right angles to create a 3D lattice in which they trapped 250 atoms of cesium.

The researchers then photographed the atoms in the array, layer by layer, proving that they could see each one. "If you can't see them, it's much harder to manipulate them individually," David Weiss says. But, he adds, because they can make out individual atoms, "it's pretty clear that we should be able to manipulate them independently of each other".

The team now plans to address individual atoms with a highly focused laser beam, which will change their energy state and make nearby atoms interact, Weiss says. This should make the atoms enter the quantum physical state known as "entanglement" that is essential for doing quantum computations.

"Now that there's this huge array, you can start seriously thinking about scaling up" the approach of using neutral atoms as qubits for quantum computing, says Trey Porto of the National Institute of Standards and Technology in Gaithersburg, Maryland. "It's certainly relevant to quantum computing because you're going to need lots of qubits.”

Intel working on carbon nanotube enhanced chips

In a patent published last week, Intel reveals how nanotubes' strength and heat-dissipating properties can be used to reinforce the conducting copper tracks that connect millions of transistors together.

Inventor Chi-Won Hwang says depositing heat-sink nanotubes on electrically insulating layers adjacent to the copper tracks slashes the thermal stress caused by fast-pulsing electric currents

Форма для связи


Email *

Message *