November 17, 2007

Printable carbon nanotube batteries

Batteries that can be printed onto a surface with "nanotube ink" have been demonstrated by US researchers, who say the technique will fit well within the growing field of printed electronics, which still use conventional power sources. Solar cells could also be printed from inkjet printers

Carbon nanotube production is expanding and prices are falling.

To make the battery, a layer of nanotubes is first deposited in the form of "nanotube ink" onto a surface. This layer acts as the charge collector, which removes current from the battery.

Next, a layer of nanotube ink mixed with manganese oxide powder and electrolytes, which carries charge within the cell, is applied on top. This layer acts as the cathode. Finally, a piece of zinc foil – the anode – is applied.

"The batteries are similar to conventional batteries," says Gruner, "with the electrically conducting nanoscale networks replacing conventional metals and electrodes."

He adds that the designs should make it possible to get more power than a conventional design would from the same materials, , "an important factor for portable electronics applications."

The researchers also made supercapacitors using the inking technique and plan to combine these with batteries for applications requiring more power.

Furthermore, since both printed batteries and supercapacitors can be made entirely at room temperature, it should be possible to mass-produce them using established printing methods, Gruner says.

Gruner says his research team is working to increase power output and to demonstrate suitability of the designs for industrial production.

November 16, 2007

New method to manipulate light a million times more efficiently

Using a special hollow-core photonic crystal fibre, a team at the University of Bath, UK, has opened the door to what could prove to be a new sub-branch of photonics, the science of light guidance and trapping.

The team, led by Dr Fetah Benabid, reports on the discovery, which relates to the emerging attotechnology, the ability to send out pulses of light that last only an attosecond, a billion billionth of a second.

It is 1000 times shorter than a femtosecond. Short pulse lasers are a very interesting and emerging tool for science and technology They can destroy viruses and bacteria. They can vaporize matter without heat and be used for precise micromachining

These pulses are so brief that they allow researchers to more accurately measure the movement of sub-atomic particles such as the electron, the tiny negatively-charged entity which moves outside the nucleus of an atom. Attosecond technology may throw light, literally, upon the strange quantum world where such particles have no definite position,only probable locations.

To make attosecond pulses, researchers create a broad spectrum of light from visible wavelengths to x-rays through an inert gas. This normally requires a gigawatt of power, which puts the technique beyond any commercial or industrial use.

Dr Benabid’s team used a photonic crystal fibre (pcf), the width of a human hair, which traps light and the gas together in an efficient way. Until now the spectrum produced by photonic crystal fibre has been too narrow for use in attosecond technology, but the team have now produced a broad spectrum, using what is called a Kagomé lattice, using about a millionth of the power used by non-pcf methods.

Tthe team makes use of the fact that light can exist in different ‘modes’ without strongly interacting. This creates a situation whereby light can be trapped inside the fibre core without the need of photonic bandgap. Physicists call these modes bound states within a continuum.

Nantero's newest NRAM promise is for 2008

Nantero is promising that NRAM could appear in consumer goods in 2008 However, they have previously promised 2005 and 2007.

The thought of introducing filthy carbon nanotubes into an ultra-sensitive fab has blocked the rise of so-called NRAM or Nano Random Access Memory. Thanks, however, to a refined cleansing process and relentless browbeating start-up Nantero thinks it has mainstream semiconductor players close to giving NRAM a try.

Nantero has attracted the attention of some big name tech players, including HP, LSI and ON Semiconductor. These companies are either experimenting with NRAM in their R&D labs or actually trying to produce NRAM chips.

The Feds have a lot of interest in NRAM because of its longevity and density. Flash, while glorious, does degrade over time as insulators wear down due to charge fluctuations. Memory loss on a satellite or weapons system can be a real pain.

So I think NRAM will have a radiation hardened and endurance niche.

November 15, 2007

Silicon electronics can be printed by Ink Jet printers

Kovio, sunnyvale startup, jas the world's first all-printed silicon transistor capability. They will have roll to roll printing of computers. This is clearly a process that will get even better as the design rules shrink and as they use faster printing technology like Memjet This technology also has implications for better fabbers, rapid prototyping and rapid manufacturing over the next few years.

This could foreshadow some prior to nanofactories style breakthroughs. New near-nanotech could make breakthroughs that can drive cost and applications that cannot be matched by silicon CMOS tech. Ovonic quantum control devices on polymer might have a performance advantage instead of disadvantage versus silicon chips. New nanopatterning systems could get the component sizes down as well.

Super-electronics based on polymer or graphene might be made from far cheaper fabs and far greater flexibility in form factors and cost reductions and even higher volumes.

Better versions or alternatives to this technology might also transform solar photovoltaics.

Its "green" silicon ink for thin-film transistors (TFTs) that achieve the performance of polysilicon transistors, but at a third their price and consuming only 5 percent of the chemicals and 25 percent of the energy of single-crystal silicon. Kovio claimed that radio-frequency identification tags using its silicon ink will drop Kovio's price from 15 cents today to 5 cents by 2008, when Kovio begins volume production of its inkjet-printed RFID tags.

Their thin-film silicon transistors have very high mobilities for a printed device and we can make both p-type and n-type devices for CMOS circuits. Right now their design rules are 20 micron, but they have 10 micron working in the lab, which is where Intel started in 1971. Intel's first microprocessor used just over two thousand transistors: similarly, their first devices for RFID tags will use less than about a thousand transistors when we go into mass production by the end of next year [2008]

Kovio is building its own fab, which uses temperatures too high for plastic substrates (which is why Kovio uses a stainless steel foil substrate), but which does not require the expensive processing equipment and clean-room environment of single-crystal silicon fabs. Silicon ink devices can be fabricated on roll-to-roll printing equipment, which is how Kovio plans to dramatically drop the price of RFID tags and similar applications using all types of flexible electronics.

They can build a printable silicon fab for about $10 million, compared with $1 billion for a traditional silicon fab. They need only about five percent of the materials (one percent of substrate cost and three percent of the cycle time) to create new devices.

By way of comparison, single crystal silicon transistors today can achieve mobilities as high as 600 centimeters squared per volt second (sq cm/Vs), and polysilicon transistors, like those that drive LCD displays, have mobilities of about 100 sq cm/Vs. Unfortunately, there is a big gap between single-crystal silicon and the printable organic transistors that are being demonstrated at dozens of labs worldwide. Organic transistors have dismal electron mobilities of less than 1 sq cm/Vs in contrast with Kovio's silicon ink, which rivals polysilicon with its 80 sq cm/Vs electron mobilities. Most important, silicon ink can produce transistors that are fast enough for RFID and most other electronic interface protocols.

US 9th Appeals Court tosses federal fuel-economy standards

A U.S. appeals court on Thursday threw out the government's new fuel economy standards for many sport-utility vehicles, minivans and pickup trucks in a victory for environmentalists.

The decision stemmed from a lawsuit filed by 11 states and environmental groups that argued federal regulators ignored the effects of carbon dioxide emissions when calculating fuel economy standards for light trucks.

Filed last year, the suit sought to force the National Highway Traffic Safety Administration to recalculate its mileage standards from scratch, with carbon dioxide emissions taken into account as a major factor in the agency's analysis.

Gov. Arnold Schwarzenegger said Thursday that California won't back down from the lawsuit — but will stick to its plan to put tougher standards in place by the 2009 model year despite protests from the auto industry.

Last week, California sued the U.S. Environmental Protection Agency, seeking to force the agency to decide whether California can enact the country's first emissions standards for cars and light trucks.

Nantero NRAM still not commercialized

Nantero NRAM is discussed with the CEO of Nanetero at Hpcwire

Nantero's business model is to license the NRAM technology to established manufacturers, and to provide intensive support to them in getting it up and running and integrated into products. The main challenges now include increasing yield on the technical side and signing new partnerships on the strategic side to add to the licensee base. Multiple discussions with potential licensees are underway, both in the embedded space and the standalone memory space, and depending on the level of resource partners apply to standalone memory, it might not come that much further after embedded memory.

NRAM still has significant potential.
NRAM requires only a small number of new manufacturing steps, all of which use existing tools that are present in any production CMOS fab. So NRAM has few hurdles for integration, either as a standalone memory or as an embedded memory. NRAM's scalability, theoretically down to below 5nm, is also unmatched in technologies that are currently under development in production CMOS fabs, as NRAM is.

However, NRAMs window of opportunity is rapidly closing with many other new universal computer memory technologies making fast progress.

Samsung could be selling a phase-change-based flash-replacement memory within a year. Some phase change memory is 1000 times faster than current flash memory.

Others are working on nanoionic memory. Qimonda, based in Germany; Micron Technologies, based in Boise, ID; and a Bay Area stealth-mode startup. The startup is well on the way to producing its first memory devices, which Kozicki says could be available within 18 months. These first chips, however, won't rival hard drives in memory density, he says.

Copper doped computer memory could be selling in a few years

If something big does not happen with NRAM in 2008, then I think the ship will have sailed on implementation of other technological alternatives. Any momentum or first mover advantage is already slipping away this year.

Making cheaper, stronger and better carbon nanotubes without metal catalysts

NASA Goddard Space Flight Center has made a major step forward in reducing the cost of manufacturing single-walled carbon nanotubes (SWCNTs).

Most manufacturing methods, which use a metal catalyst to form the tubes, have several drawbacks that have impeded development of SWCNTs’ numerous applications. NASA researchers have discovered a simple, safe, and inexpensive method to create SWCNTs without the use of a metal catalyst.

Traditional catalytic arc discharge methods produce an “as prepared” sample with a 30% to 50% SWCNT yield. NASA’s method produces SWCNTs at an average yield of 70%.

Because NASA’s process does not use a metal catalyst, no metal particles need to be removed from the final product. Eliminating the presence of metallic impurities results in the SWCNTs exhibiting higher degradation temperatures (650 °C rather than 500 °C) and eliminates damage to the SWCNTs by the purification process.

Researchers used a helium arc welding process to vaporize an amorphous carbon rod and then form nanotubes by depositing the vapor onto a watercooled carbon cathode. Analysis showed that this process yields bundles, or “ropes,” of single-walled nanotubes at a rate of 2 grams per hour using a single setup.

This process received a Nanotech Briefs Nano 50 award

Hat tip to Roland Piquepaille's Technology Trends which also has other links on this topic

Carnival of Space week 29

Reports of Dwave's latest quantum computer demo

Zdnet blogs on the latest quantum computer demo by Dwave Systems

The latest iteration of D-Wave’s chip has 28 qubits (quantum bits), according to Rose. He said they were on track to show a 512 qubit machine next year, and 1024 the year after that. The die has room for a million qubits. But first things first, says Rose. “If we can’t get to 512 qubits by the end of next year, we’re in trouble,” he admitted.

Dwave's press release on the demo and future plans is more optimistic.

Our product roadmap takes us to 512 qubits in the second quarter of 2008 and 1024 qubits by the end of 2008.

At the future of things, Rose takes a tougher stance.

Rose has responded to the criticism saying that major developments have been made in quantum computing systems in the past years. He said that the 28-qubit computer, which will be demonstrated at SC07, will be able to use Dr. Neven's image recognition algorithm to analyze a 300-image database, grouping the objects according to detected similarities. "Our image-matching demonstration, the core of which is too difficult for traditional computers, can automatically extract information from photos-recognizing whether photos contain people, places or things, and then categorize them by visual similarity" – he said.

The actual machine is a bit unweildy at the moment. In fact, it’s about as large as D-Wave’s entire booth, so demos were run remotely via a web service back to the lab. “We’re going to work on making the refrigerator a bit smaller and self-contained,” said Rose, thinking ahead to commercial deployments.

In the picture above you can see a magnified view of the individual qubits on the chip. Each qubit is connected to three of its neighbors. Rose was asked why people were so skeptical of his work. It all comes down to the traditional way of relating discoveries through peer reviewed journals, he explained... He promised. “We’re going to go out to some of the hotbeds of skepticism” in the coming year, he said, with the goal of silencing the nay-sayers. They might even file a paper or two, but it didn’t seem to be a priority.

Apparently the US Patent and Trademark office is convinced, having granted the company dozens of patents on the technology. Dozens more are pending. “We have more [quantum computing] patents than any other company in the world,” said Rose.

Neil Gershenfeld’s Keynote speech from SC07 conference

Neil Gershenfeld challenged his audience to reconsider “obviously true” statements like “binary information is represented with two states”. In light of current and future technological trends what if we relax these statements? He listed several of these tautologies and set out to reword them to be more correct in today’s and tomorrow’s world:

Computers come in cases -> computers come in rolls, buckets
Compilers optimize programs -> optimizations program compilers
Bits are zero or one -> bits are between zero and one
Internetworking -> interdevice interworking
Programs can describe things -> programs can be things

As regular objects become computerized and interconnected at a smaller and smaller scale, we’re approaching the nano-scale of biological systems. We’re “in the moment”, he says, on the cusp of a fabrication revolution

Desktop supercomputers

SiCortex and Scalable Servers Corporation have each packaged something like a server cluster into a single box, to produce what each hopes will be a commercially viable desktop supercomputer.

On the outside, the machines in question look like big desktop PCs. On the inside, they are rather different. Instead of one or two microprocessors (the business parts of a computer, which do the actual calculations), they have dozens—up to 72 in the case of SiCortex.

The new SiCortex SC072 - code-named "Catapult" - fits 72 processors into a deskside unit that starts at less than $15,000. With a total of 72 processors, 48 GB memory, and 3 PCIexpress ports, the Catapult draws less than 200 watts of power and fits in standard PC chassis.

Scalable Servers Corporation has its flexBLADE platform

Capable of a wide range of configurations, the versatile flexBLADE is comprised of a single chassis form factor with up to 5 dual socket blades, configurable as a cluster, SMP, hybrid-combination or small server farm. Supporting the full range Next Generation AMD Opteron processors, the platform can scale from a cool and quiet, low power 1500 watt departmental solution, up to a robustly configured 3000 watt compute powerhouse. The flexBLADE also supports scaling out beyond standard dual socket to quad socket SMP, FAT NODE, configurations with ample memory support (16 DIMM slots per node, or 80 total DIMM Slots per chassis), storage (up to 10-2.5" and 14-3.5" disks), and a PCIe x16 slot per blade which allows multiple graphics heads per platform. Built-in networking includes 10 or 20 Gigabit InfiniBand, 10 Gigabit and 1 Gigabit Ethernet with full system management that allows the flexBLADE platform a wide range of configurations to match performance and cost requirements.

SiCortex also has the SC5832, which has a cluster of 972 nodes, each having six processors, for a total of 5,832 processors. Each processor draws a paltry 600 milliwatts of juice. The chassis can hold up to 8 TB of main memory, and the theoretical peak performance is 5.8 TFlops.

The whole cabinet only draws 18 kilowatts of electricity. Nodes communicate directly over a passive copper backplane. It’s air cooled, and each node runs a fairly standard Linux kernel. It comes pre-installed with a full suite of development software including MPI, TAU, Vampir, TotalView, and more.

November 14, 2007

Interesting theory of everything

A very interesting and relatively simple theory of everything (including gravity) The theory should be testable with new particle colliders.

E8 polytope

All fields of the standard model and gravity are unifed as an E8 principal bundle
connection. A non-compact real form of the E8 Lie algebra has G2 and F4 subalgebras which break down to strong su(3), electroweak su(2) x u(1), gravitational so(3,1), the frame-Higgs, and three generations of fermions related by triality. The interactions and dynamics of these 1-form and Grassmann valued parts of an E8 superconnection are described by the curvature and action over a four dimensional base manifold.

Article from the Telegraph

The crucial test of Lisi's work will come only when he has made testable predictions. Lisi is now calculating the masses that the 20 new particles should have, in the hope that they may be spotted when the Large Hadron Collider starts up.

"The theory is very young, and still in development," he told the Telegraph. "Right now, I'd assign a low (but not tiny) likelyhood to this prediction.

"For comparison, I think the chances are higher that LHC will see some of these particles than it is that the LHC will see superparticles, extra dimensions, or micro black holes as predicted by string theory. I hope to get more (and different) predictions, with more confidence, out of this E8 Theory over the next year, before the LHC comes online."

The E8 theory is being discussed on physics forums.

E8 theory is predictive (that is to say falsifiable) because it has no free parameters to adjust. It will say what it will say---and if that is shown to be wrong, then the theory's wrong. As development proceeds changes might be made to the action and to the way E8 symmetry is broken, but a good many features are already locked in as unalterable predictions. Like the 18 new particles---which might serve to resolve the astrophysical dark---or might serve to trip the theory up!

E8 theory predicts what reactions are allowed for both the new and the already observed standard particles. So even though it is just taking shape the theory is already offering the prospect of something experimentalists can look for. Traditionally this is what hep-th is supposed to do.

Other discussions

and more discussion here

China taking leadership in renewable power deployment

Worldwatch indicates that China will likely achieve—and may even exceed—its target to obtain 15 percent of its energy from renewables by 2020. If China’s commitment to diversifying its energy supply and becoming a global leader in renewables manufacturing persists, renewable energy could provide over 30 percent of the nation’s energy by 2050.

China’s carbon dioxide emissions are on the rise and are expected to exceed total U.S. carbon dioxide emissions shortly, although Chinese per-capita emissions remain about one-sixth those of the United States.

More than $50 billion was invested in renewable energy worldwide in 2006, and China is expected to invest over $10 billion in new renewables capacity in 2007, second only to Germany. Wind and solar energy are expanding particularly rapidly in China, with production of wind turbines and solar cells both doubling in 2006. China is poised to pass world solar and wind manufacturing leaders in Europe, Japan, and North America in the next three years, and it already dominates the markets for solar hot water and small hydropower.

A combination of ambitious targets supported by strong government policies and the manufacturing prowess of the Chinese may enable China to “leapfrog” so-called industrialized nations in renewable technology in the years immediately ahead

The article indicates that nuclear energy could provide about 5% of china's power needs. I think that target will be reached in 2020 (with 50-70 GW of nuclear power). I think China will exceed that percentage with around 100 more nuclear plants from 2020-2030 (up to 10-15% of electricity) and then 200-400 more nuclear plants from 2030-2050 (up to 30-40% of electricity.

32 megajoule rail gun delivered for naval testing

BAE Systems has delivered a functional, 32-megajoule Electro-Magnetic Laboratory Rail Gun (32-MJ LRG) to the U.S. Naval Surface Warfare Center in Dahlgren, Va. Installation of the laboratory launcher is currently underway, and according to BAE, this is the first step toward the Navy’s goal of developing a tactical 64-megajoule ship-mounted weapon.

Eight and 9-megajoule rail guns have been fired before, but providing 3 million amps of power per shot has been a limitation. At 32 megajoules, this new system appears to be the most powerful rail gun ever built, and the Office of Naval Research is installing additional capacitors at the Dahlgren facility to support it. The planned 64-megajoule weapon, if it’s ever built, could require even more power—a staggering 6 million amps.

The Navy’s electrically-propelled DDG 100 Destroyer, Chaboki says, is a prime candidate for the final 64-megajoule system. Around 72 megawatts (MW) of the vessel’s power can be used for propulsion. But during combat, the destroyer’s speed could be brought down, freeing up energy for a rail gun. Chaboki calculates that firing the 64-megajoule weapon six times per minute would require 16 MW of power, which would be supplied by either onboard capacitors or pulsed alternators.

Effective rail guns will require a major breakthrough in materials between now and 2020, to keep the guns themselves from being shredded by each high-velocity barrage.

There was a 2003 analysis of using railguns for orbital launches.

For launch to orbit, even long launchers (>1000 m) would need to operate at accelerations >1000 gees to reach the required velocities, so that it would only be possible to launch rugged payloads, such as fuel, water, and material. A railgun system concept is described here and technology development issues are identified. Estimated launch costs could be attractively low (<$600/kg) compared with the Space Shuttle (>$20 000/kg), provided that acceptable launch rates can be achieved.

A european space agency study of rail guns for space launches

A system to launch single stage rocket propelled projectiles to put in orbit nano-satellites using a 3.4 GJ railgun with a length of 180 m.

Ram accelerators would be cheaper and quicker to develop for gun launching payloads into space.

Superthread carbon nanotubes would be the kind of material needed to help reinforce the rail gun

There is other progress being made on better materials including nanograin metals.

Staffing an expanding nuclear industry

Skilled workforce shortages are common across industries. (Nursing etc...)
Besides spinning up new training and increasing recruitment the issue can also be addressed with increased automation, design and process improvement to reduce staffing requirements. The issue of staffing the nuclear industry is a known issue and is often cited as a reason that the nuclear industry cannot expand.

The number nuclear engineers being trained is increasing, college programs are being expanded or restarted and companies like General electric have initiated agressive recruitment programs. There is also the significant changes via design and management processes that are reducing staffing requirements per operating plant. There are 60,000 skilled nuclear workers and 20,000 of them could retire over the next five years, plus thousands more will be needed for new plants in the USA and around the world.

Comparison of some staffing levels for different modern nuclear plant designs. Avg nuclear plant staffing levels were 1000-1200 in the mid-90s (already down from 1970s and 1980s when it was about 1500 people per plant). New designs with staffing levels of 440-700 would reduce the staffing needs further. While there has been staff reductions all of the safety and operational metrics have been improving for the last three decades. Avg staffing levels are now at about 790-800 people.

Since staff costs typically account for more than half of a plant’s O&M cost, reducing staff should reduce O&M costs. Design concepts for new plants have focused on reducing the operations burden and thereby reducing staff, which leads to staff reduction and should ultimately lower operating costs.
This study used a task-based approach to determinine plant staff requirements for specific plant operation tasks. Starting with the staffing profile of a top-rated plant (North Anna), the study team reviewed the details of the new designs to determine if the advances in technology and information reporting would reduce overall staffing levels. Each task associated with plant operation was taken into account. A staff model was developed for each reactor type. This model maintains an adequate staff level to meet regulatory and best practice requirements.
The first new plants built in the United States will rely heavlily on current operational practices to ensure that the lessons learned over the more than 30 years of plant operation will be applied to the newest generation of plants. Therefore, for the purposes of this study, the organizational structure from the current operating philosophy was maintained. Although current staff structures differ between operating companies, they have a single overall goal—to reduce human error and equipment failure in all phases of plant operation and safety and to ensure an overall high operating capacity factor.
The staffing estimates used in this study include the onsite plant staff as well as additional staff that would be needed in the corporate office to support the additional units. These estimates also include corporate office support staff, which includes the staff who provide fuel design and procurement, safety analysis support, major modification development, and other more generic activities.

Overcoming the challenges of the workforce issues and knowledge maintenance.

US Nuclear industry staffing levels

Once base power rates were established through public utility commissions, opportunities for cost reductions through labor savings became available. By the mid-1980s, U.S. nuclear plant operators began looking for opportunities to reduce cost through staffing reductions. The next major adjustment in personnel levels in the U.S. began in mid 1990s with programs to "right size" the employee workforce. While effectively improving performance in terms of capacity factor, safety performance, and reduced refuel outage durations, U.S. NPPs began to consistently reduce employee staffing levels. Since 1997 average U.S. NPP staffing levels have dropped by more than 15%. These reductions appear to have recently leveled off.

As part of the reduction of total staff, along with the technical nature and training requirements for operating NPPs, employee skills set have become very focused. To offset this situation, most U.S. NPPs proactively encourage rotation and cross training of staff. This approach provides "bench strength" to provide additional personnel with experience and/or training while maintaining lower overall staffing levels.

Consolidation of NPPs into operating fleets has had a beneficial impact on developing and maintaining key knowledge.

Dealing with staffing reductions.

Getting nuclear engineering enrollment to 2000-4000 would turn out 700-1400 graduates per year who would help to stablize and eventually increase the nuclear workforce. GE and other companies could step up and offer more scholarships and incentives to further increase enrollment and provide university endowments to created new programs. Get enrollment up to 8000 and 2800/year graduates should be produced. Increase to 16,000 enrolled for 5600/year in graduates. Increase to 32,000 enrolled for 10,200/year in graduates. There are 104 plants in the USA now and with 800 people per plant the staffing level must be 83,000. Of those 60,000 have special industry skills. In 2017, if the increased training and recruitment programs restore the workforce to 60,000 people and the staffing requirements for old plants are brought to 20% less and new plants only need 400 skilled staff then 30 new plants could be adequately staffed. Further recruitment and training would allow for more industry growth. 300 plants by 2030 in the USA with 400 skilled staff per plant would require 120,000 people. In the 2010-2020 timeframe the number of graduates would need to increase to the 5600/year-10,200/year levels.

I had a prior article that constructing a lot of nuclear power plants is not materially constrained.

The nine components of nuclear plants which have limited suppliers are being built up by Areva and South Korea and others. So there will be more suppliers of currently limited parts. New factories to make things you need can be started.

November 13, 2007

Other progress in quantum computing

Other Quantum computer progress discussed at SC07

The quest to build fully functioning quantum hardware is active on many fronts. Atomic physicists have seen for some years that the quantum states of a single atom held at rest in a trap, manipulated by laser pulses, functions as a highly coherent quantum information carrier. The ability to perform elementary logic operations on such a qubit has been well demonstrated.

Unfortunately, atomic physicists are not skilled HPC designers. So, much work also goes on in the area of novel integrated-circuit devices, in which the necessary quantum control is harder to demonstrate, but from which a large-scale device could be more readily created than it could be with trapped-atom technology. Two of these efforts are represented by leading practitioners on our panel: Will Oliver is a specialist in superconducting electronics, in which quantum behavior results not because the circuits are atomic-scale, but because of the special physical properties of the superconducting state. He has interesting results on a potentially scalable Josephson-junction circuit. Another panelist, Eli Yablonovich, is an expert on the creation of qubits using individual atomic impurities in semiconductors (yes, he is also the inventor of the photonic bandgap effect). The control of individual atomic impurities and individual electrons in electronic devices has been a beautiful technological feat of recent years, which has opened up many novel possibilities, quantum and otherwise, for new, ultradense integrated devices.

Big nuclear power plans from China, India and Russia

India's president talking about an energy independence plan.

Currently, power generation capacity in India stands at 130,000 MW, but this is forecasted to increase to 400,000 MW by 2030. To achieve this massive boost in output, a range of different technologies will be required including large scale solar farms, wind farms, nuclear power plants, solid biomass and municipal waste. Power generated from renewable energy sources is hoped to increase from five percent today to twenty five percent by 2030.

China's projected energy plans to 2020 mix looks far different than many project. China would have about 35% power from non-fossil fuel sources in 2020. 270GW Hydro, 40GW nuclear, 123GW from renewables if targets are reached. 42% of power would be from non-coal sources if natural gas usage is increased as projected

Some analysts say China will build 300 [nuclear plants] more by the middle of the century.

Russia has big nuclear plans.

About 15 percent of Russia's electricity comes from nuclear power. Putin wants to increase that to 25 percent or more by 2030. it also hopes to export as many as 60 plants in the next two decades. To facilitate the crash expansion, the Kremlin in the summer of 2007, ordered more than 30 nuclear-related companies to amalgamate into a single state-owned behemoth, which will control every stage of civil atomic engineering from uranium mining to construction and export of power stations to fuel enrichment to decommissioning old reactors.The new nuclear giant, to be called Atomenergoprom (Atomic Energy Industry Complex), is similar to other conglomerates that the Putin government has created and now runs in branches such as aircraft production, arms exports, electricity, and gas.


One teraflop AMD R680 GPGPU due in Q1 2008

More speculation on a US-Iran war

DC-based futurists and analysts pick top 12 areas for innovation by 2025

DC-based research and consulting firm Social Technologies released a series of 12 briefs on their top 12 areas for high impact technology innovation through 2025.

1. Personalized medicine [I agree and have covered personal genomics ]
2. Distributed energy (DE) [Here I think nuclear energy has more potential]
3. Pervasive computing [Fully realizing smart phone and wireless device potential]
4. Nanomaterials [I have written about new carbon nanotube factories and a lot on "active nanosystems", all of the advanced nanotechnology for which my site is named]
5. Biomarkers for health [I have been writing about my proposals for widespread biomarker tracking and part 2 of the biomarker proposal]
6. Biofuels [A bridging technology where electrification of transportation is delayed or less suitable]
7. Advanced manufacturing [rapid prototyping, rapid manufacturing, reel to reel systems, nanofactories]
8. Universal water [I have been covering desalination]
9. Carbon management [Shifting to lower carbon technology like nuclear power and wind power would be better than sequestering]
10. Engineered agriculture [I have been covering aquaculture, genetic engineering for crops, stem cell meat factories, and high rise green houses]
11. Security and tracking [I have tracked advancing imaging technology, surveillance technology, gigapixels, terapixels, sensors, spectrum analysis, lidar, persistent monitoring, RFIDs etc...]
12. Advanced transportation [I have covered electric bikes and scooters, platooning of vehicles, dual mode transportation, electric cars, high efficiency vehicles, advanced truck and diesel systems, transitioning from oil etc...]

Part 2: Widespread use of biomarker tests

This is a follow up to my proposal for widespread monitoring and analysis of biomarker data in people to improve medical research and treatment.

There would be work to do to bring costs down and to make more comprehensive tests. Which is why I was suggesting that someone like an Andy Grove (who lead Intel in bringing the costs of semiconductors lower) would be an ideal person to bringing this about. Andy Grove has lamented the lack of pharma progress. He is motivated.

The global market for microfluidic technologies was worth an estimated $2.9 billion in 2005. This figure should grow to $3.2 billion in 2006 and $6.2 billion by 2011. Self monitoring of blood glucose in diabetics is $1.2 billion business.

There are
51 tests for monitoring glucose for diabetics $100
. Results downloadable to PC. Have to make more detailed blood analysis and then transmit results for centralized processing.

There is work on lung cancer blood tests.

It is a goal (widespread biomarker tests) to work towards that would provide a lot of benefits. Plus there are earlier stages where it is only for a few thousand people like Nielson boxes. A statistically diverse group that would help researchers to make better inferences about larger populations. Instant clinical trial data from recorded data mine-able information. Information to improve health and lower costs.

I think the initial few thousand Nielson box goal is achievable in a few years and the larger vision within ten years. Lab on a chip are lithography and MEMS. It is a matter of getting HMOs and PPOs and government funded medical programs to see that this would help them lower costs in the long run.

In terms of the costs. The system would be a lot cheaper than doctors or nurses taking the blood samples and sending them to labs for the tests. A really capable machine would probably initially cost several thousands to tens of thousand of dollars. It would have several chips and perform multiple tests and would have re-usability (instead of throw-away systems).

Some have the attitude: "if this was cheap enough then it would already be widely utilized" is the wrong way to look at it. I think this would be a superior way to figure out what is really happening with people when they get treatment and in between exams.

The current presidential candidate suggestions for healthcare which all amount to put more money into the current broken down health care system (110 billion/year for the Clinton plan) and get some more insurance for some or all of the uninsured. These plans do not try to prevent people from getting sick in the first place, or have a central goal of early detection of disease (when it is cheaper to treat and treatment is more successful) or take steps to get the data we need to make people healthier.

Are we trying to change the future for the better or aren’t we ?

$5 billion per year would pay for tens of thousands of machines and a research program for further development and refinement. 5% of the Clinton plan.

Many of the medical tests are priced so that based on the number of tests performed the research costs of developing the test are recovered with profit. A high degree of automation and high test volumes would allow for negotiation to the Intellectual property holders to reduce the per test prices while still allowing for sufficient return on the research.

November 12, 2007

Status of air taxi services

Here is the wikipedia list of very light jet (VLJ) operators. Many of the VLJ operators are starting "air taxi services".

Dayjet which is flying with 12 aircraft in Florida has an on demand service.

DayJet brings affordable, on-demand regional jet service to poorly connected businesses, communities and individuals across the Southeast. Today’s service launch directly links Gainesville to an initial four Florida DayPort™ airports, including Boca Raton, Lakeland, Pensacola and Tallahassee. Gainesville business travelers can now book just the seat they need aboard DayJet’s fleet of Eclipse 500™ very light jets (VLJs); customize travel according to their time and budget requirements; fly point-to-point between designated DayPort airports; and return home in a single day. Prices start at a modest premium to full-fare economy coach airfares.

The Eclipse 500 set the NAA speed record on October 7, 2007 for a flight from New York (Westchester) to Atlanta (Peachtree-Dekalb), with a new record time of one hour, 55 minutes, and eight seconds (1:55:08), averaging 393.32 miles per hour (341.79 knots). The previous record holder, a Cessna Citation Mustang set the record on September 22, 2007, flying the same route in two hours, 23 minutes, and 44 seconds (2:23:44), averaging 318.87 miles per hour (277.09 knots). The Eclipse 500 exceeded the previous record time by 20 percent, while using approximately 25 percent less fuel.

The financial times discusses the new air taxi business. Dayjet is aiming to serve 40 airports within three years. It is also studying an eventual launch of similar services in Europe.

The US transportation department is also looking at how VLJs could provide a radical solution to the declining airline service to rural communities.

Linear Air is aiming for whole-aircraft operations when its Eclipse 500s arrive, launching from a small airport near Boston and targeting bases near New York, Washington DC and then on the West Coast. Linear Air plans add 1000 and 300 VLJs within 5 years for its air taxi service. They plan to have a fleet of 30 VLJs within 2 years.

Pogo Air, a start-up with managers including former American Airlines chief Bob Crandall, is also eyeing the east coast corridor with Eclipse 500s.

Globe air is planning to have about 30 VLJ in 2008 and 2009. They are planning to fly in Austria in the summer of 2008.

Imagine Air is flying in Georgia and plan to have 25 VLJ by the summer of 2008.

There are a large number of additional VLJs that are likely to be certified from 2008-2010.

I had prediced in early 2006 that there would be Jet airtaxi services in 2006-2008. There are likely to be several hundred VLJs flying by the end of 2008. There could be over one thousand VLJs flying by the end of 2008. Air taxi services will be flying in several additional states and countries by the end of 2008.

It appears that by the end of 2008, Dayjet will be servicing as many as 40 airports.

Finex steel process better for environment and lower cost

Posco, south korean steel company, is preparing to expand abroad and overtake Nippon Steel of Japan as the world’s third-biggest steelmaker, its groundbreaking Finex technology is central to its plans. The
South Korean company is a leader in revolutionising the steel- making process, becoming the first to commercialise next-generation Finex technology, which is both cleaner and cheaper than traditional blast furnaces. Finex facilities are also about 20 per cent cheaper to build than traditional blast furnaces and they produce steel for about 15 per cent less.

Finex steel process compared to blast furnace

At its main base in Po-hang, Posco’s Finex test plant is running on a commercial basis, producing 200,000 tonnes more than its 600,000 annual capacity, and the main plant is capable 1.25m tonnes a year. Posco is now looking to install Finex at plants in Vietnam, India and perhaps even China. [Globol Steel production is about 1.2 billion tons per year]

Mr Lee said: “The Finex technology is good to take abroad because it is cheaper to build Finex plants and the production cost is also cheaper as we can use low-grade iron ore and coal to produce steel with the technology.”

Energy savings from Finex processes

Its relative environmental friendliness is also an attraction for international customers, he said. Finex technology reduces air pollution by up to 99 per cent.

With a cheap iron ore source, Finex technology and fast-growing demand on its doorstep, Posco expects costs at India project to be 30 per cent below that of Korea.

“This implies about 48 per cent operating margin for its [hot rolled steel] business,” UBS analysts said.

The Finex process for making steel is better for the environment and lower cost..

The Finex process uses less expensive power-station coal and fine iron ore which is available all over the world (approximately 80 % of iron deposits in the world). Gaseous emissions containing dust, sulfur and nitrogen oxide can be reduced by an average of 90 % with the Finex process compared to previous production methods.

The average water requirement of 155 m³ per metric ton of crude steel can now be reduced to 30 m² per metric ton by means of water treatment and use of a water circuit. However, outstanding consumption levels of 2.7 m³ of water per metric ton of steel are already being achieved.

The Steel industry is responsible for 5 to 12 per cent of all CO2 emissions.

Finex powerpoint detailing energy efficiency and pollution reduction

A case in technological acceleration: faster communication makes higher resolution telescope

Higher speed communication between parts of a telescope and to scientists enables larger telescopes because "more pixels generate more data, and you have to have way to move more data around."

The boost in speed makes information processing faster among the James Webb Space Telescope's (JWST) four science instruments as they "talk" to each other with the SpaceWire network. That means the infrared telescope, NASA's next great observatory, should capture larger and higher resolution images of space.

Possible cloning of primate embryos

For the first time, scientists have created dozens of cloned embryos from adult primates. [details have not yet been published in a peer reviewed journal]

The scientists who carried out the latest primate work are believed to have tried to implant about 100 cloned embryos into the wombs of around 50 surrogate rhesus macaque mothers but have not yet succeeded with the birth of any cloned offspring.

However, one senior scientist involved in the study said that this may simply be down to bad luck – it took 277 attempts, for instance, to create Dolly the sheep, the first clone of an adult mammal.

The work was led by Shoukhrat Mitalipov, a Russian-born scientist at the Oregon National Primate Research Centre in Beaverton. Dr Mitalipov helped to pioneer a new way of handling primate eggs during the cloning process, which involved fusing each egg with a nucleus taken from a skin cell of an adult primate.

Dr Mitalipov said he was unable to comment on the study until it was published in the journal Nature. But he told colleagues at a scientific meeting this year that he had made two batches of stem cells from 20 cloned embryos and tests had shown they were true clones.

Professor Alan Trounson of Monash University in Australia said Dr Mitalipov's findings represented the long-awaited breakthrough. Despite many attempts, no one had been able to produce cloned primate embryos from adult cells, yet this had been done on dozens of other non-primate species. " This is 'proof of concept' for the primate. It has been thought by some [to be too] difficult in monkeys – and humans – but those of us who work [with] animals such as sheep and cattle thought that success rates would be much like that achieved in these species," Professor Trounson said.

"Mitalipov's data confirms this. They have the skills necessary and we can now move on to consider what might be able to be achieved in humans."

Professor Don Wolf, who led the laboratory at the Oregon National Primate Research Centre before his recent retirement, said the new procedure was based on a microscopic technique that does not use ultraviolet light and dyes, which appear to damage primate eggs.

"We're the first to do it, although it's a tainted subject because of the fraudulent research that came out of South Korea. One can never be sure but there may be some validity to what the South Koreans did. But this would now be the first documented therapeutic cloning in a primate," he added.

Two Stem cells advances: one for vascular treatment another for controlled release of cells

South Korean scientists said Monday they used human embryonic stem cells to treat mice suffering from a vascular disease, in an experiment that could lead to cures for strokes and other ailments.

The stem cells were differentiated into blood vessels that were grafted onto the animals afflicted with ischemia. Ischemia is caused by a shortage of blood to a part of the body, stemming from the constriction of blood vessels. Of the 11 mice treated, four developed new vascular cells that fully revived the damaged limb, while four suffered from a relatively mild case of necrosis. Three lost their legs due to the cut-off of blood flow. 10 other mice given alternative treatment failed to recover.

Engineers at Rensselaer Polytechnic Institute have transformed a polymer found in common brown seaweed into a device that can support the growth and release of stem cells at the sight of a bodily injury or at the source of a disease.

“We have developed a scaffold for stem cell culture that can degrade in the body at a controlled rate,” said lead researcher Ravi Kane, professor of chemical and biological engineering. “With this level of control we can foster the growth of stem cells in the scaffold and direct how, when, and where we want them to be released in the body.”

Kane and his collaborators, which include the author of the paper and former Rensselaer graduate student Randolph Ashton, created the device from a material known as alginate. Alginate is a complex carbohydrate found naturally in brown seaweed. When mixed with calcium, alginate gels into a rigid, three-dimensional mesh.

The device could have wide-ranging potential for use in regenerative medicine, Kane explains. For example, the scaffolds could one day be used in the human body to release stem cells directly into injured tissue. Kane and his colleagues hope that the scaffold could eventually be used for medical therapies such as releasing healthy bone stem cells right at the site of a broken bone, or releasing neural stem cells in the brain where cells have been killed by diseases such as Alzheimer’s.

In order to control the degradation of the alginate scaffold, the researchers encapsulated varying amounts of alginate lyase into microscale beads, called microspheres. The microspheres containing the alginate lyase were then encapsulated into the larger alginate scaffolds along with the stem cells. As the microspheres degraded, the alginate lyase enzyme was released into the larger alginate scaffold and slowly began to eat away at its surface, releasing the healthy stem cells in a controlled fashion.

Dwave System 28 qubit system coverage

Nanowerk covers the Dwave systems demo of a 28 qubit quantum computer system.

"Advancing the machine to 28 qubits in such a short space of time lends credibility to our claim of having a scaleable architecture," stated Herb Martin, D-Wave's CEO. "Our product roadmap takes us to 512 qubits in the second quarter of 2008 and 1024 qubits by the end of that year. At this point we will see applications performance far superior to that available on classical digital machines.

D-Wave will demonstrate an image matching application developed with a third party collaborator. Company personnel will be available to discuss other applications involving pattern matching, constrained search and optimization, according to Martin.

D-Wave plans to deploy the machine, code named "Orion", in the last quarter of 2008 using an on-line service model and providing support for applications involving pattern matching, constrained search and optimization.
D-Wave claims that in June 2009 the on-line quantum computing service will be available for "Monte Carlo" simulation targeted at pricing and risk analysis in the Banking and Insurance community. This will be followed by a quantum simulation capability for chemical, material and life science applications. Users of the on-line service will come from government, military, academia, research, engineering, life sciences and the manufacturing, banking and insurance, according to Martin. "Today, many applications take inordinate amounts of time to develop solutions and accuracy is often sacrificed for timeliness. Our on-line service will provide a cost effective means to improve these applications so that more accurate solutions can be obtained in a significantly shorter time period. In addition, some potential applications are never undertaken because of the limits inherent in digital computing. D-Wave will open up satisfactory solutions to these so called intractable problems," said Martin.

The company recently initiated a program to share some of its experimental results with scientists at chosen institutions. D-Wave's Dr. Mohammad Amin is leading this program with presentations during the next month at MIT, NRC and the Quantum Information Centre.

EEtimes discusses Dwave's collaboration with Google's expert on its forthcoming search-by-image capability--acquired by Google last year when it bought Neven Vision--D-Wave Systems Inc. (Vancouver, B.C.) will demonstrate how quantum computers can perform Neven-based image-recognition tasks at speeds rivaling those of humans.

We have been collaborating with Hartmut Neven, founder of the image-recognition company, Neven Vision, just after Google acquired it last year," said Rose. "Neven's original algorithms had to make many compromises on how it did things--since ordinary computers can't do things the way the brain does. But we believe that our quantum computer algorithms are not all that different from the way the brain solves image-matching problems, so we were able to simplify Neven's algorithms and get superior results."

For the demonstration, the D-Wave quantum computer analyzes a 300-image data base, cataloging the similarities among photos. The results of that comparison are then displayed on a two-dimensional grid, where similar objects are grouped together.

"We hope to have our commercial architecture ready by mid-2008," said Rose. "It will house enough qbits to begin solving mathematical problems that are intractable today. D-Wave's current prototypes are not amenable to scaling up to hundred of qbits, but with the knowledge we've gained over the last year, we feel that the last remaining technical obstacles to life-size quantum computers have been removed."

Cnet also has coverage

D-Wave has not had its system externally validated, said Rose, because "there is only one meaningful measure of validation for a technology like this: does it outperform the systems people are using today in a metric that they care about? We are getting very close to achieving this objective."

I have extensive coverage of quantum computers

Much of that coverage is about Dwave systems

Possible Disruptive technologies for supercomputers

First introduced at SC06 as the Exotic Technologies Initiative, the Disruptive Technologies activity will return to SC07. Each year, DT will serve as a forum for examining those technologies that may significantly reshape the world of HPC in the next five to fifteen years, but which are not common in today's systems.

A disruptive technology is a technological innovation or product that eventually overturns the existing dominant technology or product in the marketplace. This year's SC07 (supercomputer 07) showcase will feature quantum computing, optical interconnects, CMOS photonics, carbon nanotube memory (Nantero NRAM), and software for massively-parallel multicore processors.

IBM Research has developed an Optical Printed Circuit Board technology consisting of chip-like optical transceivers (currently supporting 16+16 optical channels at 12.5Gbps each) and polymer waveguides on circuit cards.
Their technology would be disruptive in that it would replace today’s high cost optical modules based on glass fiber technology with mass manufacturable "optical printed circuit boards." for short backplane and card level links. Although polymer based waveguides have higher losses than glass fiber technology, the ability to use lithographic processes to mass produce this technology coupled with the use of chip like optical components will allow a low cost solution for this ultra-short interconnect application. They are working to develop a supplier ecosystem to mature this technology in the next 5 to 7 years.

I have already been covering the Dwave systems quantum computer and Nantero NRAM.

Luxtera developed a breakthrough nanophotonic technology that enables manipulation of both photons and electrons on a single semiconductor CMOS die and can be produced in high-volume, low-cost mainstream CMOS processes. This breakthrough silicon photonics technology enables connection of fiberoptic cable directly to a silicon die.

ETI has developed a disruptive technology for many-core system software and logic co-verification. A complete system may contain many such chips (e.g. 64-bit 160 cores on a chip and many chips in a system in the case of the IBM Cyclops-64 supercomputer).

Near term longevity and best bets for the longer term

What can be done now for longevity ? And what are the best and under-funded opportunities for the future ?

In terms of longevity now or very near
1. don't eat garbage and exercise
Some sites on longevity enhancing lifestyle that has more chance of longevity

2. There seems to be some promise with calorie restriction (but many people are unable to incorporate it into their lifestyles.). There is an alternate days of semi-fasting approach as well, which could be easier to adopt. There is work on drugs and gene therapy to achieve these effects without lifestyle modification
Calorie restriction linked to mitochondria one of the seven pillars of SENS

So it seems that SENS is an avenue worth pursuing even based on current and near term work and research.

3. Take the tests needed to detect cancer and heart disease early and then make the lifestyle changes or take the necessary medical interventions. Find out your own higher risks based on family history.

4. Look at modifying the environment (air polluition) and public health to reduce chronic disease

However, besides the near term steps for important small gains I also look at the high potential and underfunded opportunities. This is where a little more effort can bring a lot more rewards. So first two on the list are SENS and molecular manufacturing.

I also try to find overlooked ways to use old or near term technology and processes. An example is my proposal for devices to gather a lot more data minable medical data for research and for the development of personalized medicine.

Форма для связи


Email *

Message *