December 08, 2006

DNA nanorobotic arm array

FromScience Daily, more on Ned Seemans DNA nanorobotic arms.

NYU chemistry professor Nadrian C. Seeman and his graduate student Baoquan Ding have developed a DNA cassette -- shown here as red L-shaped structures -- containing a nanomechanical device that can be inserted within a DNA array and function there. (Image courtesy of New York University)

The results pave the way for creating nanoscale "assembly lines" in which more complex maneuvers could be executed.

The signals that control the nanomechanical tool are DNA rather than RNA. If this were scaled up, activation with DNA could work with MEMS and NEMS systems that feed different activation DNA from thousands or millions of parallel channels. The massively parallel dip pen arrays are currently at 55,000 and are targeted at 1,000,000 parallel dip pens Labs on a chip also have thousands of microchannels.

There are also light activated molecules.

The dimensions of the machine are approximately 110 x 30 x 2 nm.

How precise could the nanorobotic arms get ? The more precise the smaller the molecules they could manipulate.
How could they be best used to precisely catalyze desired molecular reactions?
How much could the DNA chemistry be extended?

DNA origami

DNA nanoactuator for interfacting DNA with electronics

DNA building blocks

Proposed Cubic micron DNA system

DNA used to assemble carbon nanotubes

DNA chemistry with polymers DNA nanotechnology chemistry can be extended with polymers and variant DNA and interation with other molecules.

Carbon nanotube muscles created 100 times stronger than natural muscle

Carbon nanotube muscle 100 times stronger than normal muscle has been created. Spinning carbon nanotubes into yarn a fraction of the width of a human hair, researchers have developed artificial muscles that exert 100 times the force, per area, of natural muscle. This is according to Ray Baughman, director of the Nanotech Institute at the University of Texas at Dallas, who presented the research in Boston last week at the Materials Research Society conference.

Artificial muscles--actuators based on such materials as certain types of metals and polymers that shrink, grow, or change shape--are useful for prosthetic limbs, microscale machines, and robots.

Several issues still must be corrected. As greater loads are applied to actuators, they can start to exhibit "creep"--that is, they do not completely return to their original state with successive cycles. Baughman says that before these actuators can be useful, creep must be eliminated. "Under load, the cycle is not reversible--you've got a little creep. In most actuator applications, you don't want any creep."

Another key issue is scaling up from thin individual threads. Although the carbon-nanotube muscles can outperform natural muscles on a per-area basis, exerting 100 times the force, natural muscles are much larger, making them stronger.

DNA logic gates designed and created

California Institute of Technology researchers successfully combined up to 12 different DNA logic gates in five cascading levels, although the process takes hours, they report in the December 8 Science.

A group of so-called logic gates performs each operation. In the A AND C operation, for example, one gate would consist of strand B intertwined with strand A', which prefers strand A to strand B. When researchers introduce A into a test tube containing this gate, A' exchanges B for A, leaving B floating free.

To yield D as an output, researchers would add strand C to the test tube along with a second logic gate that contains strand D intertwined with two other sequences. One of these sequence latches onto B, the other to C, and D then floats free, as intended.

The new system can perform relatively complex sequences of operations because it allows the output strand of one operation, such as D, to serve as the input for another logical operation. "The ability to do sophisticated computations relies on the ability to build [these] networks," Winfree says. "We've opened the door to being able to build quite large and complex systems." Other approaches to DNA computing, such as a system that plays tic-tac-toe, rely on gates made from DNA- or RNA-based enzymes, which have not yet proven as capable of turning their own outputs into inputs.

A crucial part of combining so many gates is purifying noisy input signals, Winfree says. In electronic circuits a whole range of voltages, say 0 to 0.5 volt, would all represent a single input. To accomplish the same effect his group designed gates that act as thresholds, soaking up stray strands until they reach a preset concentration. Other gates amplified correct but weak signals by producing more of a given strand.

More on Nadrian Seeman DNA arms:
Nadrian Seeman and Baoquan Ding of New York University inserted into gaps specially designed DNA cassettes, each of which contains a flipper that swivels from a fixed point on the cassette. Each flipper can project from the array's surface in one of two different directions, depending on input strands of DNA that are added to the cassettes.

For now the flippers, about 100 total per array, all swivel identically in unison like windshield wipers, but in principle they could be oriented in other ways and controlled individually by specific input strands, Seeman says.

December 07, 2006

Plasmons - bridging optics and electronics

New light physics with plasmons that could bridge light, matter and electronics. Plasmon computers could operate at 100 Terahertz - 1000 terahertz or 20,000 to 200,000 times faster than mainstream computer chips. Part of what I think is the growing trend of greater control of all information, light, energy, matter and magnetism. (ILEMM)

Two-dimensional light, or plasmons, can be triggered when light strikes a patterned metallic surface. Plasmons (wikipedia) may well serve as a proxy for bridging the divide between photonics (high throughput of data but also at the relatively large circuit dimensions of one micron, or one thousandth of a millimeter) and electronics (relatively low throughput but tiny dimensions of tens of nanometers, or millionths of a millimeter).

One might be able to establish a hybrid discipline, plasmonics, in which light is first converted into plasmons, which then propagate in a metallic surface but with a wavelength smaller than the original light; the plasmons could then be processed with their own two-dimensional optical components (mirrors, waveguides, lenses, etc.), and later plasmons could be turned back into light or into electric signals.

Plasmon microscope:

Igor Smolyaninov (University of Maryland, reported that he and his colleagues were able to image tiny objects lying in a plane with spatial resolution as good as 60 nm (when mathematical tricks are applied, the resolution becomes 30 nm) using plasmons that had been excited in that plane by laser light at a wavelength of 515 nm. In other words, they achieve microscopy with a spatial resolution much better than diffraction would normally allow; furthermore, this is far-field microscopy -- the light source doesn't have to be located less than a light-wavelength away from the object.

This work is essentially a Flatland version of optics. They use 2D plasmon mirrors and lenses to help in the imaging and then conduct plasmons away by a waveguide.

Future plasmon circuits at optical frequencies:

Nader Engheta (University of Pennsylvania, argued that nano-particles, some supporting plasmon excitations, could be configured to act as nm-sized capacitors, resistors, and inductors -- the basic elements of any electrical circuit.

The circuit in this case would be able to operate not at radio (10**10 Hz) or microwave (10**12 Hz) frequencies but at optical (10**15 Hz) frequencies. This would make possible the miniaturization and direct processing of optical signals with nano-antennas, nano-circuit-filters, nano-waveguides, nano-resonators, and may lead to possible applications in nano-computing, nano-storage, molecular signaling, and molecular-optical interfacing.

More physics papers on plasmons


New sound physics: hypersound and acoustic lasers

Radically new and powerful things are being done with sound. Part of what I think is the growing trend of greater control of all information, light, energy, matter and magnetism. (ILEMM) Molecular nanotechnology and advanced metamaterials would be milestones in the control of matter which will also provide greater control of energy. Better material devices will also mean better capabilities to investigate and understand phenomena.

Hypersound, acoustic pulsation at 200 gigahertz frequencies, has been produced in the same kind of resonant multilayered semiconductor cavity as used in photonics.
They believe that a new field, nanophononics, has been inaugurated, and that the acoustical properties of semiconductor nanodevices will become more prominent.
THz sound might also participate in the development of powerful "acoustic lasers" or in novel forms of tomography for imaging the interior of opaque solids.

A new kind of Acoustic Laser, sound amplification by stimulated emission of raciation, or SASER, is the acoustic analog of a laser. Instead of a feedback-built potent wave of electromagnetic radiation, a saser would deliver a potent ultrasound wave.
In lasers the light buildup is maintained by a reflective optical cavity. In the U.K.-Ukraine saser, the acoustic buildup is maintained by an artful spacing of the lattice layer thicknesses in such a way that the layers act as an acoustic mirror.

Eventually the sound wave emerges from the device at a narrow angular range, as do laser pulses. The monoenergetic nature of the acoustic emission, however, has not yet been fully probed. The researchers believe their saser is the first to reach the terahertz frequency range while using also modest electrical power input.

Further reading:
More physics papers on hypersound

More physics papers on acoustic lasers

More physics papers on metamaterials


Ned Seeman makes 'nanorobotic' arm to operate within DNA sequence

This is a major development for enabling mechanical control at the molecular level. Robert Freitas (who wrote nanomedicine) told that he felt this was an evolutionary step towards the creation of a factory-style 'mechanical ribosome' system that could assemble biological parts via positional assembly.

New York University chemistry professor Nadrian C. Seeman and his graduate student Baoquan Ding have developed a DNA cassette through which a nanomechanical device can be inserted and function within a DNA array, allowing for the motion of a nanorobotic arm. The results, reported in the latest issue of the journal Science, mark the first time scientists have been able to employ a functional nanotechnology device within a DNA array.

The device, developed with NYU Chemistry graduate student Shiping Liao, emulates the process by which RNA replicas of DNA sequences are translated to create protein sequences. However, the signals that control the nanomechanical tool are DNA rather than RNA. The dimensions of the machine are approximately 110 x 30 x 2 nm.

In this study, Seeman and Ding developed a framework that contains a binding site--a cassette—that allows insertion of the device into a specific site of a DNA array. Changing the cassette’s control sequences or insertion sequences would allow the researchers to manipulate the array or insert it at different locations. The researchers added a long arm to the framework so that they could observe the structure undergoing a half-rotation. They visualized their results by atomic force microscopy (AFM), which permits features that are a few billionths of a meter to be visualized.

2030 Energy forecast is 50% more coal usage

Coal is projected to increase 50% in the AEO2007 reference case, particularly for electricity generation. Coal consumption is projected to increase from 22.9 quadrillion Btu (1,128 million short tons) in 2005 to more than 34 quadrillion Btu (1,772 million short tons) in 2030, with significant additions of new coal-fired generation capacity over the last decade of the projection period, when rising natural gas prices are projected.

Energy Prices are projected.

Energy efficiency to generate GDP expected to double.

The carbon generation picture would keep getting worse.

Coal kills 1000+/day and is a major contributor to climate change. We need a lot more Nuclear power until coal and oil are eliminated. Advanced Thorium reactors can eliminate the 10,000 year waste problem and the proliferation issues.

Indian President Kalam wants two thirds of nuclear fuel from Thorium

India is targeting 24 GW on nuclear power by 2020 and 50 GW by 2030. Two thirds of the nuclear fuel should be thorium. As per the present plan of the Bhabha Atomic Research Corporation (BARC) and the Nuclear Power Corporation, the capacity by 2020 is expected to be increased to 24,000 MW. 'There is a need to plan right from now to increase this capacity to 50,000 MW by 2030,' said Indian President Kalam.

In a presentation to the large gathering of technocrats, innovators, industrialists and overseas delegates, Kalam said, 'Implementation of the advanced heavy water reactor (AHWR) project and development of associated fuel cycle facilities will provide industrial scale experience into the handling of thorium.'

The AHWR is to derive two-thirds of its power from thorium and one-third from plutonium generated from fast breeder reactor (FBR).

India seems like a place that is pro-nuclear and pro-Thorium and one of the best candidates to eventually transition to molten salt Thorium reactors.

December 06, 2006

6 Gbps wireless connection record

From, the CSIRO ICT Centre today announced that it has achieved over six gigabits per second over a point to point wireless connection with the highest efficiency (2.4bits/s/Hz) ever achieved for such a system.

At the demonstration, the team will transmit 16 simultaneous streams of DVD quality video over a 250 metre link with no loss of quality or delays. This impressive demonstration nevertheless only utilises one quarter of the capacity of the link.

Dr Jay Guo, Director of the Wireless Technologies Laboratory at CSIRO said that this breakthrough is just a first stage towards direct connections of up to 12 gigabits per second.

"The system is suitable for situations where a high speed link is needed but it is too expensive or logistically difficult to lay fibre, such as in congested urban environments, and across valleys and rivers," Dr Guo said.

The system operates at 85GHz in the millimetre-wave part of the electromagnetic spectrum (above 55 GHz) which offers the potential for these enormous speeds and is not yet congested by other uses.

First molecular simulation of a long DNA strand shows unexpected flexibility

Virginia Tech researchers used novel methodology and the university’s System X supercomputer to carry out what is probably the first simulation that explores full range of motions of a DNA strand of 147 base pairs, the length that is required to form the fundamental unit of DNA packing in the living cells -- the nucleosome.

The result of finding more flexibility in DNA is interesting, but even more so is the computing power and the methodology have come to the point where molecular simulations can fully probe biology on timescales very relevant to living things – such as DNA packing. Molecular simulations are also relevant to advancing molecular nanotechnology.

Faster read and write using gallium ions for Spintronics

Spintronic memory advance: New research suggests a new route for developing high density magnetic memory chips which will not lose information when the power is switched off. For the first time data will be written and read very fast using only electrical currents.

Physicist at the Universities of Bath, Bristol and Leeds have discovered a way to precisely control the pattern of magnetic fields in thin magnetic films, which can be used to store information. The key advance of the recent research has been in developing ways to use high energy beams of gallium ions to artificially control the direction of the magnetic field in regions of cobalt films just a few atoms thick. If the approach at Bath is developed commercially, this would allow the manufacture of magnetic memory chips with much higher packing densities, which can operate many times faster.

20% efficient thin film sliver solar cells from Australia

Professor Andrew Blakers, director of the Centre for Sustainable Energy Systems at the Australian National University, says 'sliver technology' could reduce the price of solar power by 60% and is over 20% efficient.

The system works by taking a standard solar cell about 1 millimetre thick and cutting it into tiny slices that are just 120 micrometres wide.

"Imagine a standard solar cell is a loaf of bread. When you put it out in the sun it generates energy based on its surface area," Blakers says.

"Now imagine you cut that loaf up into slices and lay them horizontally. You get a lot more surface area."

This technique allows researchers to use much smaller amounts of expensive silicon to generate the same amount of electricity. This can also keep manufacturing costs down, as all the processing steps normally carried out on solar cells are done while the slices are still in the 'loaf'.

"We're looking at major reductions in the total cost without the need for major scientific breakthroughs," Blakers says.

Further developments would be needed, such as figuring out how to cut thinner slivers, he says.

Supercomputers needed for nanometer chip design

From EETimes, Nanometer chip design is becoming so compute-intensive that it needs supercomputer-like capability, according to Mentor Graphics Corp. and Mercury Computer Systems Inc. In particular, optical proximity correction (OPC) can best be performed by hardware acceleration based on the Cell Broadband Engine processor, the companies claim.

Mentor and Mercury had looked at a number of possible ways to speed OPC computations, including FPGAs, DSPs and general-purpose processors. The Cell emerged as the best choice, Skalabrin said. "There's a very significant gain. Some algorithms are 50 to 100 times faster."

Even at 65 nanometers, according to Sawicki, some customers are using 1,000 processor nodes to run OPC--and taking days to do it. Some are talking about needing 2,000 nodes for 45 nm, "an unacceptable explosion in the cost of ownership," Sawicki said.

The Cell's strength is rapid image processing. Compared with an Opteron processor, a Cell processor uses fast Fourier transforms (FFTs) to speed OPC simulation, he said.

The Dual Cell-Based Blade offers peak performance of 400 gigaflops, features two 3.2-GHz Cell BE processors and includes 512 Mbytes of XDR DRAM memory per Cell BE processor. Mercury has mapped key signal-processing algorithms onto the blade.

Newsweek reviews Resveratrol

The latest Newsweek magazine has an article about life extension, Resveratrol and Sirtuins by David Sinclair, Ph.D. and Anthony L. Komaroff, M.D.

At the end of the article it makes the case for a crash research program for life extension. A 1% reduction in cancer mortality would be worth $500 billion. This is the life extension dividend. The SENS program would be a good starting point for a crash program for life extension.

Some people shudder at the thought of a treatment to extend human life, imagining that the added years would be ones of frailty and of failing intellect and strength. However, the animals that get added time from resveratrol treatment are, by all measures, remarkably vital until the end. It has been estimated that drugs that maintain health and vitality could save the U.S. economy tens of trillions of dollars. For example, a permanent 1 percent reduction in mortality from cancer would have a value to current and future generations of Americans of nearly $500 billion. Many scientists are encouraging Congress to increase funding for aging research, to launch the equivalent of the Apollo program. Only a few humans made it to the moon. In the future, millions may live a century or more, and remain vital and productive during those added years.

Traffic accidents

Freeman Dyson talks about world war 2 and includes some information about traffic accidents.

His point about traffic accidents (which kill 1.2 million people per year worldwide and about 44,000 per year in the USA) :
Smeed (Dysons boss in WW2) also had a fatalistic view of traffic accidents. He collected statistics on traffic deaths from many countries, all the way back to the invention of the automobile. He found that under an enormous range of conditions, the number of deaths in a country per year is given by a simple formula: number of deaths equals .0003 times the two-thirds power of the number of people times the one-third power of the number of cars. This formula is known as Smeed's Law. He published it in 1949, and it is still valid 57 years later. It is, of course, not exact, but it holds within a factor of two for almost all countries at almost all times. It is remarkable that the number of deaths does not depend strongly on the size of the country, the quality of the roads, the rules and regulations governing traffic, or the safety equipment installed in cars. Smeed interpreted his law as a law of human nature. The number of deaths is determined mainly by psychological factors that are independent of material circumstances. People will drive recklessly until the number of deaths reaches the maximum they can tolerate. When the number exceeds that limit, they drive more carefully. Smeed's Law merely defines the number of deaths that we find psychologically tolerable.

If this is true then more regulations and technology need to be introduced to force safe or safer driving behavior. The best would be safe completely automated driving systems.

Honda is looking to introduce wing mirror cameras and to network them into a comprehensive traffic surveillance system This could be used as part of a system to enforce safe driving.

Fuel efficient driving behaviors could also be enforced in autodriving systems.

World Wealth - how to win

An article from the New York Times about the UN report on global wealth distribution

A lot of focus has been on how much is owned by the top tier. (40% by the top 1% and 50% by the top 2%, 85% from the top 10%)

I think it is more important to note how India and China are developing and helping many of their people move up. Poorer nations face many obstacles to amassing wealth, including sketchy property rights and land tenure systems, and underdeveloped financial markets. So it is possible for countries to fix their systems and get onto the development track.

People can also study and see how individuals become more successful within countries. Such as the millionaires and billionaires in the United States. If someone is in a developed country they can change what they are doing to improve their situation.

Someone in Africa either has to become a successful warlord or has to leave and escape the area and get to someplace where they can do better. Note: even within developed nations and regions, some places might be a better match to ease your own path to wealth. Location, location, location is not just advice for real estate but for ideally locating yourself to optimize your chances and situation. The report is telling you that some countries are loser countries. Some regions are loser regions. In a lot of places that will not change. Winners will keep on winning. A few places can change, but it will be apparent years in advance who is turning it around.

Someone who is in the United States (or anywhere else) has to look at moving beyond low skill salary work and master advantages for a successful business or some form of investment. This advice is advantages, advantages, advantages. If you really want to do well then you must figure out ways to do what you do better than your competitors. Better means having lower costs, being able to sell at higher prices, being able to add value, being able to complete transactions faster etc... The report is also saying that some people and personal strategies are winners. If you are not winning then you are and will stay a loser if you do not change.

Comprehensive energy policies

A former secretary of the UK cabinet discusses the need and shift to more comprehensive energy policies that take into account all costs

A good modified market energy portfolio should take into account the volatility of the availability and price of different fuels.

Natural gas, as the world has witnessed, can fluctuate enormously. In the U.K., the spot price of natural gas doubled between 2004 and 2006. Even more damaging were two price spikes, in which U.K. gas prices briefly rose about 400 percent. Importing nations, in particular, have little recourse if suppliers raise prices suddenly (as Russia’s Gazprom has done) or supplies approach a natural peak (as has been predicted for oil). Other fuels are relatively stable; once reactors are built, the price of nuclear power remains relatively constant. Nuclear power can therefore take the role that bonds play in a pension fund: not necessarily the highest-yielding asset, but one that reduces volatility.

Recent analysis conducted by the U.K. government shows that nuclear power would be viable over a wide range of scenarios. It would struggle to compete only if gas prices and the shadow price of carbon were both low. That combination is inherently implausible, however; it would almost certainly lead to a higher shadow price for carbon, bringing nuclear power back into contention.

A consensus is building in Europe and North America with respect to global climate change and energy security, and it is coupled with a growing sense of urgency. We now have a moment of opportunity to create a framework that enables the essential energy choices to be made — not by dictating them, but by providing open competition and building all the relevant factors into the marketplace where choices are made.

December 05, 2006

French physicist critiques gate model quantum computers

From New Scientist, an article reporting that Michael Dyakonov of the University of Montpellier in France believes large scale quantum computing is akin to achieving perpetual motion.

The pdf of the paper is here

He is referring to gate model quantum computers with 1000 to 1,000,000 qubits.

Others think Dyakonov has got it wrong. "It is true that the quantum computing community should be cautiously optimistic, rather than confident," says Andrew Steane of the University of Oxford. "But his arguments are largely misleading."

He also is not talking about adiabatic quantum computers like that being worked on by Dwave.

December 04, 2006

Military nanotechnology book

Jürgen Altmann has written a book "Military Nanotechnology: Potential Applications and Preventive Arms Control" which has an abstract at this link.

From the abstract:
Military R&D of NT is beginning to expand, with the USA far in the lead. Arguments for such R&D stress the increased military capabilities expected from NT; risks from military applications – to international security, to civilian societies – are rarely taken into account. This work provides a first assessment of potential military applications of NT with a view towards preventive arms control.

Military R&D of NT in the USA spans the full range from electronics via materials to biology. While much of this is still at the fundamental level, efforts are being made to bring applications to the armed forces soon. With above $ 200 million per year, one quarter to one third of the Federal funding for NT goes to military R&D, and the USA outspends the rest of the world by a factor 4 to 10.

For assessing and containing the risks of new military technologies, the concept of preventive arms control is used. Considerations about limitation should start whenever a special problem becomes obvious using criteria covering international law, stability and humans/environment/society. By balancing benefits, risks and costs, including considerations of verification, recommendations for limitations are to be derived.

In the first criteria group, new conventional, chemical and biological weapons would jeopardise existing arms-control treaties; armed autonomous systems would endanger the law of warfare. Secondly, stability could decrease with small distributed battlefield sensors and in particular with armed autonomous systems. Arms racing and proliferation have to be feared with all applications. In the third criteria group, the strongest dangers to humans would ensue from armed mini-/micro-robots and new chemical/biological weapons used by terrorists. Negative effects on society could follow indirectly if body manipulation were applied in the military before a thorough societal debate on benefits, risks and regulation.

To contain these risks, preventive limits are recommended in seven areas. They do not focus on NT as such, but include NT applications in a broader, mission-oriented approach. Distributed sensors below several cm size should be banned. Metal-free small arms and munitions should not be developed, the Treaty on Conventional Armed Forces should be kept and updated as new weapons systems would arrive. A moratorium of ten years for non-medical body manipulation should be agreed upon. Armed autonomous systems should optimally be banned, with limits on unarmed ones; if the former is not achievable, at least for the decision on weapon release a human should remain in the loop. Mobile systems below 0.2-0.5 m size should be banned in general, with very few exceptions. A general ban on space weapons should be concluded, with exceptions for non-weapons uses of small satellites. The Chemical and Biological Weapons Conventions should be upheld and strengthened.

The ban on mobile systems below 0.2-0.5 meters in size, does not seem possible or realistic to me. Attempting to get an agreement for a ban on space weapons will probably work for a decade, but when technology and access to space improves I think any ban will be violated. But it is worth discussing and trying to get some agreements to try to keep things orderly.

Other reading:
My essay on military nanotechnology

Eric Drexler comments on first NNI report on Molecular Manufacturing

From Eric Drexler's website (, is Eric's comment on the first report from the National Academies Molecular Manufacturing. (pdf through link)

I had previously commented in Sept, 2006 about the release of the NNI report

Eric Drexler's Comment on the report

In its conclusion, the committee notes that it is difficult to analyze complex systems intended to build intricate, atomically precise, large-scale products, stating that “the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time”, and that “the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence.”

To advance research from theoretical models to concrete accomplishments, the committee calls for “defining and focusing on basic experimental steps that are critical to advancing long-term goals” and for funding "experimental demonstrations that link to abstract models and guide long-term vision”.

This report, prepared in response to a congressional request, represents the first open, high-level, science-based evaluation of the concept of molecular manufacturing. Not surprisingly, this first evaluation led to the first recommendation that research be supported. For a decade or more, researchers eager to pursue this work have faced a closed door. That door now seems to be opening.

Hopefully the door is opening but the funds are still scarce for site-specific chemistry/molecular manufacturing.

The $3 million brainstorming project from the UK for the Software Control of Matter at the Atomic or Molecular Scale is one of the few funded projects.

Leaders in molecular manufacturing research such as Robert Freitas have trouble raising funds for their nanofactory collaboration work

The good news is that there is progress in protein engineering, DNA nanotechnology, synthetic biology, molecular manipulation tools and computing power. The advancing wave of technological improvement is making it easier and cheaper to work towards molecular manufacturing. It is just the actual targeted work which still has difficulty getting funding. It is similar to anti-aging research where most of the money must be raised for some other disease related benefit even if it might also treat or research the sources of aging.

December 03, 2006

Successful Quantum computer business impact

Dwave systems plans to demo a 16 qubit (4×4 grid) superconducting AQC (adiabatic quantum computing) in the first quarter of 2007. Each qubit is connected to nearest- and next-nearest neighbors using tunable couplers. The chip is programmed by setting the values of the biases on each qubit (16 total), and the values of each of the (42) couplers.

The demo will focus on running two applications on the hardware. One is a planning/scheduling application and the other is a pattern match application for small molecules.

Since Dwave was considering starting out with a 64 qubit machine, I believe that if they hit the Q1 2007 16 qubit target that by Q4 of 2007 they will have the 64 qubit machine.

My paid public prediction at long bets is 100+ qubits by Dec 31,2010. I remain confident that Dwave systems will the company that makes that prediction come true. I think 100+ qubits will happen in 2008.

Logistics dependent companies like airlines, buses, trucking and package delivery companies will benefit with more efficient routing. The stocks of and related to the Dow Jones Transportation index

Biotechnology and nanotechnology companies will benefit from the improved quantum simulation capabilities. The Nasdaq biotechnology index lists prominent biotech companies

The routing and infrastructure of information delivery for computer networks could also be optimized. There is also optimization potential in the layout of computer chips.

I think there will be significant efficiency gains that will be captured in the 2008-2010 timeframe with 64-1024 qubit quantum supercomputers.

Computer security companies and computer security consultants will benefit from new work changing the computer security to be quantum computer resistant. Banks, governments and financial institutions will have the expense of converting their systems to the new algorithms.

Форма для связи


Email *

Message *