July 22, 2006

Several big 30meter to 60 meter telescopes are funded

These are welcome developments as we can not see well outside of our solar system.

As I have noted in my predictions if we can combine hypertelescope concepts (very large space arrays), new worlds imaging ideas (satellites that null the light from stars so that their planets can be seen) and magnetic inflation of really big space and cheap space structures then we could make telescopes with nano-arc second resolution (hypertelescope) and have kilometer size mirror elements (using magnetic inflation). This can be further enhanced with molecular nanotechnology (massive increase in production capabilities and reduction of the costs for getting into space) and metamaterials (we are developing the capability for superlens able to focus to 1/20th the wavelength of light). I think that an advanced realization of that grand project is possible in the 20-30 year range, but that some very useful precursors could be started immediately.

A giant telescope with a mirror up to 60 metres wide is being planned by the European Southern Observatory. The telescope would be able to detect Earth-like planets around other stars and spot the universe's first galaxies. ESO aims to put the E-ELT on a fast track. Their goal is to build it for €750 million ($950 million) and have it ready to observe by 2015.

Two other groups are also pushing forward with plans for giant telescopes. A US-Australian consortium is planning a 24.5-metre instrument called the Giant Magellan Telescope (GMT) to be built by 2016 (see World's largest telescope begins with a spin). And a US-Canadian group is planning the Thirty Meter Telescope (TMT), also to be built by 2016.

While space-based telescopes enjoy crystal-clear views, ground-based telescopes have their own advantages. "For a given budget, ground-based telescopes can be much larger than telescopes in space," says GMT project manager Matt Johns of the Carnegie Institution of Washington's Observatories in Pasadena, California, US.

The giant ground-based telescopes will have vision roughly four times sharper than the James Webb Space Telescope (JWST) that will replace Hubble.

As supporting technology such as adaptive optics improves, the new telescopes could eventually analyse the gases in the atmospheres of other Earth-like planets.

Related Articles:
Magnetic inflation could make really big space based telescopes cheaper

Other interesting space ideas from NIAC (Nasa Institute for Advanced Concepts

Blimp based telescopes planned

Promising space technology

Article about no fermi paradox. Part way through links to hypertelescope concept and New worlds imager

July 21, 2006

Rapid prototyping cheap UAVs and maybe precursor for precise manufacturing

The rapid prototyping laser printing seems like an early precursor of the kind of production that we would want from a nanofactory.

This technique could come top down / bottom up.

Top down to 2 nanometers or less with metamaterial superlens able to focus to 1/20th of wavelength and maybe less.
Bottom up with directed self assembly and other techniques for making 2 nanometer nanoblocks.

In rapid prototyping, a three-dimensional design for a part - a wing strut, say - is fed from a computer-aided design (CAD) system to a microwave-oven-sized chamber dubbed a 3D printer. Inside the chamber, a computer steers two finely focussed, powerful laser beams at a polymer or metal powder, sintering it and fusing it layer by layer to form complex, solid 3D shapes.

The technique is widely used in industry to make prototype parts - to see if, for instance, they are the right shape and thickness for the job in hand. Now the strength of parts printed this way has improved so much that they can be used as working components.
"The big advantage over conventional, large-scale aircraft production programmes is the cost saving in tooling as well as the order-of-magnitude reductions in fabrication and assembly time."

By mixing composite polymers with radar-absorbing metals, it is thought that the aircraft can be built with a certain amount of stealth characteristics already built in.

The flexibility lent by 3D printing allowed Mauro's team to design and build the Polecat in only 18 months. "Today's sophisticated UAVs are approaching the cost of equivalent manned aircraft. Polecat's approach is a way to break this trend and demonstrate affordable UAV systems that can be rapidly developed," says Mauro.

other tech: more common mechanism for transfering life to other solar systems

Magnetic fields and magnetospheric plasmoids could move electrically charged bacteria from one planet to another and even to another solar system The idea that microbes could be electrically levitated into the upper atmosphere was first suggested in 1908 by chemist Svante Arrhenius, but until recently there had been no direct measurements of the strength of electric fields high in the atmosphere to show whether the mechanism would work to propel microbes away from the planet.

Other researchers have already demonstrated that some bacterial spores can survive in conditions thought to exist in interplanetary space, and then be revived. So the possibility of interplanetary spread of life is plausible and deserves further investigation, Dehel believes.

Charged microbes could also be propelled outwards from a planet at high speed by “magnetospheric plasmoids” - independent structures of plasma and magnetic fields that can be swept away from the Earth’s magnetosphere. Hitching rides on these structures could accelerate microbes to speeds capable of taking them out of the solar system and on to the planets of other stars.

And because of the potential for a steady outflow of the particles pushed by the electric fields, a single life-bearing world might seed an entire galaxy with life, claims Dehel.

other tech: Metamaterials bending infrared light

Soukoulis and his co-workers from the University of Karlsruhe, Germany, published in the May 12, 2006, issue of Science, that they have fabricated for the first time a metamaterial that has a negative index of refraction at 1.5 micrometers (1500nm). These wavelengths are microscopic and can be used in telecommunications. Soukoulis’ success moves metamaterials into the near infrared region of the electromagnetic spectrum – very close to visible light (400 to 700 nm, although some people may be able to perceive wavelengths from 380 to 780 nm.), superior resolution and a wealth of potential applications.

In addition, Soukoulis and his University of Karlsruhe colleagues have also shown that both the velocity of the individual wavelengths, called phase velocity, and the velocity of the wave packets, called group velocity, are both negative, which Soukoulis said accounts for the ability of negatively refracted light to seemingly defy Einstein’s theory of relativity and move backwards faster than the speed of light.

Elaborating, Soukoulis said, “When we have a metamaterial with a negative index of refraction at 1.5 micrometers that can disperse, or separate a wave into spectral components with different wavelengths, we can tune our lasers to play a lot of games with light. We can have a wavepacket hit a slab of negative index material, appear on the right-hand side of the material and begin to flow backward before the original pulse enters the negative index medium.” Continuing, he explained that the pulse flowing backward also releases a forward pulse out the end of the medium, a situation that causes the pulse entering the front of the material appear to move out the back almost instantly.

He predicted, “Snell’s law on the refraction of light is going to be different; a number of other laws will be different"

See movies that show what they are doing.

Related articles:
Metamaterials and superlens

July 20, 2006

Using datamining of possible crystal structures

Datamining all known possible crystal structures allows scientists to identify a short list of crystal structures for any mixture of elements whose structure is unknown. They then use quantum mechanical calculations to identify the right one. The process is now 30 to 300 times faster.

Using a technique called data mining, the MIT team preloaded the entire body of historical knowledge of crystal structures into a computer algorithm, or program, which they had designed to make correlations among the data based on the underlying rules of physics.

Ceder's team of computational modelers can already determine, in the space of just a few days, atomic structures that might take months or even years to elucidate in the lab. In testing on known structures of just two elements, Ceder's group found the new algorithm could select five structures from 3,000-4,000 possibilities with a 90 percent chance of having the true structure among the five.

"It's all about probability and correlations," Ceder said. "Our algorithm gives us the crystal structure with a certain probability. The key was realizing we didn't need more than that. With a short list of candidate structures, I can solve the problem precisely with quantum mechanics."

other tech: Making and modifying organisms for cheap ethanol

Processing ethanol from cellulose -- wheat and rice straw, switchgrass, paper pulp, agricultural waste products like corn cobs and leaves -- has the potential to squeeze at least twice as much fuel from the same area of land, because so much more biomass is available per acre. Moreover, such an approach would use feedstocks that are otherwise essentially worthless.

Converting cellulose to ethanol involves two fundamental steps: breaking the long chains of cellulose molecules into glucose and other sugars, and fermenting those sugars into ethanol. In nature, these processes are performed by different organisms: fungi and bacteria that use enzymes (cellulases) to "free" the sugar in cellulose, and other microbes, primarily yeasts, that ferment sugars into alcohol.

The ideal organism would do it all -- break down cellulose like a bacterium, ferment sugar like a yeast, tolerate high concentrations of ethanol, and devote most of its metabolic resources to producing just ethanol. There are two strategies for creating such an all-purpose bug. One is to modify an existing microbe by adding desired genetic pathways from other organisms and "knocking out" undesirable ones; the other is to start with the clean slate of a stripped-down synthetic cell and build a custom genome almost from scratch. Synthetic Genomics, founded by Craig Venter, is in hot pursuit of a bacterium. There is progress in both strategies.

other tech: White Blood Cells From Cancer-resistant Mice Cure Cancers In Ordinary Mice- might work in people

White blood cells from a strain of cancer-resistant mice cured advanced cancers in ordinary laboratory mice, researchers at Wake Forest University School of Medicine reported

"Even highly aggressive forms of malignancy with extremely large tumors were eradicated," Zheng Cui, M.D., Ph.D., and colleagues reported in this week's on-line edition of Proceedings of the National Academy of Sciences.

The transplanted white blood cells not only killed existing cancers, but also protected normal mice from what should have been lethal doses of highly aggressive new cancers.

"This is the very first time that this exceptionally aggressive type of cancer was treated successfully," said Cui. "Never before has this been done with any other therapy."

The original studies on the cancer-resistant mice -- reported in 2003 -- showed that such resistance could be inherited, which had implications for inheritance of resistance in humans, said Mark C. Willingham, M.D., a pathologist and co-investigator.

Moreover, preliminary studies show that the white blood cells also kill "endogenous" cancers -- cancers that spring up naturally in the body's own cells.

Cui and Willingham said the research produced many other surprises. For one thing, if a virulent tumor was planted in a normal mouse's back, and the transplanted white blood cells were injected into the mouse's abdomen, the cells still found the cancer without harming normal cells. The kind of cancer didn't seem to matter.

A single injection of cancer-resistant macrophages offered long-term protection for the entire lifespan of the recipient mouse, something very unexpected, they said.

"The potency and selectivity for cancer cells are so high that, if we learned the mechanism, it would give us hope that this would work in humans," said Cui. "This would suggest that cancer cells send out a signal, but normal white blood cells can't find them."

Cui said the findings "suggest a cancer-host relationship that may point in a new therapeutic direction in which adverse side effects of treatment are minimal."

Discover magazine had a follow up in its August issue. They are finding that some people are cancer resistant. They are going to try to perform tests to see if transplanted white blood cells from cancer resistant people can help those with cancer. They have a blood test to identify cancer resistant mice without trying to give them cancer. The plan is to use such a blood test to find cancer resistant people. Cui points out that it could take years to find the gene, and many more to develop and test drugs that target it. In the meantime, his team has begun to test blood samples from healthy people, and have found a wide range of cancer-killing activity in humans. Cui says he would like to pursue both the conventional and unconventional approaches. "We think there might actually be a possibility we could do it without knowing the mechanism," he says, "but of course by knowing the mechanism you could devise many other options, so if one thing doesn't work then you can also find different ways using the same concept. So we think both directions are important."

if the cell-donation approach were to work in people, it would not need to go through a long FDA approval process. "All the delivery mechanisms are already in place and all the ethical regulations for that direction are already in place. So if we can identify cancer-resistant humans then they could start treating them tomorrow if someone wants to pay for it."

New Graphene based Composite Materials

Northwestern University researchers have developed a process that promises to lead to the creation of a new class of composite materials -- “graphene-based materials.”

The method uses graphite to produce individual graphene-based sheets with exceptional physical, chemical and barrier properties that could be mixed into materials such as polymers, glasses and ceramics. The Northwestern team's approach to its challenge was based on chemically treating and thereby “exfoliating” graphite to individual layers.

Better mixing of nanotubes in polymers will make better nanomaterials

The amount of force applied while mixing carbon nanotube suspensions influences the way the tiny cylinders ultimately disperse and orient themselves. In turn, the final arrangement of the nanotubes largely dictates the properties of the resultant materials.

In an elegantly simple result, NIST researchers Erik Hobbie and Dan Fry found that networks of carbon nanotubes respond predictably to externally applied force. The networks also showed behavior reminiscent of more conventional materials that align spontaneously under the forces of Brownian motion--the random motion of particles in a fluid famously described mathematically by Einstein.

The response was so predictable that the scientists mapped out the relationship in the form of a phase diagram, the materials science equivalent of a recipe. Using their "phase diagram of sticky nanotube suspensions," other researchers can estimate the order that will result when applying a certain amount of force when mixing a polymer fluid with a particular concentration of nanotubes. The recipe can be used to prevent entanglement and to help achieve the nanotube arrangement and orientation associated with a desired set of properties.

This is a recipe for better nanotube/polymer composites.

Carnegie Mellon University's top down approach to nanobots

Carnegie Mellon University in Pittsburgh is making biomimetic robots. They are based on biological principles and have bacteria motors attached to their near-invisible bodies, and can slither through water canals and probe deep into blood vessels to stop disease and administer medicine.

The nanorobotics lab at Carnegie Mellon University is doing interesting work

Nanomanipulation project goal: To develop a robotic manipulation system that can autonomously construct complex three-dimensional micro and nano-structures from micro and nano-parts. To develop a large-scale assembly system to mass-assemble micro and nano-devices. Currently: A micromanipulation system is developed that can autonomously construct two-dimensional micro-arrangements of spheres sized from 3um to 20um in diameter. They are looking to get smaller using atomic force microscopes.

nanoManipulation Modeling project goal: Develop a continuum physical model of the dynamics of nano-scale particle manipulation. Complete computer simulations have been devised based on preliminary modeling of the physics of nano-scale dynamics. These simulations will be compared with the experimental Atomic Force Microscope (AFM) probe based nanoparticle pushing data.

Integrated NanoTool Carrier project goal: Develop an autonomous mobile robot equipped with various exchangeable nano tools(e.g. drills, shears, saws, buckets, and grippers) by applying a novel ultra precise positioning strategy to improve the flexibility and versatility of existing nano imaging and manipulating facilities and also perform assigned nano missions in a cooperative and efficient way by colony of robots.

Walker prototype

Telenano project goal: Augmented Reality User Interface for Atomic Force Microscopes (afm)

Project to make 3D nanofibers and nanofiber networks from polymers

Foresight's Nanodot reports that at a NASA nanotech meeting in August 2004, Prof. Sitti gave his timing projections: 5-10 yrs: nanoassembly, nanomanufacturing, hybrid biotic/abiotic robots. After 10 years: atomic and molecular scale manufacturing. He explained that complexity will be a challenge: controlling and programming. So an estimate of 2014, to the start of molecular manufacturing from someone making interesting things now.

The Center for Responsible Nanotechnology has coverage on this

Nanoparticles self-assemble through chemical lithography

More advances in self assembly of nanoparticles and making more robust end products. One of the newest types of nanoparticle self-assembly – developed by scientists K. Prabhakaran and team from institutions in the U.S., Japan and Germany – is called “chemical lithography.” The process, the scientists demonstrate, can effectively form periodic arrays of very stable nanoparticles that don’t succumb to many of the defects and limitations of previous lithography techniques (which include, for example, atomic force microscope dip-pen lithography, laser lithography, electron beam lithography, embossing, etc.). Chemical lithography, instead, is a combination of techniques, where particle arrangement is controlled by differences in reactivity – a characteristic determined by exposing particles and surfaces to an assortment of chemical treatments.

The scientists believe that this this approach can be universal and extendable to more specific functionalities. This method has the potential to deliver tailored functional nanoarchitectures which will play a major role in realizing completely self-assembled nanodevices. Prabhakaran explained that it is possible to achieve precise “nanoarchitecturing” involving many sorts of applications with the ability to assemble particles that have a particular wavelength or magnetic property by selectively activating or sensing such properties.

The scientists used luminescent yttrium aluminum garnet (YAG) nanoparticles assembled on a silicon wafer, synthesizing the particles through doping and crystallization to determine their shape and composition. Before placing the particles on silicon wafers, the scientists pre-patterned the wafers using etching techniques based on a phenomenon called “atomic step movement.” Because atom-high steps innately exist on silicon surfaces, the scientists could move these steps during high-temperature treatments to fabricate a desired pattern. Chemical reactions (between the silicon, nitrogen and oxygen) caused very thin nitride linings to form in accordance with the atomic step boundaries, thereby pre-patterning the wafers.

Then, in order to align the particles along the pattern on the wafer, the scientists annealed the samples in an ultra-high vacuum chamber for several hours. After being heated at temperatures ranging from 1000 to 1500 degrees Fahrenheit (500-850 degrees Celsius), the scientists not only unveiled precisely aligned particles (see figures), but also another advantage. Measuring the nanoparticles’ alignment after the annealing process revealed the strength of the particles using this technique. Generally, many nanoparticles suffer from photobleaching, which is damage caused after exposure to high intensity light; these particles, on the other hand, remained intact after prolonged illumination of the scientists’ fluorescent imaging measurements.

Rice scientists make nanoeggs that focus 5 times more wavelengths than nanoshells

Like nanoshells, nanoeggs are about 20 times smaller than a red blood cell, and they can be tuned to focus light on small regions of space. But each nanoegg interacts with more light – about five times the number of wavelengths – than their nanoshell cousins, and their asymmetric structure also allows them to focus more energy on a particular spot.

A cousin of the versatile nanoshell, nanoeggs are asymmetric specks of matter whose striking optical properties can be harnessed for molecular imaging, medical diagnostics, chemical sensing and more. This is part of a rapidly growing family of optical nanoparticles from increased understanding of the interaction between light and matter in this critical size regime.

nano-etched cavity makes LEDs 7 times brighter

Semiconductor LEDs are used increasingly in displays and many other applications, in part because they can efficiently produce light across a broad spectrum, from near-infrared to the ultraviolet. However, they typically emit only about two percent of the light in the desired direction: perpendicular to the diode surface. Far more light skims uselessly below the surface of the LED, because of the extreme mismatch in refraction between air and the semiconductor. The NIST nanostructured cavity boosts useful LED emission to about 41 percent and may be cheaper and more effective for some applications than conventional post-processing LED shaping and packaging methods that attempt to redirect light.

The researchers experimented with different numbers and dimensions of grooves. The brightest output was attained with 10 grooves, each about 240 nanometers (nm) wide and 150 nm deep, and spaced 40 nm apart. The team spent several years developing the design principles and perfecting the manufacturing technique. The principles of the method are transferable to other LED materials and emission wavelengths, as well as other processing techniques, such as commercial photolithography, according to lead author Mark Su.

July 19, 2006

Good/Bad AI's, accelerating returns and a lot of abundance

There are several topics which are often analyzed in isolation in regards to projected advanced technology. AI, accelerating technology, and abundance from technology and resources from space.

There are various papers that talk about achieving abundance from advanced technology like molecular nanotechology.

There is the analysis by Ray Kurzweil that technology is providing accelerating returns

There is also the concern about the need for friendly Artificial Intelligence (AI). This matters because the technological Singularity is mainly about the development of intelligences that are far greater than human and how that will cause an explosion of technological capability.

People can get a sense of the immense resources of energy and materials in space from the Kardashev scale of civilizations

People fear that an AI that is vastly more intelligent than people will rapidly become very powerful and dangerous to people. If an AI is vastly superior in intelligence and is able to rapidly develop and extend technological capability, then it should rapidly be able to tap the resources of space. Trillions of times more than what is available on earth. The AI can make itself mobile and leave and do whatever it wants. For the AI to decide to kill people on earth, I have difficulty seeing the motivation good or bad. The AI can basically outclass any human that is not completely augmented. It would be like Bill Gates parents being concerned that he might plot to kill them for his allowance. Even if the AI is very greedy or expansionist what we have developed so far should be irrelevant to its aims. Maybe a bad AI won't help us out and just leave. But why would it fumigate the old house on the way out ?

The superior AI rapidly moves itself into an entirely level. Tiger Wood's does not need to dominate the miniture gold courses.

There is also the discussion about whether or not to upgrade people. There is the concern the non-upgraded and therefore weaker people would be at the mercy of those who upgrade. The choice is not whether the non-upgrades will be killed, again abundance and accelerating returns from technology means that those who do not upgrade become irrelevant.

Accelerating returns mean that 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today's rate). 200 years of progress will be more than 4,000,000 years of progress (at today's rate).

In about 20 years, those who have not upgraded are like the Amish a few hundred years of technology behind. They are a quaint curiousity and barely connected to the advanced economy.

In about 100 years, they are like the cavemen and utterly removed from and unable to understand the advances being made.

In 200 years, they are like chimpanzees. The choice not to adopt the best technology is like choosing not to evolve.

For those who choose to advance and become transhuman, being generous to those who did not becomes very easy with abundance and the resources of space. It becomes increasingly small fractions. Initially like foreign aid (1-2%), then like setting side nature preserves and reservations. Then like setting up city zoos. Then like keeping potted plants and ant colonies.

So some things to remember is that abundant is really abundant. Not a little abundant.

And AI's and radically augmented people can move themselves into an entirely different level of operation. Fear not the bad AI, but the meticulously cruel AI.

Don't upgrade and rapidly become irrelevant

July 18, 2006

Ion trap quantum computers could scale to thousands of qubits

the universal quantum computer or universal quantum Turing machine (UQTM) is a theoretical machine that combines both Church-Turing and quantum principles.

A UQTM quantum computer would be able to do everything that our current computers can do and in addition the funky quantum capabilities.

The number of classical states encoded in a quantum register grows exponentially with the number of qubits. The power of quantum computers is the ability to compare 2 to the number of qubit states. So 12 entangled and coherent qubits could look at 2 ** 12 or 4096 states. So 50 entangled and coherent qubits could look at over a quadrillion states. 100 entangled and coherent qubits could look at quadrillion times a quadrillion states. For n=300, this is roughly 10**90, more numbers than there are atoms in the known universe.

Note: that ten physical qubits are needed to form one logical qubit and the physical qubits need to be able to perform multiple logic operations. We are currently counting logical qubits. It is the 2007 goal to form one logical qubit.

Another use of quantum computers is cracking the security used for finance and secrets. The Shor algorithm: if a number has n bits (is n digits long when written in the binary numeral system), then a quantum computer with just over 2n qubits can use Shor's algorithm to find its factors. The key length for a secure RSA transmission is typically 1024 bits. 512 bits is now no longer considered secure. For more security or if you are paranoid, use 2048 or even 4096 bits. Read more about cryptography and its importance here

From the wikipedia quantum computer (QC) entry:

Problems and practicality issues [to get to quantum computers]

There are a number of practical difficulties in building a quantum computer, and thus far quantum computers have only solved trivial problems. David DiVincenzo, of IBM, listed the following requirements for a practical quantum computer:

* scalable physically to increase the number of qubits
* qubits can be initialized to arbitrary values
* quantum gates faster than decoherence time
* Turing-complete gate set
* qubits can be read easily

To summarize the problem from the perspective of an engineer, one needs to solve the challenge of building a system which is isolated from everything except the measurement and manipulation mechanism. Furthermore, one needs to be able to turn off the coupling of the qubits to the measurement so as to not decohere the qubits while performing operations on them.

Reviewing some of the leading QC approaches:
The scaling to useful numbers of qubits is progressing among several architectures.
superconducting versions - Like dwave systems in vancouver. Seems like they should get a lot of qubits but that they are somewhat limited in the range of capabilities. They can do searches and they are not computers that can be be generally sold. Their solution is finicky and will need to be coddled by engineers and Phds at big facilities who will feed questions submitted for answers. Could be offered as a service starting in 2007 for 50+ and maybe 100 qubits.

Trapped ion quantum computers
are a leading approach qo quantum computers. They appear to be very scalable and are usually built with semiconductors. This 2006 presentation by Carl Williams toutes the benefits of trapped ions and describes the proposed scalable to hundreds of qubits architecture. Recently 2D semiconductor ion traps have been developed They need to be adjusted to have ions that are better for manipulation.

This table from the 2004 Quantum computing roadmap gives a sense of the state of each approach. Ion traps seemed to have filled their gaps with item 1 and 3.

Green= a potentially viable approach has achieved sufficient proof of principle
Orange= a potentially viable approach has been proposed, but there has not been sufficient proof of principle
Red= no viable approach is known

The column numbers correspond to the following QC criteria:
#1. A scalable physical system with well-characterized qubits.
#2. The ability to initialize the state of the qubits to a simple fiducial state.
#3. Long (relative) decoherence times, much longer than the gate-operation time.
#4. A universal set of quantum gates.
#5. A qubit-specific measurement capability.
#6. The ability to interconvert stationary and flying qubits.
#7. The ability to faithfully transmit flying qubits between specified locations.

Magnetic liquid. Read by sensitive MRI. Up to about 12 qubits. Current qubit leader but limitations on scaling.

Magnetic bubbles could scale to 100's of qubits. Might be able to get going quickly. But has not actually delivered anything.

Scaling also possible by transmitting quantum effects via optical fiber and can transmit quantum effected ions and molecules (the quantum teleportation stuff). Can also already have quantum encrypted communication (military).

As we get smaller semiconductor features, better superconductors and metamaterials down to 10 nanometers or less and at better temperatures, I think the decoherence and robustness issues will be vastly improved.

It seems progress is going quite well on all of these factors and useful machines for larger quantum simulations (hundreds of qubits) and narrow sets of search and decryption purposes should be with us far sooner than the universal machines.

If it is 10 years to universal quantum computers and 1 or 2 years to useful but finicky quantum lab simulators as a commercial service. Probably in the 3-6 year range there will be fairly widespread, cheap and useful for pushing molecular manufacturing development with very powerful quantum simulators. Our understanding and mastery of the quantum world will vastly improve over this time.

Trends that are working for this. Constantly shrinking lithography and control of nanoscale materials, better understanding of quantum physics and superconductors, and better superconductors, better lasers.

I expect a lot of progress over the next ten years and beyond. A lot of the progress are virtually assured based on progressing enabling capabilities. But with the interacting effects there will at some point be some sudden leaps in capability when a bunch of things all come together at once past particular threshholds.

Related articles
Magnetic bubbles could scale to 100's of qubits

Error checking for superconducting QC

Quick summary of the state of quantum computers

Further Reading

2004 quantum computing roadmap

Quantum pontiff

Economist magazine on quantum computers

Berkeley Labs article on quantum computers

other tech: top single crystal of semiconductor can be moved to other surfaces

A team led by electrical and computer engineer Zhenqiang (Jack) Ma and materials scientist Max Lagally have developed a process to remove a single-crystal film of semiconductor from the substrate on which it is built. This thin layer (only a couple of hundred nanometers thick) can be transferred to glass, plastic or other flexible materials, opening a wide range of possibilities for flexible electronics.

In addition, the semiconductor film can be flipped as it is transferred to its new substrate, making its other side available for more components. This doubles the possible number of devices that can be placed on the film.

By repeating the process, layers of double-sided, thin-film semiconductors can be stacked together, creating powerful, low-power, three-dimensional electronic devices.

These are single-crystal films of strained silicon or silicon germanium. Strain is introduced in the way they form the membrane. Introducing strain changes the arrangement of atoms in the crystal such that they can achieve much faster device speed while consuming less power.

By including the germanium without destroying the quality of the material, we can achieve devices with two to three orders of magnitude more sensitivity."

That increased sensitivity could be applied to create superior low-light cameras, or smaller cameras with greater resolution.

Nano Lube Could Make Possible Ultra-Dense Memory

A new way to reduce friction at the nanoscale could enable the commercialization of nano mechanical devices, including ones for data storage 10 to 100 times denser than current memory. Now physicists at the University of Basel in Switzerland have developed a dry "lubrication" method that uses tiny vibrations to keep parts from wearing out.

The method, described in the current issue of Science, could be particularly useful for a new class of memory devices, pioneered by IBM with its Millipede technology, which uses thousands of atomic force microscope tips to physically "write" bits to a surface by making divots in a polymer substrate and later reading them. The "nano lube" could also find uses with tiny rotating mirrors that might serve as optical routers in communications and mechanical switches, replacing transistors in computer processors, so cutting power consumption.

Devices based on NEMS and MEMS are some of the most promising new nanotechnologies. Yet the commercialization of applications such as Millipede -- which could store well over 25 DVDs in an area the size of a postage stamp -- has been held up in part by wear caused by friction. Indeed, friction is a particular problem in micro- or nanodevices, where contacts between surfaces are tiny points that can do a lot of damage.

In their experiments, the Swiss researchers moved an atomic force microscope tip made of silicon across a test material of sodium chloride or potassium bromide. Ordinarily, the ultra-sharp tip would travel in a "stick-and-slip" fashion, as friction repeatedly builds up until the tip suddenly breaks free. (The same physical mechanism accounts for squeaky door hinges.) The researchers solved the sticky-tip problem by oscillating the tips using changing voltages. The vibrations, which are so small that the tip stays in continuous contact with the material, keep energy from building up and being suddenly released. As a result, friction decreases 100-fold.

July 17, 2006

Sandia looking to create secure energy grids

Sandia is looking to prove secure energy microgrids next year at a military base. Energy systems with high levels of energy surety must be safe -- safely supplying energy to end users; secure -- using diversified energy sources; reliable -- maintaining power when and where needed; sustainable -- being able to be maintained indefinitely ("indefinite" is based on the American Indian definition of seven generations or 200 years); and cost-effective -- producing energy at an acceptable (and preferably lowest) cost.

The Sandia team believes the solution is what they are researching for Army bases across the country -- a microgrid that reduces the single points of failure by cutting down the number of transmission lines.

In looking at the five criteria of an energy surety approach, the microgrid meets all. It is safe -- it's not introducing any new dangers. It's secure because it uses a diverse mix of fuels -- solar, wind, and oil. It's reliable because it uses a variety of types of generators. There is a redundancy of generation and storage. It's sustainable because it is using renewable energies. And, it is cost-effective because it uses energy sources that are readily available and appropriate for the site. (An example is that solar could be used in the Southwest and wind along the nation's coastlines.)

Free space optical communications from high altiute balloons and airships could go over 1.25Gbps

They have performed high altitude balloon communication trials in Sweden in 2005. They had supported data rates of 11Mbit/s and throughputs up to 4Mbit/s, using WiFi (IEEE802.11b), at distances ranging up to 60km. Dr David Grace, the project's principal scientific officer said: "Proving the ability to operate a high data rate link from a moving stratospheric balloon is a critical step in moving towards the longer term aim of providing data rates of 120Mbits/s." DLR, a German partner, performed the first known optical 1.25 Gbit/s downlink from the stratosphere to an optical receiver on the ground over a link distance of up to 64 km. The very high data rates offered by free space optical communications will be used for future inter platform and platform to satellite backhaul links.

The CAPANINA project, which uses balloons, airships or unmanned solar-powered planes as high-altitude platforms (HAPs) to relay wireless and optical communications, is due to finish its main research at the end of October. The consortium, drawn from Europe and Japan, have demonstrated how the system could bring low-cost broadband connections to remote areas and even to high-speed trains. It promises data rates 2,000 times faster than via a traditional modem and 100 times faster than today's 'wired' ADSL broadband.

Gene sequencing 100 times faster than traditional methods

454 pyrosequencing has emerged in the past year uses real-time, light-based observations of gene synthesis to reveal genomic information. It produces genomic information up 100 times faster than the old technology.

Here is more on the 454 method The analysis of some 25 million bases of DNA sequence using 454’s “sequencing by synthesis” nanotechnology approach allowed a team of more than 50 researchers to assemble almost the complete genome of Mycoplasma genitalium – some 580,000 nucleotides, or bases – with greater than 99 percent accuracy in about 4 hours.

The polony sequencing approach of George Church seems to be as fast or faster but cheaper

Nanopore DNA sequencing approach

Gene sequencing at wikipedia

Other tech: Surveillence airship test

other tech: microwaving rocks for better mining

Advances in short term cryonic suspension

Cryonic suspension of pigs for a few hours could be used in humans for better trauma surgery. The surgeons drain the blood and connect tubes to the aorta and other vessels, filling the circulatory system with chilled organ-preservation fluid – a nearly frozen daiquiri of salts, sugars, and free-radical scavengers. cryogenic suspension may be just two years away from clinical trials on humans. They have suspended 200 pigs for an hour each, and although experimental protocol calls for different levels of care for each pig, the ones that got optimal treatment all survived. Trauma surgeons currently have time limits to save people with serious wounds, like gunshots. It is a race against the effects of blood loss. When blood flow drops, toxins accumulate; just five minutes of low oxygen levels causes brain death.

More about long term cryonics is here at the Alcor site and here at the wikipedia article on cryonics

July 16, 2006

Laser tweezers sort atoms work towards quantum computer

Physicists of the University of Bonn have taken one more important hurdle on the path to what is known as a quantum computer: by using 'laser tweezers' they have succeeded in sorting up to seven atoms and lining them up. In the experiment the research team headed by Dr. Arno Rauschenbeutel and Professor Dieter Meschede decelerated several caesium atoms for a period of several seconds so that they were hardly moving, then loaded them onto a 'conveyor belt' consisting of lasers. This conveyor belt is made up of a standing light wave composed of many peaks and troughs – possibly comparable to a piece of corrugated iron. 'Unfortunately it cannot be predicted which trough precisely the atoms will land in,' Arno Rauschenbeutel explains. 'It's rather like pouring several eggs from a big dish into an egg carton – which section each egg rolls into is a matter of chance.'

However, anyone wishing to calculate with atoms must be able to place them exactly. 'All the atoms on the conveyor belt have to have the same distance from each other,' is how Arno Rauschenbeutel sketches the challenge. 'Only then can we get them to interact in a controlled way in what is called a quantum gate.' By lining up gate operations like these it would already be possible to carry out simple quantum calculations.

The next aim of the Bonn physicists is to construct a quantum gate. For this purpose they want to 'write' quantum information onto two caesium atoms and then place them between two tiny mirrors. The intention is that they should interact there with each other, i.e. exchange information by emitting and absorbing fluorescent light. If this is successful, it will be the next milestone for the Bonn researchers on their way to the quantum computer.

DNA Used To Direct Nanowire Assembly And Growth

A research team led by Brown University engineers has harnessed the coding power of DNA to create zinc oxide nanowires on top of carbon nanotube tips. The feat, detailed in the journal Nanotechnology, marks the first time that DNA has been used to direct the assembly and growth of complex nanowires. The tiny new structures can create and detect light and, with mechanical pressure, generate electricity. The wires’ optical and electrical properties would allow for a range of applications, from medical diagnostics and security sensors to fiber optical networks and computer circuits.

“The use of DNA to assemble nanomaterials is one of the first steps toward using biological molecules as a manufacturing tool,” said Adam Lazareck, a graduate student in Brown’s Division of Engineering. “If you want to make something, turn to Mother Nature. From skin to sea shells, remarkable structures are engineered using DNA.”

Engineers in the lab of Jimmy Xu used DNA to grow zinc oxide nanowires like this one on the tips of carbon nanotubes. The zinc oxide wires created in the lab measured between 100 and 200 nanometers long.

The Xu lab is the first in the world to make uniform arrays of carbon nanotubes. Lazareck and his collaborators at Brown and Boston College built on this platform to make their structures. They started with arrays of billions of carbon nanotubes of the same diameter and height evenly spaced on a base of aluminum oxide film. On the tips of the tubes, they introduced a tiny DNA snippet.

This synthetic snippet of DNA carries a sequence of 15 “letters” of genetic code. It was chosen because it attracts only one complement – another sequence made up of a different string of 15 “letters” of genetic code. This second sequence was coupled with a gold nanoparticle, which acted as a chemical delivery system of sorts, bringing the complementary sequences of DNA together. To make the wires, the team put the arrays in a furnace set at 600° C and added zinc arsenide. What grew: Zinc oxide wires measuring about 100-200 nanometers in length.

The team conducted control experiments – introducing gold nanoparticles into the array with no DNA attached or using nanotubes with no DNA at the tips in the nanotube array – and found that very few DNA sequences stuck. And no wires could be made. Lazareck said the key is DNA hybridization, the process of bringing single, complimentary strands of DNA together to reform the double helices that DNA is famous for.

“DNA provides an unparalleled instruction manual because it is so specific,” Lazareck said. “Strands of DNA only join together with their complements. So with this biological specificity, you get manufacturing precision. The functional materials that result have attractive properties that can be applied in many ways.”

“We’re seeing the beginning of the next generation of nanomaterials,” said Xu, senior author of the article. “Many labs are experimenting with self-assembly. And they are making beautiful, but simple, structures. What’s been missing is a way to convey information – the instruction code – to make complex materials.”

Bacteria that make electricity could be leveraged with synthetic biology for biology based electronics

Microbiologist Discovers Our Planet Is Hard-wired With Electricity-producing Bacteria, which could be leve raged using synthetic biology. Yuri Gorby discovered that a microbe which transforms toxic metals can sprout tiny electrically conductive wires from its cell membrane, he reasoned this anatomical oddity and its metal-changing physiology must be related. It now turns out that not only are the wires and their ability to alter metal connected—but that many other bacteria, including species involved in fermentation and photosynthesis, can also form wires under a variety of environmental conditions.

Paint-on Semiconductor Outperforms Chips

Researchers at the University of Toronto have created a semiconductor device that outperforms today’s conventional chips — and they made it simply by painting a liquid onto a piece of glass. This is the first a so-called “wet” semiconductor device has bested traditional, more costly grown-crystal semiconductor devices. The Toronto team instead cooked up semiconductor particles in a flask containing extra-pure oleic acid, the main ingredient in olive oil. The particles are just a few nanometres (one billionth of a metre) across. The team then placed a drop of solution on a glass slide patterned with gold electrodes and forced the drop to spread out into a smooth, continuous semiconductor film using a process called spin-coating. They then gave their film a two-hour bath in methanol. Once the solvent evaporated, it left an 800 nanometre-thick layer of the light-sensitive nanoparticles.
At room temperature, the paint-on photodetectors were about ten times more sensitive to infrared rays than the sensors that are currently used in military night-vision and biomedical imaging. “These are exquisitely sensitive detectors of light,” says Sargent, who holds a Canada Research Chair in Nanotechnology. “It’s now clear that solution-processed electronics can combine outstanding performance with low cost.”

Development of one atom thick coating should lead to far better microscopes

One atom thick coating for tooltips created. National Institute of Nanotechnology at the U of Alberta, used a unique process to make the sharpest tip ever known and opened the door to a range of possibilities. Technically speaking, they were able to coat peripheral atoms near the peak with nitrogen, making it a one atom-thick, tough protective paint job. "That coating has the effect of binding the little pyramid of metal atoms or Tungsten, in place," said Dr. Robert Wolkow, a physics professor at the U of A and co-author on the research paper published in the Journal of Chemical Physics. "Such a pointy pyramid of metal atoms would normally just smudge away spontaneously. It's like a sand pile--you know you can't make it arbitrarily pointy. If you try to pile on more sand, it flows down and makes a more blunt pile. Metal atoms will do the same thing."

These sharp tips are needed for making contact with metals or semiconductors as well as for the manipulation and examination of atoms, molecules and small particles. Ultrafine tips are demanded for future experiments where the results are directly dependent on shape of the tip.

The tips made by Wolkow and the research team--made up of Moh'd Rezeq and Jason Pitters from NINT--are so stable they withstand about 900 degrees Celsius. They are so sharp they appear so far to serve as excellent emitters of electron beams. "The lenses in an electron microscope work more perfectly if the electron beam comes from a really small point," said Wolkow. "Since we have the smallest point source of electrons, we think we will be able to make the best electron microscopes. This is speculation, but based on pretty conventional thinking.

Форма для связи


Email *

Message *