February 16, 2007

Climate reengineering

The new age of climate re-engineering and the $25 million bounty from Richard Branson is discussed at open the future. It is noted that the re-engineering efforts would have the best chance of succeeding if we adjust technology and behavior to stop making the problems worse.

Gregory Benford's proposal for climate re-engineering is discussed at future pundit The Benford proposal possesses the advantages of being both one of the simplest planet-cooling technologies so far suggested and being initially testable in a local context. He suggests suspension of tiny, harmless particles (sized at one-third of a micron) at about 80,000 feet up in the stratosphere. These particles could be composed of diatomaceous earth.

Benford says treating the Arctic would cost only $100 million per year. You could do the whole planet for a couple of billion.

Anyone go to the Vernor Vinge Non-singularity talk

If anyone has heard the Vernor Vinge talk of non-singularity futures, please provide information in the comments here. He spoke in San Francisco Feb 15, 2007.

Update: I guess a podcast or other recording of the talk will be up or is up on the longnow site

Update: The text and slides from the speech are online
He lays out the first stage where we fail to achieve a singularity [the age of failed Dreams / bad software] and then three scenarios [Nuclear war scenario, Golden Age and the Wheel of time].

Vinge advocates longevity research and space research as vital for positive future scenarios. [I agree]

“What If the Singularity Does NOT Happen?” Vernor Vinge, Cowell Theater, Fort Mason, San Francisco, 7pm, Thursday, February 15. The lecture starts promptly at 7:30pm. Admission is free (a $10 donation is always welcome, not required).

Science fiction writer Vernor Vinge invented the concept that dominates thinking about technology these days. He called it “the Singularity”— the idea that technology (computer tech, biotech, nanotech) is now accelerating so exponentially that it will lead to a massive, irreversible, and profoundly unpredictable transformation of humanity by mid-century.

This Thursday evening Vinge will challenge his own idea for the first time: “I have some plausible, non-singularity scenarios that get us into a human-scale world with long time horizons. I’ll describe the near-term peculiarities I see for such scenarios and then discuss what such a world might be like across ten or twenty thousand years. Finally, I’d like to talk about dangers and defenses related to these scenarios.”

Alberta using oil revenue to buy a high tech future

Alberta is using its financial clout to scour the globe for scientific "superstars" who will be offered $20-million each to conduct research in the province for the next decade.

Broad areas of research that will be considered include maternal, fetal and child health, mental health, system sustainability and technology, remote and rural care, injury and disease prevention, infectious diseases and food. Flush with energy revenue in 2005, the Alberta government pumped $500-million into the foundation, which over 27 years has distributed about $850-million to 600 researchers.

This is related to Molecular nanotechnology because Alberta also funds and is interested in molecular nanotechnology. I lived for four years in Calgary, it is a nice city. Alberta is using its oil and gas revenues from the oilsands to make/buy a corporate and technological economy.

Next-Generation Retinal Implant

Scientists plan to test an implanted chip with four times the resolution of the previous version in people blinded by retinal degeneration It now has 60 pixels instead of 16 and because it is smaller the operation to implant it is much less traumatic, taking just 90 minutes instead of eight hours. Prof Humayun predicts that it will cost around $30,000 (£15,400).

Both Chichilnisky and the USC researchers are working with Second Sight Medical Products, the company based in Sylmar, CA, that is manufacturing the devices, on the next version of the implant. The third-generation device will have 500 electrodes, boosting resolution by a factor of almost 10.

But increasing the number of electrodes won't be the only hurdle in developing implants that can give blind people truly useful vision. Scientists also need to figure out how to electrically stimulate the retina in a way that the brain can interpret with high spatial resolution, says Joseph Rizzo, an ophthalmologist at the Massachusetts Eye and Ear Infirmary and codirector of the Boston Retinal Implant Project. A ray of light, for example, stimulates retinal cells in a more precise and refined way than does the electric current coming from an electrode. "It doesn't matter if you have 10 or 1,000 electrodes," he says. "If you don't know how to use them, it doesn't matter."

A goal of 1000 pixel systems for facial recognition ability

A tiny implant on the surface of the eye receives wireless signals from an external camera, which the patient wears on a pair of glasses. The implant transmits signals to an array of electrodes surgically implanted on the retina. The array delivers electrical signals to the nerve cells in the eye, mimicking the role of light-sensitive cells lost in degenerative retinal disease.
Credit: Courtesy of Doheny Eye Institute

How the retinal implant works. Photograph: University of Southern California

Increasing electrodes is also a goal of improved cochlear implants which is discussed at wikipedia.

Biological cures for regenerating or growing the necessary visual or hearing systems is also possible path to future treatment.

February 15, 2007

Nuclear so called waste analysis

NNadir at DailyKos has an analysis of radionuclides and what we really should and should not be worried about them.

To keep the nuclear risks in perspective remember that coal causes over 1 million deaths per year from particulates causing heart disease, lung disease and cancer. Plus 10,000 per year from coal mining deaths. 27,000 of the deaths per year are in the USA which is 25 times the number of US deaths in Iraq.

Tritium kills about 13 people per year from increased cancer risk

According to the link on tritium cancer risk, every time a person drinks 1 picocurie of tritium (one trillionth of a curie) his or her risk of cancer increases by 4.4 one hundred trillionths. (See the box on the bottom.) It can be shown that 1 "tritium unit" is the equivalent of 3.2 pCi per liter. Multiplying 3.2 pCi per liter by 5000 by the cancer risk we find that the risk from tritium of drinking a liter of water in 1963 was about 1 in 1.4 billion. World population in 1963 was about 3.2 billion. If we assume that the average person drank 2 liters of water per day each day of the year, it is easy to estimate that the number of cancers induced amounted to about 1600 people.

Today the number of tritium units found is about 20, which corresponds to a risk per liter of around 1 in 360 billion per liter of water. World population is much higher, about 6.6 billion. It follows that the number of people who die this year from tritium, again assuming two liters per day of water, will be a little over 13.




Adiabatic Quantum Computing and Dwave

After seeing the Dwave demo and hearing more about how it works. A paper by Seth Lloyd now makes more sense to me.

Seth says:
Adiabatic quantum computation is a recently proposed, general approach to solving NP-hard combinatorial minimization problems. It consists of constructing a set of qubits with a time-dependent Hamiltonian ˆH (t) whose starting point ˆHs has a ground state that is quickly reachable simply by cooling and whose final point ˆHp has couplings that encode the cost scheme of a desired minimization problem. The name “adiabatic” comes from the fact that if the qubits are initialized in the ground state of ˆHs and if ˆH (t) is varied slowly enough, then the qubits will overwhelmingly be in the ground state of ˆH (t) at all times t, thus in principle completely bypassing the usual concern about local minima in ˆHp confounding the search for the problem’s solution. Later in the paper (pg 6) it indicates how the AQC is resistant to decohernce and how results are still good so long as the quantum conherence is dominant. (ie not there all of the time but most of the time)

Dwave have a four by four grid of superconducting loops, that have connections/couplings. The loops are the vertices of the grid.
They set the initial state (variables for the problem) using induction from wires near the superconducting loops and couplings. The electrical feed to those wires needs to be precisely controlled and whole system needs to be as clean from interference as possible. Thus it is in a shielded room and the electrical feeds have really heavy duty electrical means to keep the electricity and magnetism precise and clean from corruption.
Cooling causes the system to settle at or near the answer state. The answer is then read out.

So the Adiabatic Quantum Computer (AQC) as noted by Seth Lloyd is naturally resistent to decoherence and dephasing.

Dwave's specialized qubits (still open question how much quantumness there is and if it stays dominant as it scales up) avoid the error correction problem by running the same problem 100 times and polling the results. The most popular answer is taken and has been their experience to be the right answer.

The AQC need to let the system more "slowly" settle to an answer seems to mean that instead of exponentially speeding up in some cases for "better qubits" they get a quadratic speedup.

So the systems is 100 times slower for multiple runs and 100 times slower for setup times than an optimal QC system.

However, if the quantum computer is a million or billion times faster for a particular problem than a classical computer then Dwave will still have a performance advantage. Plus they could refine their process to reduce the time devoted to setup and the setup time will be a smaller fraction as they scale up and have bigger systems. Later versions might get better on the error run redundancy.

2009 is the timeframe for adjustments to the Dwave qubits to allow simulations that will help develop molecular nanotech.

In 2008 it will be apparent if the speed up in problem solving is clearly developed. (when they have 512 and 1024 qubit systems)

If someone else were to get one of other approaches to quantum computers working with universal qubits then that system would probably be superior to a Dwave system with the same number of qubits. However, other solutions are still taking far longer to make and will seem likely to lag in the number of qubits. Dwave as they refine their qubits could close that gap during their multi-year head start. Plus early Dwave quantum computers could be used to design better quantum computers.

David Deutsch talks about Dwave, Quantum Computers and nanotechnology

Wired magazine has an interview with David Deutsch, the father of Quantum computers I agree with almost everything that he says about Dwave, Quantum computers and nanotechnology. I have pulled what are my highlights from the interview.

On quantum computing gaining mind share with the Dwave announcement.
He replied: the field [quantum computing] doesn't need acceptability. The idea will either be valid, or not. The claim [Dwave's] will either be true, or not.

On the most important uses of quantum computers:
The most important application of quantum computing in the future is likely to be a computer simulation of quantum systems, because that's an application where we know for sure that quantum systems in general cannot be efficiently simulated on a classical computer. This is an application where the quantum computer is ideally suited.

Perhaps in the long run, as nanotechnology becomes quantum technology, that will be a very important generic application.

It also thinks that fully operational universal quantum computers will force the acceptance of the many universes theory.

Deutsch: I think the watershed moment with quantum computer technology will be when a quantum computer -- a universal quantum computer -- exceeds about 100 to 200 qubits... What I mean here is a qubit which is capable of being in any quantum state, and is capable of undergoing any kind of entanglement with another qubit of the same technology, and all those conditions are actually necessary to make a fully fledged quantum computer... When I said you need 100 to 200, that probably means several hundred, or perhaps 1,000 or more, physical qubits.

He said more about nanotechnology only having psychological barriers to a lot more progress. We could and should be doing a lot more to make molecular nanotechnology happen except for psychological barriers.
Deutsch: Nanotechnology has the potential of making a huge change. But the only involvement of quantum computers is that it will make it easier to design nanotechnological devices. Apart from that I don't think it's a big technological revolution.

What it is though, philosophically, is taking a quantum world view. That is rather a revolution, but that could happen today and the only reason it has been sluggish in happening is psychological, and maybe quantum computers will help with this psychological process. That's a very indirect phenomenon

More on Dwave from mainstream and the new Dwave Site

Better Nanoscale membranes

This is the 1000th article posted on this site.

There are three new membranes with nanoscale dimensions that have been created. One is silicon membrane which could filter proteins and could be used for dialysis and one is a carbon nanotube membrane that can control water flow. Be able to control and purify molecules will be an important part of molecular nanotechnology systems. These could be bootstrapping technologies and they are interesting and have other potential as well. Another is a ceramic nanomesh which could filter HIV from blood.

MIT Technology Review and Physorg have articles about a new silicon membrane with nanoscale holes that can act roughly 10 times faster than current membranes used for blood dialysis, the artificial purification of blood.

A silicon wafer with 160 nanoporous silicon membranes. Each 15-nanometer-thick, 200-by-200-micrometers-square membrane is at the center of the 160 squares patterned into the wafer. Credit: University of Rochester

Current molecular-level filters use a polymer-based design that is a jumble of varying holes and tunnels. The sizes of holes in the polymer model vary greatly, and since its "holes" are really convoluted tunnels through the material, they require much more time for proteins to pass through, and they are prone to clogging.

The new membrane is 15 nanometers thick, so it filters faster without trapping the molecules that pass through it, which is important if researchers want to retain both the larger and smaller proteins. "Once a molecule gets to the membrane, it takes one step, and it's on the back side," McGrath says.

To make the membranes, the researchers employ tools that are used to create integrated circuit chips. This should make the filters easy to integrate into silicon-based microfluidic devices that are used for protein research, where they would be useful if scientists wanted to separate a particular protein of interest from a biological fluid sample. The researchers made the membranes by first depositing a stack of three thin layers--an amorphous silicon layer sandwiched between two silicon-dioxide layers--on a silicon wafer. Exposing the wafer to temperatures higher than 700 ºC crystallizes the amorphous silicon, and it forms pores. Then the researchers etch the wafer and silicon-dioxide layers to expose small squares of the nanoporous membrane that are 200 micrometers on each side. The temperature controls the pore diameter, allowing the researchers to fine-tune the membranes: at 715 ºC the membrane has an average pore size of 7 nanometers, while at 729 ºC the average is about 14 nanometers.

By fusing wet and dry nanotechnologies, researchers at Rensselaer Polytechnic Institute have found a way to control the flow of water through carbon nanotube membranes with an unprecedented level of precision. The research, which will be described in the March 14, 2007 issue of the journal Nano Letters, could inspire technologies designed to transform salt water into pure drinking water almost instantly, or to immediately separate a specific strand of DNA from the biological jumble.

Precise control of water transport through a nanotube membrane is demonstrated by a novel electro-chemical approach

HIV may one day be able to be filtered from human blood saving the lives of millions of people, thanks to a world-first nano-membrane innovation by Queensland University of Technology scientists. QUT scientists have developed specially designed ceramic membranes for nanofiltration, which are so advanced they have the potential to remove viruses from water, air and blood. Preliminary research had proved it successful in removing viruses from water.

Quantum hall effect observed at room temperature

Quantum hall effect observed at room temperature using strong magnetic fields observations of graphene.

The quantum Hall effect was previously believed to only be observable at temperatures close to absolute zero (equal to minus 459 degrees). But when scientists at the National High Magnetic Field Laboratory in the U.S. and at the High Field Magnet Laboratory in the Netherlands put a recently developed new form of carbon called graphene in very high magnetic fields, scientists were surprised by what they saw.

"At room temperature, these electron waves are usually destroyed by the jiggling atoms and the quantum effects are destroyed," said Nobel Prize winner Horst Stormer, physics professor at Columbia University and one of the paper's authors. "Only on rare occasions does this shimmering quantum world survive to the temperature scale of us humans.

That opinion began to change, however, with the ability to create very high magnetic fields and with the discovery of graphene, a single atomic sheet of atoms about as strong as diamond. Together, these two things have allowed scientists to push this fragile quantum effect all the way to room temperature. Now there is a way to see curious and often surprising quantum effects, such as frictionless current flow and resistances as accurate as a few parts per billion, even at room temperature.

The room temperature quantum Hall effect was discovered independently in the two high field labs, in the 45-tesla Hybrid magnet in Tallahassee and in a 33-tesla resistive magnet in Nijmegen. Both research groups agreed that a common announcement on both sides of the Atlantic was the right thing to do.

Motorized climber for emergency crews

Ball's Atlas Powered Rope Ascender can pull a firefighter loaded down with 80 to 100 pounds of equipment up a 30-story building in 30 seconds. Trudging up the stairs weighed down with equipment that heavy can take six to eight minutes.

The Atlas works as follows: A rope is fixed to the roof of a building or other surface where a firefighter or paramedic wants to go. (The Atlas thus is designed for the second and third waves of help.) Down below, the rope is woven through a series of specially configured rollers on top of a turning spindle on the Atlas. As the battery-powered spindle rotates, it pulls the rope through the device and hoists the person.

Like a boat anchor, the Atlas exploits the capstan effect, which lets the rope grip tighter each time it wraps around a cylinder. As the grip tightens, more weight can be applied to the line. The key is that the Atlas also has a system that prevents the rope from overlapping or winding up on itself on the internal cylinder, thereby ensuring continuous movement, said Ball.

The battery inside the Atlas comes from A123 Systems, a notable lithium-ion battery start-up that is working with General Motors and General Electric.

The Atlas grew out of the 2004 Soldier Design Competition at MIT. Contestants were asked to create a device that could hoist 250 pounds of weight 50 feet into the air in five seconds. The contest rules also specified that the device had to weigh less than 25 pounds, which meant it would have to pack five horsepower of power.

A similar device is the Powerquick Ascender

There are also wall climbing systems from Germany

February 14, 2007

Dwave quantum computer demo part 4

AP reports on the Dwave Quantum Computer

China shutting smallest and worst coal plants

Worldwatch indicates that China has a program for shutting down its smallest, oldest and most polluting and inefficient coal plants It is a small step in the right direction.

The State Council, China’s parliament, recently endorsed a plan to accelerate closure of the nation’s smaller coal-fired power plants. The plan is to close the small plants by 2010 Coal powered plants with capacity under 50 megawatts (MW) will be ordered to close by 2010, as will 100 MW generators that have been in operation for 20 years or more

According to Li Junhong, a power expert in Nanjing, generators under 50,000 kilowatts consume 200 grams more energy per kilowatt of electricity generated than those above 300,000 kilowatts. China’s larger “ultra-supercritical” thermal power generators, with over 1 million kilowatts of generating capacity, consume roughly 290 grams of coal per kilowatt, while some smaller generators use around 1,000 grams per kilowatt. The coal used to produce only 1 kilowatt of electricity in small plants will generate as many as 2–3 kilowatts in larger ones.

Statistics also reveal that small plants emit 20 times more particulate matter and smog-forming pollutants than larger ones, and three times the sulfur dioxide. In 2006, coal burning was responsible for 90 percent of China’s sulfur dioxide discharges and 70 percent of its emissions of particulate matter and other smog-forming pollutants, according to World Watch magazine.

Dwave quantum computer demo part 3

Some more reactions and comments from blogs about the event.

Scott aronson has his comments and quotes Lawrence Ip who was at the live event

Dave Bacon the quantum pontiff has collected some links and comments no the news

As noted by the Lawrence Ip summary, the goal and plan that Geordi Rose described is to use quick and dirty short cuts to get this to work and then to refine the qubits later. They hope to get a quadratic speedup for some classes of problems.

Here is some wikipedia information on the Ising model that Dwave has implemented a 2 solver for Ising models.

Dwave quantum computer demo part 2

Photos of the event by Steve Jurvetson

Geordie Rose on stage at the Computer History Musem as data streams into the quantum computer…

Article about the event from the Register

IEEE Spectrum also has an article and reaction from Lieven Vandersypen, an associate professor at Delft University Vandersypen was a wait and see attitude. Vandersypen is part of Isaac Chung's MIT quantum computer group who are working on NMR (nuclear magnetic resonance) quantum computers. NMR systems use a big magnet to control the magnetic properties of a liquid.

Tiny plasma particle accelerator smashes record

A metre-long plasma-powered particle accelerator can boost electrons' energy to the same degree as a conventional machine 3-kilometres-long, experiments show.

Mark Hogan at the Stanford Linear Accelerator Center in California, US, is developing an alternative. Together with colleagues at SLAC and at the University of California in Los Angeles, US, Hogan has created a much more compact plasma-powered accelerator.

"Taking the beam from a standard accelerator, we've been able to double the energy [from 42 gigaelectronvolts to 84 GeV]," Hogan says.

"This is an incredible breakthrough," says Harry Weertz of the Argonne National Laboratory in Illinois, US. "Now they have to work on the details," he adds, so that plasma accelerators could be used for real experiments.

Weerts also points out that it has not been possible to pass electrons through a string of plasma accelerators, to repeatedly boost their speed. As it stands, a huge boost from plasma is a one-shot deal.

Another downside, as in all these plasma accelerators, is that the incoming electron beam loses a lot of intensity. In this case, only about 1% of the electrons in the beam made it up to the highest energy.

IBM Reveals Breakthrough eDRAM Memory Technology

IBM eDRAM test chip. IBM announced a major breakthrough in microchip design that will more than triple the amount of memory contained on a single high-end chip. With the advent of multi-core chips, memory has become an increasingly critical aspect of microprocessor performance. This prototype eDRAM (Embedded Dynamic Random Access Memory) contains over 12 million bits and high-performance logic. It will be available in IBM products beginning next year. Credit: IBM

IBM revealed a first-of-its-kind, on-chip memory technology that features the fastest access times ever recorded in eDRAM (Embedded Dynamic Random Access Memory). IBM's new microchip technology will more than triple the amount of memory stored on chips and double the performance of computer processors. It will be available in 2008.

Research on Smart cell chips

Smart cells will reconfigurable chips with faster speeds and low power usage.

A new class of computer chip is being developed. Huang's reconfigurable computing device, called the smart cell, will combine the advantages of ASICs and FPGAs. It will incorporate more than a thousand individual processors wired onto a silicon substrate. Each processor will be responsible for performing a single operation, such as addition or multiplication, as data flows through the chip. Using a type of parallel computing called stream processing, the chip will complete hundreds of calculations simultaneously, enabling it to perform up to 300 times faster than microprocessors and about 15 times faster than FPGAs.

As with FPGAs, the smart cells will be programmed by software, enabling their functions to be updated continually as conditions change. But since the individual processors will be optimally design to perform specific functions, the chips will approach the power efficiency of ASICs. The architecture should scale easily, making it possible to build more powerful chips just by adding more processors.

To create the new architecture, Huang must find a way to integrate hundreds of individual processors in a single chip, something that has never been attempted before. An even more daunting task is developing a way to connect the processors to each other.

February 13, 2007

The Dwave Quantum computer demo event, part 1

Today Dwave demoed their 16 qubit quantum computer in public I went to the event at the Mountain View Computer museum of history. There were a few hundred people in attendance.

There were some slides that discussed markets that Dwave systems had identified. Operations Research $0.5 billion
Bioscience $1.2 billion
CAD/CAE (place and route) $1.3 billion
Finance & Economics

More was revealed about the timeline that Dwave has set for itself.
32 qubits by Q4 2007 with an improved I/O system
512 qubits for Q1 2008
1024 qubits for Q3 2008

The problem solution times for the system are currently dominated by setup times.
They have to configure and load the parameters into the system.
But they could run many problem runs per second.
So if the actual problem run times were microseconds then even with setup taking 100 times longer they were still able run multiple problems multiple times.
To avoid the issues of error correction, they take a probabilistic approach and run the same problem say 100 times. The most common answer is chosen.
They are able to run problems that are larger than the 16 qubits by breaking up some problems and running local solutions and then putting them together. Like a 9 X 9 sodoku problem.

There is still some question of how much quantum computerness the system is exploiting. They have run experiments to confirm this but the real answer will come at the end of this year or early next year when the larger systems either do or do not demonstrate the speed advantage.

If they double the qubits every year after 2008 then they will get to 1 million qubits by 2018. They will be iteratively improving the quality of the qubits and system over time.

February 12, 2007

Material suitable for 30 tesla magnet found

Bismuth compound has been identified as material for the new wires needed to one day build the most powerful superconducting magnet in the world, a 30 Tesla magnet.

Using MR techniques at the National High Magnetic Field Laboratory in Tallahassee, Fla., Halperin and his team studied Bi-2212, one of the "darlings" of superconductivity. To measure its properties, they put the rare isotope oxygen-17 into a crystal of Bi-2212, with the isotope acting as a probe, much like a fluorescent dye. They then determined the phase diagram of the material where superconductivity is stable, which showed high temperature and high magnetic field could not be achieved together.

How to make the bismuth compound into the appropriate wires and configure them into a magnet still has to be determined

Форма для связи


Email *

Message *