Quantum Computing Against and in Defense of Dwave Systems

Scott Aaronson, Quantum Computing Assistant professor, who published some highly public criticisms on Dwave Systems (adiabatic quantum computer company) in places like Scientific American has a blog post tirade against Dwave Systems and against people asking him about Dwave Systems

It is amusing that Scott Aaronson is whining about his being asked to react to every new Dwave related announcement when he has vocally set himself up as Dr. Anti-Dwave.

David Bacon provides a technically knowledgeable and up to date (half hearted semi-defense of Dwave Systems by an outsider to the company but from someone with a somewhat more open mind.

Scott: While some of your employees are authoring or coauthoring perfectly-reasonable papers on various QC topics, those papers still bear essentially zero relation to your marketing hype?

Dave: I hate the term “reasonable papers.” Sorry. It sounds like the quantum computing gestapo to me. But beyond that what hype are you talking about in press releases. Their news section has absolutely zero about their latest NIPS demo (which is apparently what set you off, Dr. Optimizer. [Google working with Dwave on a binary classifier of images) If anything, I think your beef has to be with the science journalists who are producing articles on the recent paper or with Hartmut Neven whose blog post on the google research blog has more meat to argue about (the last lines are classic.)

Actually if you read the NIPS demo paper you would see that there is some interesting new stuff. In particular you would note that they believe they have 52 of their 128 “qubits” functioning. Independent of whether this thing quantum computes or represents a viable technology, getting 52 such flux qubits to operate in controllable manner such that they can read out the ground state to the combinatorial problem at all is, in my opinion, an impressive feat. The fact that they thought they would be at 128 qubits about a year ago is also a warning to me that this shit is hard. Also the paper gives a nice list of the “problems” they are encountering. In particular they acknowledge here the difficulties arising due to finite temperature and to parameter variability. You’d also read that their classifier doesn’t outperform the one they compare against for false-positives

Ars Technica discusses the issue

As it happens, what D-Wave is making looks remarkably like a branch of math called simulated annealing. While this doesn’t require a quantum computer, it is still a very good way of solving certain classes of problems.

From Google’s perspective, D-Wave has perfected a simulated annealing system that is well suited to searching images for well defined objects, something that is too computationally expensive to be offered to the public otherwise. If these co-processors can do the task fast enough, then common searches can be pre-calculated and the results stored in databases for fast retrieval. This would then offer a vast improvement over the current implementation of Google’s image search capability

.

Quantum Annealing has been discussed here before.

A research paper has results that compare the time for a quantum annealer to achieve the same levels of accuracy. They obtain times of 10 milliseconds for the quantum annealer for 10 hours of simulated annealing time–a speed-up of more than six orders of magnitude. The speed improvement in the analyzed case was 3,600,000 times.

var pubId=12340;
var siteId=12341;
var kadId=18004;
var kadwidth=336;
var kadheight=280;
var kadtype=1;

Scott – I actually agree that D-Wave seems to have improved over the last couple years in certain respects. In particular, it’s no longer that hard to imagine them stumbling into something profitable—presumably, something classical-optimization-related that had nothing to do with quantum—in spite of the multiple gaping holes in their stated business plan (which of course is my main concern

Geordie Rose (CTO of Dwave from comments at Dave’s post, talking about entangled superconducting qubits) – But let’s say the following hold: (a) you make the best QM model of a multi-qubit circuit you can; (b) you use this model to predict the allowed energies of the multi-qubit system; (c) you measure the energy levels (using eg spectroscopy) and you find quantitative agreement with the predictions of the QM model; (d) the eigenstates of the QM model are entangled for some realized experimental parameters; (e) the temperature is less than the gap between ground and first excited states. This type of spectroscopy most definitely is evidence of entanglement.

Scott Aaronson and David Bacon have a proposal for a national quantum computing initiative (3 page pdf), Version 6: December 12, 2008

Quantum Computing and the Ultimate Limits of Computation: The Case for a National Investment

Building a quantum computer is daunting challenge. Current progress toward this goal is best described as piecemeal. However, we are currently approaching a tipping point for quantum computers: the most promising technologies have demonstrated all the necessary building blocks of a quantum computer. A large effort to carry these technologies forward and put these blocks together is now prudent, and, we believe, likely to produce future technological spin-offs.

Recently Singapore invested over $100 million in quantum computing research. The Canadian government has contributed over $50 million to the University of Waterloo’s Institute for Quantum Computing and the Perimeter Institute for Theoretical Physics, both of which have become world leaders in quantum computing and information. European spending on quantum computing is comparable to that of the US. In short, while the US has funded quantum computing research, it has done so only at a level sufficient enough to barely keep up with the rest of the world. In some areas of quantum computing, for instance in the theory of these computers, the US is being eclipsed by the rest of the world.

We suggest that the NSF fund several large centers, based upon existing experimental
efforts around the United States, which are specialized to the major approaches being taken toward building a quantum computer. These centers should have sufficient resources to fund their own internal effort as well as to support external efforts to develop the technologies needed for quantum computation. A goal should be set for these centers of building a quantum computer which outperforms today’s classical computers at quantum simulation tasks within the next decade. We also believe it is essential that a significant NSF program be established to deal with computer security in a post-quantum-computing world. Such a program would focus on cryptography that is resistant to quantum attacks, the capabilities and limits of quantum computers, and the limits of feasible computation more generally. Finally, to support these efforts, we recommend that the NSF’s existing modest investment in the theoretical foundations of computer science be enhanced.