Dwave Systems will be commercially releasing a new 1152 qubit quantum annealing system in March 2015

Dr. Colin P. Williams [CPW], Director of Business Development and Strategic Partnerships for D-Wave Systems provided answers in an email interview with Nextbigfuture.

1. How is Dwave doing with the 2048 qubit system ?

[CPW] D-Wave is making fantastic progress in fabricating ever-larger processors. In fact, we will be releasing our new 1,152-qubit “Washington” processor in March of this year. So we’re all very excited about that. However, size (i.e., qubit-count) is not the only aspect of the processor that has been improved. We have also lowered the noise and stretched the energy scale of the qubits (making them inherently more quantum mechanical), and we have strengthened our ability to create chains of qubits (making it easier to program the processor by locking qubits together to change the effective topology of the chip). Our initial performance tests have gone really well, and we are seeing some very exciting performance from the new processor. We are now perfecting new benchmark problems and new performance metrics that more clearly showcase the innate capabilities of the Washington system. These studies, and more, will be rolling out later in the year. So stay tuned for that.

Previously Nextbigfuture reported that Dwave has shown a chip with 2048 physical qubits. This is the same chip but only 1152 qubits will be active.

2. Any comment on Ivan Deutsch ? He is at the University of New Mexico’s Center for Quantum Information and Control on scaling up binary quantum bits into base-16 digits.

Question to Ivan Deutsch in Quanta magazine : Is the D-Wave machine a quantum simulator?

Ivan Deutsch answer – The D-Wave prototype is not a universal quantum computer. It is not digital, nor error-correcting, nor fault tolerant. It is a purely analog machine designed to solve a particular optimization problem. It is unclear if it qualifies as a quantum device.

[CPW] Ivan is partly correct. The D-Wave system is indeed a non-universal quantum computer designed to perform quantum annealing – a process that is naturally suited to solving a wide variety of discrete optimization and sampling problems. Non-universality doesn’t diminish our machine’s importance, though, since the classes of problems it can address have widespread commercial and scientific applicability. However, Ivan is incorrect to imply that the D-Wave machine cannot support quantum error correction. In fact, a quantum error correction method tailored to quantum annealing has been developed and demonstrated by a team of quantum computer scientists from USC (see, e.g., arXiv:1408.4382v1). Moreover, while it is true the system is not “fault-tolerant” in the sense quantum computer scientists use the term, the appeal of quantum annealing is precisely that it is inherently more resilient to errors than other approaches to quantum computing. Lastly, I think it is a bit unfair to suggest the question of whether the device is quantum in still “unclear”. On the contrary, there is now abundant evidence that the D-Wave system is a quantum computing device. In particular, it has now been shown that:

i) D-Wave’s Vesuvius processor generates significant entanglement throughout the critical stages of quantum annealing (see Lanting et al. “Entanglement in a Quantum Annealing Processor,” Phys. Rev. X 4, 021041, 29th May (2014);

ii) None of the classical models so far proposed to describe the D-Wave processor match its dynamics, whereas a quantum model describes it perfectly across a wide range of effective temperatures without any parameter fine tuning (see Albash et al. “Distinguishing Classical and Quantum Models for the D-Wave Device,” arXiv:1403.4228v3); and

iii) Not only are quantum effects present in the D-Wave processor but they play a functional role in the computations it performs (see “Computational Role of Collective Tunneling in a Quantum Annealer”, arXiv:1411.4036).

Thus, at this point, the evidence is squarely in D-Wave’s favor that the device is indeed quantum mechanical in nature, and that the quantum effects present in the chip play a functional role in the computations it performs.

3. Any comment on Canada funding the University of Waterloo-based Institute of Quantum Computing to carry out and commercialized research into quantum technologies for $15 million.

[CPW] Yes, it is gratifying to see that the Canadian Federal government has been a staunch supporter of IQC. This latest $15M is on top of some $74M provided previously. Moreover, there are reports that the total monies raised for quantum technology in the Waterloo area is an impressive $750M (see e.g., http://www.canadianbusiness.com/ceo-insider/quantum-computing-leadership-forum/ ). That’s a spectacular commitment, and it is a wonderful thing that Canadian researchers have such forward thinking benefactors. In comparison, D-Wave’s fundraising efforts have been more modest. With a total of only about $170M raised mostly from private investors, D-Wave has developed by far the world’s most sophisticated quantum computing device. That such a device can compete with conventional CPUs and GPUs that cost literally billions upon billions of dollars to develop is quite remarkable. So on a dollar for dollar basis, I think D-Wave can be very proud of what it has accomplished thus far. Moreover, our most recent infusion of new funding announced this week will allow us to accelerate development of our next generation design.

4. Do you find any other recent developments in quantum computing science noteworthy ?

[CPW] Yes, I would say it is noteworthy that other groups, including Google, have recently started projects aimed at developing quantum computers based on quantum annealing. I think this validates D-Wave’s overall approach, makes our IP portfolio more valuable, and brings talented people with new ideas to the table. Eventually, I see this as creating a population of users experienced in expressing their problems in a form amenable to quantum annealing. I also expect them to come to have a greater appreciation for the technical advances D-Wave has made in this area.

From a prior Dwave article

One of the interesting things DWave is playing with now is the following idea (starts at around 22:30 of the presentation linked to below). Imagine instead of measuring the time to find the ground state of a problem with some probability, instead measure the difference between the ground state energy and the median energy of samples returned, as a function of time and problem size. If we do this what we find is that the median distance from the ground state scales like sqrt{|E|+N} where N is the number of qubits, and |E| is the number of couplers in the instance (proportional to N for the current generation). More important, the scaling with time flattens out and becomes nearly constant. This is consistent with the main error mechanism being mis-specification of problem parameters in the Hamiltonian (what we call ICE or Intrinsic Control Errors).

In other words, the first sample from the processor (ie constant time), with high probability, will return a sample no further than O(sqrt{N}) from the ground state. That’s pretty cool.

Nextbigfuture primer on potential impacts and what is expected from Dwave and Quantum Computing Systems

Nextbigfuture has 344 articles on Quantum Computers, about half of which were on quantum annealing and Dwave Systems. I went to the introductionory event where Dwave announced their 16 qubit system. In 2006, I, Brian Wang, predicted that There will be a quantum computer with over 100 qubits of processing capability sold either as a hardware system or whose use is made available as a commercial service by Dec 31, 2010. The sale for $10 million for a 128 qubit system concluded November, 2010

Also, addressing a common misunderstanding. There are adiabatic factoring algorithms. They do not need to implement Shors algorithm to factor numbers. There are academic sites like Quantum Zoo and papers that describe quantum algorithms.

Wikipedia describes quantum algorithms. It is possible to adapt most versions of quantum algorithms into adiabatic versions.

A google search on quantum adiabatic algorithms gives a sense of the field and the sources on this topic.

Nextbigfuture recognizes that Scott Aaronson understands theoretical quantum computing. However, Scott has been wrong about Dwave’s potential from the beginning. Scott does not understand business or what is useful to companies or people. Dwave is solving protein folding, Machine learning problems for Google, problems for Lockheed, problems for US intelligence and other customers. Academics take decades for an elegant one, two or four qubit system. As noted while Scott Aaronson was getting into magazines or mainstream reports as the vocal skeptic, I had correctly identified the potential and speed with which it would happen. Recall a few paragraphs above about calling the commercial sale for years before it happened. Yet people who think they are clever about Dwave keep quoting Aaronson who was repeatedly wrong.

There are dozens of ways to build quantum computing systems. Qubits made from quantum dots could scale to many trillions. The early Dwave Systems could be the Wright Flyer of computing. It could beat the hot air balloons or blimps of computing.

If bigger optimization problems are solved then the answers can be hard coded into classical systems to provide the speedup. Software would include or refer to solved answers just like referring to long solutions for PI. Dwave has already built software where a quantum computer could be accessed over a cloud. This would be like a function call to another system but the other system is a remote quantum system.

Venture Capitalist Steve Jurvetson (Draper Fisher Jurvetson is an investor in Dwave Systems) describes the speedup for Dwave Systems’ Adiabatic Quantum computers.

At 2000 to 4000 qubits, Dwave Systems adiabatic quantum computer should become faster than classical computers for discrete optimization problems. Dwave should reach that level of qubits by about 2015. This system has 2048 physical qubits with 1152 active. It is non-trivial to get the full architecting and testing to enable all of the qubits.

Dwave is on track to ten thousand qubits by about 2017. They would be disappointed not to reach millions of qubits by 2024.

The potential is vast but knowing what is possible and what must be done to advance performance involves confirming theories with actual physics. The work is generating lots of scientific papers because it is breaking new ground in science. In spite of the experimental nature, there is still progress toward application to real world problems.

* Google has applied the systems to improving Machine learning, Image recognition and classification.
A classical system has trouble with outlier examples from images. An algorithm applied to Dwave Systems eliminates outliers. This speeds up automated machine learning.

* Fedex and some US Government agencies have massive organizations that work to squeeze out optimization improvement in logistics or large numbers of simultaneous linear equations. They have a lot of mathematicians working on exact or very good approximate solutions. Lockheed is trying to determine how to use DWave systems to computationally validate optimizations for systems like the F35 jet and other complex systems and projects. Smaller optimizations are for doctors who are looking to minimize the amount of radiation a patient is exposed to for cancer treatment. They have a small clinic and are using off the shelf software on a personal computer. Using a DWave system remotely can produce better results than the off the shelf software on a PC. Dwave was already ten thousand times faster than unoptimized off the shelf software. Classical (regular) computers needed to have tuned classical software from a research team to match Dwaves 500 qubit software. The cancer doctors does not have a team of academic optimization experts working on his radiation treatment plan that he needs to use next week. Therefore, there is already commercial utility to DWave’s systems over realworld situations.

The indications are that scaling limits are not being encountered.

Every month Dwave reworks the chip designs at the hardware level and has a lithography partner print new ones.

Not just more qubits, physical improvements are constantly being made. This was discussed in the first answer in the interview.

Cracking complex but foundational optimization could mean a 20-30% boost in global GDP over time from better logistics.
Improvements to machine learning could be the key to better than human level AI. Not just for the processing of problems but also for automated organization of information.

Going beyond what we can do now means understanding the utility of what we do not know and how much getting better helps will determine the full potential.

There will be more than one kind of quantum computer that will be successful and useful. Dwave is targeting what can be built now and what is most useful now for the most valuable problems.

Steve Jurvetson Describes How the Scaling Works if the Early Data Extrapolates

If we suspend disbelief for a moment, and use D-Wave’s early data on processing power scaling (see below), then the very near future should be the watershed moment, where quantum computers surpass conventional computers and never look back. Moore’s Law cannot catch up. A year later, it outperforms all computers on Earth combined. Double qubits again the following year, and it outperforms the universe. What the???? you may ask… Meaning, it could solve certain problems that could not be solved by any non-quantum computer, even if the entire mass and energy of the universe was at its disposal and molded into the best possible computer.

It is a completely different way to compute — as David Deutsch posits — harnessing the refractive echoes of many trillions of parallel universes to perform a computation.

First the caveat (the text in white letters on the graph). D-Wave has not built a general-purpose quantum computer. Think of it as an application-specific processor, tuned to perform one task — solving discrete optimization problems. This happens to map to many real world applications, from finance to molecular modeling to machine learning, but it is not going to change our current personal computing tasks. In the near term, assume it will apply to scientific supercomputing tasks and commercial optimization tasks where a heuristic may suffice today, and perhaps it will be lurking in the shadows of an Internet giant’s data center improving image recognition and other forms of near-AI magic. In most cases, the quantum computer would be an accelerating coprocessor to a classical compute cluster.

Second, the assumptions. There is a lot of room for surprises in the next three years. Do they hit a scaling wall or discover a heretofore unknown fracturing of the physics… perhaps finding local entanglement, noise, or some other technical hitch that might not loom large at small scales, but grows exponentially as a problem just as the theoretical performance grows exponentially with scale. I think the risk is less likely to lie in the steady qubit march, which has held true for a decade now, but in the relationship of qubit count to performance.

Background

Quantum annealing correction for random Ising problems [Arxiv paper shows quantum error correction for quantum annealing, 17 pages]

Physical Review X – Entanglement in a Quantum Annealing Processor (14 pages)

Computational Role of Collective Tunneling in a Quantum Annealer (35 pages)

SOURCES – DWave Systems, Arxiv, Physical Review X, IT World Canada, Canadian Business, wikipedia