Current and near term quantum computers

The (noisy) 50-100 qubit quantum computer is coming soon. (NISQ = noisy intermediate-scale quantum computer).

NISQ devices cannot be simulated by brute force using the most powerful currently existing supercomputers.
NISQ will be an interesting tool for exploring physics. It might also have useful applications. But we’re not sure about that.
NISQ will not change the world by itself. Rather it is a step toward more powerful quantum technologies of the future.
Potentially transformative scalable quantum computers may still be decades away. We’re not sure how long it will take.

Qubit “quality”

The number of qubits is an important metric, but it is not the only thing that matters. The quality of the qubits, and of the “quantum gates” that process the qubits, is also very important. All quantum gates today are noisy, but some are better than
others. Qubit measurements are also noisy.

For today’s best hardware (superconducting circuits or trapped ions), the probability of error per (two-qubit) gate is about 1 per 1000, and the probability of error per measurement is about 1 per 100 (or better for trapped ions). We don’t yet know whether systems with many qubits will perform that well.

Naively, we cannot do many more than 1000 gates (and perhaps not even that many) without being overwhelmed by the noise. Actually, that may be too naïve, but anyway the noise limits the computational power of NISQ technology.

Eventually we’ll do much better, either by improving (logical) gate accuracy using quantum error correction (at a hefty overhead cost) or building much more accurate physical gates, or both. But that probably won’t happen very soon.

Other important features: The time needed to execute a gate (or a measurement). E.g., the two-qubit gate time is about 40 ns for superconducting qubits, 100 µs for trapped ions, a significant difference. Also qubit connectivity, fabrication yield, …

Quantum Speedups?

When will quantum computers solve important problems that are beyond the reach of the post powerful classical supercomputers?
We should compare with post-exascale classical hardware, e.g. 10 years from now, or more (over 10^18 FLOPS).
We should compare with the best classical algorithms for the same tasks.
Note that, for problems outside NP (e.g typical quantum simulation tasks), validating the performance of the quantum computer may
be difficult.

Even if classical supercomputers can compete, the quantum computer might have advantages, e.g. lower cost and/or lower power consumption.

Quantum optimizers

Eddie Farhi: “Try it and see if it works!”

We don’t expect a quantum computer to solve worst case instances of NP-hard problems, but it might find better approximate solutions, or find them faster.

Hybrid quantum/classical algorithms.

Combine quantum evaluation of an expectation value with a classical feedback loop for seeking a quantum state with a lower value.

Quantum approximate optimization algorithm (QAOA).

In effect, seek low-energy states of a classical spin glass.

Variational quantum eigensolvers (VQE).

Seek low energy states of a quantum many-body system with a local Hamiltonian H. (Much easier than algorithms which require simulation of time evolution governed by H.)

Classical optimization algorithms (for both classical and quantum problems) are sophisticated and well-honed after decades of hard work. Will NISQ be able to do better?

How quantum testbeds might help

Peter Shor: “You don’t need them [testbeds] to be big enough to solve useful problems, just big enough to tell whether you can solve useful problems.”

Classical examples:
Simplex method for linear programming: experiments showed it’s fast long before theorists could prove that it’s fast.

Metropolis algorithm: experiments showed it’s useful for solving statistical physics problems before theory established criteria for rapid convergence.

Deep learning. Mostly tinkering so far, without much theory input.

Possible quantum examples:

Quantum annealers, approximate optimizers, variational eigensolvers, … playing around may give us new ideas.

But in the NISQ era, imperfect gates will place severe limits on circuit size. In the long run, quantum error correction will be needed for scalability. In the near term, better gates might help a lot!

What can we do with, say, less than 100 qubits, depth less than 100? We need a dialog between quantum algorithm experts and application users.

Quantum annealing

The D-Wave machine is a (very noisy) 2000-qubit quantum annealer (QA), which solves optimization problems. It might be useful. But we have no convincing theoretical argument that QAs are useful, nor have QA speedups been demonstrated experimentally.

QA is a noisy version of adiabatic quantum computing (AQC), and we believe AQC is powerful. Any problem that can be solved efficiently by noiseless quantum computers can also be solved efficiently by noiseless AQC, using a “circuit-to-Hamiltonian map.”

But in contrast to the quantum circuit model, we don’t know whether noisy AQC is scalable. Furthermore, the circuit-to-Hamiltonian map has high overhead: Many more qubits are needed by the (noiseless) AQC algorithm than by the corresponding quantum circuit algorithm which solves the same problem.

Theorists are more hopeful that a QA can achieve speedups if the Hamiltonian has a “sign problem” (is “non-stoquastic”). Present day QAs are stoquastic, but non-stoquastic versions are coming soon.

Assessing the performance of QA may already be beyond the reach of classical simulation, and theoretical analysis has not achieved much progress. Further experimentation should clarify whether QAs actually achieve speedups relative to the best classical algorithms.

QAs can also be used for solving quantum simulation problems rather than classical optimization problems (D-Wave, unpublished).

Quantum speedups in the NISQ era and beyond

Can noisy intermediate-scale quantum computing (NISQ) surpass exascale classical hardware running the best classical algorithms?

Near-term quantum advantage for useful applications is possible, but not guaranteed.

Hybrid quantum/classical algorithms (like QAOA and VQE) can be tested.

Near-term algorithms should be designed with noise resilience in mind.

Quantum dynamics of highly entangled systems is especially hard to simulate, and is therefore an especially promising arena for quantum advantage.

Experimentation with quantum testbeds may hasten progress and inspire new algorithms.

NISQ will not change the world by itself. Realistically, the goal for near-term quantum platforms should be to pave the way for bigger payoffs using future devices.

Lower quantum gate error rates will lower the overhead cost of quantum error correction, and also extend the reach of quantum algorithms which do not use error correction.

Truly transformative quantum computing technology may need to be fault tolerant, and so may still be far off. But we don’t know for sure how long it will take. Progress toward fault tolerant QC must continue to be a high priority for quantum technologists.

Quantum hardware: state of the art

IBM Quantum Experience in the cloud: now 16 qubits (superconducting circuit). 20 qubits by end of 2017, 50-qubit device “built and measured.”

Google 22-qubit device (superconducting circuit), 49 qubits next year.

Harvard 51-qubit quantum simulator (Rydberg atoms in optical tweezers).

Dynamical phase transition in Ising-like systems; puzzles in defect (domain wall) density.

UMd 53-qubit quantum simulator (trapped ions). Dynamical phase transition in Ising-like systems; high-efficiency single-shot readout of many-body correlators.

ionQ: 32-qubit processor planned (trapped ions), with all-to-all connectivity.

Microsoft: is 2018 the year of the Majorana qubit?

And many other interesting platforms … spin qubits, defects in diamond (and other materials), photonic systems, …

There are other important metrics besides the number of qubits; in particular, the two-qubit gate error rate (currently over 1 per 1000) determines how large a quantum circuit can be executed with reasonable signal-to-noise.