Here is the Quantum Supremacy Using a Programmable Superconducting Processor Paper

Nextbigfuture has a copy of the 12-page Google-NASA paper on Quantum Supremacy.

The paper is called Quantum Supremacy Using a Programmable Superconducting Processor by Eleanor G. Rieffel NASA Ames Research Center. (2019). NOTE: This paper still needs to have peer-review completed. Other scientists have to check the work performed by NASA and Google.

Jim Clarke, director of quantum hardware, Intel Labs said
Google’s recent update on the achievement of quantum supremacy is a notable mile marker as we continue to advance the potential of quantum computing. Achieving a commercially viable quantum computer will require advancements across a number of pillars of the technology stack.

We along with the industry are working to quickly advance all of those areas to realize the true potential of quantum computing. And while development is still at mile one of this the marathon, we strongly believe in the potential of this technology.

Chad Rigetti the founder of quantum computer said that the Google-NASA is “profound research.”

Google NASA Research Details

The tantalizing promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor. A fundamental challenge is to build a high-fidelity processor capable of running quantum algorithms in an exponentially large computational space. Here, we report using a processor with programmable superconducting qubits to create quantum states on 53 qubits, occupying a state-space 253 ∼ 10^16. Measurements from repeated experiments sample the corresponding probability distribution, which we verify using classical simulations. While our processor takes about 200 seconds to sample one instance of the quantum circuit 1 million times, a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task. This dramatic speedup relative to all known classical algorithms provides an experimental realization of quantum supremacy on a computational task and heralds the advent of a much-anticipated computing paradigm.

* They designed a quantum processor named “Sycamore” which consists of a two-dimensional array of 54 transmon qubits, where each qubit is tunably coupled to four nearest-neighbors, in a rectangular lattice. The connectivity was chosen to be forward compatible with error- correction using the surface code. A key systems- engineering advance of this device is achieving high- fidelity single- and two-qubit operations, not just in isolation but also while performing a realistic computation with simultaneous.

Achievements of the Google-NASA AI Quantum Team

They show quantum speedup is achievable in a real-world system and is not precluded by any hidden physical laws. Quantum supremacy also heralds the era of Noisy Intermediate- Scale Quantum (NISQ) technologies.

* They made technical advances which pave the way to error correction. (NBF – Error Corrected Quantum computing will be huge and currently it is believed it will take a million times more qubits for error correction.)
* They developed fast, high-fidelity gates that can be executed simultaneously across a two-dimensional qubit array.
* They calibrated and benchmarked the processor at both the component and system level using a powerful new tool: cross-entropy benchmarking (XEB).
* Finally, they used component-level fidelities to accurately predict the performance of the whole system, further showing that quantum information behaves as expected when scaling to large systems.

The benchmark task we demonstrate has an immediate application in generating certifiable random numbers, other initial uses for this new computational capability may include optimization, machine learning, materials science and chemistry. However, realizing the full promise of quantum computing (e.g. Shor’s algorithm for factoring) still requires technical leaps to engineer fault-tolerant logical qubits.

The coupler design allows them to quickly tune the qubit-qubit coupling from completely off to 40 MHz. Since one qubit did not function properly the device uses 53 qubits and 86 couplers.

The processor uses transmon qubits, which can be thought of as nonlinear superconducting resonators at 5 to 7 GHz.

To be successfully described by a digitized error model, a system should be low in correlated errors. We achieve this in our experiment by choosing circuits that randomize and decorrelate errors, by optimizing control to minimize systematic errors and leakage, and by designing gates that operate much faster than correlated noise sources, such as 1/f flux noise [37]. Demonstrating a predictive uncorrelated error model up to a Hilbert space of size 253 shows that we can build a system where quantum resources, such as entanglement, are not prohibitively fragile.

Quantum Future

Quantum processors have thus reached the regime of quantum supremacy. Google and NASA expect their computational power will continue to grow at a double exponential rate: the classical cost of simulating a quantum circuit increases exponentially with computational volume, and hardware improvements will likely follow a quantum-processor equivalent of Moore’s law, doubling this computational volume every few years. To sustain the double exponential growth rate and to eventually offer the computational volume needed to run well-known quantum algorithms, such as the Shor or Grover algorithms the engineering of quantum error correction will have to become a focus of attention.

The “Extended Church-Turing Thesis” formulated by Bernstein and Vazirani asserts that any “reasonable” model of computation can be efficiently simulated by a Turing machine. Our experiment suggests that a model of computation may now be available that violates this assertion.

Background on the Sampling Problem

In 2018, Adam Bouland, Bill Fefferman, Chinmay Nirkhe, and Umesh Vazirani of University of California, Berkeley and Joint Center for Quantum Information and Computer Science (QuICS), University of Maryland / NIST analyzed the hardness of the problem that has been solved by Google and NASA.

Arxiv – Quantum Supremacy and the Complexity of Random Circuit Sampling (2018)

Abstract
A critical milestone on the path to useful quantum computers is quantum supremacy – a demonstration of a quantum computation that is prohibitively hard for classical computers. A leading near-term candidate, put forth by the Google/UCSB team, is sampling from the probability distributions of randomly chosen quantum circuits, which we call Random Circuit Sampling (RCS). In this paper we study both the hardness and verification of RCS. While RCS was defined with experimental realization in mind, we show complexity theoretic evidence of hardness that is on par with the strongest theoretical proposals for supremacy. Specifically, we show that RCS satisfies an average-case hardness condition – computing output probabilities of typical quantum circuits is as hard as computing them in the worst-case, and therefore #P-hard. Our reduction exploits the polynomial structure in the output amplitudes of random quantum circuits, enabled by the Feynman path integral. In addition, it follows from known results that RCS satisfies an anti-concentration property, making it the first supremacy proposal with both average-case hardness and anti-concentration.

SOURCES- Google, NASA, Quantum Supremacy Using a Programmable Superconducting Processor, Intel, Rigetti
Written by Brian Wang, Nextbigfuture.com

1 thought on “Here is the Quantum Supremacy Using a Programmable Superconducting Processor Paper”

  1. So I’ve had a chance to read this finally. It looks a lot better than I had guessed.

    Its not very useful work exactly that they did but it does seem to prove that there’s nothing holding quantum advantage back by first principles.

Comments are closed.