MIT Quantum Computing Professor Scott Aaronson blogged about the quantum supremacy work. Scott basically is confirming that there are updated versions of the Google Quantum Supremacy paper and that the entire Quantum research community all knew about the Google work. Peer review has been well underway and it is looking very promising that peer review will confirm the work. The 53 qubits of the Google machine will also be able to generate truly random numbers.
Google is indeed preparing a big announcement about quantum supremacy, to coincide with the publication of its research paper in a high-profile journal. This will probably happen within a month.
Running Shor’s algorithm to break the RSA cryptosystem would require several thousand logical qubits. With known error-correction methods, that could easily translate into millions of physical qubits, and those probably of a higher quality than any that exist today.
There are even candidates for public-key cryptosystems (with some using lattices) that are impervious to quantum computing. Researcher have tried for 20+ years to determine quantum methods to use against quantum-proof encryption systems.
The Google group 53 qubit system can be used to demonstrate Aaronson’s protocol for generating certified random bits.
Aaronson’s Proposals for Next Steps
1. A programmable QC with 50-100 qubits could be used for a useful quantum simulation (maybe a condensed-matter system) much faster than any known classical method.
2. Demonstrate the use of quantum error-correction, to keep an encoded qubit alive for longer than the underlying physical qubits remain alive.
Quantum Speedup Over Classical Is Possible
Quantum Speedup over classical is possible.
How useful will it be?
The fundamental principles of computing were about physics.
Quantum mechanics upholds the Church-Turing Thesis, but violates the Extended CT Thesis.
The extended Church-Turing thesis is a foundational principle in computer science. It asserts that any ”reasonable” model of computation can be efficiently simulated on a standard model such as a Turing Machine or a Random Access Machine or a cellular automaton. This thesis forms the foundation of complexity theory — for example ensuring that that the class P (polynomial time) is well defined. But what do we mean by ”reasonable”? In this context, reasonable means ”physically realizable in principle”
Can we exploit the exponentiality of quantum states to solve certain problems faster?
* We will be able to learn more about physics and quantumness. Science in physics and chemistry will be improved
* We will have important tools for getting better and possibly optimal answers to computationally complex questions where the questions can be properly defined with mathematics and physics.
* There will be gains from quantum simulation for chemistry and drug discovery.
* There will be gains from improved aspects of machine learning.
* There will be gains from the optimization of large routing and supply chain problems like those for the US military and FedEx.
We do not know how quickly quantum computers can be scaled up. However, there is the expectation of doubling qubits every 6 to 18 months.
There needs to be error correction for very large quantum computing systems. Error correction will likely take 1000 to 1 million times more qubits than non-error corrected systems. If 6-18 months for doubling qubits were to hold, then the error correction lag would be 5 to 30 years.