BCG estimates that quantum simulations will have the following business market sizes
* addressable market in pharmaceuticals of up to $20 billion by 2030
* $7 billion coming from chemicals, materials science, and other materials science–intensive industries
* $20 billion in search and machine learning applications in 2030
BCG estimated that physical qubits would double roughly every 24 months.
Nextbigfuture finds a faster doubling rate for noisy qubits and the possibility of useful noisy qubit systems in 2-4 years instead of 10-12 years
From November 2017 to March 2018 there was the announcement of IBM 50 qubit prototype, Intel’s 49 qubit test chip and Google 72 qubit processor. These processors had 10% to as low as 1% error rates. In 2017, D-Wave systems had commercial availability of its 2000 qubit quantum annealing system.
Rigetti computing has said they will have 128 qubit chip by August 2019. A recent article indicates that the 128-qubit chip is developed on a new form factor that lends itself to rapid scaling. This seems to indicate that Rigetti will be able to continue scaling at the faster end of the 7 to 16 month doubling rate after the 128 qubit chip is released.
All of the competitors will also be working to reduce the error rates to 1 in 1000 or 1 in ten thousand.
Hybrid Algorithms could vastly improve the usefulness of Noisy Quantum Systems
Rigetti Computing researchers believe their new hybrid algorithms will be very useful for making near-term quantum computers useful for machine learning.
Noisy intermediate-scale quantum computing devices are an exciting platform for the exploration of the power of near-term quantum applications. Performing nontrivial tasks in such a framework requires a fundamentally different approach than what would be used on an error-corrected quantum computer. One such approach is to use hybrid algorithms, where problems are reduced to a parameterized quantum circuit that is often optimized in a classical feedback loop. Here we described one such hybrid algorithm for machine learning tasks by building upon the classical algorithm known as random kitchen sinks. Our technique, called quantum kitchen sinks, uses quantum circuits to nonlinearly transform classical inputs into features that can then be used in a number of machine learning algorithms. We demonstrate the power and flexibility of this proposal by using it to solve binary classification problems for synthetic datasets as well as handwritten digits from the MNIST database. We can show, in particular, that small quantum circuits provide significant performance lift over standard linear classical algorithms, reducing classification error rates from 50% to less than 0.1%, and from 4.1% to 1.4% in these two examples, respectively.
Random quantum circuits can be used to transform classical data in a highly nonlinear yet flexible manner, similar to the random kitchen sinks technique from classical machine learning. These transformations, which Rigetti calls quantum kitchen sinks, can be used to enhance classical machine learning algorithms.
Future Rigetti work will focus on exploring different circuit Ans¨atze and developing a better understanding of the performance of this technique.
Nextbigfuture’s Rough Timeline of noisy quantum computers from Google, Rigetti, IBM, Intel and others
100-150 qubit quantum computers in second half of 2018
200-300 qubit computers in first half of 2019
400-600 qubit computers late in 2019
800-1600 qubit computers in 2020
1600-4000 qubit computers in 2021
3000-10000 qubit computers in 2022
D-Wave Systems could get funding to convert their 5000 qubit quantum annealing system to low error rate qubits. They would try to get this working in 2020-2021 if the funding is provided.
The peak of this age of noisy quantum computers could be quantum computers with 1000 qubits and two-qubit errors rates less than 1 in 1000. This is Google’s near-term goal, which might be reached in 2020.
There could be utility in pushing to 10,000 qubits with two-qubit error rates less than 1 in 10000. These could arrive around 2022.