Google Hartmut Neven predicts that within 10 years there will only be quantum machine learning and no machine learning on classical computers

Google is working on error corrected adiabatic (analog) quantum computer designs. Tehy work less like a conventional computer and are less well understood theoretically. And they would still need a way to deal with errors. But the burden of error correction should be much smaller. As a result, it should be much easier to demonstrate the power of a quantum computer this way.

The team used the analog quantum computing approach to program a superconducting quantum chip to simulate nine atoms interacting magnetically. That was made possible by drawing on some of the error correction techniques developed in earlier work on the harder-to-scale-up digital quantum computing.

The chip used had nine of the basic building blocks of a quantum computer, known as qubits. It would take an analog quantum computer with 40 or more to demonstrate what researchers charmingly call “quantum supremacy”—meaning a system that can conclusively demonstrate things impossible for a conventional computer.

Google says it can scale up to that point relatively quickly, and other researchers in the field say it’s credible.
It would likely take scaling up a little further to do useful work with an analog quantum computer. If and when Google or some other company does that, the devices could be used to crack tough chemistry problems in health or energy by simulating atoms to a level of realism impossible today.

Google also believes that quantum supremacy could advance its research in machine-learning and artificial-intelligence technology, which underpins CEO Sundar Pichai’s claim that the company has entered an “AI first” era.

Hartmut Neven, who leads Google’s work on figuring out what to do with quantum computers once they arrive, hopefully told [Technology Review] last year that the power of quantum-enhanced artificial intelligence could sweep away today’s technology. “I would predict that in 10 years there’s nothing but quantum machine learning—you don’t do the conventional way anymore,” he said.

They are expecting to go from 9 error corrected adiabatic qubits today to 40 by 2018.

SOURCES- Technology Review, Nature, Google