Dwave Systems might enable breakthroughs in optimization, machine learning and artificial intelligence

Nextbigfuture has had nearly 100 articles following Dwave Systems and another fifty on other quantum computers or quantum computer science. Dwave has a 512 qubit commercial adiabatic computer system. They are close to releasing an 1152 qubit processor that would have 2000 physical qubits. They will have tested and certified 1152 for commercial use on each chip. Their systems sell for about $10 million. Companies are exploring if they can make breakthroughs in optimization calculations, machine learning, deep learning and artificial intelligence with Dwave’s processors.

EETimes has an update

D-Wave Systems uses a different model for computation than a universal computer, called the adiabatic (occurring without loss or gain of heat) instead of the approach take by everyone working toward a universal quantum computer—the normal gates-based model when qubits are processed in the quantum computer in a manner similar to conventional computers.

Those working toward a universal quantum computer today are obsessed with error correction methods—using up to thousands of qubits just to ensure that the superposition of values in a quantum state (part 0 and part 1) is maintained accurately throughout all of its calculations. With the adiabatic method, Hilton claimed, you don’t need error correction because the qubits naturally relax into their lowest energy state.

“Our qubits go from excited level to a relaxed level, they don’t need error correction at this point,” Hilton told us. “But with gate-model of a universal quantum computer you need error correction to get anything to work at all.”

D-Wave is pioneering more than just quantum computing, but also accumulating experience with new computing hardware paradigms—like superconductivity—that could keep Moore’s Law going.

“I know testing of the D-Wave hardware has been mixed, but I understand why large companies are investing in it anyway,” Mike Battista, senior manager and analyst of Infrastructure at Info-Tech Research Group, told EEtimes. “If there is even a small chance that this is the next foundational technology that underlies computing for the next few decades, the investments will be worth it. Companies that get a head start in developing algorithms and finding problems that are amenable to quantum computing will be at a huge advantage if/when viable hardware emerges.”

D-Wave gets about 100 quantum computer chips per wafer (two shown here) which is mounts on a super-cooled mounting (middle below).(Source: D-Wave)

D-Wave has its own interface tool to create “quantum machine code” for its computers, but also has API’s/compilers for MatLab, C++ and Python with Fortran and Mathematica in the works. Currently it is getting stable qubits for 98-to-99 percent of the time, and has automatic redundancy and recover modes that map around bad qubits.

According to analyst Ed Maguire of CLSA (Credit Lyonnais Securities Asia, Hong Kong) researchers have been developing applications on D-Wave’s quantum computer “for protein folding, image detection, video compression, sentiment analysis and many others,” said Maguire in his report ‘2020 Draws Nearer’. “According to an interview with Lockheed’s CTO Ray Johnson in the New York Times, it could be possible to tell instantly how the millions of lines of software running a network of satellites might react to a solar burst or a pulse from a nuclear explosion–a calculation that currently would take weeks or more to determine [on a conventional computer].”

For its next generation, to be announced later in 2015, it is using ANSYS, Incorporated’s engineering simulation software to further reduce the magnetic vacuum that is essential to keep its qubits from failing. The new D-Wave quantum computer will have 1000 qubits (not 1024—they’ve dropped the binary progression favor of base-10) with full reconfigurability and redundancy to serve the place of error correction in “universal” quantum computers. The company is also working on more advanced algorithms and applications to real-world problems, such as encoding qubits to emulate neural networks so as to accelerate the new field of deep learning” and similar artificial intelligence endeavors.

SOURCES – EEtimes, Dwave Systems, Youtube