The Washington Post interviewed Jeremy Hilton, D-Wave's vice president of processor development. Jeremy has been at the company since 2000.
Right now, we [Dwave Systems] have a 1000 qubit processor in our lab. Dwave has a plan to release it later in 2014. The major thing that's changing aside from some of the design details is the scale of the problem you can represent, going from a 500-variable graph to a 1000-variable graph. Complexity of that is growing tremendously. [It leads to an] unimaginable exponential blowup of the number of solutions. That scale of problems is getting that much harder for classical algorithms to solve.
[After that] we're planning to release a 2000-bit processor design. That's pushing into a scale of territory where we're tackling problems that are very difficult for people to solve [with conventional methods]. The community is working on getting a few qubits to work at the scale they're trying to work at.
Dwave saw that between that 128 qubit and 512 qubit, there was a 300,000x improvement in performance. That kind of performance gain is really unprecedented.
There are algorithms related to the factoring problem that can be run on Dwave hardware and they have done some basic work along those lines. It's simply not a particularly interesting market segment for a business. Dwave has focused on problems that can connect to things like machine learning, financial modeling, logistics and scheduling. They clearly relate to major business challenges.
Google and Lockheed have purchased DWave systems. Dwave has received about $130 million in investment from Draper Fisher Jurvetson (major venture capital company) and some from the CIA's investment arm.
A scaling projection of a particular problem from lower number of qubits
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks