With Quantum SpeedUp Proven Will There Be Quantum Manhattan Projects?

Google will likely announce quantum speedup and quantum supremacy has been proven using the random circuit selection problem. It is pretty clear that physics allows quantum speedup.

Will there be Quantum Manhattan Projects?

Phase 1a – Scale existing superconducting technology to 1000 or so non-error-corrected qubits.
Phase 1b – Try to get better qubits and couplers for lower error rate and improved speed for 10000+ non-error-corrected qubits
Phase 2 – Get as fast as possible to error-corrected qubits in the million to trillion qubit range.

State of Quantum Computing

There is already hundreds of millions of dollars going into quantum computing projects.

China has talked about putting tens of billions of dollars into quantum technology. This included quantum radar and not just quantum computing.

Current competitors like Intel, Google, IBM, Facebook and Microsoft are each spending tens of millions of dollars on quantum computing projects.

Will tens of billions of dollars get committed for focused quantum computing projects?

There would be billions needed for more basic research and development of more options to determine the best ways to scale the technology. However, the superconducting quantum processors share many of the lithography technology of regular semiconductor manufacturing. Most of the quantum computer projects have used older lithography equipment.

Arxiv – Superconducting Qubits: Current State of Play (2019)

Using current techniques – despite the challenges outlined below – it seems possible to scale to on the order of ∼1000 qubits. However, beyond this (rough) number, a new set of techniques will be needed. Examples include co-location inside the dilution
refrigerator of control and readout electronics, as well as on-the-fly decoders for quantum error correction procedures.

Near-Term Challenges

◦ Control and high coherence in medium-scale devices: For medium- and large-scale devices, the individual qubit coherences are not necessarily the same as those in a simpler few-qubit devices. Maintaining high coherence and high-fidelity control across a large chip is a key challenge.
◦ Scalable calibration techniques: Advanced software strategies are also needed to calibrate medium-to-large scale quantum processors due to the large number of non-trivial cross-calibration terms while finding simultaneous optimal operating parameters.
◦ Verification and validation: As the number of qubits increases, efficiently determining the fidelity of quantum operations across the entire chip using e.g. Clifford randomized benchmarking becomes infeasible and new techniques for validation and verification will be needed. Techniques such as ‘cross entropy benchmarking’ and ‘direct benchmarking’ have recently been proposed and implemented.
◦ Improving qubit connectivity: While impressive progress has been made in three-dimensional integration of superconducting circuits, nonplanar connectivity of high-fidelity qubits has yet to be demonstrated.
◦ Improved gate fidelity: Continued improvements to gate fidelities will be an important step towards bringing down the overhead of physical qubits needed to encode a single logical qubit as well as important for demonstrating the efficacy of NISQ algorithms.
◦ Robust & reproducible fabrication: The fabrication of medium-to-large scale superconducting circuits will need to be consistent with continued improvements to qubit coherence and 3D integration techniques.

Alternative to Superconducting Qubits

The transmon qubit modality has shown tremendous progress over the last decade, but it has certain limitations.

A different strategy, which still relies on the transmon qubit modality, replaces the local flux control used in the tunable transmon qubits with local voltage control, by using superconductor-semiconductor-superconductor Josephson junctions. In such systems, a local electrostatic gate is used to tune the carrier density in the semiconductor region, resulting in a modified EJ. Such devices were first demonstrated in InAs nanowires proximitized by epitaxially-grown aluminum, forming the transmon qubit element in a cQED setup. Subsequently, improved coherence times as well as compatibility with large external magnetic fields were demonstrated. However, the need to individually place nanowires makes the path to larger devices within this scheme potentially difficult. Alternative demonstrations of such hybrid superconducting qubit systems have therefore used two-dimensional electron gases amenable to top-down fabrication, as well as graphene flakes proximitized by evaporated aluminum. The absence of local currents results in a decrease of the power that needs to be delivered onto the qubit chip, but at the cost of reintroducing some charge noise susceptibility through the gate.

Quantum Error Correction with Superconducting Qubits

There has been tremendous progress on coherence, gate operations, and readout fidelity achieved with superconducting qubits, quantum error correction (QEC) will still be needed to realize truly large-scale quantum computers. Most QEC schemes utilize some form of
redundancy (typically, multiple qubits) to encode so-called logical qubits. A prescription for performing the encoding and for correcting errors in the encoding is referred to as an error correcting code. The threshold theorem then guarantees that for a QEC code, if the operational error-rate on the physical qubits is below a certain value, and the code is implemented in a fault-tolerant manner, then errors can be suppressed to arbitrary precision. The two-dimensional surface code is perhaps the most promising, experimentally feasible QEC code in the near term, due to its particularly lenient error rate to satisfy the threshold theorem (error rate . 1%), and because it only requires weight-four parity measurements using a nearest-neighbour coupling to four qubits.

Other Papers on Next Generation Quantum Hardware

Improving Quantum Hardware: Building New Superconducting Qubits and Couplers

Journal of Superconductivity and Novel Magnetism – Superconductor Electronics: Status and Outlook

Purdue University and Microsoft – semiconductor-superconductor combination creates a state of “topological superconductivity,” which would protect against even slight changes in a qubit’s environment that interfere with its quantum nature, a renowned problem called “decoherence.” The device is potentially scalable because of its flat “planar” surface – a platform that industry already uses in the form of silicon wafers for building classical microprocessors.

Arxiv- The electronic interface for quantum processors

A cryogenic electronic interface appears the viable solution to enable large-scale quantum computers able to address world-changing computational problems.

10 thoughts on “With Quantum SpeedUp Proven Will There Be Quantum Manhattan Projects?”

  1. Why do q-bits decohere? Making them cold, to keep them coherent with its “other” so as to keep them from doing its own thing(unentangled). The reasoning used is, to keep anything, even a bit of motion from plain heat, to enter their environment to make those kinds of bits get jostled out of their pristine environment which is required to allow entanglement to happen and continue.
    OR…the bits are just ordinary entangled bits, that just happen, on pure chance, to be acting in what seems to be in unison. That also can be interpreted as acting in a coherent manner, until at one point, they just happen to act as if going their own way, as any unentangled bit will do anyways.
    It is interpreting those kinds of bits as being entangled, instead of as randomly acting the same as if entangled, over a long stream of getting into such a state. The same as when a pair of dice, thrown, can both exhibit related values over a large number of throws, could also be interpreted as being entangled, then kept at their last displayed related values just by not allowing them to be thrown any more; just lie there, like bits in a very cold environment, and then pointed at and say, “entangled dice!”
    Which is why entangled bits do decohere, when one or both are acted on, by heat and thereby act as those still dies, get thrown just one more time to break that, up till that point, of a long and just “lucky” stream of related values also can be used to make a dice base quantum computer.

  2. Wouldn’t speeding up such progress also speed up the inexorable march of the makeup of past winners diverging from those of future winners?

  3. As far as I know, the headline’s underlying story was a bit of a cludge for what constitutes “quantum supremecy” basically, the act of measuring the qubits is itself a bit of a test for an operation. so the machine google built that simply returns the state of the qubits, is difficult to simulate outside of a quantum machine. No actual work or problem has been solved here other than “read memory”. The present fidelity of 53 qubit machine is .2%. so 99.8% likely you will have an un-entangled and invalid result.

    Hardly at a place where manhatten project is needed. Invent a qubit that can remain entangled for a non-trivial amount of time (say 10 minutes) and maybe…just maybe, you’ll have enough quantum compute there to compete against a larger mainframe on an actual simulatable problem. any lab ought to be able to build a 2 qubit system with something exceeding .2% fidelity of state.

  4. I don’t believe that a free market always yield the best results

    A famous example is the 2011 Samsung vs Apple patent war over the shape of a mobile phone, which resulted in a hundred million dollars paid to Apple by Samsung. Whether that was a free market is another topic.

    I put here again a simple idea that the only thing government must do is to control that no one uses intellectual property without paying its authors fair share of its cost and the authors do not exaggerate those costs. Fair share of IP costs for a consumer is a ratio of the quantity of products which have been made using that IP for the consumer to the quantity of all products which has been made using that IP (for everyone). And I was wrong, interests (15-20% annual) must be included as R&D costs.

  5. Most of the quantum computing projects around the world are sponsored by governments including in the U.S. Is that free market? i don’t know.
    But there is a lot of private companies working in quantum computing in China for profits like Alibaba, Baidu and others so don’t let complacency cloud your mind because even states companies in China have profit incentives.

  6. First of all we need to find out if it makes economic sense to have one. China is investing billions on quantum computing but our companies that work in a free market environment are beating China in this area because there is a clear economic drive to move in this direction. This is a good example of a free market working. And no, I don’t believe that a free market always yield the best results, but I don’t ignore the times that it does. Maybe people who come from a big communist state will always look for a reason to start the next Manhattan project.

  7. To get a Manhattan Project, you need a world war to drive it.

    So we certainly hope there won’t be any Manhattan projects.

    An Apollo project is a different matter.

Comments are closed.