With Quantum SpeedUp Proven Will There Be Quantum Manhattan Projects?

Google will likely announce quantum speedup and quantum supremacy has been proven using the random circuit selection problem. It is pretty clear that physics allows quantum speedup.

Will there be Quantum Manhattan Projects?

Phase 1a – Scale existing superconducting technology to 1000 or so non-error-corrected qubits.
Phase 1b – Try to get better qubits and couplers for lower error rate and improved speed for 10000+ non-error-corrected qubits
Phase 2 – Get as fast as possible to error-corrected qubits in the million to trillion qubit range.

State of Quantum Computing

There is already hundreds of millions of dollars going into quantum computing projects.

China has talked about putting tens of billions of dollars into quantum technology. This included quantum radar and not just quantum computing.

Current competitors like Intel, Google, IBM, Facebook and Microsoft are each spending tens of millions of dollars on quantum computing projects.

Will tens of billions of dollars get committed for focused quantum computing projects?

There would be billions needed for more basic research and development of more options to determine the best ways to scale the technology. However, the superconducting quantum processors share many of the lithography technology of regular semiconductor manufacturing. Most of the quantum computer projects have used older lithography equipment.

Arxiv – Superconducting Qubits: Current State of Play (2019)

Using current techniques – despite the challenges outlined below – it seems possible to scale to on the order of ∼1000 qubits. However, beyond this (rough) number, a new set of techniques will be needed. Examples include co-location inside the dilution
refrigerator of control and readout electronics, as well as on-the-fly decoders for quantum error correction procedures.

Near-Term Challenges

◦ Control and high coherence in medium-scale devices: For medium- and large-scale devices, the individual qubit coherences are not necessarily the same as those in a simpler few-qubit devices. Maintaining high coherence and high-fidelity control across a large chip is a key challenge.
◦ Scalable calibration techniques: Advanced software strategies are also needed to calibrate medium-to-large scale quantum processors due to the large number of non-trivial cross-calibration terms while finding simultaneous optimal operating parameters.
◦ Verification and validation: As the number of qubits increases, efficiently determining the fidelity of quantum operations across the entire chip using e.g. Clifford randomized benchmarking becomes infeasible and new techniques for validation and verification will be needed. Techniques such as ‘cross entropy benchmarking’ and ‘direct benchmarking’ have recently been proposed and implemented.
◦ Improving qubit connectivity: While impressive progress has been made in three-dimensional integration of superconducting circuits, nonplanar connectivity of high-fidelity qubits has yet to be demonstrated.
◦ Improved gate fidelity: Continued improvements to gate fidelities will be an important step towards bringing down the overhead of physical qubits needed to encode a single logical qubit as well as important for demonstrating the efficacy of NISQ algorithms.
◦ Robust & reproducible fabrication: The fabrication of medium-to-large scale superconducting circuits will need to be consistent with continued improvements to qubit coherence and 3D integration techniques.

Alternative to Superconducting Qubits

The transmon qubit modality has shown tremendous progress over the last decade, but it has certain limitations.

A different strategy, which still relies on the transmon qubit modality, replaces the local flux control used in the tunable transmon qubits with local voltage control, by using superconductor-semiconductor-superconductor Josephson junctions. In such systems, a local electrostatic gate is used to tune the carrier density in the semiconductor region, resulting in a modified EJ. Such devices were first demonstrated in InAs nanowires proximitized by epitaxially-grown aluminum, forming the transmon qubit element in a cQED setup. Subsequently, improved coherence times as well as compatibility with large external magnetic fields were demonstrated. However, the need to individually place nanowires makes the path to larger devices within this scheme potentially difficult. Alternative demonstrations of such hybrid superconducting qubit systems have therefore used two-dimensional electron gases amenable to top-down fabrication, as well as graphene flakes proximitized by evaporated aluminum. The absence of local currents results in a decrease of the power that needs to be delivered onto the qubit chip, but at the cost of reintroducing some charge noise susceptibility through the gate.

Quantum Error Correction with Superconducting Qubits

There has been tremendous progress on coherence, gate operations, and readout fidelity achieved with superconducting qubits, quantum error correction (QEC) will still be needed to realize truly large-scale quantum computers. Most QEC schemes utilize some form of
redundancy (typically, multiple qubits) to encode so-called logical qubits. A prescription for performing the encoding and for correcting errors in the encoding is referred to as an error correcting code. The threshold theorem then guarantees that for a QEC code, if the operational error-rate on the physical qubits is below a certain value, and the code is implemented in a fault-tolerant manner, then errors can be suppressed to arbitrary precision. The two-dimensional surface code is perhaps the most promising, experimentally feasible QEC code in the near term, due to its particularly lenient error rate to satisfy the threshold theorem (error rate . 1%), and because it only requires weight-four parity measurements using a nearest-neighbour coupling to four qubits.

Other Papers on Next Generation Quantum Hardware

Improving Quantum Hardware: Building New Superconducting Qubits and Couplers

Journal of Superconductivity and Novel Magnetism – Superconductor Electronics: Status and Outlook

Purdue University and Microsoft – semiconductor-superconductor combination creates a state of “topological superconductivity,” which would protect against even slight changes in a qubit’s environment that interfere with its quantum nature, a renowned problem called “decoherence.” The device is potentially scalable because of its flat “planar” surface – a platform that industry already uses in the form of silicon wafers for building classical microprocessors.

Arxiv- The electronic interface for quantum processors

A cryogenic electronic interface appears the viable solution to enable large-scale quantum computers able to address world-changing computational problems.