Quantum Volume is Not Over 9000 Yet

Honeywell has released access to its trapped ion quantum computer with a quantum volume of 64. The previous best quantum computers were IBM and Google at a quantum volume of 32. Honeywell claims to have the most powerful quantum computer now and they claim they will improve it to QV 640,000 Path by 2025 while IBM was talking QV 100,000 by 2030.

In the Japanese anime Dragon Ball Z, they used a Power Level concept which denotes the combat strength of a warrior. This through a few dozen episodes where the big enemy was Frieza. Near the beginning, it was a big deal to be over 8000 or over 9000. The main enemy Frieza starts at over 1 million and ultimately powers up to about 120 million.

Most people have no sense of what the quantum volume number means. For most people, quantum volume is as meaningless as a metric as Power Level in Dragon Ball Z.

Even for people willing to dig into quantum volume, there is a lot of confusion. It seems like algorithm and hardware improvements will enable software and hardware quantum simulations to achieve QV 4096 and possibly far beyond. This means non-quantum systems can continue to compete against quantum computers. There is no hard line where regular computers get beat by quantum systems. There is a significant race to improve the science and algorithms in all areas

Quantum volume measures the number of qubits, stability of the qubits, connectedness and several other characteristics that affect what can be solved.

The goal of creating quantum volume was to try to match up with the LINPACK benchmark for supercomputers. The LINPACK Benchmarks are a measure of a system’s floating-point computing power. They measure how fast a classical computer solves a dense n by n system of linear equations Ax = b, which is a common task in engineering.

Different researchers compared NASA’s Electra supercomputer, which is primarily powered by Intel Skylake CPUs, with ORNL’s Summit supercomputer, which is primarily powered by NVIDIA Volta GPUs. Tens of petaflop classical computers solved a 7X7 circuit which is a Quantum volume of 128.

However, there is a lot of work being done that improves how well and how efficiently regular computers are able to simulate and solve quantum problems. Improved algorithms have solved an 8X8 circuit which is like achieving a quantum volume of 256.

In 2018, Alibaba and the University of Michigan published “Classical Simulation of Intermediate-Size Quantum Circuits”.

Computing a single amplitude of an 8 × 8 qubit circuit with depth 40 was previously beyond the reach of supercomputers. Their algorithm can compute this within 2 minutes using a small portion (≈ 14% of the nodes) of the cluster.

They successfully simulating quantum supremacy circuits of sizes
9×9×40 (QV 512),
10×10×35 (QV 1024),
11 × 11 × 31 (QV2048), and
12 × 12 × 27 (QV4096).

They did that using 131072 processors and 1 petabyte of memory. They give evidence that noisy random circuits with realistic physical parameters may be simulated classically. This suggests that either harder circuits or error-correction may be vital for achieving quantum supremacy from random circuit sampling.

Members of the same group can solve the Google quantum supremacy in days or minutes. The Google team claimed a problem would take a regular supercomputer 10,000 years to solve but the Sycamore quantum processor solved it only 200 seconds. The Alibaba-Michigan team used a tensor network-based classical simulation algorithm. Using a Summit-comparable cluster, they estimate that their simulator can perform this task in less than 20 days. On moderately-sized instances, they reduce the runtime from years to minutes, running several times faster than Sycamore itself.

Quantum Computing Reports explains the problem with Quantum Volume:

1. The test is all based upon a square circuit configuration, but very few quantum programs really have a square configuration. Some of the algorithms being developed for NISQ computers such as VQE and QAOA, are wide and shallow. They use a larger number of qubits but only a few levels of gate depth. Other algorithms may use a much larger number of gate operations versus the numbers of qubits. For example, Shor’s algorithm to factor 2048 bit number using about 4100 logical qubits but needs 8.6 billion gate operations. (Note that is based upon logical or error-free qubits and not physical qubits).

2. The measurement distorts how fast power is increasing. If the square circuits were solving an office space problem. QV64 Honeywell is with 6×6 circuit solution is about 44% more valuable (36/25) than the QV32 (5×5) and not 100%.

3. The focus for anyone developing a quantum computer should be how to make it achieve quantum advantage and solve problems better than a classical computer. Since classical computers are error-free, the equivalent quantum volume for a quantum program running on a quantum simulator on a classical computer can be very high. In 2019, Google ran a quantum benchmark on the Summit supercomputer at the Oak Ridge National Lab that successfully calculated the results of a 49×40 circuit. So the equivalent QV for Summit would be 240 or about 1.1 trillion.

Qubit for Quantum Supremacy

Researchers analyzed what an exaflop supercomputer might be able to simulate.

They concluded that Instantaneous Quantum Polynomial-Time (IQP) circuits with 208 qubits and 500 gates, Quantum Approximate Optimization Algorithm (QAOA) circuits with 420 qubits and 500 constraints and boson sampling circuits (i.e. linear optical networks) with 98 photons and 500 optical elements are large enough for the task of producing samples from their output distributions up to constant multiplicative error to be intractable on current exaflop-class supercomputer technology. They additionally rule out simulations with constant additive error for IQP and QAOA circuits of the same size. Without the assumption of linearly increasing simulation time, they can make analogous statements for circuits with slightly fewer qubits but requiring ten thousand to ten million gates.

Researchers at the University of Chicago and Argonne National Laboratory used data compression techniques to fit a 61-qubit simulation of Grover’s quantum search algorithm on a large supercomputer with 0.4 percent error. Other quantum algorithms were also simulated with substantially more qubits and quantum gates than previous efforts.

Chicago has $10 million in funding for the Enabling Practical-Scale Quantum Computing (EPiQC) project. The National Science Foundation’s Expeditions in Computing program will explore the co-design of hardware and software that helps scientists realize the potential of quantum computing more rapidly. They are developing new algorithms, software and hardware designs tailored to key properties of quantum technologies capable of 100 to 1,000 quantum bits.

The Quantum Computing Report has a scorecard table that lists the qubit counts of all hardware implementations and software simulations.

In 2018, A classical simulation extracted a large amount of measurement outcomes within a short time, achieving a 64-qubit simulation of a universal random circuit of depth 22 using a 128-node cluster, and 56- and 42-qubit circuits on a single PC. They estimated that a 72-qubit circuit of depth 23 can be simulated in about 16 hours on a supercomputer.

What is a Virtual Quantum Computer?
Blackbrane has a virtual quantum computer able to simulate 128 qubits.

Simulated quantum computers – Every matrix multiplication is explicitly modeled
Emulated quantum computers – An abstraction of quantum computers with some level of implied approximation
Virtual quantum computers – The quantum circuit is transformed into an efficiently calculable data structure

SOURCES- Quantum Computer Reports, Arxiv, Honeywell
Written By Brian Wang, Nextbigfuture.com

3 thoughts on “Quantum Volume is Not Over 9000 Yet”

  1. Honeywell predicts they’ll be at 640 in a years development . . . they think they’ll increase power ten fold each year. That’s 6.400,000 by 2025.

Comments are closed.