IonQ’s quantum computers are now powerful enough to demonstrate a state-of-the-art quantum algorithm from Goldman Sachs and QC Ware that could one day speed up Monte Carlo simulations. These simulations are key for problem solving in many industries, including finance, telecommunications, robotics, climate science, and drug discovery.
Arxiv -Low depth amplitude estimation on a trapped ion quantum computer (2021)
Amplitude estimation is a fundamental quantum algorithmic primitive that enables quantum computers to achieve quadratic speedups for a large class of statistical estimation problems, including Monte Carlo methods. The main drawback from the perspective of near term hardware implementations is that the amplitude estimation algorithm requires very deep quantum circuits. Recent works have succeeded in somewhat reducing the necessary resources for such algorithms, by trading off some of the speedup for lower depth circuits, but high quality qubits are still needed for demonstrating such algorithms.
Here, we report the results of an experimental demonstration of amplitude estimation on a state-of-the-art trapped ion quantum computer. The amplitude estimation algorithms were used to estimate the inner product of randomly chosen four-dimensional unit vectors, and were based on the maximum likelihood estimation (MLE) and the Chinese remainder theorem (CRT) techniques. Significant improvements in accuracy were observed for the MLE based approach when deeper quantum circuits were taken into account, including circuits with more than ninety two-qubit gates and depth sixty, achieving a mean additive estimation error on the order of 10^−2. The CRT based approach was found to provide accurate estimates for many of the data points but was less robust against noise on average. Last, we analyze two more amplitude estimation algorithms that take into account the specifics of the hardware noise to further improve the results.
The high fidelity of the quantum hardware allowed us to run oracle circuits with depths ranging up to seven which translates to four qubit circuits with more than ninety two-qubit gates and depth sixty. Similar experiments on quantum hardware available on the cloud provided considerably worse results.
The MLE based algorithms showed significant improvements in accuracy when higher depth samples were taken into account reaching errors of less than 0.014 at depth six, while the error when samples from the evaluation oracle are taken saturates to 0.053. We also developed a more sophisticated version of maximum-likelihood amplitude estimation based on a power-law schedule.
This introduced two improvements: first, the asymptotic precision improved since the power-law algorithm incorporates a noise model, albeit an imperfect one. Second, this noise floor is reached much faster in terms of oracle calls since the optimal power-law schedules spend fewer shots at costly higher depths. Note that all maximum likelihood methods can naturally accommodate any probabilistic noise model in the definition of the likelihoods.
The CRT based algorithm is more sensitive to noise and it was affected by the hardware noise as well as the correlated errors across experiments. It achieved a minimum mean error of 0.024 at depth 3, following its design precision curve, before departing from it at larger depths. A hybrid algorithm that combines small depth MLE estimates with CRT estimates achieved minimum mean error of 0.017, an improvement over the depth two MLE estimates with an average error of 0.018. With improvements in hardware fidelity and calibration the CRT based and the hybrid algorithms will become competitive with the MLE based approach.
Note that we restricted the experiments to four qubits, because our main goal was to probe the regime where the evaluation oracle is invoked a large number of times in a noisy setting, achieving up to fifteen sequential oracle invocations with still excellent results. A next step would be to establish tradeoffs between circuit depth and number of oracle calls in an experimental setting, as theoretically proved in another paper, and this may soon become feasible with further improvements in hardware.
The quantum algorithm theorized by QC Ware and Goldman Sachs for Monte Carlo simulations has now been demonstrated in practice on the latest IonQ quantum computer. Together, the teams are designing quantum algorithms intended to let firms evaluate risk and simulate prices for a variety of financial instruments at far greater speeds than today, which, if successful, could transform the way financial markets worldwide operate.
“This is a demonstration of how the combination of insightful algorithms that reduce hardware requirements and more powerful near-term quantum computers has now made it possible to start running Monte Carlo simulations,” said Iordanis Kerenidis, Head of Quantum Algorithms – International, QC Ware. “While QC Ware has designed novel practical quantum algorithms and software for enterprise implementation, IonQ has built unique hardware with quantum gates of high enough quality to run these algorithms.”
This experiment was performed on the newest generation IonQ quantum processing unit (QPU), which features an order of magnitude better performance in terms of fidelity and greatly enhanced throughput compared to previous generations. This allows for deeper circuits with many shots to be run over a significantly shorter period of time than previously possible. The combination of these features makes it possible for the first time to run algorithms of this nature. Technical details are outlined in a recently released research paper.
“To get to useful solutions in quantum computing today, we must bring together state-of-the-art quantum hardware and best-in-class quantum algorithms,” said Peter Chapman, CEO and President of IonQ. “Most people are tracking quantum hardware progress, but they often miss that quantum software is accelerating at similarly breakneck speeds. The convergence of hardware and software will enable a quantum future sooner than most think, and our work with Goldman Sachs and QC Ware is a great example of that.”
The news follows on the heels of a number of notable developments from IonQ. The company recently announced a partnership with the University of Maryland to create the National Quantum Lab at Maryland (Q-Lab), the nation’s first user facility that enables the scientific community to pursue world-leading research through hands-on access to a commercial-grade quantum computer. IonQ also debuted two breakthroughs in quantum computing that lay the foundation for increases to qubit count into the triple digits on a single chip. Finally, IonQ anticipates becoming the first publicly-traded, pure-play quantum computing company via a merger with dMY Technology Group III (NYSE: DMYI).
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
1 thought on “Faster Finance Simulations Via IonQ Quantum Computers”
Seems like finance is sufficiently entangled as is.
Comments are closed.