IonQ Has the Most Powerful Quantum Computers With 79 Trapped Ion Qubits and 160 Stored Qubits

IonQ just made a presentation on two new trapped ion quantum computers with 160 stored and 79 processing qubits. This is more qubits than the best noisy superconducting quantum computers which is currently the Google 72 Qubit Bristlecone processor.

The IonQ Website is here https://ionq.co/

The Q2B conference website is at this link and was organized by QCware.

Highlights From the Nextbigfuture Interview with Stewart Allen

* IonQ systems are at room temperature
* IonQ manipulates ions with magnets and lasers and have software control on mostly FPGA chips
* IonQ are like atomic clocks, they do not have time limiting decoherence
* IonQ can invent and make any kind of quantum gate. It is a matter of software and tuning laser pulses.
* There are no errors from the fabrication of the qubits as the qubits are ions and not Josephson junctions.
* There are no idle errors, no readout errors and no qubit lifetime problem
* They could make 100-200 qubit modules and link them with optical interconnects.
* They can have modular scaling which is not possible or practical with the superconducting systems. Superconducting quantum systems need extreme cooling. This cooling limits the size of the systems
* Superconducting systems had proposed or were using access via a remote cloud computing submission of problems.
* IonQ will have quantum systems at specialized data centers and then at regular data centers. The reason for data centers is because the quantum systems generate so much data. You get more benefit by pairing them with a lot of classical computers to process the data.
* IonQ has a path to optical networking.
* IonQ has a confident path to scaling to thousands of qubits. They believe the error rates will let them get to thousands of qubits before error correction is needed.
* Error correction means you need a thousand times more qubits to get a number of useful error corrected qubits. You would need say one billion qubits to get to one million error corrected qubits.

IonQ is ramping up their improvement at a very fast rate. They are making revolutionary performance advances every day. Their strategy is to not release a commercial product now. They will ramp up improvement faster and then release when their lead and financing is such that they can support a product without slowing development.

They have done all this with only $22 million in funding from GV (Google Ventures), NEA (New Enterprise Associates) and Amazon AWS.

Nextbigfuture First Pass Impressions

I believe they may have the most powerful computer in the world of any kind. They are not taking the time to prove it. I think they could quite rapidly reach many hundreds or thousands of qubits.

They have another system that they already have in development. All quantum computer projects have what they can announce and what is in development.

In the rapidly moving area of quantum computing, this has emerged as the most promising project.

They have not been able to fully test the error rates of their systems. The more qubits you have the longer it takes to test and prove the error rate. I believe that based upon the nature of their technology, that their error rates are very low. They have eliminated entire categories of possible error.

The nature of their rapid development is promising.

The limitations they have are mostly software not researching physics.

The room temperature aspect is huge.

The future will head to one or more quantum computer systems per data center. This will be great.

Quantum Supremacy without error correction looks really good.

IonQ does not want to commit to any dates but they are very confident. Any forecasts or predictions are from Nextbigfuture and not IonQ.

They have Google and Amazon funding. Quantum computers are strategic in the global technology competition. IonQ’s promising technology will not get limited by lack of money.

Highlights From IonQ Talk

On measures of capacity, accuracy and other key benchmarks, IonQ’s system has surpassed all the other quantum computers in the market. It has stored 160 qubits and performed operations on 79 qubits, a record. Its gate fidelity—a measure of the accuracy of logical operations—is greater than 98% for both one-qubit and two-qubit operations on average in a 13-qubit configuration. This means it can handle longer calculations than other commercial quantum computers.

The potential of the IonQ system, and the flexibility of a trapped-ion architecture, can be seen in benchmark results using the Bernstein-Vazirani Algorithm. This application tests the ability of a computer to determine an encoded number—called an oracle—when allowed only a single yes/no question. For a 10-bit oracle (a number between 0 and 1023), a conventional computer would succeed 0.2% of the time. The IonQ system ran the algorithm on all possible 10-bit oracles and observed an average success rate of 73%. That’s a better result on a more complex version of the calculation than any result yet published for a quantum computer.

However, this is just the beginning of what sounds like a path to vastly more powerful quantum computers in far shorter time than even Nextbigfuture had predicted.

Nextbigfuture interviewed Stewart Allen who is the IonQ Chief Product Officer.

IonQ Has No Wires. They Can Upgrade With Software Improvements

IonQ feels they have no more questions in physics for their systems. They believe it is all engineering and scaling up.

They have no wires. This advantage was not emphasized in the presentation.

There controls do not need deep learning or AI. They have very simple control systems because their physics is not complex.

They can roll out software improvements every day.

They will be able to work on two hardware systems at the time for rapid development.


First Applications Will be Chemistry Simulations and Optimization Problems

They will work with strategic partners, academic partners and national labs. They will work on improving the system and they will implement solutions.

The goals are to rapidly advance and build a huge lead in technology and applications.

There was a 2016 article in Scientific American on the IonQ system.

They had a picture of the chip module at that time.

Other Trapped Ion Competitors

The University of Chicago has a $15 million grant and was working to 100 qubits in four years.
Tsinghau university also had a project.
China is devoting $10 billion or more to quantum technology but this includes quantum radar for anti-stealth.

IonQ previously completed a small Grover search algorithm on an earlier system. There is no reason why they cannot apply the powerful algorithms to their systems.

Old Nextbigfuture Prediction

A prediction that nextbigfuture had for quantum computers was
100-150 qubit quantum computers in second half of 2018
200-300 qubit computers in first half of 2019
400-600 qubit computers late in 2019
800-1600 qubit computers in 2020
1600-4000 qubit computers in 2021
3000-10000 qubit computers in 2022

This could be a low side under-prediction. Previously I was concerned I might have been too confident.

I will wait for IonQ to decide on some statements and do some more work. I will then update and detail where I think this will go.

Doubling Rate of Superconducting Quantum Computers

Nextbigfuture notes that IBM and others working on quantum computers have a faster doubling rate of 7 to 16 months.

From November 2017 to March 2018 there was the announcement of IBM 50 qubit prototype, Intel’s 49 qubit test chip and Google 72 qubit processor. These processors had 10% to as low as 1% error rates. In 2017, D-Wave systems had commercial availability of its 2000 qubit quantum annealing system.

Rigetti computing has said they will have 128 qubit chip by August 2019. A recent article indicates that the 128-qubit chip is developed on a new form factor that lends itself to rapid scaling. This seems to indicate that Rigetti will be able to continue scaling at the faster end of the 7 to 16 month doubling rate after the 128 qubit chip is released.

All of the competitors will also be working to reduce the error rates to 1 in 1000 or 1 in ten thousand.

Hybrid Algorithms could vastly improve the usefulness of Noisy Quantum Systems

Rigetti Computing researchers believe their new hybrid algorithms will be very useful for making near-term quantum computers useful for machine learning.

Arxiv- Quantum Kitchen Sinks: An algorithm for machine learning on near-term quantum computers

Noisy intermediate-scale quantum computing devices are an exciting platform for the exploration of the power of near-term quantum applications. Performing nontrivial tasks in such a framework requires a fundamentally different approach than what would be used on an error-corrected quantum computer. One such approach is to use hybrid algorithms, where problems are reduced to a parameterized quantum circuit that is often optimized in a classical feedback loop. Here we described one such hybrid algorithm for machine learning tasks by building upon the classical algorithm known as random kitchen sinks. Our technique, called quantum kitchen sinks, uses quantum circuits to nonlinearly transform classical inputs into features that can then be used in a number of machine learning algorithms. We demonstrate the power and flexibility of this proposal by using it to solve binary classification problems for synthetic datasets as well as handwritten digits from the MNIST database. We can show, in particular, that small quantum circuits provide significant performance lift over standard linear classical algorithms, reducing classification error rates from 50% to less than 0.1%, and from 4.1% to 1.4% in these two examples, respectively.

Random quantum circuits can be used to transform classical data in a highly nonlinear yet flexible manner, similar to the random kitchen sinks technique from classical machine learning. These transformations, which Rigetti calls quantum kitchen sinks, can be used to enhance classical machine learning algorithms.

Future Rigetti work will focus on exploring different circuit and developing a better understanding of the performance of this technique.

Nextbigfuture’s Rough Timeline of noisy quantum computers from Google, Rigetti, IBM, Intel and others

100-150 qubit quantum computers in second half of 2018
200-300 qubit computers in first half of 2019
400-600 qubit computers late in 2019
800-1600 qubit computers in 2020
1600-4000 qubit computers in 2021
3000-10000 qubit computers in 2022

D-Wave Systems could get funding to convert their 5000 qubit quantum annealing system to low error rate qubits. They would try to get this working in 2020-2021 if the funding is provided.

The peak of this age of noisy quantum computers could be quantum computers with 1000 qubits and two-qubit errors rates less than 1 in 1000. This is Google’s near-term goal, which might be reached in 2020.

There could be utility in pushing to 10,000 qubits with two-qubit error rates less than 1 in 10000. These could arrive around 2022.

The noisy quantum computers might be better than classical computers for quantum simulation, quantum chemistry or machine learning. We will only know if they are better after they are built and tested. Scientific refinement of the qubits could be needed to get to lower error rates.

In 2025-2030, there will be the fully error-corrected quantum computers with 100,000 to millions of overall qubits but only 1 in 1000 will be used for calculations. The rest will be needed for error-correction.

13 thoughts on “IonQ Has the Most Powerful Quantum Computers With 79 Trapped Ion Qubits and 160 Stored Qubits”

  1. I will share your doubt with them and see how they respond. Obviously I do not have the means to peer review their conference presentation. Also, claims can even be made multiple times in peer reviewed articles without conclusive certainty. See the years of peer reviewed papers from D-Wave and the continued controversy

  2. As someone worked closely with similar projects and people, I would d say the 98% fidelity has been stuck for years. Although scaling ion n7mber from 13 to higher number sounds trival, the actual situation is more complicated, and not necessarily can be solved purely by engineering. I don’t think they got 70+ qubits, just 70+ions, which is relatively trival. As for the optical link, it is still a fairy tale for now.

  3. So I have originally sourced interviews. I am generating original material along with some analysis of other information. Reddit futurology says they want to “support” original material but they are clearly not.

  4. SoI get no respect or value from Reddit. They do not have an ecosystem which benefits the sites and articles that they link to. Each year they become more pure in how they parasite off of the rest of the internet.

    However, I can talk and interview people at most any technology startup. CEOs, CTOs, top scientists. I can get calls and invites from conferences. I can provide insight to USC 5th year space design. At space design, Buzz Aldrin spoke. I am working on an article about what Buzz said. I am trying to get presentation materials and other writeups. Despite his age, Buzz is still intellectually active.

    I am getting more involved in helping startups and entrepreneurs.

    I am working on more comprehensive industry and market studies of certain emerging tech areas.

    Industry reports pretty much do not have complete citations. They interview all the companies and write it up. It is not an academic thing.

    The report has business value, it will help you to decide what to do investment wise. BCG (boston consulting) purposely does not say who gave them the info. They do the work and then position themselves as the go between, you must pass through their consulting to get the scoop.

  5. I do not want to brainstorm on ways futurology subreddit can circumvent my site.

    I supported futurology and reddit years ago when they started. They built themselves up on the material from sites like mine, then they decide lets skip over all of those sites. This is their right. But they let users slander and ban me from submission or commenting or defending myself. It is annoying to me. It is their business model.

    They used material from my site a lot to build up their popularity. Then they decided that they had too many articles that got popular from a few aggregation points, they did not add enough value so then they tried to go to direct sourcing. Research papers etc… They hoped some redditors would parse it out for others.

    I had articles years ago that made the front page of all of reddit over 5 times. I had articles that were the most popular or top three on futurology a few dozen times. If what I was writing had no value or there was no value in my finding information had no value then why did this happen?

    I think it is clear that this article has the content and value to have a decent chance of making front page of reddit or make top article on futurology. People will have to come here, subscribe, find via google news.

  6. There is no flood of traffic from reddit for me. Even without bans it does not translate into money. The discussions are inferior in terms of technical debate. The fact that Reddit has rules that do not allow valid novel information from first source interviews with the executives of companies… not my problem.

    Reddit’s new system is walling off their discussion even more. You can see a good article linked on reddit, you can see the URL. Reddit is no longer helping the average internet user to search the other domain for other good stuff.

    With a link, any link on a site the chance of someone clicking through is in the 1-5% range. People rarely look at more than the 1 article. Maybe 2 and then even less 3. Looking at more articles is like how many kids you have in your family.

    I know that the futurology readers find material on my site and then skip over to what I link. Which gives me no benefit. This happens to most if not all tech sites too

    End vent-
    Nextbigfuture is not banned on other subreddits. The last time I checked a few years ago, Nextbigfuture was only considered a “bad” source in the futurology rating system. I do not follow what they do because it is of no value to me

  7. I interviewed an IonQ executive on the phone. It was directly sourced.

    allow me to vent:
    Reddit are hypocritical ripoffs.. in particular futurology. They have the rules and the bans but they are using links. No analysis or original contented added except in discussion. With new reddit you cannot even search by domain. You have to go to old reddit to see all the articles from nextbigfuture that went to spacex subreddit or economics or whatever.

    One of my articles can go up to say spacex subreddit and get a few hundred up votes. Many people can like and appreciate the article. Then a few people can slander my site. I cannot go on to refute it because of the ban.

    How did my original ban happen? I wrote an article supporting LPP Fusion crowdfunding and posted it. I got shadow banned for raising money.
    Also, for my site some of my fans upvoted several of my articles and the system accused me of gaming reddit voting. I do not have time to proof read my articles and I am going over to reddit to vote manipulate?
    Also, reddit traffic. I could get 100,000 views from reddit when I made front page back in the day. This made me less than 5,000 to 10,000 from Yahoo. cool reddit readers have ad block. people using yahoo don’t.

  8. You are the only news source for this information. A google search of “news about ionQ” with settings for the past week show only a story from next big future. There are no other search results apart from that.

    Now here is the problem. I wanted to post this to my futurology sub-reddit but as of now next big future is banned from that subreddit. So I have taken to looking in next big future articles to see if there is original material. The ban is that we can’t post “rehosted” material. We must support original sources. Which is stupid, but that is how it is I guess. I had been using NBF for years to post articles and there had never been any problems up to about 4 months ago I’d say.

    When I searched your article I did not find any recent activity from them in an original source. All sources were at least one year old. So the conclusion is that you have published an unvetted news piece.  If you have any further information please advise!

  9. They are verifying. They have benchmarked. They are writing research papers in the top journals. They have tested gate fidelity—a measure of the accuracy of
    logical operation. Gate fidelity is 99.97 percent for one-qubit operations and greater than 99% for two-qubit. They have implemented Grovers algorithm. You did see that they have large funding from Amazon and Google. There is a difference between not slowing the ramp and scaling to perform in parallel the needed levels of verification.

    They verify to confirm they are on the right track and they do heavy confirmation for the journal papers. Trapped ion has been around for decades. It was being done before the superconducting quantum computers. There are many research labs that have and still work on it.

    Trapped ion uses an atomic physical ion which inherently should have more fidelity. The thinking was trapped ion had science and engineering challenges that make ramping slower than SC

    IonQ has apparently solved the ramping, scaling and now work on engineering.

    IonQ says the number of qubits fabbed on a chip is not an [accurate and complete proxy] for performance or capability. IonQ said the power of quantum systems is only realized when you can achieve full entanglement. To do that at a bare minimum, you need 2^n entangling gates where n is the # of qubits. The corollary to that is that your effective qubit count is, at best, the square root of the number of 2-qubit entangling gates you can perform.

  10. From their frontpage, they are currently sitting at 11 entangled Qbits, which is the most important value to look at when talking about quantum computers, in my opinion.

Comments are closed.