**Quantum Supremacy vs Quantum Advantage**

In October of 2019, Google announced that they had demonstrated the ability to compute in seconds what would take the largest and most advanced supercomputers thousands of years, thereby achieving a milestone referred to as “**quantum supremacy**” for the first time. They used a processor named “Sycamore” with 54 programmable superconducting qubits to create quantum states on 53 qubits (one did not operate), corresponding to a computational state-space of 2^{53} (equivalent to about 10^{16} or over ten million-billion calculations). They achieved this using a two-dimensional array of 54 transmon qubits, where each qubit is tunably coupled to four nearest neighbors. Each transmon has two controls: a microwave drive to excite the qubit, and a magnetic flux control to tune the frequency. The claim was generally considered by many to be a “Wright Brothers Kitty Hawk” type of achievement.

And then, later that year, researchers at the University of Science and Technology of China (“USTC”) announced that they had also achieved quantum supremacy, utilizing a Quantum Computer named “Jiuzhang” which manipulates photons via a complex array of optical devices including light sources, hundreds of beam splitters, dozens of mirrors and 100 photon detectors. They claimed that their device performed calculations in 20 seconds that would take a supercomputer 600 million years. Each of Google and USTC have increased their qubit utilization since these breakthroughs and now several other companies have successfully operated Quantum Computers with dozens of qubits and a couple with 100 or more.

Let’s review some semantics regarding the measurement of Quantum Computing performance. In 2012 a leading quantum mechanics researcher named John Preskill, a professor of theoretical physics at CalTech, first coined the term “quantum supremacy” to “describe the point where quantum computers can do things that classical computers can’t, regardless of whether those tasks are useful.” He coined this term before any actual Quantum Computers had been built. At the time, Preskill was wondering, in his words, “whether controlling large-scale quantum systems was merely *really, really hard* or whether it was *ridiculously hard*. In the former case we might succeed in building large-scale quantum computers after a few decades. In the latter case we might not succeed for centuries.” In this sense, and based on Preskill’s original intent, the announcement by Google is a bona fide example of Quantum Supremacy and indicated that “a plethora of quantum technologies are likely in the next decade or so” [Preskill, 2019].

So, although the Google Sycamore quantum supremacy claim was discounted by some (most notably IBM and researchers in China), and despite it being an admittedly highly contrived and not very useful calculation, it was a ground-breaking achievement.

Before I get into the semantics of how we measure Quantum Computing power, here is what the quantum community generally means regarding quantum progress:

**Quantum Supremacy**: This term still retains Preskill’s original context and is considered the first major step to prove quantum computing is feasible. Specifically, it means: “demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.” Based on the definition, this threshold has been passed since October 2019, in fact at this point it has been shown by several different companies beyond Google and this is why I refer to the current hurdles as engineering challenges rather that theoretical ones.

**Quantum Advantage**: Refers to the demonstrated and measured success in processing a **real-world** problem faster on a Quantum Computer than on a classical computer. While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.

**How do we Measure Quantum Computing Performance?**

At the end of a prior post regarding Qubits, I alluded to the challenge of measurement metrics for Quantum Computing highlighting that the count of operating qubits is not appropriate as a yardstick. Imagine if you were shopping for a new car. If the only metric that was available was “horsepower”, it would be very difficult to decide which car to buy. By itself, horsepower is only one measure of car performance. It does not factor actual acceleration power, fuel efficiency, ride comfort, handling, noise levels, legroom, sleekness, color/trim/style, etc. Even if we are considering computers, just focusing on the clock speed, for example, would not provide enough breadth of information to make an informed purchase decision. While Quantum Computers are in their very early stages, simply measuring a particular calculation speed or the number of qubits used, is not enough to describe accurately the actual performance capabilities. Researchers at IBM have proposed the term “Quantum Volume” to enable the systematic measurement of Quantum Computing performance. It is a metric that measures the capabilities and error rates of a Quantum Computer by calculating the maximum size of square quantum circuits that can be implemented successfully. While the details are a bit esoteric, it is intended to provide one number, or score, to be used to compare incremental technology, configuration and design changes and to compare the relative power of one Quantum Computer to another.

In fact, the performance of a quantum computer involves many factors as shown below:

*Source: IBM and Forbes as adapted by Riccardo Silvestri*

Since quantum volume is not quite an industry term-of-art at this point, I won’t use it as the definitive measurement tool. However, the concept of focusing on characteristics beyond just the “number of qubits” is crucial, and I will discuss the relative performance characteristics of competing Quantum Computers beyond just a mention of the number of qubits.

While many of the balloons in the above graphic may be unfamiliar, there are three key metrics for measuring quantum computing performance:

**Scale**: The number of qubits which the computer can simultaneously process. It is important to distinguish between physical and logical qubits, with logical qubits being the key element (as I’ll show below, many constructs are adding physical qubits for error correction overhead).

**Quality**: The quality of the circuits which factors in both the time that the qubits remain in a superposition and entangled before they decohere, and the numbers of qubits that can entangle with each other.

**Speed**: Typically measured by circuit layer operations per second (or CLOPS) or how many circuits or gates can run on a Quantum Computers at a given time. While this is a strong and objective measurement, it is not generally reported at this time.

Another reason that the “number of qubits” is not useful to compare performance, is that we are currently operating in the NISQ environment (recall the “N” is for noisy). Accordingly, many constructs are being proposed where certain qubits are dedicated to error correction and not for added entanglement. IBM has a useful graphic to highlight the tradeoff between physical and logical qubits based on error rates:

**Quantum Computing Milestones**

While the semantics and various yardsticks used to describe Quantum Computer performance is confusing, evolving and not yet universally agreed upon, real progress is being made no matter which metric is showcased. Here are a few recent advances in early working Quantum Computers, although not all report the same metrics, so it is difficult to compare these to each other:

In addition to these Quantum Computers, Intel has a 49 qubit QC, Xanadu as a 24 qubit QC, and MIT has a 100 qubit QC, however the other performance metrics noted in the table are not readily available for these.

It is worth noting that USTC recently claimed that Zuchongzhi 2.1 is a million times more powerful than Google’s Sycamore, and that it is 10 million- 100 trillion times faster than the world’s fastest supercomputer. While it is difficult to substantiate these claims, given China’s enormous focus on Quantum Computing, a China-US space race of sorts is certainly afoot. Also, the Quantinuum achievement on H1, only very recently announced, is worth paying close attention to given its high quantum volume and long decoherence times.

Semantics and yardsticks aside, it is fascinating to see the increasing number of companies creating working Quantum Computers with ever increasing performance metrics, confirming that it is merely “really, really hard” to build these devises and not “ridiculously hard”. It seems like we are seeing new press releases each week showcasing quantum performance achievements by these and others in the field. Stay tuned as we track the performance.

Refereneces:

arXiv:1203.5813, “Quantum Computing and the entanglement frontier”, Preskill, John, March 26, 2012

Quanta Magazine, “Why I called it ‘Quantum Supremacy”, Preskill, John, October 2, 2019

Nature, “Quantum supremacy using a programmable superconducting processor,” Arute, Arya, Babbush, et. al., October 23, 2019

The Independent – UK, “China builds world’s fasted programmable quantum computers that outperform ‘classical’ computers,” Sankaran, Vishwam, October 31, 2021

Scorecards – *Quantum Computing Report*, Retrieved December 2021

Silvestri, Riccardo. Masters Degree Thesis: “Business Value of Quantum Computers: analyzing its business potentials and identifying needed capabilities for the healthcare industry.” August 2020