In a prior post about Superposition and Entanglement (click here to re-read), we learned that superposition allows a qubit to have a value of not just “0” or “1” but both states at the same time, enabling simultaneous computation. Entanglement enables one qubit to share its state with other qubits enabling the information or processing capability to double with each entangled qubit. These two features of Quantum Computing, embodied by “qubits,” enable it to perform certain types of calculations substantially faster than existing computers, and underlie the vast potential of Quantum Computing. In this post I will describe how qubits are currently made and controlled.
There are mutually exclusive forces at play, which make qubit construction and manipulation exceedingly difficult, although not impossible. On the one hand, in order for qubits to be as stable as possible, they need to be immune to external forces such as temperature changes, electromagnetic radiation, vibrations, etc., so that they stay in their “state” until we need to use them. However, this makes it very difficult to manipulate them. In addition, qubits operate based on quantum mechanics, which is the physics of incredibly small objects such as individual electrons (often measured by their spin which is either “spin up” or “spin down”) or photons (measured by their polarization which is either horizontal or vertical). Controlling an individual electron or photon adds another layer of difficulty to the mix due to their extremely small scale.
When the bits created for classical computing were first created, there were several different transistor designs developed before the industry settled on MOFSET (metal-oxide-semiconductor field-effect transistor). Similarly, today there are many ways to create a qubit. The following is a brief overview of some of the more common types:
Superconducting Qubits: Some leading Quantum Computing firms including Google and IBM are using superconducting transmons (an abbreviation derived from “transmission line shunted plasma oscillation qubit”) as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule). The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop. A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap. Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable. However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.
Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit. Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge. Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons. These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras. Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing. Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity.
Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.
Diamond Vacancies: There is a well-know defect that can be manufactured into artificial diamonds, which leaves a nitrogen-vacancy inside the diamond which is filled by a single electron. The spin of this electron can then be manipulated and measured with laser light. This technology can operate at room temperature and ambient pressure, which are extremely positive attributes, although they have so far proven very difficult to scale to large numbers of qubits.
Topological Qubits: Quasiparticles can be observed in the behavior of electrons channeled through semi-conductor structures. Braided paths can encode quantum information via electron fractionalization and/or ground-state degeneracy which can be manipulated via magnetic fields. While this form of qubit is only theoretical at this point, it is being pursued by some large players including Microsoft.
There are a few others, including Neutral Atoms, Nuclear Magnetic Resonance (which seems more experimental but very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed. Hopefully this provides a high-level flavor for the various types of qubits. The good news is that many entities have created, manipulated and measured qubits and often there has been success in controlling them into superpositions and in entangling a limited but growing number of qubits at a time.
The following table summarizes some of the benefits and challenges along with selected current proponents of key qubit technologies currently in use:
The “Noisy Intermediate-Scale Quantum” (or “NISQ”) Envirnonment
In prior posts I have covered how qubits use superposition and entanglement to empower massive processing speed for certain applications. However, the technical and manufacturing challenges noted above regarding various qubit types, has prohibited the construction of a very large and non-error-prone system. There are many competing strategies for creating qubits, each with a different set of advantages and challenges.
In order to have a Quantum Computer that can exhibit supremacy to a classical computer, it is estimated that we need at least ~100 “logical” qubits, meaning 100 qubits that maintain their fidelity and coherence for as long as needed to perform a desired analysis. However, as noted above, qubits are unstable, are easily affected by environmental factors, and are difficult to get to remain entangled. These challenges are generally referred to as “noise”, hence the “N” in NISQ. One way to solve for this “noise” is to allocate additional qubits to check on or correct the target qubit. Currently, it is thought that as many as 1,000 “physical” qubits are required to ensure stable utilization of 1 “logical” qubit and many firms are focusing exclusively on the quantum error correction schemes to address this challenge. Therefore, in order to create a Quantum Computer with 100 logical qubits in the NISQ phase of quantum computing, 100,000 – 1,000,000 physical qubits are being targeted. To date, the most entangled qubits reported are still measured in the 100’s so there is a long way to go. That said, this is now an engineering challenge more than a theoretical challenge, and many of the companies noted in this blog have announced product roadmaps to reach 1,000,000 active physical qubits in the next five years or so.
An alternative or competing framework is to create error-correcting qubits. Today’s transistors have error correction built in, so they operate at extremely high accuracy rates. The hope is that a method of qubit construction can be created that can self-correct, obviating the need for the massive error-correction overhead noted above in the 1,000:1 ratio of physical to logical qubits.
How do we measure Qubit Performance?
Unfortunately, there is no common and agreed upon set of metrics to allow apples-to-apples comparisons among various Quantum Computing configurations. A few important measurement factors include the number of operations that can be performed before error, the gate fidelities, and the gate speeds. IBM has proposed a “Quantum Volume” construct, intended to provide a single-number metric which factors in several key items in order to quantify the largest random circuit of equal width and depth that the Quantum Computer successfully implements. While the approach of creating a single and agreed upon metric has broad interest, not everyone agrees with the IBM methodology, so a universal standard is still not available. In the meantime, two great resources for tracking and comparing qubit/Quantum Computer performance metrics inculde: Quantum Computing Report and Fact Based Insight. I’ve provided hyperlinks to their qubit dashboards.
So, in order to have objective assessments of various Quantum Computer performance metrics, it is important to acknowledge the various attributes desired and take a wholistic approach to such performance announcements.
Images from Science, C. Bickel, December 2016 and New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010
“What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?”, Bobier, Langione, Tao and Gourevitch, BCG, July 2021
Fact Based Insight, Accessed December 2021
7 Primary Qubit Technologies for Quantum Computing, Dr. Amit Ray, December 10, 2018
Inside the race to build the best quantum computer on Earth, Gideon Lichfield, MIT Technology Review, February 26, 2020