A Quantum Computing Glossary

Hopefully many of you have been following this blog since it began and are familiar with the terms highlighted below.  For some of you, a refresher for reference may be helpful.  For others, this may all be very overwhelming and confusing so I hope this guide will clarify things for you.  I’ve curated this list to provide a broad set of definitions that should help frame the Quantum Computing (QC) potential, and for ease of reference as you come across terms where a definitional reminder would be helpful.  In the first post in this series, I introduced QC with the following word-cloud graphic:

While not every word in this cloud bears defining in this post, I hope many of these definitions help you in your efforts to understand and appreciate QC, and I have grouped them into silos to add context (although some may naturally apply to more than one silo).  This is not intended to be complete list, and it’s likely that more definitions will need to be added over time, but this should provide a good grounding in the general nomenclature and principles.

Quantum Concepts

  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated. 
  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.Dirac Notation: Symbolic representation of quantum states via linear algebra, also called bra-ket notation.  The bra portion represents a row vector and the ket portion represents a column vector.  While a general understanding of QC does not necessarily require familiarity with linear algebra or these notations, it is fundamental to a deeper working knowledge.
  • Quantum Supremacy: Demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.  Based on this definition, the threshold was passed in October 2019.
  • Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.
  • Collapse: The phenomenon that occurs upon measurement of a quantum system where the system reverts to a single observable state.  Said another way, after a qubit is put into a superposition, upon measurement it collapses to either a 1 or 0.
  • Bloch Sphere: a geometrical representation of the state space of a qubit, named after the physicist Felix Bloch.  The Bloch Sphere provides the following interpretation: the poles represent classical bits, and we use the notation |0⟩ and |1⟩. However, while these are the only possible states for the classical bit representation, quantum bits cover the whole sphere. Thus, there is much more information involved in the quantum bits, and the Bloch sphere depicts this.
  • Schrodinger’s Cat: A quantum mechanics construct or thought experiment that illustrates the paradox of superposition wherein the cat may be considered both alive and dead (until the box is opened and its status is then known for certain).  This “both alive and dead” concept often confuses early students of quantum mechanics.
  • Heisenberg Uncertainty: (also known as Heisenberg’s uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the position and momentum of a particle can be known based on its starting parameters.  Generally, the more precise the position location is, the less precise the momentum can be described, and vice versa.  This also confuses early students of quantum mechanics who are used to typical physics where speed and position are usually well known by observation.

Hardware/Physical Components

  • Qubit: Also known as a quantum bit, a qubit is the basic building block of a quantum computer. In addition to the conventional—binary—states of 0 or 1, it can also assume a superposition of the two values.  There are several different ways that qubits can be created with no clear candidate emerging as the definitive method.
  • Auxiliary Qubit:  Unfortunately, there is no such thing as quantum-RAM so it is difficult for QC’s to store information for extended periods of time.  An “Auxiliary Qubit” serves as a temporary memory for a quantum computer and is allocated and de-allocated as needed (also referred to as an ancilla).
  • Cryogenics: Operating at extremely cold temperatures, generally meant to be less than -153 Celsius, or in the case of QC, -180 Celsius.  Cryogenics are of particular interest for QC when applied to silicon-based semiconductors because at this temperature, such semiconductors operate with superconductivity (i.e., the electrons flow with no loss to resistance).
  • Dilution Refrigerator: Used in superconducting qubits and often with quantum dots, whereby a series of physical levels (typically 7) are sequentially chilled to the lowest level, where the qubits operate.
  • High Performing Computer (HPC): Sometimes also referred to as a “supercomputer” is generally meant to represent any ultra-high performing classical computer.  Powerful gaming PCs operate at 3 GHz (i.e., 3 billion calculations per second) while HPC’s operate at quadrillions of calculations per second.  Despite this blazing speed, there are many problems that HPC’s cannot perform in a reasonable about of time, but theoretically can be done with a QC in a very short amount of time. 
  • Quantum Annealer: Annealing is used to harden iron, where the temperature is raised so the molecular speed increases and strong bonds are formed.  The iron is then slowly cooled which reinforces these new bonds, a process called “annealing” in metallurgy. Quantum annealing works in a similar way, where the temperature is replaced by energy and the lowest energy state, the global minimum, is found via annealing.  Quantum annealing is a quantum computing method used to find the optimal solution of problems involving many solutions, by taking advantage of properties specific to quantum physics.   Since there are no “Gates”, the mechanics of annealing are less daunting than full blown QC, although the outputs are less refined and precise than they would be under a full gate-based QC. 
  • Quantum Dot: Quantum dots are effectively “artificial atoms.” They are nanocrystals of semiconductor wherein an electron-hole pair can be trapped. The nanometer size is comparable to the wavelength of light and so, just like in an atom, the electron can occupy discrete energy levels. The dots can be confined in a photonic crystal cavity, where they can be probed with laser light.
  • Quantum Sensor: Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.  Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds.  Quantum illumination is similar and can be used in quantum radar.

Computing Operations

  • Gate: A basic operation on quantum bits and the quantum analogue to a conventional logic gate. Unlike conventional logic gates, quantum gates are reversible. Quantum algorithms are constructed from sequences of quantum gates.
  • Hadamard Gate: The Hadamard operation acts on a single qubit and puts it in an even superposition (i.e., turns and spins the qubit so the poles face left and right instead of up and down).  It is a universal gate operation which establishes superposition.
  • Fault Tolerance: technical noise in electronics, lasers, and other components of quantum computers lead to small imperfections in every single computing operation. These small errors ultimately lead to erroneous computation results. Such errors can be countered by encoding one logical qubit redundantly into multiple physical qubits. The required number of redundant physical qubits depends on the amount of technical noise in the system. For superconducting qubits, experts expect that about 1,000 physical qubits are required to encode one logical qubit. For trapped ions, due to their lower noise levels, only a few dozens of physical qubits are required. Systems in which these errors are corrected are fault tolerant.
  • Measurement: the act of observing a quantum state. This observation will yield classical information, but the measurement process will change the quantum state. For instance, if the state is in superposition, this measurement will ‘collapse’ it into a classical state of 1 or 0. Before a measurement is done, there is no way of knowing what the outcome will be.
  • NISQ: Noisy intermediate-scale quantum, coined by John Preskill in 2017, meant to depict the current state of QC whereby qubits suffer from noise and rapid decoherence.  It generally means the establishment of 50-100 logical qubits (the “intermediate-scale” portion of the definition, which would require 100,000 – 1,000,000 physical qubits with the balance of the qubits dedicated to noise reduction).
  • Noise: In QC, noise is anything which impacts a qubit in an undesirable way, namely electromagnetic charges, gravity or temperature fluctuations, mechanical vibrations, voltage changes, scattered photons, etc.  Because of the precise nature of qubits, such noise is nearly impossible to prevent and requires substantial error-correction (to correct for the noise) in order to allow the qubits to perform desired calculations.
  • Quantum Algorithm: An algorithm is a collection of instructions that allows you to compute a function, for instance the square of a number. A quantum algorithm is exactly the same thing, but the instructions also allow superpositions to be made and entanglement to be created. This allows quantum algorithms to do certain things that cannot be done efficiently with regular algorithms.
  • Quantum Development Kit (QDK): A number of providers offer different types of QDK’s including some that are proprietary and others that are open source.  It generally contains the programming language for quantum computing along with various libraries, samples and tutorials.  QDK’s are available from the following companies (with their QDK name in parentheses): D-Wave (Ocean), Rigetti (Forest), IBM (Qiskit), Google (Cirq), Microsoft (Microsoft QDK), Zapata (Orquestra), 1Qbit (1Qbit SDK), Amazon (Braket), ETH Zurich (ProjectQ), Xanadu (Strawberry Fields) and Riverlane (Anian).
  • Quantum Error Correction: The environment can disturb the computational state of qubits, thereby causing information loss. Quantum error correction combats this loss by taking the computational state of the system and spreading it out over an entangled state using many qubits. This entanglement allows observers to identify and remedy disturbances without observing the computational state itself, which would collapse it.  However, many 100’s or 1000’s of error correcting qubits are required for each logical qubit.
  • Speedup: The improvement in speed for a problem solved by a quantum algorithm compared to running the same problem through a conventional algorithm on conventional hardware.
  • Coherence/Decoherence: Coherence is the ability of a qubit to maintain its state over time.  Decoherence generally occurs when the quantum system exchanges energy with its environment, typically from gravity, electromagnetism, temperature fluctuation or other physical inputs (see “Noise”).  Longer coherence times generally enable more computations and therefore more computational power for QC.
  • No Cloning Theorem: The no-cloning principle is a fundamental property of quantum mechanics which states that, given a quantum state, there is no reliable way of producing extra copies of that state. This means that information encoded in quantum states is unique. This is sometimes annoying, such as when we want to protect quantum information from outside influences, but it is also sometimes especially useful, such as when we want to communicate securely with someone else.
  • Oracle: A subroutine that provides data-dependent information to a quantum algorithm at runtime.  It is often used in the context of “how many questions must be asked before an answer can be given” in order to confirm or establish quantum advantage.


  • Quantum Cloud: Access to Quantum Computers via a cloud-based provider.  Some prominent firms currently offering such access includes IBM, Amazon, Google, and Microsoft, among others.  Two benefits of such QC access included lower up-front costs (users do not need to buy any hardware) and futureproofing (i.e., as the QC makers create more powerful machines, cloud access can be directed to the newer machines without any added investment required by the users).
  • Quantum Communication: A method of communication that leverages certain features of quantum mechanics to ensure security.  Specifically, once a given qubit is “observed” or measured, it collapses to either a “1” or a “0”. Therefore, if anyone intercepts or reads a secure quantum message, the code will have changed such that the sender and receiver can see the impact of the breach.  QKD or quantum key distribution is an existing technology that is already in use over fiber optics, certain line-of-sight transmissions, and recently by China via a special satellite, between Beijing and Austria.
  • Shor’s Algorithm:  An integer factorization algorithm written in 1994 by American mathematician Peter Shor.  It is open-sourced and currently available to anyone to use to break RSA encryption or other protocols relying on the difficulty of factoring large numbers.  For this reason, it is often cited as a clear example of the need and desire for a powerful enough QC to run the algorithm.  No QCs are yet powerful enough to use this algorithm to circumvent RSA or related encryption, but that will change at some point in the coming years.  “Post-Quantum” encryption is generally meant as a protocol that would not be vulnerable to Shor’s Algorithm.
  • Grover’s Algorithm:  Another open-source algorithm already written, intended for search optimization.  For most current computer searches, the target samples must either be processed one at a time until the desired result is found, or the data must be organized (i.e., put in numerical or alphabetical order) to be searched more efficiently.  Grover’s algorithm can simultaneously search much of the entire field (depending on the power of the QC) and therefore find results much faster.  Shor’s and Grover’s algorithms are often the first two algorithms cited when discussing quantum supremacy and are elegant examples of the speedup that QC’s can provide.

I hope this glossary is a useful companion for your journey in understanding and appreciating Quantum Computing.  Feedback is always invited.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


Quantum Computing: Progress and Prospects, The National Academies Press, 2019

Azure Quantum Glossary, Microsoft.com, accessed January 22, 2022

The Rise of Quantum Computing, McKinsey & Company, December 14, 2021

Glossary, Dotquantum.io, accessed January 22, 2022

Dilmegani, Cem, Quantum Computing Programming Languages, AI Multiple, published April 11, 2021, updated January 4, 2022.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Ten Fundamental Facts About Quantum Computing

The Quantum Leap

January 17, 2022

I’ve covered some of the key aspects of Quantum Computing in prior posts, including details about things like qubits, superposition, and entanglement.  I thought it would be helpful to readers to now synthesize and consolidate some of the fundamental properties of Quantum Computing in order to provide a bigger picture of the promise and potential of the industry. 

However, I want to be on alert for overblown claims or statements disconnecting fact from reality.  Some speak of a “Quantum Winter” where the hype gets overblown, and people get fed up with the promise and divert their attention (and resources) elsewhere, such as the case with nuclear fusion as a power source.  So, I will be careful to be as fact-based as possible.  As with all these posts, I hope that readers without any formal physics or computer science training can still appreciate and understand the information presented.  Feedback is always welcomed and encouraged.

  1. What is a Quantum Computer?

Quantum Computers (QCs) use incredibly tiny particles (e.g., atoms, ions, or photons) to process information.  The physics that govern the behavior of particles at this minute size scale is quite different from the physics we experience in our much larger “people-scale”.  QCs control and manipulate the individual particles as “qubits” which hold and process information analogous to how “bits” control our computers and electronic devices.  However, the quantum mechanics at work at this scale allow QCs to process more information much more quickly than ordinary computers.  Also, because of the different physics at play, different questions can be processed, and physical systems can be more accurately modeled, suggesting significant new advances as the machines continue to scale in size and power.  The following table highlights some of the differences between existing digital/classical computers and QCs:

Think about these differences as enabling a Quantum Computer to do more per step, which is another way of saying it can process information faster than a classical computer. As it turns out, this speed advantage is phenomenal, which is why there is such enormous potential for Quantum Computers.  See here for a prior post with additional details.

  1. How are Qubits made?

There are several different ways people are creating and manipulating qubits, each with an array of strengths and weaknesses. The overarching challenge for each method is the desire to maintain a constant environment for the qubit, shielding from light, electromagnetism, temperature fluctuations, etc., (i.e., non-disturbed) while at the same time maintaining exquisite control of the qubit.  Any tiny disturbance in the environment can throw off the qubit and create “noise” in the calculations.  On top of this, is the challenge of achieving precise control of such tiny elements, often in a cryogenic environment.  The power of the qubits resides in the ability to manipulate or rotate them very precisely.  This is a difficult engineering requirement that is increasingly being met by the players in the industry.  While there are a growing number of methods of creating and controlling qubits, here are some of the most common:

  • Superconducting Qubits:  Some leading QC players including Google and IBM are using superconducting, achieved at near absolute-zero temperatures, to control and measure electrons.  While there are a few different ways these qubits are created (charge, flux, or phase qubits) it generally utilizes a microwave resonator to excite an electron as it oscillates around a loop which contains a tiny gap, and measures how the electron crosses that gap.Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperatures adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to this low temperature environment.
  • Trapped Ions:  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
  • Photonic QubitsPhotons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 
  • Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium, or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

The following table highlights some of the features of these strategies along with companies currently working on QCs with these qubits.  See here for a prior post which provides added details.

  1. What is Superposition and Entanglement?

Nearly every introduction to Quantum Computing includes an explanation of Superposition and Entanglement, because these are the properties that enable qubits to contain and process so much more information than digital computing bits and enable the phenomenal speed-up in calculations.  While these are profound properties that are difficult to conceptualize with our common frame-of-reference on the macro-scale world, they are well established quantum physical properties. 

  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a qubit can be in a superposition of both at the same time, which imparts more information per processing unit than a classical bit. 
  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated.  It is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (n qubits = 2n bits).  The following table showcases this feature:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.  See here for more details on superposition and entanglement.

However, the sobering reality is that this chart assumes the qubits can be perfectly controlled for the duration of the calculations and that all the qubits can entangle with each other.  We are still quite far away from being able to achieve these parameters at a meaningful scale, although progress and advances are being made continuously.  The other key to understanding and appreciating this, is to distinguish between “logical qubits” which this table describes, versus “physical qubits”.  You may hear of companies using quantum computers with over 1,000 qubits but in the current NISQ (noisy intermediate-stage quantum) environment, many of the physical qubits are dedicated to error-correction as opposed to logic/calculations and often the qubits lose their superposition or entanglement properties (decoherence) very quickly, before the algorithms can be completed.  So, discussions about the number of qubits in a given quantum computer need to have the proper context to understand the computing power implications.

  1. Is the Power of Quantum Computers Magical?

You may be hearing claims of phenomenal powers of Quantum Computers (including from yours truly) along with descriptions of “quantum” as doing things surreal or supernatural (e.g., Schrodinger’s cat being both alive and dead).  Features of Superposition and Entanglement are very difficult for a lay person to understand or appreciate, let alone believe it can be used for computing purposes.  Some even describe quantum mechanics as “magical.”  Most people, when they think of magic, conjure up parlor tricks or optical illusions, so it would be natural to doubt the veracity of the claims of QC.  This, combined with the fact that nobody has created a QC that can perform real-world useful computations (yet) that can’t be performed on a classical computer.  However, while the underlying mathematics are advanced, there is clear and agreed science concerning the construction and performance of Quantum Computers.  The mathematical principals of manipulating qubits and using them to create logic gates are based on well-established linear algebra and trigonometry.  Innumerable quantum algorithms are being written and will perform useful and important calculations once quantum machines scale to match the required power needed.  At this point, it is difficult to predict precisely when such scale will be achieved, but those in the field will confirm that this is an engineering challenge not a theoretical challenge.

  1. I Hear True Quantum Computing may be Decades Away.  Is that True?

This is very difficult to answer with precision.  My first “computer”, bought in 1980, was a Sinclair ZX80 with only 8k of memory, a puny amount compared to today’s PC’s.  It certainly could not perform any applications or calculations that were of practical use at the time, although I was able to write some very basic code (ironically “Basic” was also the software language it used).  But I could truthfully and accurately say in 1980 that I was using a personal computer to execute commands.  A similar statement can currently be made by users of existing QC’s and many people are using cloud-based Quantum Computers today to run simple algorithms. While they are not yet capable of performing calculations that ordinary computers can’t perform, it is a dynamic and evolving situation.

At the same time, companies like D-Wave have “quantum” computers that use annealing, which leverages certain aspects of quantum mechanics, but cannot yet perform typical gate functions.  They have many customers performing useful optimization calculations today, although not full-fledged QCs in the typical sense.

While there are no crystal balls, there are several high-profile quantum computing companies publishing their development timelines, which generally suggest a large-scale product (i.e., more than 100 logical qubits) before the end of this decade.   See below for IBM, Honeywell (Quantinuum) and IonQ versions: 

Many predict consistent Quantum Advantage (when Quantum Computers can consistently perform real-world calculations) in the next 5-10 years.  The key thing to follow as the industry advances, will be to monitor which players are successful in meeting their timeline milestones.  As more and more companies achieve important stated milestones, this timeline should become more precise.

  1. Can We Measure Quantum Computing Power?

Unfortunately, there is no universally recognized measurement standard for the power of a Quantum Computer.  There are several characteristics that are important including the number of qubits, the fidelity of the qubits, the length of time entanglement can be sustained, the numbers of gates that can be utilized, the numbers of connections between qubits that can be controlled, etc.  Recently, IBM proposed a metric called “quantum volume” which is intended to consolidate many of these features although not all players are utilizing this standard.  Barring any established metric, be careful to understand and appreciate the claims made by Quantum Computing companies realizing that the power of the computer is not necessarily directly correlated to the numbers of qubits it uses.   See here, for a prior post which covered performance measurement.

  1. Are People Really Using Quantum Computers?

This is a bit of a trick question.  The truth is that dozens of providers have made actual working Quantum Computers available for use via the Cloud.  Some basic machines are available for no charge, some are available free for academic use, and some can be utilized for a modest cost.  You could finish reading this article, and assuming you were familiar with basic Python programming, download a development kit from IBM (Quiskit), Microsoft (Q#), Google (Cirq), Amazon (AWS Bracket) or others, and begin writing quantum algorithms, and then establish an account with one of the QC cloud providers and either wait in the queue for your turn on a given machine, or acquire time to have the algorithm run on one of dozens of machines available remotely.

A recent study by Zapata Computing revealed that many companies are also using or planning to use QC in their businesses.  Specifically, the study indicates that “69% of enterprises across the globe reveal they have adopted or are planning to adopt QC in the next year,” with those already having adopted some form of QC amounting to 29% of their survey respondents.  In addition, you may read of many companies using Quantum Computers today to begin various optimization analyses.  The following highlights some of the companies currently exploring QCs for various business applications:

  1. Where will Quantum Computing Provide Early Impact?

The superposition and entanglement of qubits enables QCs to evaluate many dataset items simultaneously instead of linearly, hence the tremendous speed-up in processing.   One area where QCs can use these speedup features to provide a quantum advantage is in the ability to process currently unmanageable combinatorial problems (simulation and/or optimization).  To visualize this, consider that a simple seating chart for 16 people involves over 20 trillion possible configurations [see here for prior post describing this in more detail].  Imagine the complexity of trying to design new chemicals or materials or medicines or optimized financial portfolios.  The numbers of atoms, chemical bonds, or securities involved makes computer simulations practically impossible with existing classical computers, and the trial-and-error of experimentation is costly and time consuming.   Therefore, problems involving combinatorics are the likely first uses of QCs.  The following table highlights some of these use cases:

  1. Are My Bitcoin Porfolio or Encrypted Bank Transactions Vulnerable to Quantum Attack?

The short answer is, not really.   While it is theoretically true that a powerful enough Quantum Computer could mine all remaining cryptocurrency and break standard RSA encryption (used for most secure messages and transactions communicated over the Web), this is a well-known issue that is seeing substantial remedial attention.  NIST (the National Institute of Science and Technology), a government entity which oversees certain standards and measurements, is in the final round of approving candidates to deploy a post-quantum cryptography standard.  There are four Round 3 finalists with Public-Key Encryption and Key-Established Algorithms, and three Round 3 finalists with Digital Signature Algorithms, so new approved protocols which are “quantum safe” are imminent.  In addition, there are other ways to secure on-line transactions besides RSA encryption, such as two-factor authentication, so more and more users are establishing enhanced protections.  As for bitcoin, that is a bit more nuanced.  Since most cryptocurrencies rely on increasingly complex mathematics for the mining of new coins, there is a finite number or bitcoins that can be created, and with existing computing power, it is anticipated that the discovery, or mining, of new coins will continually take longer and longer until it reaches its final amount (estimated at ~100 years at the current pace).   So, if quantum computers are built which can mine faster, this end date may be accelerated, but the total number of possible bitcoins won’t change.

  1.  How can I Learn More?

There are many excellent resources available including articles, papers, on-line tutorials, books, and other resources.  Please sign up to receive this blog as new posts are written and/or visit this section of the Quantum Leap blog for links to some additional resources.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


IBM’s roadmap for scaling quantum technology | IBM Research Blog, retrieved January 16, 2022.

Scaling IonQ’s Quantum Computers: The Roadmap, retrieved January 16, 2022.

Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When “If” Turns to “When” in Quantum Computing”, Boston Consulting Group, July 2021.

Harnessing the Power of Quantum Computing | Honeywell Beyond 2021, accessed January 9, 2021

Starting the Quantum Incubation Journey with Business Experiments”, Digitale Welt Magazine, accessed January 16, 2022

The First Annual Report on Enterprise Quantum Computing Adoption, Zapata Computing, July 5, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
Russ Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantinuum – Company Evaluation

The Quantum Leap

January 9, 2022

When I established this blog in November of last year, I noted that I would present posts regarding details underlying Quantum Computers (QC), the immense potential they hold, and advances being made.  I hope you have enjoyed those posts which will continue (see here for a link to prior posts), but I also stated an intention to reflect on current events, companies and breakthroughs.  I thought it fitting that Quantinuum be the first company profile presented in this series. 


IIn June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to merge with Honeywell Quantum Solutions (HQS), a division of Honeywell. Then in November of 2021, Honeywell spun out the combined businesses into a new stand-alone company called “Quantinuum”. In addition, Honeywell invested $300m into Quantinuum which is now 54% owned by Honeywell and 46% by CQC shareholders.

CQC, founded in 2014, is a global Quantum Computing software company which designs enterprise applications in the areas of quantum chemistry, machine learning and cybersecurity, among others.  Honeywell is a Fortune 100 multinational conglomerate with operations in aerospace, building technologies, performance materials, and safety and productivity solutions.  Its diverse industrial footprint included expertise in cryogenic systems, ultra-high vacuum systems, photonics, RF (radio-frequency) magnetic systems, and ultra-high precision control systems, all of which turned out to be extremely well suited for building a quantum computer.  In ~2010 Honeywell Quantum Solutions was secretly formed, reached some critical technical milestones in 2015 and was publicly disclosed in 2018. By 2020 HQS released the “Model H1”, a modest 10-qubit trapped-ion QC and it has been on an aggressive timetable for scaling up its QC portfolio, recently showcasing the achievement of quantum volume of 2,048 using its 12 qubit Model H1-2 which was a 10x increase in quantum volume in less than one year.

Details on Honeywell Quantum Solutions

Leveraging its 130 years of innovation including strengths in science, engineering and research, Honeywell has developed trapped-ion quantum computers using individual, charged atoms (ions) to hold quantum information. Their system uses electromagnetic fields to hold (trap) each ion so it can be manipulated and encoded using microwave signals and lasers.  These trapped-ion qubits can be uniformly manufactured and controlled more easily compared to alternative qubit technologies that do not directly use atoms and it does not require cryogenic cooling (although an ultra-high vacuum environment is required).

In October of 2020, HQS introduced its first quantum computer, the System Model H1, which featured 10 fully connected qubits and a quantum volume of 128, which was the highest reported at the time (surpassing IBM’s prior record of 64).  By this past December, the Model H1-2 successfully passed the quantum volume benchmark of 2,048, a new global record and consistent with the Company’s stated timeline of annual 10x increases in quantum volume.  The hardware roadmap includes four key milestones to be achieved before the end of the current decade:

  1. Model H1: Creation of a linear device with 10 computational qubits [achieved], eventually scaling to 40 qubits.
  2. Model H2: Using the same lasers to perform operations on two sides of a racetrack configuration.  Once achieved, quantum volume should exceed that possible with classical computers (i.e., will not be able to be simulated on classical machines).
  3. Model H3: Geometries change to a grid, which will be much more scalable than linear or racetrack configurations.
  4. Model H4: Aim to integrate optics via photonic devices that allow laser sources to be an integrated circuit. 

The following chart showcases the planed roadmap:

Source: Honeywell

Details on Cambridge Quantum Computing

The team at CQC has been developing the theoretical foundations of Quantum Computing for over 25 years.  They design, engineer and deploy algorithms and enterprise level applications leveraging TKET, their hardware-agnostic software development platform, along with other technologies.  They have developed application specific quantum software across a number of fields including quantum chemistry, quantum artificial intelligence and quantum cybersecurity.  Here is a brief overview of their products and solutions:

TKET: is a leading open-source development toolkit that enables optimization and manipulation of quantum circuits for current quantum computers.  As a platform-agnostic tool, TKET can integrate with most commercially available quantum hardware platforms including IBM, Honeywell, Google, IonQ and others, as well as third-party quantum programming tools including Cirq, Qiskit and Pennylane.

Quantum Origin: is an industry-defining, cryptographic key generation platform that employs quantum computers to generate quantum-enhanced cryptographic keys.  Using the Quantum Origin platform, both classical algorithms (e.g., RSA or AES) and post-quantum algorithms (e.g., CRYSTALS-Dilithium and Falcon) can be seeded to provide cryptographic keys that offer superior protection against even the most powerful of adversaries (see more on Quantum Origin below in the Strangeworks Collaboration section).

QNLP: The rapidly emerging field of Quantum Natural Language Processing, and its underlying theoretical foundations, has been pioneered by the team at CQC. lambeq is the world’s first software toolkit for QNLP capable of converting sentences into a quantum circuit. It is designed to accelerate the development of practical, real-world QNLP applications, such as automated dialogue, text mining, language translation, text-to-speech, language generation and bioinformatics. Their structural approach takes advantage of mathematical analogies between theoretical linguistics and quantum theory to design “quantum native” NLP pipelines. Combined with advances in Quantum Machine Learning (QML), CQC has successfully trained quantum computers to perform elementary text classification and question-answering tasks, paving the way for more scalable intelligence systems.

Quantum Artificial Intelligence: (QAI) is one of the most promising and broadly impactful application areas of quantum computing. CQC is simultaneously pioneering the highly interconnected areas of quantum machine learning, quantum natural language processing, quantum deep learning, combinatorial optimization and sampling (i.e., Monte Carlo simulations) to build intelligence systems of the future.

QA: The Quantum Algorithms division is seeking to realize definitive and unequivocal quantum computational advantage as soon as possible. Although ultimately interested in all quantum algorithms, at present, the focus is on three problems which show promise for early quantum advantage, including Monte Carlo estimation, optimization and solving Partial Differential Equations (PDEs).

QML: The Quantum Machine Learning division, in collaboration with industrial, academic and governmental partners, designs and engineers novel, application-motivated Quantum Machine Learning algorithms across industries such as finance, healthcare, pharma, energy and logistics.

EUMEN: Currently in advanced beta testing, EUMEN is an enterprise-grade quantum computational chemistry package and development ecosystem, enabling a new era of molecular and materials simulations. Developed in close collaboration with Fortune 500 partners, EUMEN’s modular workflow enables both computational chemists and quantum algorithm developers to easily mix and match the latest quantum algorithms with advanced subroutines and error mitigation techniques to obtain best-in-class results. Current applications in development with clients include new material discovery for carbon sequestration, drug design and discovery, and hydrogen storage.

The Combined Companies as Quantinuum

Quantinuum has the benefit of CQC’s software and algorithm expertise combined with HQS’s hardware expertise, creating the largest full-stack dedicated quantum computer company.  Quantinuum has about 400 employees in 7 offices in the US, UK and Japan.  On the hardware side, the Model H series of quantum computers are available via the cloud, facilitating broad access and ensuring it is “future-proof” for customers as the product evolves and advances.  On the software side, the open-source platform-agnostic approach will continue, ensuring customers always have access to the best tools for the target application and will not be dependent on a single company’s machines.

The predecessor companies had a long history of collaboration.  In fact, CQC was the first external user to run a quantum circuit on the System Model H0, Honeywell’s inaugural commercial system.  No organization outside of Honeywell had used the H-Series hardware more than CQC, so the formal combination of the businesses seems like a natural extension of their legacy collaborations.  By spinning the business out into a stand-alone company, you can expect to see a Quantinuum IPO some time this year.

Strangeworks Collaboration

“Quantum Origin” is the first commercially available product based on verifiable quantum randomness, a capability essential to securing existing security software and to protect enterprise systems from threats posed by quantum computing-based attacks.  Just this past week, Strangeworks, a global quantum computing software company, announced a collaboration to implement Quantinuum’ s quantum-enhanced cryptographic keys into the Strangeworks ecosystem.  By implementing Quantum Origin, Strangeworks will be the first to implement a seamless path to quantum-generated cryptographic keys and it expects to expand the relationship between the parties enabling rapid adoption, insights and continued development.  

Select Customer Usage Cases

Quantinuum has listed a few case studies on their website,  including the following:

Nippon Steel: Has collaborated with the Company to optimize scheduling.  As the recent global supply-chain disruptions have highlighted, complexities in managing manufacturing and supply often requires companies to juggle resources.  Nippon Steel produces over 50 million metric tons of steel annually and has been using an algorithm co-developed with Quantinuum and run on a System Model H1, to schedule the intermediate products it uses.  Having the right balance of raw materials and intermediate products is essential and is a delicate balancing act facilitated by Quantinuum. 

Samsung: The electronics giant teamed up with Imperial College London to investigate new battery materials using a System Model H1. The team created a simulation of the dynamics of an interacting spin model to examine changes and effects of magnetism.  They were able to run deep circuits and use as many as 100 two-qubit gates to support the calculations, confirming the Model H1 can handle complex algorithms with a high degree of accuracy.

BMW: Entropica Labs, a Singapore-based quantum software startup, and the BMW Tech Office, teamed up to develop and run a Recursive Quantum Approximate Optimization Algorithm (R-QAOA) to benchmark logistics and supply chain optimization via number portioning, a classic combinatorial problem that is an entry point to many logistics challenges.  More complex versions of R-QAOA are now being explored.

This is just a small sampling of current projects and customers, with more than 750 overall collaborations currently underway, suggesting substantial customer uptake and potential.


Cambridge Quantum Computers and Honeywell Quantum Solutions were each already formidable players in the evolving QC space and have been generating meaningful revenues from this nascent field. CQC is/was a reputable and well-established quantum software and algorithm provider and HQS has created advanced QC devices which continue to scale and surpass performance records.  Assuming they can achieve synergies as a combined company, the upward trajectory should accelerate.  That said, the QC industry is still quite immature, and many players are dedicating substantial resources, so any early market leads will remain vulnerable to new technologies or competitive advances.  If Quantinuum can successfully leverage the broad client portfolio and historical industrial legacy of Honeywell with the substantial history and success of CQC, it should remain a leader in this growing field.  The following table highlights some of the key attributes of Quantinuum:


Apropos of the probabilistic nature of quantum algorithms, I wanted to leverage the nomenclature to create a company rating system and assign a scale to my overall assessment of a company’s potential.  Accordingly, I am going to use the formula below when reviewing companies, whereby the “alpha” coefficient correlates with “positivity” (and the formula adheres to the Born rule).  Given my overall assessment of Quantinuum including its strong position as a full-stack player, the strengths of the legacy businesses and the potential synergies, I am assigning the highest rating to Quantinuum at this time, with an Alpha of 0.95 which equates to an “Exceptional performance expected”.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


Our Technology – Cambridge Quantum, retrieved January 8, 2022.

Strangeworks and Quantinuum partner to integrate world’s first quantum-enhanced cryptographic key service – Strangeworks, retrieved January 8, 2022.

TQD Exclusive: Interview with Tony Uttley, President of Honeywell Quantum Solutions, Kirmia, Andrew, May 3, 2021.

Cambridge Quantum Computing, Pitchbook profile, accessed August 2, 2021

Next Few Months Will Demonstrate Quantum Cybersecurity Value of the New Quantum Computing Company Quantinuum, The Qubit Report, December 3, 2021

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantum Supremacy vs. Quantum Advantage – and how do we measure these things? 

Quantum Supremacy vs Quantum Advantage 

In October of 2019, Google announced that they had demonstrated the ability to compute in seconds what would take the largest and most advanced supercomputers thousands of years, thereby achieving a milestone referred to as “quantum supremacy” for the first time. They used a processor named “Sycamore” with 54 programmable superconducting qubits to create quantum states on 53 qubits (one did not operate), corresponding to a computational state-space of 253 (equivalent to about 1016 or over ten million-billion calculations).  They achieved this using a two-dimensional array of 54 transmon qubits, where each qubit is tunably coupled to four nearest neighbors. Each transmon has two controls: a microwave drive to excite the qubit, and a magnetic flux control to tune the frequency.  The claim was generally considered by many to be a “Wright Brothers Kitty Hawk” type of achievement.

And then, later that year, researchers at the University of Science and Technology of China (“USTC”) announced that they had also achieved quantum supremacy, utilizing a Quantum Computer named “Jiuzhang” which manipulates photons via a complex array of optical devices including light sources, hundreds of beam splitters, dozens of mirrors and 100 photon detectors.  They claimed that their device performed calculations in 20 seconds that would take a supercomputer 600 million years. Each of Google and USTC have increased their qubit utilization since these breakthroughs and now several other companies have successfully operated Quantum Computers with dozens of qubits and a couple with 100 or more. 

Let’s review some semantics regarding the measurement of Quantum Computing performance.  In 2012 a leading quantum mechanics researcher named John Preskill, a professor of theoretical physics at CalTech, first coined the term “quantum supremacy” to “describe the point where quantum computers can do things that classical computers can’t, regardless of whether those tasks are useful.”   He coined this term before any actual Quantum Computers had been built.  At the time, Preskill was wondering, in his words, “whether controlling large-scale quantum systems was merely really, really hard or whether it was ridiculously hard.  In the former case we might succeed in building large-scale quantum computers after a few decades.  In the latter case we might not succeed for centuries.”  In this sense, and based on Preskill’s original intent, the announcement by Google is a bona fide example of Quantum Supremacy and indicated that “a plethora of quantum technologies are likely in the next decade or so” [Preskill, 2019]. 

So, although the Google Sycamore quantum supremacy claim was discounted by some (most notably IBM and researchers in China), and despite it being an admittedly highly contrived and not very useful calculation, it was a ground-breaking achievement.     

Before I get into the semantics of how we measure Quantum Computing power, here is what the quantum community generally means regarding quantum progress: 

Quantum Supremacy: This term still retains Preskill’s original context and is considered the first major step to prove quantum computing is feasible.  Specifically, it means: “demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.”  Based on the definition, this threshold has been passed since October 2019, in fact at this point it has been shown by several different companies beyond Google and this is why I refer to the current hurdles as engineering challenges rather that theoretical ones.

Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.  

How do we Measure Quantum Computing Performance? 

At the end of a prior post regarding Qubits, I alluded to the challenge of measurement metrics for Quantum Computing highlighting that the count of operating qubits is not appropriate as a yardstick.  Imagine if you were shopping for a new car.  If the only metric that was available was “horsepower”, it would be very difficult to decide which car to buy.  By itself, horsepower is only one measure of car performance.  It does not factor actual acceleration power, fuel efficiency, ride comfort, handling, noise levels, legroom, sleekness, color/trim/style, etc.   Even if we are considering computers, just focusing on the clock speed, for example, would not provide enough breadth of information to make an informed purchase decision.  While Quantum Computers are in their very early stages, simply measuring a particular calculation speed or the number of qubits used, is not enough to describe accurately the actual performance capabilities.   Researchers at IBM have proposed the term “Quantum Volume” to enable the systematic measurement of Quantum Computing performance.  It is a metric that measures the capabilities and error rates of a Quantum Computer by calculating the maximum size of square quantum circuits that can be implemented successfully.  While the details are a bit esoteric, it is intended to provide one number, or score, to be used to compare incremental technology, configuration and design changes and to compare the relative power of one Quantum Computer to another.   

In fact, the performance of a quantum computer involves many factors as shown below: 

Source: IBM and Forbes as adapted by Riccardo Silvestri 

Since quantum volume is not quite an industry term-of-art at this point, I won’t use it as the definitive measurement tool.  However, the concept of focusing on characteristics beyond just the “number of qubits” is crucial, and I will discuss the relative performance characteristics of competing Quantum Computers beyond just a mention of the number of qubits. 

While many of the balloons in the above graphic may be unfamiliar, there are three key metrics for measuring quantum computing performance: 

  1. Scale: The number of qubits which the computer can simultaneously process.  It is important to distinguish between physical and logical qubits, with logical qubits being the key element (as I’ll show below, many constructs are adding physical qubits for error correction overhead). 
  1. Quality: The quality of the circuits which factors in both the time that the qubits remain in a superposition and entangled before they decohere, and the numbers of qubits that can entangle with each other. 
  1. Speed: Typically measured by circuit layer operations per second (or CLOPS) or how many circuits or gates can run on a Quantum Computers at a given time.  While this is a strong and objective measurement, it is not generally reported at this time. 

Another reason that the “number of qubits” is not useful to compare performance, is that we are currently operating in the NISQ environment (recall the “N” is for noisy).  Accordingly, many constructs are being proposed where certain qubits are dedicated to error correction and not for added entanglement.  IBM has a useful graphic to highlight the tradeoff between physical and logical qubits based on error rates: 

Quantum Computing Milestones 

While the semantics and various yardsticks used to describe Quantum Computer performance is confusing, evolving and not yet universally agreed upon, real progress is being made no matter which metric is showcased.   Here are a few recent advances in early working Quantum Computers, although not all report the same metrics, so it is difficult to compare these to each other: 

In addition to these Quantum Computers, Intel has a 49 qubit QC, Xanadu as a 24 qubit QC, and MIT has a 100 qubit QC, however the other performance metrics noted in the table are not readily available for these. 

It is worth noting that USTC recently claimed that Zuchongzhi 2.1 is a million times more powerful than Google’s Sycamore, and that it is 10 million- 100 trillion times faster than the world’s fastest supercomputer.  While it is difficult to substantiate these claims, given China’s enormous focus on Quantum Computing, a China-US space race of sorts is certainly afoot.  Also, the Quantinuum achievement on H1, only very recently announced, is worth paying close attention to given its high quantum volume and long decoherence times. 

Semantics and yardsticks aside, it is fascinating to see the increasing number of companies creating working Quantum Computers with ever increasing performance metrics, confirming that it is merely “really, really hard” to build these devises and not “ridiculously hard”.  It seems like we are seeing new press releases each week showcasing quantum performance achievements by these and others in the field.  Stay tuned as we track the performance. 


arXiv:1203.5813, “Quantum Computing and the entanglement frontier”, Preskill, John, March 26, 2012 

Quanta Magazine, “Why I called it ‘Quantum Supremacy”, Preskill, John, October 2, 2019 

Nature, “Quantum supremacy using a programmable superconducting processor,” Arute, Arya, Babbush, et. al., October 23, 2019 

The Independent – UK, “China builds world’s fasted programmable quantum computers that outperform ‘classical’ computers,” Sankaran, Vishwam, October 31, 2021 

Scorecards – Quantum Computing Report, Retrieved December 2021 

Silvestri, Riccardo. Masters Degree Thesis: “Business Value of Quantum Computers: analyzing its business potentials and identifying needed capabilities for the healthcare industry.” August 2020 

The Evolving Quantum Computing Ecosystem

In the past few blogs I have described what a Quantum Computer is and how it can be so powerful and transformative, basic features of qubits and highlights on some of the major players in Quantum Computing (“QC”).  But just like the evolution of personal computing, there are many participants in the QC ecosystem beyond just the makers of the actual machines.  You likely use a PC today, manufactured by one of a number of various hardware makers.  However, your machine’s core operating memory is made by a different company,  it is built upon an operating system (likely MS-DOS, owned by Microsoft) as well as various software applications.  You may also use external data drives, a mouse input device, a screen, a printer, various cables and other physical devices.  You likely also access the internet and some of the cloud services, utilize a virus protection program and other related activities and services.  There are likely dozens if not 100’s of companies whose technologies you use daily to operate your computing device. 

Companies like Oracle ($280 billion market cap; database management), Ingram Micro ($5.8 billion market cap; distributor of technology equipment), Cisco ($250 billion market cap; interconnecting equipment and services), Symantec ($15 billion market cap; antivirus protection), Adobe ($311 billion market cap; document and process software) and Salesforce ($262 billion market cap; productivity platform) have created enormous value despite not actually making any computers.    Quantum Computers will likely spur many similar such players in its ecosystem, in fact there are already 100’s of players engaged in this space., Some of these participants may also carve out significant market positions and value.  To give a sense for the breadth and depth of players needed, you can visualize the basic inner workings of a Quantum Computer as follows:

As this graphic shows, there are various aspects of the physical creation and manipulation of qubits (the bottom section of the graphic) along with software needed to control the logical layer.  Also, covered in a prior post, there are various ways to create qubits, often requiring cryogenic temperatures and/or detailed laser or radio frequency controls. 

Here is another graphic to help visualize the complexities of building a quantum computer:

Source: IBM

You’ll note the various wiring, amplifying, microwave generation and shilling components all requiring highly specialized design and control.  In order to describe the various QC players, it is helpful to segregate them into some functional categories or buckets as follows:

Hardware: Companies seeking to build a fully-functional Quantum Computer.  Many are also creating software and are integrating access to the cloud.  As discussed in a prior post, there are a few competing technologies underlying the creation of a working Quantum Computer including superconducting loops and Quantum Dots (which require cryogenics), or Ion traps and Photonics (which require sophisticated optics/laser controls), among others.

Circuits/Qubits: There are some companies focused on quibits and their interoperability for entanglement rather than attempting to build complete systems.

Cryogenics: Superconducting loops and quantum dots require temperatures that approach “absolute zero” (~negative 460 degrees Fahrenheit).  Many of the pictures you may see of Quantum Computers (like the graphic above) generally depict a 7-tiered structure, whereby the temperature is lowered in each of the layers, and there are companies that specialize in temperature control.

Wiring/Controllers: Operating near absolute zero, using lasers to control individual atoms or manipulating and controlling individual photons all require specialized and sophisticated devices and connections.  Some players are focused just on these types of challenges.

Error Correction: Due to the current NISQ (noisy intermediate-stage quantum) landscape and the need to have enormous computing “overhead” to correct for the noise in today’s qubits, some companies are concentrating on error correction strategies.

Photonics:  Lasers and/or photons are being utilized in various QC constructs and some companies are providing this specialization.

Software: Many of the major companies have developed quantum software to control and manipulate the qubits and the gates formed to perform quantum algorithms.  Some of these are creating open-source platforms while others are working on proprietary languages.

Applications:  Although this is still a somewhat immature portion of the market, as Quantum Computers continue to become more and more robust, I expect to see many more businesses develop applications and various related consulting services.

I will describe some of the players in this ecosystem, although the list is vast and growing, so this is not meant to be a definitive roster, rather a sampling to highlight the broad set of players and opportunities in Quantum Computing.  For a more complete list of players, I encourage you to visit this Quantum Computing Report listing.

In a prior post I noted that some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, as highlighted below: 

Each of these firms is making a major push in Quantum Computing, although their “valuation” is more driven by their other activities.  In any case, they are worth following and I expect their QC activities will make up an increasing portion of their values.

For the balance of this post I want to focus more on the players who are dedicated to QC or who have major operating divisions participating in the space, segregated by the categories described above:

Xanadu: Operator of a quantum photonic platform which it will combine with advanced artificial intelligence to integrate quantum silicon photonic chips into existing hardware to create a full-stack quantum computers.

IonQ: IonQ is a quantum computing hardware and software company developing a general-purpose trapped ion quantum computer and software to generate, optimize, and execute quantum circuits. It is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access and was the first pure-play public QC company.

Atom Computing:  Developer of quantum computers built using individually controlled atoms, creating nuclear-spin qubits made from neutral atoms that can be controlled, scaled, and stabilized optically.

PsiQuantum: PsiQuantum was founded on the premise that if you want a useful quantum computer, you need fault tolerance and error correction, and therefore ~1,000,000 physical qubits– to address commercially useful quantum computing applications.

Rigetti: Developer of quantum computing integrated circuits packaged and deployed in cryogenic environments and integrated into cloud infrastructure using pre-configured software.   The company also develops a cloud platform called Forest that enables programmers to write quantum algorithms.

EeroQ: Developer of quantum cloud platform using  trapping and control of individual electrons floating in a vacuum above superfluid helium, which form the qubits, and the purity of the superfluid protects the intrinsic quantum properties of each electron, allowing users to get seamless delivery of computing power.

ColdQuanta: Developer of quantum sensing technologies with a focus on improving the positioning and navigation systems as well as providing cold atom experimentation, quantum simulation, quantum information processing, atomic clocks, and inertial sensing products, enabling users to explore their own quantum matter innovations for sensing and other applications.

Quantum Circuits: The company’s computers are superconducting devices that include a quantum circuit model for quantum computation with an error correction system, enabling clients to make error-free computation through solid-state quantum bits.

D-Wave: Developer of quantum computing technologies offering annealing algorithms to solve optimization problems for commercial use in logistics, bioinformatics, life, and physical sciences, quantitative finance, and electronic design automation.

Oxford Instruments: Designs and manufactures tools and systems for industry and research. Their Quantum Technologies division helps companies with cryogenics, sensing photons and fabricating novel quantum materials.

Silicon Quantum Computing: SQC is currently developing a 10-qubit quantum integrated circuit in silicon to be delivered in 2023, and has the ultimate goal of delivering useful commercial quantum computing solutions.

Oxford Ionics: Manufacturer of computational electronic systems intended to create the most powerful, accurate, and reliable quantum computers. The company’s system is based on noiseless electronic qubits trapped ions control technology to create high-performance quantum computers by combining high quality qubits and trapped ions. 

Teledyne e2V: The engineering groups of Teledyne draws on a portfolio of leading edge technology, unique expertise and decades of experience in sensing, signal generation and processing for the development and commercialisation of Quantum technologies.

Quantum Brilliance: Using synthetic diamonds to develop quantum computers that can operate at room temperature, without the cryogenics or complex infrastructure, enabling disruptive quantum computing applications.

Chronos: Chronos Technology specializes in time, timing, phase, and monitoring solutions and services including highly accurate atomic clocks and clock syncronization.

BraneCell: Developer of a new quantum processing unit that can function at ambient temperatures. The company offers decentralized quantum computing hardware

Quantum Machines: Designing quantum controllers that translate quantum algorithms into pulse sequences, enabling organizations to run complex quantum algorithms and experiments in a smooth, intuitive way.

Alpine Quantum Technologies: Developer of ion trap quantum computer technology where single, charged atoms are trapped inside vacuum chambers.  Each qubit is manipulated and measured by precisely timed laser pulses.

Bluefors: Developer of a cryogen-free dilution refrigeration system designed to deliver easy-to-operate refrigerators. The company’s system provides custom unit connection components for different specifications including dilution units, control systems and gas handling units.

kiutra: Developer of a cooling technology intended to offer cryogen-free cooling service. The company’s technology offers sub-Kelvin temperatures for basic research, material science, quantum technology, high-performance electronics, and detector applications.

Toptica: Manufacturer of and distributor of high-end laser systems designed for scientific and industrial applications including for qubit control.

M-Squared: Developer of photonics and quantum technology used specifically for quantum research, biophotonics and chemical sensing application. The company’s laser based systems offer lasers and photonic optical instruments for applications in remote sensing, frontier science, bio-photonics, defence, microscopy, spectroscopy and metrology.

Montana Instruments: Delivers best-in-class cryostats that are simple to set up, use, and grow with our partners in your journeys over time. Since 2009, Montana Instruments has worked with hundreds of category pioneers to build cryostats with purposeful modularity.

Single Quantum: Developer of single-photon detectors designed to detect particles of light. The company’s detectors are based on superconducting nanotechnology.

Sparrow Quantum: Spun out of the Niels Bohr Institute, a developer of a photonic quantum technology based on self-assembled quantum dots coupled to a slow-light photonic-crystal waveguide, enabling nanophotonics researchers to increase light-matter interaction and enhance chip out-coupling.

Quantum Motion: Developer of quantum computer architectures designed to solve the problem of fault tolerance. The company’s architectures leverage CMOS processing to achieve high-density qubits which can scale up to large numbers and tackle practical quantum computing problems, enabling users to help reduce errors and thereby improve quality.

QDevil: Developer of electronics and specialized components for quantum electronics research.  The QFilter is a cryogenic filter for reducing electron temperatures below 100 mK. The product portfolio also includes the QDAC, a 24-channel ultra-stable low noise Digital-Analogue-Converter, the QBoard, a fast-exchange chip carrier system, and the QBox, a 24-channel breakout box.

SeeQC: The company’s technologies are developed and commercialized for quantum information processing applications including scalable fault-tolerant quantum computers and simulators, quantum communications, and quantum sensors, enabling businesses to get access to a full suite of electronic circuit design tools for integrated circuit design including PSCAN2, XIC, WR Spice and InductEx.

Delft Circuits: Manufacturer of cryogenic circuit technologies intended to perform scientific instrumentation, quantum computing, and astronomy. The company’s technology offers custom-engineered superconducting circuits and cryogenic instrumentation which have ultra-low thermal conductance and scalable cryogenic cabling, enabling users to conduct their research with cryogenic circuit packaging as per their need.

Q-CTRL: Developer of quantum control infrastructure software designed to perform quantum calculations to identify the potential for errors. The company’s platform uses quantum sensors to visualize noise and decoherence and then deploy controls to defeat the errors, enabling R&D professionals and quantum computing end users to improve the efficiency and performance of standoff detection as well as precision navigation and timing for defense and aerospace.

TMD Technologies: Manufacturer of professional microwave and radio frequency products primarily focused n the defense and communications markets as well as providing compact and precise atomic clocks, new gravimetric and magnetic sensors used in quantum computers. 

Terra Quantum: Developer of a hybrid quantum algorithm intended to solve a linear system of equations with exponential speedup that utilizes quantum phase estimation.

QxBranch: Developer of algorithms and software intended to provide predictive analytics, forecasting and optimization for quantum and classical computers.

Zapata: Spun out from Harvard in 2017, developer of a quantum software and algorithms to compose quantum workflows and orchestrate their execution across classical and quantum technologies. The company’s platform provides artificial intelligence, machine learning and quantum autoencoder to deliver an end-to-end, workflow-based toolset for quantum computing that advances computational power.

Cambridge Quantum Computing: Quantum computing software company building tools for commercialization of quantum technologies. The company designs software combining enterprise applications in the area of quantum chemistry, quantum machine learning and augmented cybersecurity in a variety of corporate and government use cases.

RiverLane: Developer of quantum computing software using an ultra-low latency quantum operating system that accelerates quantum-classical hybrid algorithms to facilitate hardware research and development and also develops algorithms to make optimal use of the full quantum computing stack, enabling hardware partners to focus on the physics and build better full-stack solutions.

QCWare: Developer of enterprise software designed to perform quantum computing. The company’s software simplifies QC programming and provides access to QC machines while improving risk-adjusted returns and monitoring networks, enabling clients to integrate quantum computing power into any existing application and remove performance bottlenecks.

StrangeWorks: Strangeworks QC™ is used by thousands of researchers, developers, and companies around the world to learn, teach, create, and collaborate on quantum computing projects and , enabling clients to overcome the risks of vendor lock-in and architectural uncertainties. 

1Qbit: 1QB Information Technologies is a quantum computing software company in hardware partnerships with Microsoft, IBM, Fujitsu and D-Wave Systems. 1QBit develops general purpose algorithms focused on computational finance, materials science, quantum chemistry, and the life sciences.

Quantum Computing Inc.: Quantum Computing Inc is focused on providing software tools and applications for quantum computers. Its products include the Qatalyst, Qatalyst Core, and Quantum Application Accelerator. Qatalyst enables developers to create and execute quantum-ready applications on conventional computers while being ready to run on quantum computers where those systems achieve performance advantage.

Quintessence Labs: Developer of quantum-cybersecurity applications designed to implement robust security strategies to protect data. The company’s cybersecurity technologies are used for cryptographic purposes to centralize the management and control of data-security policy and harness quantum science properties, thereby enabling businesses to increase returns on investment from existing assets and reduce data-security complexities.

MagiQ: A research and development company offering quantum cryptography system. The company’s offering includes optical sensing applications for RF interference cancellation, quantum cryptography, and optical surveillance for advanced energy exploration, enabling customers better communicate, safeguard and secure their worlds.

Quantinuum: A Honeywell spin-out, the company provides an open-access, architecture-independent quantum software stack and a development platform, enabling researchers and developers to work seamlessly across multiple platforms and tackle some of the most intriguing problems in chemistry, material science, finance, and optimization.

Nu Quantum: Developer of cryptography systems designed to be more secure and time-efficient. The company’s system created a portfolio of patented ground-breaking single-photon components fundamental to the realization of commercially viable photonic technologies by combining novel materials and semiconductor technology, enabling clients to secure exchange of cryptographic keys worldwide for the ultra-sensitive detection of light.

ID Quantique: Provider of quantum-safe crypto services designed to protect data for the long-term future. The company offers quantum-safe network encryption, secure quantum key generation, and quantum key distribution, enabling financial clients, enterprises, and government organizations to solve problems by exploiting the potential of quantum physics.

Some of these companies are now publicly traded or about to go public, others are private but well-funded by preeminent venture firms or other institutions.  Many are independent and working hard to establish a strong position in the ecosystem.  Stay tuned to this blog for future reports which will showcase some of the individual players and investment opportunities.


Nature, “Building logical qubits in a superconducting quantum computing system,” Gambetta, Chow and Steffen, January 13, 20917

AI Multiple, “QC Companies of 2021: Guide based on 4 ecosystem maps” Dilmegani, Cem , January 1, 2021

Fact Based Insight, Accessed December 2021

Qubits: A Primer

In a prior post about Superposition and Entanglement (click here to re-read), we learned that superposition allows a qubit to have a value of not just “0” or “1” but both states at the same time, enabling simultaneous computation.  Entanglement enables one qubit to share its state with other qubits enabling the information or processing capability to double with each entangled qubit.   These two features of Quantum Computing, embodied by “qubits,” enable it to perform certain types of calculations substantially faster than existing computers, and underlie the vast potential of Quantum Computing.   In this post I will describe how qubits are currently made and controlled.

There are mutually exclusive forces at play, which make qubit construction and manipulation exceedingly difficult, although not impossible.  On the one hand, in order for qubits to be as stable as possible, they need to be immune to external forces such as temperature changes, electromagnetic radiation, vibrations, etc., so that they stay in their “state” until we need to use them.  However, this makes it very difficult to manipulate them.  In addition, qubits operate based on quantum mechanics, which is the physics of incredibly small objects such as individual electrons (often measured by their spin which is either “spin up” or “spin down”) or photons (measured by their polarization which is either horizontal or vertical).  Controlling an individual electron or photon adds another layer of difficulty to the mix due to their extremely small scale.

When the bits created for classical computing were first created, there were several different transistor designs developed before the industry settled on MOFSET (metal-oxide-semiconductor field-effect transistor).  Similarly, today there are many ways to create a qubit.  The following is a brief overview of some of the more common types:

Superconducting QubitsSome leading Quantum Computing firms including Google and IBM are using superconducting transmons (an abbreviation derived from “transmission line shunted plasma oscillation qubit”) as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule).  The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop.  A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap.  Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.

Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit.  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.

Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 

Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

Diamond Vacancies: There is a well-know defect that can be manufactured into artificial diamonds, which leaves a nitrogen-vacancy inside the diamond which is filled by a single electron.   The spin of this electron can then be manipulated and measured with laser light.  This technology can operate at room temperature and ambient pressure, which are extremely positive attributes, although they have so far proven very difficult to scale to large numbers of qubits.

Topological Qubits: Quasiparticles can be observed in the behavior of electrons channeled through semi-conductor structures.  Braided paths can encode quantum information via electron fractionalization and/or ground-state degeneracy which can be manipulated via magnetic fields.  While this form of qubit is only theoretical at this point, it is being pursued by some large players including Microsoft.

There are a few others, including Neutral Atoms, Nuclear Magnetic Resonance (which seems more experimental but very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed.  Hopefully this provides a high-level flavor for the various types of qubits.  The good news is that many entities have created, manipulated and measured qubits and often there has been success in controlling them into superpositions and in entangling a limited but growing number of qubits at a time.    

The following table summarizes some of the benefits and challenges along with selected current proponents of key qubit technologies currently in use:

The “Noisy Intermediate-Scale Quantum” (or “NISQ”) Envirnonment

In prior posts I have covered how qubits use superposition and entanglement to empower massive processing speed for certain applications.  However, the technical and manufacturing challenges noted above regarding various qubit types, has prohibited the construction of a very large and non-error-prone system.    There are many competing strategies for creating qubits, each with a different set of advantages and challenges. 

In order to have a Quantum Computer that can exhibit supremacy to a classical computer, it is estimated that we need at least ~100 “logical” qubits, meaning 100 qubits that maintain their fidelity and coherence for as long as needed to perform a desired analysis.   However, as noted above, qubits are unstable, are easily affected by environmental factors, and are difficult to get to remain entangled.  These challenges are generally referred to as “noise”, hence the “N” in NISQ.  One way to solve for this “noise” is to allocate additional qubits to check on or correct the target qubit.  Currently, it is thought that as many as 1,000 “physical” qubits are required to ensure stable utilization of 1 “logical” qubit and many firms are focusing exclusively on the quantum error correction schemes to address this challenge.  Therefore, in order to create a Quantum Computer with 100 logical qubits in the NISQ phase of quantum computing, 100,000 – 1,000,000 physical qubits are being targeted.  To date, the most entangled qubits reported are still measured in the 100’s so there is a long way to go.  That said, this is now an engineering challenge more than a theoretical challenge, and many of the companies noted in this blog have announced product roadmaps to reach 1,000,000 active physical qubits in the next five years or so.

An alternative or competing framework is to create error-correcting qubits.  Today’s transistors have error correction built in, so they operate at extremely high accuracy rates.  The hope is that a method of qubit construction can be created that can self-correct, obviating the need for the massive error-correction overhead noted above in the 1,000:1 ratio of physical to logical qubits.

How do we measure Qubit Performance?

Unfortunately, there is no common and agreed upon set of metrics to allow apples-to-apples comparisons among various Quantum Computing configurations.  A few important measurement factors include the number of operations that can be performed before error, the gate fidelities, and the gate speeds.  IBM has proposed a “Quantum Volume” construct, intended to provide a single-number metric which factors in several key items in order to quantify the largest random circuit of equal width and depth that the Quantum Computer successfully implements.  While the approach of creating a single and agreed upon metric has broad interest, not everyone agrees with the IBM methodology, so a universal standard is still not available. In the meantime, two great resources for tracking and comparing qubit/Quantum Computer performance metrics inculde: Quantum Computing Report and Fact Based Insight. I’ve provided hyperlinks to their qubit dashboards.

So, in order to have objective assessments of various Quantum Computer performance metrics, it is important to acknowledge the various attributes desired and take a wholistic approach to such performance announcements.


Images from Science, C. Bickel, December 2016 and New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010

What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?”, Bobier, Langione, Tao and Gourevitch, BCG, July 2021

Fact Based Insight, Accessed December 2021

7 Primary Qubit Technologies for Quantum Computing, Dr. Amit Ray, December 10, 2018

Inside the race to build the best quantum computer on Earth, Gideon Lichfield, MIT Technology Review, February 26, 2020

Follow the Money…the Quantum Computing Goldrush

You’re likely thinking to yourself, “OK, I see there is some potential in Quantum Computers, and some theoretically important use cases, but nobody has created a robust working Quantum Computer…existing qubits only stay coherent for milliseconds at best, so isn’t this all just hype?”

While no one can say for sure, my suggestion, paraphrasing Deep Throat’s instructions to Bob Woodward, is to “follow the money.”

The amount of funding being dedicated to Quantum Computing on a global basis is staggering.  Governments, private companies, venture firms and academic institutions are all committing huge sums of money and resources to this field.  While investment flows are no guarantee of future value, there is a broad common theme to push the development of Quantum Computers, and the equivalent of the modern “space race” is garnering growing attention in the media.  Given the awesome power, potential and disruption that Quantum Computers can deliver, these trends should not be surprising.

The industry is at an interesting crossroad, where it has evolved from being an esoteric theoretical construct, to having many dozens of firms and academic institutions creating actual working (albeit still not very powerful) Quantum Computers. The challenge now is an engineering one, not a theoretical one. And with the growing pull of resources, it should be expected that engineering challenges will be overcome and developments will accelerate. When integrated circuits were still being created in the 1950’s, very few people could have imagined the boon it would create. Things like personal computers, cellular phones or the Internet were not yet contemplated. Even when PC’s were made available in the early 80’s, many were skeptical that there was an actual market for such an esoteric device. In fact, here is a reprint of an editorial by William F. Buckley Jr. as printed in the Lancaster New Era on July 19, 1982, where he is mulling that he cannot fathom any possible way a personal computer could be useful in the home:

Not surprisingly, his point-of-view was strictly in the context of the written word, since he was a writer, so his myopia makes contextual sense. Given that Quantum Computers are based on a completely different set of physics, logic gates and architecture, I am confident that the use cases will expand well beyond any currently contemplated uses and that current skeptics should try to maintain an open mind.

Government Directed Quantum Computing Investments

As can be seen in the chart below, the top ten countries focused on Quantum Computing technology have recently invested or committed over $21 billion towards this field:

The breadth and depth of these commitments are catalyzing the industry and I expect these trends to continue, so even excluding private company investment, there will be significant advancements achieved at the national level.

Major Current Players

Some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, and lead the push to broad adoption, as highlighted below:

Many are already offering their own quantum software platforms and providing early access to prototype machines over the web. For example, anyone can download the IBM Qiskit open-source Quantum Software Development Kit (SDK), create programs and run them on an IBM quantum emulator. Similarly, you can download and run Google’s Cirq, Microsoft’s Azure, Alibaba’s Aliyum, etc. among others. These firms are leveraging their broad infrastructure, technological resources and established web-based platforms to advance the access to, and utilization of, evolving Quantum Computing resources. In addition, in June Honeywell agreed to invest $300 million into its Quantum Computing unit after it merged with Cambridge Quantum Computing.

Venture Investment in Quantum Computing

In addition to the large government programs and major push by leading technology firms, there is a growing and accelerating focus on Quantum Computing among venture investors. According to the Quantum Computing Report, there have been more than 450 venture investments in Quantum Computing companies made by more than 300 different venture investment firms.  Echoing the growth of Silicon Valley companies funded by legendary Sand Hill Road venture investors, current venture investors are making increasing large and diverse bets on many parts of the Quantum Computing ecosystem.  The following chart showcases aggregate venture investments in each of the past three years (with more than a month still left in 2021):

A few venture firms have focused on Quantum Computing investments, with 17 firms making 3 or more such investments and with two (Quantonation and DCVC) making 10 or more each, as highlighted in the following table:

Not only has the playing field for Quantum Computing investments been growing, but there have been some very significant investments made. The following highlights some of the larger announced venture investments:

Sources: PitchBook, Boston Consulting Group

Of these companies, IonQ became the first-ever pure-play Quantum Computing company to go public, debuting on the NYSE on October 1, 2021 and as of Nov. 23rd had a market capitalization of $4.8 BILLION. Rigetti Computing also recently announced it would be going public in an expected $1.5 billion reverse merger with a SPAC. The latest PsiQuantum investment was announced this past summer and included a $450 million investment at a valuation exceeding $3 billion, with ambitious plans to build a commercially viable Quantum Computer by 2025.

University Focus on Quantum Computing

Quantum computing and quantum information theory has gone from being a fringe subject to a full complement of classes in well-funded programs at quantum centers and institutes at leading universities.  Some world-class universities offering dedicated Quantum Computing classes and research efforts include:

  • University of Waterloo – Institute for Quantum Computing
  • University of Oxford
  • Harvard University – Harvard Quantum Initiative
  • MIT – Center for Theoretical Physics
  • National University of Singapore and Nanyang Technological University – Centre for Quantum Technologies
  • University of California Berkeley – Berkeley Center for Quantum Information and Computation
  • University of Maryland – Joint Quantum Institute
  • University of Science and Technology of China – Division of Quantum Physics and Quantum Information
  • University of Chicago – Chicago Quantum Exchange
  • University of Sydney, Australia
  • Ludwig Maximilian University of Munich – Quantum Applications and Research Lab
  • University of Innsbruck – Quantum Information & Computation

These Colleges and Universities, as well as many others, continue to add courses and departments dedicated to Quantum Computing.

We are witnessing an unprecedented concentration of money and resources focused on Quantum Computing, including substantial government initiatives, major industrial player committment, accelerating venture investment and evolving university programs. While not all investments will be positive, and the landscape continues to evolve, serious, smart money is backing this trend. The clear message is that resource focus will lead to engineering breakthroughs and immense value creation. There are now 100’s of companies jockeying for position in this evolving field. Stay tuned to this blog as we watch for the winners and losers.


Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Hajjar, Alamira Jouman, 33+ Public & Private Quantum Computing Stocks, AI Multiple, May 2, 2021

Inside Quantum Technology News, Government Investments in Quantum Computing Around the Globe, May 31, 2021.

Pitchbook Database, Retrieved November 2021

Universities With Research Groups — Quantum Computing Report, Retrieved November 2021

Venture Capital Organizations — Quantum Computing Report, Retrieved November 2021

Quantum Quantum Everywhere

Quantum Mechanics in Everyday Life, Near Term Use and Future Quantum Computing Applications

In prior posts, I conveyed some of the underlying reasons why Quantum Computers can do things that existing digital computers cannot do or would take prohibitively long to do. In this post I will cover some of the near-term use cases for Quantum Computing, but first I want to cover how “Quantum” or, specifically, the quantum mechanics underlying the power of Quantum Computing, is already used in our daily lives, some near term applications where quantum effects are providing powerful new capabilities, and finally, where the power of Quantum Computing will likely have the most impact.

Quantum Mechanics in Everyday Life

Anyone trying to learn about Quantum Computing or quantum mechanics is likely baffled by how to picture it in your head in a relatable way. Because quantum mechanics occurs on a scale so small and the physics are wholly unfamiliar, it is an intimidating field and very difficult to visualize. However, we use and benefit from quantum mechanics every day without understanding the underlying physics. Here are some examples (Choudhury, 2019):

  1. Electronic Appliances: If you notice the heating elements in your toaster, you are witnessing a quantum mechanical application of electricity as evidenced by the red glow which is the power being converted to heat.
  2. Computers (transistors and microchips): The core transistors in every computer (and in the chips used in many other modern products) work via semiconductors, where the electrons behave like waves, which is a core principle of quantum physics.
  3. LED’s:  Like transistors, LED’s are made of two layers of semiconductor, which are caused to meet and to release the energy applied by the power source, again a quantum physical action.
  4. Lasers: Lasers produce their monochromatic light via a form of optical amplification based on the stimulated emissions of photons, another quantum physical process.
  5. MRI’s: Magnetic Resonance Imaging works by flipping the spins in the nuclei of hydrogen atoms.
  6. GPS: the ubiquitous Global Positional System, where the interconnected satellites, using atomic clocks, use principles of quantum theory and relativity to measure time and distance.
  7. Incandescent Bulbs: Like with the toaster noted above, current passes through a thin filament and makes it hot, which causes it to glow, which creates visible light – all quantum mechanical processes.
  8. Sensors: Nearly all of us have digital cameras or use the cameras in our phones.  These cameras use a lens to collect and convey photons, which the sensor, a form of semiconductor, converts to a digital image.

Hopefully, these examples give you the confidence to appreciate that quantum physics impacts your everyday life without any need to understand the underlying physics.  Let’s use that baseline to now explore applications of quantum physics in quantum sensing, quantum communications and, finally, Quantum Computing.

Quantum Sensing

Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   None of these uses require entanglement, so these are much nearer to actual utilization than robust Quantum Computers. 

Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.

Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds (Shapiro, 2008).  Quantum illumination is similar and can be used in quantum radar. 

Tabletop prototypes of these quantum sensing applications have already been demonstrated and have the nearest-term commercial potential (Palmer, 2017).

Quantum Communication

The primary near-term application of quantum mechanics in communications involves quantum key distribution (QKD).  QKD is a form of encryption (more on encryption below) used between two communicating parties who encode their messages in transmitted photons.  Due to the quantum nature of photons, any eavesdropper who intercepts a message encoded with QKD will leave a telltale sign that the data stream was read since the act of viewing a photon alters it (a fundamental principle of quantum dynamics).  For this reason, quantum-secure communication is referred to as “unhackable”.  This principal has already been shown over fiber optics and across line-of-sight towers (both of which have limitations on distance) and has recently been demonstrated by China via satellite.  China launched the Mozi satellite in 2018 and beamed a completely secure QKD encrypted message between China and Austria (Liao et al., 2018).  And this past month, the CAPSat, quantum communication satellite, a collaboration between University of Illinois Urbana-Champaign and the University of Waterloo, was placed into orbit by the ISS, and is designed to test unhackable quantum communications.  So long-range quantum communication is already becoming a reality (Schwink, 2021). 

Quantum Computing

So far in this post I have shown you how quantum physics already impacts your everyday life as well as some new applications that are already in use or have shown success via prototypes, so will be utilized near-term.  The least commercially developed feature of quantum physics, but the most profoundly beneficial, involves the superposition and entanglement of qubits in Quantum Computing [covered in detail in the prior post].

I want to make clear that “Quantum Computers” are not all-powerful supercomputers that will replace existing binary-based computers.  An essential feature of Quantum Computing lies in the exponential increase in its computing power as you increase the number of entangled qubits which distinguishes it from digital computing for certain types of calculations or problems.  The most fundamental areas where this exponential speedup is valuable applies to an area known as combinatorics.  Let me provide an example to set the stage for this discussion.

Assume you manage a networking group, and you are planning the seating chart for this month’s meeting where eight members are going to attend. You want to arrange the seating so that you help optimize the networking opportunities as well as respect seniority by having certain members sit facing the door, etc. (the reasons are not important, just assume that the seating chart has many nuances). You may think this is an easy exercise – for example, put Alice and Bob next to each other, but not next to Charlie since they already know each other.  Put Sam closest to the door, etc.  However, it turns out that there are more than 40,000 different seating arrangements with just 8 people (for those trying to decipher the math, it is 8! or 8 factorial, meaning place any of the 8 attendees in the first seat, then any of the 7 remaining attendees in the next seat, etc., or 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 = 40,320 different seating combinations).  This may seem more complicated than you expected, but intuitively you may feel that you could work it out if you had to.

However, imagine that at the next month’s meeting you have 16 members attend and want to be equally diligent in the seating arrangement.  For this meeting there are now 20,922,789,888,000 different seating arrangements possible, or more than 20 trillion!  With just 16 people (16x15x14x….). This defies logic but is simple factorial math.  Now, I am not suggesting we need Quantum Computers to help with seating charts, but a seating chart represents a typical “optimization” challenge. For certain instances, as you increase the number of inputs, the potential combinations become unmanageable very quickly, hence the reference to combinatorics

Where will Quantum Computers Provide Near-Term Results?

The superposition and entanglement of qubits enables Quantum Computers to consider many combinations simultaneously instead of linearly, hence the tremendous speed-up in processing.   Let’s now dig into two areas where Quantum Computers can use these speedup features to provide a “quantum advantage” in the ability to process currently unmanageable combinatorial problems, namely simulation/optimization and cryptography. 

Simulation and Optimization

For optimization, you can imagine our networking seating problem as analogous to molecular modeling for things such as drug development or materials science.

PASIEKA/Getty Images

In these cases, as you tweak the atoms or molecules or proteins you are studying, the numbers of different alignments or configurations increases quickly, like shown with the seating chart example.    A powerful Quantum Computer could simulate and evaluate many potential configurations simultaneously and could dramatically accelerate advances in these fields.  Here are some examples where Quantum Computers can accelerate computational problems:

  • Simulation: Simulating processes that occur in nature and are difficult or impossible to characterize and understand with classical computers, which has the potential to accelerate advances in drug discovery, battery design, fertilizer design, fluid dynamics, weather forecasting and derivatives pricing, among others.
  • Optimization: Using quantum algorithms to identify the best solution among a set of feasible options, such as in supply chain logistics, portfolio optimization, energy grid management or traffic control.

The table below highlights additional examples of fields where Quantum Computing speedup will manifest:

Here are examples regarding a few of these applications along with some of the companies already deploying early quantum computing programs:

  • Today, most new drugs are formulated by trial and error and the time between finding a new drug molecule and getting it into the clinic averages 13 years and costs up to $2 billion. If we can use Quantum Computers to model various drugs in silico, instead of through the trial and error of lab experiments, we could shorten this timeline and decrease the overall costs. Recently, healthcare giant Roche announced a partnership with Cambridge Quantum Computing to support efforts in research tackling Alzheimer’s disease. And synthetic biology company Menton AI has partnered with quantum annealing company D-Wave to explore how quantum algorithms could help design new proteins with therapeutic applications.
  • Fertilizers are crucial to feeding the world’s growing population because they allow food crops to grow stronger, bigger and faster. More than half of the world’s food production relies on synthetic ammonia fertilizer which is created by the Haber-Bosch process which converts hydrogen and nitrogen to ammonia. However, this process has an enormous carbon footprint including the energy needed to perform the conversion (some estimate this to be 2%-5% of ALL global energy production) as well as the huge amount of carbon-dioxide by-product it emits. Scientists believe that using a Quantum Computer, they could map the chemistry used by certain bacteria that naturally create fertilizers and uncover an alternative to the current synthetic fertilizers created by the Haber-Bosch process. In fact Microsoft has already demonstrated how Quantum Computers can create better fertilizer yields and has created a Quantum Chemistry Library to facilitate such research.
  • There is a global push to expand battery powered automobiles in a transition to a greener economy, but existing car batteries have limited capacity/range and long charge times. Searching for materials with better properties is another molecular simulation problem that can be better handled by Quantum Computers. That is why German car maker Daimler has partnered with IBM to assess how Quantum Computers could help simulate the behavior of sulphur molecules in different environments, with the end-goal of building lithium-sulphur batteries that are longer-lasting, better performing and less expensive than existing lithium-ion batteries.
  • The “traveling salesman problem” generally describes the challenge of optimizing the routing for businesses, another area where combinatorics makes the problems exponentially difficult to resolve as inputs are added. For example, a fleet of more than 50,000 merchant ships carrying 200,000 containers each, with a total value of $14 trillion dollars, is actively in motion each day. Energy giant ExxonMobil has teamed up with IBM to find out if Quantum Computers could do a better job optimizing these routes and related logistics.

In the next blog I will cover additional details on the players currently working with Quantum Computers for these and similar applications.


Another field where Quantum Computers will have a profound impact is for encryption.  Nearly every time you log into a site on your computer, perform on-line banking transactions or when governments send confidential communications between entities, such activity is “on the web” meaning accessible to others.  It is protected by an encryption protocol developed by Ron Rivest, Adi Shamir and Leonard Adleman in 1977 and known as RSA public-key encryption.   

In a very truncated description, the foundation of the RSA encryption lies in the fact that it uses two very large prime numbers to create a “factoring problem”.    Here is an over-simplified explanation:

  1. A sender uses a very large number (the product of two large prime numbers) to encrypt or encipher a message.  This is known as the Public Key.
  2. The encoded message along with the Public Key are sent over the Internet (in theory, anyone can see/read these).
  3. The Sender and a Receiver communicate a Private Key in a secure manner.  This Private Key is the two prime factors used to create the Public Key.
  4. The Receiver uses its Private Key to decrypt or decipher the message.

The encoded message cannot be decoded without knowing this “private key”. Said another way, finding the two prime factors of a very large number is exceedingly difficult, so if the RSA Encipher key is based on a sufficiently large number (i.e., 2048 bits which is over 600 digits long), it is practically impossible with current computers to find the two prime factors. However, in 1994, mathematician Peter Shor proposed an algorithm that could factor large numbers into their primes in much shorter polynomial time. In fact the algorithm he created is open source and available on the Internet for anyone to download. [For those of you interested in seeing the actual code, you can visit here: [GitHub implementation of Shor’s algorithm written in Python calling Q# for the quantum part]. Existing Quantum Computers only have the power to factor fairly small numbers, but the code is readily available for whomever creates a powerful enough Quantum Computer to use it to break existing RSA encryption.

Cryptocurrency mining and wallets are also areas which could be vulnerable to Quantum Computers. Bitcoin and other cryptocurrencies are “mined” by computers that crunch increasingly complex algorithms which result in the creation of new bitcoins (and which is why bitcoins consume increasing amounts of power). As levels of cryptocurrency are deciphered, the code to uncover the next round of coins increases in complexity. By some estimates, the current bitcoin protocols will take another 120 years to mine the remaining coins, so once Quantum Computers are powerful enough they could mine the remaining coins much faster. In addition, the wallets that most people use to hold their cryptocurrency have similar vulnerabilities as described above regarding encryption.

I hope this post helps you appreciate how quantum mechanics already affects your everyday life and to begin to appreciate areas where Quantum Computers will have a profound impact.   Stay tuned for a deeper dive into this subject.


Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Bodur, Hüseyin and Kara, Resul ,“Secure SMS Encryption Using RSA Encryption Algorithm on Android Message Application.”, 2015.

Bova, Francesco, Goldfarb, Avi & Melko, Roger G., “Commercial applications of quantum computing,” EPJ Quantum Technology, 2021.

Cavicchioli, Marco, “How fast can quantum computers mine bitcoin?” The Cryptonomist, May 12, 2020.

Choudhury, Ambika, 8 Ways You Didn’t Know Quantum Technology Is Used In Everyday Lives (analyticsindiamag.com), October 7, 2019.

Leprince-Ringuet, Daphne, “Quantum computers: Eight ways quantum computing is going to change the world,” ZDNet, November 1, 2021.

Liao, Sheng-Kai, Cai, Wen-Qi, Pan, Jian-Wei, “Satellite-to-ground quantum key distribution,” Nature, August 9, 2017.

Palmer, Jason, “Here, There and Everywhere: Quantum Technology Is Beginning to Come into Its Own,” The Economist, 2017.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

Schwink, Siv, “Self-annealing photon detector brings global quantum internet one step closer to feasibility,” University of Illinois Urbana-Champaign Grainger College of Engineering, October 13, 2021.

Quantum Superposition and Entanglement

In prior posts I emphasized the excitement and potential of Quantum Computing, without any reference to the underlying quantum mechanics, but would like to introduce you to some unique quantum properties in this post. While understanding the nature of Quantum Computing is complex and contains many new concepts, a basic understanding of “Superposition” and “Entanglement” is fundamental to grasp why this new computing methodology is so novel, powerful and exciting.  I am going to try to describe these concepts in a way that does not require any math, although I will use some math references to highlight how these concepts manifest.


As noted in the prior post, one of the fundamental differences between Quantum Computers and classical computers lies in the core components used to process information.  We know that classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because in order to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a quantum system can be in a superposition of both at the same time.  While this seems counter-intuitive, and somewhat supernatural, it is well proven so please try and accept it at face value in order to get the gist of the other concepts covered in this post.  For a deeper dive into superposition from a particle physics perspective (light is both a particle and a wave), you can investigate Wave–particle duality. [Fun Fact – Einstein did not receive the Nobel prize for his famous E=MC2 relativity equation but rather for his photoelectric effect work, which is fundamental to quantum mechanics, where he postulated the existence of photons, or “quanta” of light energy which underpins much of the power behind Quantum Computing].

Without getting into the physics or explaining complex numbers, Superposition can be mathematically depicted as:

Please bear with me here, I have promised not to overwhelm you with complex math, I only want to highlight how to think about Superposition in a way that will help you appreciate its power for computing and to share the nomenclature that is generally used in the field.  Don’t focus on the Greek characters (psi, alpha and beta) or the linear algebra notation (the |Ket> notations and the parenthetical portion).  Simply note that the equation above on the left, is the mathematical representation of a qubit and is simply stating that there is a probability that a given qubit (the Psi or trident symbol) when measured is “0” (the alpha symbol) and a probability that the qubit is a “1” (the beta symbol).  The equation on the right, known as the Born rule, is simply stating that the two probabilities add to 100%.  Let me reframe that in a simpler manner. Before a qubit is actually measured, it is both a “1” and a “0” at the same time and the relative odds of it being one or the other are included in the qubit equation. In practice this means that using Quantum Computers to solve problems becomes a probabilistic analysis, and if the equations are run enough times, they will average out to the answer.

You may recall in a prior post that qubits are described as 3-dimensional, as shown below in blue and red. The line drawing version with the funny symbols, shows how this is used for calculations when put into a superposition (the math and symbols are helpful for those comfortable with them, but not essential for a general understanding):

In this depiction, if the North pole is “0” and the South pole is “1” and if the qubit is tilted to the side, the degree of its tilt translates generally into these probabilities.   In the image with the blue arrow pointing to psi, you will notice that the psi symbol looks like it is leaning between north and south, in this case closer to north.  For example, this might be represented as “0.8 |0> + 0.6 |1>” meaning it is leaning closer to “0”. This generally means it would have a higher probability of being 0 when measured. [You will also note that if you square each term, you get 0.64 + 0.36 which equals 1, and therefore follows the Born rule, and roughly means that the odds of this qubit being a 0 are 64% and the odds of it being a 1 are 36%.]

The important part to take away is that since a qubit can represent an input with various weightings of 1 and 0, it contains much more information than a simple binary bit.


If the above explanation of superposition seems a bit unintuitive, I’m afraid that entanglement might seem even more bizarre [don’t worry, you’re not alone, even Einstein struggled with it], but I will do my best to explain it.  I will ask you to accept that, like Superposition, this is an actual phenomenon which is well proven, even if you likely won’t be able to picture it in your mind in a way that will be satisfying.  And it is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (more on that below).

Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled“, their spin or polarization becomes perfectly correlated.  In fact, this concept is much older than Quantum Computers and was first described in 1935 by Einstein, along with Boris Podolsky and Nathan Rosen, and become known as the EPR (each of their initials) paradox, which Einstein famously referred to as “spooky action at a distance”.  What it means, simply, is that qubits can be made to entangle, which then enables them to correlate with each other. Quantum Computers use microwaves or lasers to nudge the qubits into a state/alignment where they can be correlated, or entangled.

Now let’s walk through how this entanglement can manifest in an exponential increase in computing power.  If we consider two classical bits, we know that they can be either 1 or 0, so together they can take on the following values:

                0, 0

                1, 0

                0, 1


However, two entangled qubits can take all of those values at once, because of the entanglement, so in this case 2 qubits can take the value of 4 bits.  If we consider three classical bits, they can be any of the following combined entries:

                0, 0, 0                    1, 0, 0

                0, 1, 0                    1, 1, 0

                0, 0, 1                    1, 0, 1

                0, 1, 1                    1, 1, 1

So, in this case there are 8 combinations, but this can be fully described using 3 qubits (again because they are entangled).  Mathematically, the number of bits required to match the computing power of qubits is as follows: n qubits = 2n bits or an exponential relationship.  For now, don’t try and picture how entangled qubits do this, just know that they do.  The purpose of this line of analysis is to give some numerical context as to why this entanglement makes Quantum Computers (phenomenally) more powerful than classical computers.

If we continue the logic above for increasing numbers of bits/qubits we get the following:

So it only takes 13 qubits to have the equivalent classical computing power as a kilobyte (KB). Now let’s see how that manifests in computer power/speed.

Let’s assume we have a pretty powerful current classical computer processer, which might have a clock speed of 4 GHz, which means it can execute 4 BILLION cycles per second which sounds (and is) phenomenally fast.  High end current gaming PC’s generally operate at this speed and provide an excellent performance experience.  Let’s now use this baseline processing speed, and scale up the prior table, to see the profound impact of exponential computing power on processing time:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.

While this analysis assumes certain types of computer analysis/equations, the point is to show how Quantum Computers can process information at an unprecedented speedup

The key takeaway from this post is that Quantum Computers, using qubits that can be both in superposition and entangled, allow these machines to process inputs much faster than is possible with classical computing architecture.  Now all we need is a reliable working Quantum Computer with just 100 qubits, which would not only enable massive speedup for certain problems but would open the door to all sorts of new questions and analyses. Many experts predict this will be achieved within 5 years (IonQ has recently showcased a Quantum Computer with 32 entangled qubits, so real progress is being made).  Some general categories of problems where this phenomenal speedup will have a profound impact include simulation, optimization and encryption. In the next post I will provide more insights into what types of problems can be solved with the exponential speedup that can be provided by Quantum Computers.




What is a Computer? – Analog vs Digital vs Quantum

Before we can get into the inner workings of a Quantum Computer, we should make sure we are in alignment on what a computer actually is.  At its core, a computer is a machine that is designed to perform prescribed mathematical and logical operations at high speed, and display the results of these operations.  Generally speaking, mankind has been using “computers” in the form of the abacus, since circa 2700 BC.  Fast forward a few millennia, and we see the first “programmable computer” invented by Charles Babbage in 1833.  It then took another 100+ years for the first working “electromechanical programmable digital computer” or the Z3, to be invented by Konrad Zuse in 1941.

During World War II, a flurry of advances occurred, including the usage of vacuum tubes and digital electronic circuits, and the development of the famously depicted Enigma, which was used to break the encryption of German military communications. This was soon followed by Colossus in 1944, which was the first “electronic digital programmable computer” which was also used for military advantage. Enigma and Colossus were built in Bletchley Park in the UK, while ENIAC was the first such device built in the US and which was used extensively from 1943-1945. It weighed 30 tons and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors and inductors, but could add or subtract 5,000 times per second, which was a thousand times faster than any prior device and could handle multiplication, division and square roots. In many ways, the tangle of cables and electronics noted in photos of ENIAC (below left) seem eerily similar to the photos of today’s Quantum Computers (below right):

The next big advance in computers came with the invention of the integrated circuit in the 1950’s.  By 1968 the first silicon-gate semiconductor integrated circuit was developed at Fairchild Semiconductor, generally known as the MOSFET (short for metal oxide semiconductor field effect transistor) and is the core technology underpinning most current “digital computers”. 

Moore’s Law

The first MOSFET semiconductors built in 1971 had process nodes that were 10 microns in size, which is a fraction of the width of a human hair (which is about 50-70 microns).  Gordon Moore was a co-founder of Fairchild Semiconductor, and in 1975 he postulated that the number of transistors that could fit on an integrated circuit would double every two years, implicitly suggesting that the costs would thereby decrease by a factor of two.  This log-linear relationship was estimated, at the time, to continue for ten years but has amazingly been fairly consistent through today, meaning it has held for nearly 50 years.  However, in order for this rule/law to be in effect, the size of the process nodes needed to continue to shrink.  In fact todays generation of MOSFET includes 5 nm nodes (“nm” or nanometer is one-billionth the size of a meter), which is 1/2,000th the size of the first MOSFET nodes.  Ironically, as these size scales continue to shrink, they begin to approach “quantum scale” whereby the electrons being used in the processors begin to exhibit quantum behaviors thereby reducing their effectiveness at processing, in traditional digital devices, due to quantum tunneling.

While Moore’s Law has been amazingly prescient and consistent for these many decades, there is a theoretical minimum size that can’t be breached efficiently utilized for transistors, largely because of these scale/quantum limitations.  While the 5nm processor size is the current working minimum for semiconductors, and there are 3nm and even 2nm transistor scales in development, it appears that there is some end likely in sight, likely due to this quantum tunneling challenge at such scales.  The graphic below[1] shows the uncanny straight line (dark blue) of transistor scale.  However, the light blue and brown lines show some recent plateauing of maximum clock speed and thermal power utilization, indicating the declining efficiency as scale reduces.

Analogue vs Digital vs. Quantum

Readers that lived through the 90’s are likely familiar with the transition from “analogue” to “digital”.  This manifested most notably in the music industry, with the replacement of analogue phonograph records to digital discs and streamed digitized music. I won’t get into the audiophile arguments about which sound was purer but highlight this item to emphasize the “digitization” of things during our lifetime.

In the prior blog post I noted that computers used digital gates to process logic (i.e., AND, NOT and OR gates).   However, each of these gates can be performed by analogue methods and can be simulated using billiard balls, which was proposed in 1982 by Edward Fredkin and Tommaso Toffoli.  While this is a highly theoretical construct that assumes no friction and perfect elasticity between balls, I point it out because it shows that although current digital computation is amazing, efficient and powerful, it is just a sophisticated extension of basic analog (i.e., particle) movements.  Let me briefly walk you through one example to emphasize this point.

Picture two billiard balls entering a specially constructed wooden box. When a single billiard ball arrives at the gate through an input (0-in or 1-in), it passes through the device unobstructed and exits via 0-out or 1-out. However, if a 0-in billiard ball arrives simultaneously as a 1-in billiard ball, they collide with each other in the upper-left-hand corner of the device and redirect each other to collide again in the lower-right-hand corner of the device forcing one ball to exit via 1-out and the other ball to exit via the lower AND-output. Thus, the presence of a ball being emitted from the AND-output is logically consistent with the output of an AND gate that takes the presence of a ball at 0-in and 1-in as inputs.[2]

Similar physical gates and billiard balls could be constructed to replicate the OR and NOT gates.  As you may recall from the prior blog, all Boolean logic operators can be created using combinations of these three gates, so a theoretical computer constructed entirely of wood and billiard balls, could replicate the results of any existing computer. 

Admittedly, this is a theoretical construct, but I cite it to point out that while our current digital computers are amazingly powerful and fast and have led to countless advances and improvements in our daily lives, today’s digital computers, at their essence, are somewhat simplistic.  The “digitization” vastly improves speed and the ability to stack gates for interoperability and thereby tackling increasingly complex processes, but there are certain limits to their capabilities (I will cover some specifics on speed-up and complexity in subsequent posts).

Quantum Computers, and the gates possible using qubits, are a very different animal. The underlying mechanics and processes cannot be replicated using standard analogue materials because they operate using different laws of physics. Therefore, it is not really appropriate to compare the performance of a Quantum Computer with that of a digital computer, to suggest the quantum version is more powerful or faster – it is an “apples to oranges” comparison. Stated another way, it would be like saying a light-emitting diode (LED) is a more powerful candle. It is, in fact, an entirely different form of creating light and comparisons between the two are therefore not useful.

In summary, mankind has been using different forms of “computing devices” for thousands of years and Quantum Computers are in some ways a natural extension of computing progress.  However, different laws of physics are involved and therefore Quantum Computers are in a new category of computing devices that have the potential to create new approaches to problems and novel new solutions.

In the next few posts I will dig in on where these new computing approach will provide the most benefit, and how “Superposition” and “Entanglement” are used to massively increase the computing power of Quantum Computers.

[1] The Economist Technology Quarterly, published March 12, 2016

[2] Wikipedia contributors. (2021, May 4). Billiard-ball computer. In Wikipedia, The Free Encyclopedia. Retrieved 15:51, October 25, 2021, from https://en.wikipedia.org/w/index.php?title=Billiard-ball_computer&oldid=1021387675