Quantum Supremacy vs. Quantum Advantage – and how do we measure these things? 

Quantum Supremacy vs Quantum Advantage 

In October of 2019, Google announced that they had demonstrated the ability to compute in seconds what would take the largest and most advanced supercomputers thousands of years, thereby achieving a milestone referred to as “quantum supremacy” for the first time. They used a processor named “Sycamore” with 54 programmable superconducting qubits to create quantum states on 53 qubits (one did not operate), corresponding to a computational state-space of 253 (equivalent to about 1016 or over ten million-billion calculations).  They achieved this using a two-dimensional array of 54 transmon qubits, where each qubit is tunably coupled to four nearest neighbors. Each transmon has two controls: a microwave drive to excite the qubit, and a magnetic flux control to tune the frequency.  The claim was generally considered by many to be a “Wright Brothers Kitty Hawk” type of achievement.

And then, later that year, researchers at the University of Science and Technology of China (“USTC”) announced that they had also achieved quantum supremacy, utilizing a Quantum Computer named “Jiuzhang” which manipulates photons via a complex array of optical devices including light sources, hundreds of beam splitters, dozens of mirrors and 100 photon detectors.  They claimed that their device performed calculations in 20 seconds that would take a supercomputer 600 million years. Each of Google and USTC have increased their qubit utilization since these breakthroughs and now several other companies have successfully operated Quantum Computers with dozens of qubits and a couple with 100 or more. 

Let’s review some semantics regarding the measurement of Quantum Computing performance.  In 2012 a leading quantum mechanics researcher named John Preskill, a professor of theoretical physics at CalTech, first coined the term “quantum supremacy” to “describe the point where quantum computers can do things that classical computers can’t, regardless of whether those tasks are useful.”   He coined this term before any actual Quantum Computers had been built.  At the time, Preskill was wondering, in his words, “whether controlling large-scale quantum systems was merely really, really hard or whether it was ridiculously hard.  In the former case we might succeed in building large-scale quantum computers after a few decades.  In the latter case we might not succeed for centuries.”  In this sense, and based on Preskill’s original intent, the announcement by Google is a bona fide example of Quantum Supremacy and indicated that “a plethora of quantum technologies are likely in the next decade or so” [Preskill, 2019]. 

So, although the Google Sycamore quantum supremacy claim was discounted by some (most notably IBM and researchers in China), and despite it being an admittedly highly contrived and not very useful calculation, it was a ground-breaking achievement.     

Before I get into the semantics of how we measure Quantum Computing power, here is what the quantum community generally means regarding quantum progress: 

Quantum Supremacy: This term still retains Preskill’s original context and is considered the first major step to prove quantum computing is feasible.  Specifically, it means: “demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.”  Based on the definition, this threshold has been passed since October 2019, in fact at this point it has been shown by several different companies beyond Google and this is why I refer to the current hurdles as engineering challenges rather that theoretical ones.

Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.  

How do we Measure Quantum Computing Performance? 

At the end of a prior post regarding Qubits, I alluded to the challenge of measurement metrics for Quantum Computing highlighting that the count of operating qubits is not appropriate as a yardstick.  Imagine if you were shopping for a new car.  If the only metric that was available was “horsepower”, it would be very difficult to decide which car to buy.  By itself, horsepower is only one measure of car performance.  It does not factor actual acceleration power, fuel efficiency, ride comfort, handling, noise levels, legroom, sleekness, color/trim/style, etc.   Even if we are considering computers, just focusing on the clock speed, for example, would not provide enough breadth of information to make an informed purchase decision.  While Quantum Computers are in their very early stages, simply measuring a particular calculation speed or the number of qubits used, is not enough to describe accurately the actual performance capabilities.   Researchers at IBM have proposed the term “Quantum Volume” to enable the systematic measurement of Quantum Computing performance.  It is a metric that measures the capabilities and error rates of a Quantum Computer by calculating the maximum size of square quantum circuits that can be implemented successfully.  While the details are a bit esoteric, it is intended to provide one number, or score, to be used to compare incremental technology, configuration and design changes and to compare the relative power of one Quantum Computer to another.   

In fact, the performance of a quantum computer involves many factors as shown below: 

Source: IBM and Forbes as adapted by Riccardo Silvestri 

Since quantum volume is not quite an industry term-of-art at this point, I won’t use it as the definitive measurement tool.  However, the concept of focusing on characteristics beyond just the “number of qubits” is crucial, and I will discuss the relative performance characteristics of competing Quantum Computers beyond just a mention of the number of qubits. 

While many of the balloons in the above graphic may be unfamiliar, there are three key metrics for measuring quantum computing performance: 

  1. Scale: The number of qubits which the computer can simultaneously process.  It is important to distinguish between physical and logical qubits, with logical qubits being the key element (as I’ll show below, many constructs are adding physical qubits for error correction overhead). 
  1. Quality: The quality of the circuits which factors in both the time that the qubits remain in a superposition and entangled before they decohere, and the numbers of qubits that can entangle with each other. 
  1. Speed: Typically measured by circuit layer operations per second (or CLOPS) or how many circuits or gates can run on a Quantum Computers at a given time.  While this is a strong and objective measurement, it is not generally reported at this time. 

Another reason that the “number of qubits” is not useful to compare performance, is that we are currently operating in the NISQ environment (recall the “N” is for noisy).  Accordingly, many constructs are being proposed where certain qubits are dedicated to error correction and not for added entanglement.  IBM has a useful graphic to highlight the tradeoff between physical and logical qubits based on error rates: 

Quantum Computing Milestones 

While the semantics and various yardsticks used to describe Quantum Computer performance is confusing, evolving and not yet universally agreed upon, real progress is being made no matter which metric is showcased.   Here are a few recent advances in early working Quantum Computers, although not all report the same metrics, so it is difficult to compare these to each other: 

In addition to these Quantum Computers, Intel has a 49 qubit QC, Xanadu as a 24 qubit QC, and MIT has a 100 qubit QC, however the other performance metrics noted in the table are not readily available for these. 

It is worth noting that USTC recently claimed that Zuchongzhi 2.1 is a million times more powerful than Google’s Sycamore, and that it is 10 million- 100 trillion times faster than the world’s fastest supercomputer.  While it is difficult to substantiate these claims, given China’s enormous focus on Quantum Computing, a China-US space race of sorts is certainly afoot.  Also, the Quantinuum achievement on H1, only very recently announced, is worth paying close attention to given its high quantum volume and long decoherence times. 

Semantics and yardsticks aside, it is fascinating to see the increasing number of companies creating working Quantum Computers with ever increasing performance metrics, confirming that it is merely “really, really hard” to build these devises and not “ridiculously hard”.  It seems like we are seeing new press releases each week showcasing quantum performance achievements by these and others in the field.  Stay tuned as we track the performance. 


arXiv:1203.5813, “Quantum Computing and the entanglement frontier”, Preskill, John, March 26, 2012 

Quanta Magazine, “Why I called it ‘Quantum Supremacy”, Preskill, John, October 2, 2019 

Nature, “Quantum supremacy using a programmable superconducting processor,” Arute, Arya, Babbush, et. al., October 23, 2019 

The Independent – UK, “China builds world’s fasted programmable quantum computers that outperform ‘classical’ computers,” Sankaran, Vishwam, October 31, 2021 

Scorecards – Quantum Computing Report, Retrieved December 2021 

Silvestri, Riccardo. Masters Degree Thesis: “Business Value of Quantum Computers: analyzing its business potentials and identifying needed capabilities for the healthcare industry.” August 2020 

The Evolving Quantum Computing Ecosystem

In the past few blogs I have described what a Quantum Computer is and how it can be so powerful and transformative, basic features of qubits and highlights on some of the major players in Quantum Computing (“QC”).  But just like the evolution of personal computing, there are many participants in the QC ecosystem beyond just the makers of the actual machines.  You likely use a PC today, manufactured by one of a number of various hardware makers.  However, your machine’s core operating memory is made by a different company,  it is built upon an operating system (likely MS-DOS, owned by Microsoft) as well as various software applications.  You may also use external data drives, a mouse input device, a screen, a printer, various cables and other physical devices.  You likely also access the internet and some of the cloud services, utilize a virus protection program and other related activities and services.  There are likely dozens if not 100’s of companies whose technologies you use daily to operate your computing device. 

Companies like Oracle ($280 billion market cap; database management), Ingram Micro ($5.8 billion market cap; distributor of technology equipment), Cisco ($250 billion market cap; interconnecting equipment and services), Symantec ($15 billion market cap; antivirus protection), Adobe ($311 billion market cap; document and process software) and Salesforce ($262 billion market cap; productivity platform) have created enormous value despite not actually making any computers.    Quantum Computers will likely spur many similar such players in its ecosystem, in fact there are already 100’s of players engaged in this space., Some of these participants may also carve out significant market positions and value.  To give a sense for the breadth and depth of players needed, you can visualize the basic inner workings of a Quantum Computer as follows:

As this graphic shows, there are various aspects of the physical creation and manipulation of qubits (the bottom section of the graphic) along with software needed to control the logical layer.  Also, covered in a prior post, there are various ways to create qubits, often requiring cryogenic temperatures and/or detailed laser or radio frequency controls. 

Here is another graphic to help visualize the complexities of building a quantum computer:

Source: IBM

You’ll note the various wiring, amplifying, microwave generation and shilling components all requiring highly specialized design and control.  In order to describe the various QC players, it is helpful to segregate them into some functional categories or buckets as follows:

Hardware: Companies seeking to build a fully-functional Quantum Computer.  Many are also creating software and are integrating access to the cloud.  As discussed in a prior post, there are a few competing technologies underlying the creation of a working Quantum Computer including superconducting loops and Quantum Dots (which require cryogenics), or Ion traps and Photonics (which require sophisticated optics/laser controls), among others.

Circuits/Qubits: There are some companies focused on quibits and their interoperability for entanglement rather than attempting to build complete systems.

Cryogenics: Superconducting loops and quantum dots require temperatures that approach “absolute zero” (~negative 460 degrees Fahrenheit).  Many of the pictures you may see of Quantum Computers (like the graphic above) generally depict a 7-tiered structure, whereby the temperature is lowered in each of the layers, and there are companies that specialize in temperature control.

Wiring/Controllers: Operating near absolute zero, using lasers to control individual atoms or manipulating and controlling individual photons all require specialized and sophisticated devices and connections.  Some players are focused just on these types of challenges.

Error Correction: Due to the current NISQ (noisy intermediate-stage quantum) landscape and the need to have enormous computing “overhead” to correct for the noise in today’s qubits, some companies are concentrating on error correction strategies.

Photonics:  Lasers and/or photons are being utilized in various QC constructs and some companies are providing this specialization.

Software: Many of the major companies have developed quantum software to control and manipulate the qubits and the gates formed to perform quantum algorithms.  Some of these are creating open-source platforms while others are working on proprietary languages.

Applications:  Although this is still a somewhat immature portion of the market, as Quantum Computers continue to become more and more robust, I expect to see many more businesses develop applications and various related consulting services.

I will describe some of the players in this ecosystem, although the list is vast and growing, so this is not meant to be a definitive roster, rather a sampling to highlight the broad set of players and opportunities in Quantum Computing.  For a more complete list of players, I encourage you to visit this Quantum Computing Report listing.

In a prior post I noted that some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, as highlighted below: 

Each of these firms is making a major push in Quantum Computing, although their “valuation” is more driven by their other activities.  In any case, they are worth following and I expect their QC activities will make up an increasing portion of their values.

For the balance of this post I want to focus more on the players who are dedicated to QC or who have major operating divisions participating in the space, segregated by the categories described above:

Xanadu: Operator of a quantum photonic platform which it will combine with advanced artificial intelligence to integrate quantum silicon photonic chips into existing hardware to create a full-stack quantum computers.

IonQ: IonQ is a quantum computing hardware and software company developing a general-purpose trapped ion quantum computer and software to generate, optimize, and execute quantum circuits. It is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access and was the first pure-play public QC company.

Atom Computing:  Developer of quantum computers built using individually controlled atoms, creating nuclear-spin qubits made from neutral atoms that can be controlled, scaled, and stabilized optically.

PsiQuantum: PsiQuantum was founded on the premise that if you want a useful quantum computer, you need fault tolerance and error correction, and therefore ~1,000,000 physical qubits– to address commercially useful quantum computing applications.

Rigetti: Developer of quantum computing integrated circuits packaged and deployed in cryogenic environments and integrated into cloud infrastructure using pre-configured software.   The company also develops a cloud platform called Forest that enables programmers to write quantum algorithms.

EeroQ: Developer of quantum cloud platform using  trapping and control of individual electrons floating in a vacuum above superfluid helium, which form the qubits, and the purity of the superfluid protects the intrinsic quantum properties of each electron, allowing users to get seamless delivery of computing power.

ColdQuanta: Developer of quantum sensing technologies with a focus on improving the positioning and navigation systems as well as providing cold atom experimentation, quantum simulation, quantum information processing, atomic clocks, and inertial sensing products, enabling users to explore their own quantum matter innovations for sensing and other applications.

Quantum Circuits: The company’s computers are superconducting devices that include a quantum circuit model for quantum computation with an error correction system, enabling clients to make error-free computation through solid-state quantum bits.

D-Wave: Developer of quantum computing technologies offering annealing algorithms to solve optimization problems for commercial use in logistics, bioinformatics, life, and physical sciences, quantitative finance, and electronic design automation.

Oxford Instruments: Designs and manufactures tools and systems for industry and research. Their Quantum Technologies division helps companies with cryogenics, sensing photons and fabricating novel quantum materials.

Silicon Quantum Computing: SQC is currently developing a 10-qubit quantum integrated circuit in silicon to be delivered in 2023, and has the ultimate goal of delivering useful commercial quantum computing solutions.

Oxford Ionics: Manufacturer of computational electronic systems intended to create the most powerful, accurate, and reliable quantum computers. The company’s system is based on noiseless electronic qubits trapped ions control technology to create high-performance quantum computers by combining high quality qubits and trapped ions. 

Teledyne e2V: The engineering groups of Teledyne draws on a portfolio of leading edge technology, unique expertise and decades of experience in sensing, signal generation and processing for the development and commercialisation of Quantum technologies.

Quantum Brilliance: Using synthetic diamonds to develop quantum computers that can operate at room temperature, without the cryogenics or complex infrastructure, enabling disruptive quantum computing applications.

Chronos: Chronos Technology specializes in time, timing, phase, and monitoring solutions and services including highly accurate atomic clocks and clock syncronization.

BraneCell: Developer of a new quantum processing unit that can function at ambient temperatures. The company offers decentralized quantum computing hardware

Quantum Machines: Designing quantum controllers that translate quantum algorithms into pulse sequences, enabling organizations to run complex quantum algorithms and experiments in a smooth, intuitive way.

Alpine Quantum Technologies: Developer of ion trap quantum computer technology where single, charged atoms are trapped inside vacuum chambers.  Each qubit is manipulated and measured by precisely timed laser pulses.

Bluefors: Developer of a cryogen-free dilution refrigeration system designed to deliver easy-to-operate refrigerators. The company’s system provides custom unit connection components for different specifications including dilution units, control systems and gas handling units.

kiutra: Developer of a cooling technology intended to offer cryogen-free cooling service. The company’s technology offers sub-Kelvin temperatures for basic research, material science, quantum technology, high-performance electronics, and detector applications.

Toptica: Manufacturer of and distributor of high-end laser systems designed for scientific and industrial applications including for qubit control.

M-Squared: Developer of photonics and quantum technology used specifically for quantum research, biophotonics and chemical sensing application. The company’s laser based systems offer lasers and photonic optical instruments for applications in remote sensing, frontier science, bio-photonics, defence, microscopy, spectroscopy and metrology.

Montana Instruments: Delivers best-in-class cryostats that are simple to set up, use, and grow with our partners in your journeys over time. Since 2009, Montana Instruments has worked with hundreds of category pioneers to build cryostats with purposeful modularity.

Single Quantum: Developer of single-photon detectors designed to detect particles of light. The company’s detectors are based on superconducting nanotechnology.

Sparrow Quantum: Spun out of the Niels Bohr Institute, a developer of a photonic quantum technology based on self-assembled quantum dots coupled to a slow-light photonic-crystal waveguide, enabling nanophotonics researchers to increase light-matter interaction and enhance chip out-coupling.

Quantum Motion: Developer of quantum computer architectures designed to solve the problem of fault tolerance. The company’s architectures leverage CMOS processing to achieve high-density qubits which can scale up to large numbers and tackle practical quantum computing problems, enabling users to help reduce errors and thereby improve quality.

QDevil: Developer of electronics and specialized components for quantum electronics research.  The QFilter is a cryogenic filter for reducing electron temperatures below 100 mK. The product portfolio also includes the QDAC, a 24-channel ultra-stable low noise Digital-Analogue-Converter, the QBoard, a fast-exchange chip carrier system, and the QBox, a 24-channel breakout box.

SeeQC: The company’s technologies are developed and commercialized for quantum information processing applications including scalable fault-tolerant quantum computers and simulators, quantum communications, and quantum sensors, enabling businesses to get access to a full suite of electronic circuit design tools for integrated circuit design including PSCAN2, XIC, WR Spice and InductEx.

Delft Circuits: Manufacturer of cryogenic circuit technologies intended to perform scientific instrumentation, quantum computing, and astronomy. The company’s technology offers custom-engineered superconducting circuits and cryogenic instrumentation which have ultra-low thermal conductance and scalable cryogenic cabling, enabling users to conduct their research with cryogenic circuit packaging as per their need.

Q-CTRL: Developer of quantum control infrastructure software designed to perform quantum calculations to identify the potential for errors. The company’s platform uses quantum sensors to visualize noise and decoherence and then deploy controls to defeat the errors, enabling R&D professionals and quantum computing end users to improve the efficiency and performance of standoff detection as well as precision navigation and timing for defense and aerospace.

TMD Technologies: Manufacturer of professional microwave and radio frequency products primarily focused n the defense and communications markets as well as providing compact and precise atomic clocks, new gravimetric and magnetic sensors used in quantum computers. 

Terra Quantum: Developer of a hybrid quantum algorithm intended to solve a linear system of equations with exponential speedup that utilizes quantum phase estimation.

QxBranch: Developer of algorithms and software intended to provide predictive analytics, forecasting and optimization for quantum and classical computers.

Zapata: Spun out from Harvard in 2017, developer of a quantum software and algorithms to compose quantum workflows and orchestrate their execution across classical and quantum technologies. The company’s platform provides artificial intelligence, machine learning and quantum autoencoder to deliver an end-to-end, workflow-based toolset for quantum computing that advances computational power.

Cambridge Quantum Computing: Quantum computing software company building tools for commercialization of quantum technologies. The company designs software combining enterprise applications in the area of quantum chemistry, quantum machine learning and augmented cybersecurity in a variety of corporate and government use cases.

RiverLane: Developer of quantum computing software using an ultra-low latency quantum operating system that accelerates quantum-classical hybrid algorithms to facilitate hardware research and development and also develops algorithms to make optimal use of the full quantum computing stack, enabling hardware partners to focus on the physics and build better full-stack solutions.

QCWare: Developer of enterprise software designed to perform quantum computing. The company’s software simplifies QC programming and provides access to QC machines while improving risk-adjusted returns and monitoring networks, enabling clients to integrate quantum computing power into any existing application and remove performance bottlenecks.

StrangeWorks: Strangeworks QC™ is used by thousands of researchers, developers, and companies around the world to learn, teach, create, and collaborate on quantum computing projects and , enabling clients to overcome the risks of vendor lock-in and architectural uncertainties. 

1Qbit: 1QB Information Technologies is a quantum computing software company in hardware partnerships with Microsoft, IBM, Fujitsu and D-Wave Systems. 1QBit develops general purpose algorithms focused on computational finance, materials science, quantum chemistry, and the life sciences.

Quantum Computing Inc.: Quantum Computing Inc is focused on providing software tools and applications for quantum computers. Its products include the Qatalyst, Qatalyst Core, and Quantum Application Accelerator. Qatalyst enables developers to create and execute quantum-ready applications on conventional computers while being ready to run on quantum computers where those systems achieve performance advantage.

Quintessence Labs: Developer of quantum-cybersecurity applications designed to implement robust security strategies to protect data. The company’s cybersecurity technologies are used for cryptographic purposes to centralize the management and control of data-security policy and harness quantum science properties, thereby enabling businesses to increase returns on investment from existing assets and reduce data-security complexities.

MagiQ: A research and development company offering quantum cryptography system. The company’s offering includes optical sensing applications for RF interference cancellation, quantum cryptography, and optical surveillance for advanced energy exploration, enabling customers better communicate, safeguard and secure their worlds.

Quantinuum: A Honeywell spin-out, the company provides an open-access, architecture-independent quantum software stack and a development platform, enabling researchers and developers to work seamlessly across multiple platforms and tackle some of the most intriguing problems in chemistry, material science, finance, and optimization.

Nu Quantum: Developer of cryptography systems designed to be more secure and time-efficient. The company’s system created a portfolio of patented ground-breaking single-photon components fundamental to the realization of commercially viable photonic technologies by combining novel materials and semiconductor technology, enabling clients to secure exchange of cryptographic keys worldwide for the ultra-sensitive detection of light.

ID Quantique: Provider of quantum-safe crypto services designed to protect data for the long-term future. The company offers quantum-safe network encryption, secure quantum key generation, and quantum key distribution, enabling financial clients, enterprises, and government organizations to solve problems by exploiting the potential of quantum physics.

Some of these companies are now publicly traded or about to go public, others are private but well-funded by preeminent venture firms or other institutions.  Many are independent and working hard to establish a strong position in the ecosystem.  Stay tuned to this blog for future reports which will showcase some of the individual players and investment opportunities.


Nature, “Building logical qubits in a superconducting quantum computing system,” Gambetta, Chow and Steffen, January 13, 20917

AI Multiple, “QC Companies of 2021: Guide based on 4 ecosystem maps” Dilmegani, Cem , January 1, 2021

Fact Based Insight, Accessed December 2021

Qubits: A Primer

In a prior post about Superposition and Entanglement (click here to re-read), we learned that superposition allows a qubit to have a value of not just “0” or “1” but both states at the same time, enabling simultaneous computation.  Entanglement enables one qubit to share its state with other qubits enabling the information or processing capability to double with each entangled qubit.   These two features of Quantum Computing, embodied by “qubits,” enable it to perform certain types of calculations substantially faster than existing computers, and underlie the vast potential of Quantum Computing.   In this post I will describe how qubits are currently made and controlled.

There are mutually exclusive forces at play, which make qubit construction and manipulation exceedingly difficult, although not impossible.  On the one hand, in order for qubits to be as stable as possible, they need to be immune to external forces such as temperature changes, electromagnetic radiation, vibrations, etc., so that they stay in their “state” until we need to use them.  However, this makes it very difficult to manipulate them.  In addition, qubits operate based on quantum mechanics, which is the physics of incredibly small objects such as individual electrons (often measured by their spin which is either “spin up” or “spin down”) or photons (measured by their polarization which is either horizontal or vertical).  Controlling an individual electron or photon adds another layer of difficulty to the mix due to their extremely small scale.

When the bits created for classical computing were first created, there were several different transistor designs developed before the industry settled on MOFSET (metal-oxide-semiconductor field-effect transistor).  Similarly, today there are many ways to create a qubit.  The following is a brief overview of some of the more common types:

Superconducting QubitsSome leading Quantum Computing firms including Google and IBM are using superconducting transmons (an abbreviation derived from “transmission line shunted plasma oscillation qubit”) as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule).  The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop.  A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap.  Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.

Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit.  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.

Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 

Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

Diamond Vacancies: There is a well-know defect that can be manufactured into artificial diamonds, which leaves a nitrogen-vacancy inside the diamond which is filled by a single electron.   The spin of this electron can then be manipulated and measured with laser light.  This technology can operate at room temperature and ambient pressure, which are extremely positive attributes, although they have so far proven very difficult to scale to large numbers of qubits.

Topological Qubits: Quasiparticles can be observed in the behavior of electrons channeled through semi-conductor structures.  Braided paths can encode quantum information via electron fractionalization and/or ground-state degeneracy which can be manipulated via magnetic fields.  While this form of qubit is only theoretical at this point, it is being pursued by some large players including Microsoft.

There are a few others, including Neutral Atoms, Nuclear Magnetic Resonance (which seems more experimental but very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed.  Hopefully this provides a high-level flavor for the various types of qubits.  The good news is that many entities have created, manipulated and measured qubits and often there has been success in controlling them into superpositions and in entangling a limited but growing number of qubits at a time.    

The following table summarizes some of the benefits and challenges along with selected current proponents of key qubit technologies currently in use:

The “Noisy Intermediate-Scale Quantum” (or “NISQ”) Envirnonment

In prior posts I have covered how qubits use superposition and entanglement to empower massive processing speed for certain applications.  However, the technical and manufacturing challenges noted above regarding various qubit types, has prohibited the construction of a very large and non-error-prone system.    There are many competing strategies for creating qubits, each with a different set of advantages and challenges. 

In order to have a Quantum Computer that can exhibit supremacy to a classical computer, it is estimated that we need at least ~100 “logical” qubits, meaning 100 qubits that maintain their fidelity and coherence for as long as needed to perform a desired analysis.   However, as noted above, qubits are unstable, are easily affected by environmental factors, and are difficult to get to remain entangled.  These challenges are generally referred to as “noise”, hence the “N” in NISQ.  One way to solve for this “noise” is to allocate additional qubits to check on or correct the target qubit.  Currently, it is thought that as many as 1,000 “physical” qubits are required to ensure stable utilization of 1 “logical” qubit and many firms are focusing exclusively on the quantum error correction schemes to address this challenge.  Therefore, in order to create a Quantum Computer with 100 logical qubits in the NISQ phase of quantum computing, 100,000 – 1,000,000 physical qubits are being targeted.  To date, the most entangled qubits reported are still measured in the 100’s so there is a long way to go.  That said, this is now an engineering challenge more than a theoretical challenge, and many of the companies noted in this blog have announced product roadmaps to reach 1,000,000 active physical qubits in the next five years or so.

An alternative or competing framework is to create error-correcting qubits.  Today’s transistors have error correction built in, so they operate at extremely high accuracy rates.  The hope is that a method of qubit construction can be created that can self-correct, obviating the need for the massive error-correction overhead noted above in the 1,000:1 ratio of physical to logical qubits.

How do we measure Qubit Performance?

Unfortunately, there is no common and agreed upon set of metrics to allow apples-to-apples comparisons among various Quantum Computing configurations.  A few important measurement factors include the number of operations that can be performed before error, the gate fidelities, and the gate speeds.  IBM has proposed a “Quantum Volume” construct, intended to provide a single-number metric which factors in several key items in order to quantify the largest random circuit of equal width and depth that the Quantum Computer successfully implements.  While the approach of creating a single and agreed upon metric has broad interest, not everyone agrees with the IBM methodology, so a universal standard is still not available. In the meantime, two great resources for tracking and comparing qubit/Quantum Computer performance metrics inculde: Quantum Computing Report and Fact Based Insight. I’ve provided hyperlinks to their qubit dashboards.

So, in order to have objective assessments of various Quantum Computer performance metrics, it is important to acknowledge the various attributes desired and take a wholistic approach to such performance announcements.


Images from Science, C. Bickel, December 2016 and New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010

What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?”, Bobier, Langione, Tao and Gourevitch, BCG, July 2021

Fact Based Insight, Accessed December 2021

7 Primary Qubit Technologies for Quantum Computing, Dr. Amit Ray, December 10, 2018

Inside the race to build the best quantum computer on Earth, Gideon Lichfield, MIT Technology Review, February 26, 2020

Follow the Money…the Quantum Computing Goldrush

You’re likely thinking to yourself, “OK, I see there is some potential in Quantum Computers, and some theoretically important use cases, but nobody has created a robust working Quantum Computer…existing qubits only stay coherent for milliseconds at best, so isn’t this all just hype?”

While no one can say for sure, my suggestion, paraphrasing Deep Throat’s instructions to Bob Woodward, is to “follow the money.”

The amount of funding being dedicated to Quantum Computing on a global basis is staggering.  Governments, private companies, venture firms and academic institutions are all committing huge sums of money and resources to this field.  While investment flows are no guarantee of future value, there is a broad common theme to push the development of Quantum Computers, and the equivalent of the modern “space race” is garnering growing attention in the media.  Given the awesome power, potential and disruption that Quantum Computers can deliver, these trends should not be surprising.

The industry is at an interesting crossroad, where it has evolved from being an esoteric theoretical construct, to having many dozens of firms and academic institutions creating actual working (albeit still not very powerful) Quantum Computers. The challenge now is an engineering one, not a theoretical one. And with the growing pull of resources, it should be expected that engineering challenges will be overcome and developments will accelerate. When integrated circuits were still being created in the 1950’s, very few people could have imagined the boon it would create. Things like personal computers, cellular phones or the Internet were not yet contemplated. Even when PC’s were made available in the early 80’s, many were skeptical that there was an actual market for such an esoteric device. In fact, here is a reprint of an editorial by William F. Buckley Jr. as printed in the Lancaster New Era on July 19, 1982, where he is mulling that he cannot fathom any possible way a personal computer could be useful in the home:

Not surprisingly, his point-of-view was strictly in the context of the written word, since he was a writer, so his myopia makes contextual sense. Given that Quantum Computers are based on a completely different set of physics, logic gates and architecture, I am confident that the use cases will expand well beyond any currently contemplated uses and that current skeptics should try to maintain an open mind.

Government Directed Quantum Computing Investments

As can be seen in the chart below, the top ten countries focused on Quantum Computing technology have recently invested or committed over $21 billion towards this field:

The breadth and depth of these commitments are catalyzing the industry and I expect these trends to continue, so even excluding private company investment, there will be significant advancements achieved at the national level.

Major Current Players

Some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, and lead the push to broad adoption, as highlighted below:

Many are already offering their own quantum software platforms and providing early access to prototype machines over the web. For example, anyone can download the IBM Qiskit open-source Quantum Software Development Kit (SDK), create programs and run them on an IBM quantum emulator. Similarly, you can download and run Google’s Cirq, Microsoft’s Azure, Alibaba’s Aliyum, etc. among others. These firms are leveraging their broad infrastructure, technological resources and established web-based platforms to advance the access to, and utilization of, evolving Quantum Computing resources. In addition, in June Honeywell agreed to invest $300 million into its Quantum Computing unit after it merged with Cambridge Quantum Computing.

Venture Investment in Quantum Computing

In addition to the large government programs and major push by leading technology firms, there is a growing and accelerating focus on Quantum Computing among venture investors. According to the Quantum Computing Report, there have been more than 450 venture investments in Quantum Computing companies made by more than 300 different venture investment firms.  Echoing the growth of Silicon Valley companies funded by legendary Sand Hill Road venture investors, current venture investors are making increasing large and diverse bets on many parts of the Quantum Computing ecosystem.  The following chart showcases aggregate venture investments in each of the past three years (with more than a month still left in 2021):

A few venture firms have focused on Quantum Computing investments, with 17 firms making 3 or more such investments and with two (Quantonation and DCVC) making 10 or more each, as highlighted in the following table:

Not only has the playing field for Quantum Computing investments been growing, but there have been some very significant investments made. The following highlights some of the larger announced venture investments:

Sources: PitchBook, Boston Consulting Group

Of these companies, IonQ became the first-ever pure-play Quantum Computing company to go public, debuting on the NYSE on October 1, 2021 and as of Nov. 23rd had a market capitalization of $4.8 BILLION. Rigetti Computing also recently announced it would be going public in an expected $1.5 billion reverse merger with a SPAC. The latest PsiQuantum investment was announced this past summer and included a $450 million investment at a valuation exceeding $3 billion, with ambitious plans to build a commercially viable Quantum Computer by 2025.

University Focus on Quantum Computing

Quantum computing and quantum information theory has gone from being a fringe subject to a full complement of classes in well-funded programs at quantum centers and institutes at leading universities.  Some world-class universities offering dedicated Quantum Computing classes and research efforts include:

  • University of Waterloo – Institute for Quantum Computing
  • University of Oxford
  • Harvard University – Harvard Quantum Initiative
  • MIT – Center for Theoretical Physics
  • National University of Singapore and Nanyang Technological University – Centre for Quantum Technologies
  • University of California Berkeley – Berkeley Center for Quantum Information and Computation
  • University of Maryland – Joint Quantum Institute
  • University of Science and Technology of China – Division of Quantum Physics and Quantum Information
  • University of Chicago – Chicago Quantum Exchange
  • University of Sydney, Australia
  • Ludwig Maximilian University of Munich – Quantum Applications and Research Lab
  • University of Innsbruck – Quantum Information & Computation

These Colleges and Universities, as well as many others, continue to add courses and departments dedicated to Quantum Computing.

We are witnessing an unprecedented concentration of money and resources focused on Quantum Computing, including substantial government initiatives, major industrial player committment, accelerating venture investment and evolving university programs. While not all investments will be positive, and the landscape continues to evolve, serious, smart money is backing this trend. The clear message is that resource focus will lead to engineering breakthroughs and immense value creation. There are now 100’s of companies jockeying for position in this evolving field. Stay tuned to this blog as we watch for the winners and losers.


Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Hajjar, Alamira Jouman, 33+ Public & Private Quantum Computing Stocks, AI Multiple, May 2, 2021

Inside Quantum Technology News, Government Investments in Quantum Computing Around the Globe, May 31, 2021.

Pitchbook Database, Retrieved November 2021

Universities With Research Groups — Quantum Computing Report, Retrieved November 2021

Venture Capital Organizations — Quantum Computing Report, Retrieved November 2021

Quantum Quantum Everywhere

Quantum Mechanics in Everyday Life, Near Term Use and Future Quantum Computing Applications

In prior posts, I conveyed some of the underlying reasons why Quantum Computers can do things that existing digital computers cannot do or would take prohibitively long to do. In this post I will cover some of the near-term use cases for Quantum Computing, but first I want to cover how “Quantum” or, specifically, the quantum mechanics underlying the power of Quantum Computing, is already used in our daily lives, some near term applications where quantum effects are providing powerful new capabilities, and finally, where the power of Quantum Computing will likely have the most impact.

Quantum Mechanics in Everyday Life

Anyone trying to learn about Quantum Computing or quantum mechanics is likely baffled by how to picture it in your head in a relatable way. Because quantum mechanics occurs on a scale so small and the physics are wholly unfamiliar, it is an intimidating field and very difficult to visualize. However, we use and benefit from quantum mechanics every day without understanding the underlying physics. Here are some examples (Choudhury, 2019):

  1. Electronic Appliances: If you notice the heating elements in your toaster, you are witnessing a quantum mechanical application of electricity as evidenced by the red glow which is the power being converted to heat.
  2. Computers (transistors and microchips): The core transistors in every computer (and in the chips used in many other modern products) work via semiconductors, where the electrons behave like waves, which is a core principle of quantum physics.
  3. LED’s:  Like transistors, LED’s are made of two layers of semiconductor, which are caused to meet and to release the energy applied by the power source, again a quantum physical action.
  4. Lasers: Lasers produce their monochromatic light via a form of optical amplification based on the stimulated emissions of photons, another quantum physical process.
  5. MRI’s: Magnetic Resonance Imaging works by flipping the spins in the nuclei of hydrogen atoms.
  6. GPS: the ubiquitous Global Positional System, where the interconnected satellites, using atomic clocks, use principles of quantum theory and relativity to measure time and distance.
  7. Incandescent Bulbs: Like with the toaster noted above, current passes through a thin filament and makes it hot, which causes it to glow, which creates visible light – all quantum mechanical processes.
  8. Sensors: Nearly all of us have digital cameras or use the cameras in our phones.  These cameras use a lens to collect and convey photons, which the sensor, a form of semiconductor, converts to a digital image.

Hopefully, these examples give you the confidence to appreciate that quantum physics impacts your everyday life without any need to understand the underlying physics.  Let’s use that baseline to now explore applications of quantum physics in quantum sensing, quantum communications and, finally, Quantum Computing.

Quantum Sensing

Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   None of these uses require entanglement, so these are much nearer to actual utilization than robust Quantum Computers. 

Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.

Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds (Shapiro, 2008).  Quantum illumination is similar and can be used in quantum radar. 

Tabletop prototypes of these quantum sensing applications have already been demonstrated and have the nearest-term commercial potential (Palmer, 2017).

Quantum Communication

The primary near-term application of quantum mechanics in communications involves quantum key distribution (QKD).  QKD is a form of encryption (more on encryption below) used between two communicating parties who encode their messages in transmitted photons.  Due to the quantum nature of photons, any eavesdropper who intercepts a message encoded with QKD will leave a telltale sign that the data stream was read since the act of viewing a photon alters it (a fundamental principle of quantum dynamics).  For this reason, quantum-secure communication is referred to as “unhackable”.  This principal has already been shown over fiber optics and across line-of-sight towers (both of which have limitations on distance) and has recently been demonstrated by China via satellite.  China launched the Mozi satellite in 2018 and beamed a completely secure QKD encrypted message between China and Austria (Liao et al., 2018).  And this past month, the CAPSat, quantum communication satellite, a collaboration between University of Illinois Urbana-Champaign and the University of Waterloo, was placed into orbit by the ISS, and is designed to test unhackable quantum communications.  So long-range quantum communication is already becoming a reality (Schwink, 2021). 

Quantum Computing

So far in this post I have shown you how quantum physics already impacts your everyday life as well as some new applications that are already in use or have shown success via prototypes, so will be utilized near-term.  The least commercially developed feature of quantum physics, but the most profoundly beneficial, involves the superposition and entanglement of qubits in Quantum Computing [covered in detail in the prior post].

I want to make clear that “Quantum Computers” are not all-powerful supercomputers that will replace existing binary-based computers.  An essential feature of Quantum Computing lies in the exponential increase in its computing power as you increase the number of entangled qubits which distinguishes it from digital computing for certain types of calculations or problems.  The most fundamental areas where this exponential speedup is valuable applies to an area known as combinatorics.  Let me provide an example to set the stage for this discussion.

Assume you manage a networking group, and you are planning the seating chart for this month’s meeting where eight members are going to attend. You want to arrange the seating so that you help optimize the networking opportunities as well as respect seniority by having certain members sit facing the door, etc. (the reasons are not important, just assume that the seating chart has many nuances). You may think this is an easy exercise – for example, put Alice and Bob next to each other, but not next to Charlie since they already know each other.  Put Sam closest to the door, etc.  However, it turns out that there are more than 40,000 different seating arrangements with just 8 people (for those trying to decipher the math, it is 8! or 8 factorial, meaning place any of the 8 attendees in the first seat, then any of the 7 remaining attendees in the next seat, etc., or 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 = 40,320 different seating combinations).  This may seem more complicated than you expected, but intuitively you may feel that you could work it out if you had to.

However, imagine that at the next month’s meeting you have 16 members attend and want to be equally diligent in the seating arrangement.  For this meeting there are now 20,922,789,888,000 different seating arrangements possible, or more than 20 trillion!  With just 16 people (16x15x14x….). This defies logic but is simple factorial math.  Now, I am not suggesting we need Quantum Computers to help with seating charts, but a seating chart represents a typical “optimization” challenge. For certain instances, as you increase the number of inputs, the potential combinations become unmanageable very quickly, hence the reference to combinatorics

Where will Quantum Computers Provide Near-Term Results?

The superposition and entanglement of qubits enables Quantum Computers to consider many combinations simultaneously instead of linearly, hence the tremendous speed-up in processing.   Let’s now dig into two areas where Quantum Computers can use these speedup features to provide a “quantum advantage” in the ability to process currently unmanageable combinatorial problems, namely simulation/optimization and cryptography. 

Simulation and Optimization

For optimization, you can imagine our networking seating problem as analogous to molecular modeling for things such as drug development or materials science.

PASIEKA/Getty Images

In these cases, as you tweak the atoms or molecules or proteins you are studying, the numbers of different alignments or configurations increases quickly, like shown with the seating chart example.    A powerful Quantum Computer could simulate and evaluate many potential configurations simultaneously and could dramatically accelerate advances in these fields.  Here are some examples where Quantum Computers can accelerate computational problems:

  • Simulation: Simulating processes that occur in nature and are difficult or impossible to characterize and understand with classical computers, which has the potential to accelerate advances in drug discovery, battery design, fertilizer design, fluid dynamics, weather forecasting and derivatives pricing, among others.
  • Optimization: Using quantum algorithms to identify the best solution among a set of feasible options, such as in supply chain logistics, portfolio optimization, energy grid management or traffic control.

The table below highlights additional examples of fields where Quantum Computing speedup will manifest:

Here are examples regarding a few of these applications along with some of the companies already deploying early quantum computing programs:

  • Today, most new drugs are formulated by trial and error and the time between finding a new drug molecule and getting it into the clinic averages 13 years and costs up to $2 billion. If we can use Quantum Computers to model various drugs in silico, instead of through the trial and error of lab experiments, we could shorten this timeline and decrease the overall costs. Recently, healthcare giant Roche announced a partnership with Cambridge Quantum Computing to support efforts in research tackling Alzheimer’s disease. And synthetic biology company Menton AI has partnered with quantum annealing company D-Wave to explore how quantum algorithms could help design new proteins with therapeutic applications.
  • Fertilizers are crucial to feeding the world’s growing population because they allow food crops to grow stronger, bigger and faster. More than half of the world’s food production relies on synthetic ammonia fertilizer which is created by the Haber-Bosch process which converts hydrogen and nitrogen to ammonia. However, this process has an enormous carbon footprint including the energy needed to perform the conversion (some estimate this to be 2%-5% of ALL global energy production) as well as the huge amount of carbon-dioxide by-product it emits. Scientists believe that using a Quantum Computer, they could map the chemistry used by certain bacteria that naturally create fertilizers and uncover an alternative to the current synthetic fertilizers created by the Haber-Bosch process. In fact Microsoft has already demonstrated how Quantum Computers can create better fertilizer yields and has created a Quantum Chemistry Library to facilitate such research.
  • There is a global push to expand battery powered automobiles in a transition to a greener economy, but existing car batteries have limited capacity/range and long charge times. Searching for materials with better properties is another molecular simulation problem that can be better handled by Quantum Computers. That is why German car maker Daimler has partnered with IBM to assess how Quantum Computers could help simulate the behavior of sulphur molecules in different environments, with the end-goal of building lithium-sulphur batteries that are longer-lasting, better performing and less expensive than existing lithium-ion batteries.
  • The “traveling salesman problem” generally describes the challenge of optimizing the routing for businesses, another area where combinatorics makes the problems exponentially difficult to resolve as inputs are added. For example, a fleet of more than 50,000 merchant ships carrying 200,000 containers each, with a total value of $14 trillion dollars, is actively in motion each day. Energy giant ExxonMobil has teamed up with IBM to find out if Quantum Computers could do a better job optimizing these routes and related logistics.

In the next blog I will cover additional details on the players currently working with Quantum Computers for these and similar applications.


Another field where Quantum Computers will have a profound impact is for encryption.  Nearly every time you log into a site on your computer, perform on-line banking transactions or when governments send confidential communications between entities, such activity is “on the web” meaning accessible to others.  It is protected by an encryption protocol developed by Ron Rivest, Adi Shamir and Leonard Adleman in 1977 and known as RSA public-key encryption.   

In a very truncated description, the foundation of the RSA encryption lies in the fact that it uses two very large prime numbers to create a “factoring problem”.    Here is an over-simplified explanation:

  1. A sender uses a very large number (the product of two large prime numbers) to encrypt or encipher a message.  This is known as the Public Key.
  2. The encoded message along with the Public Key are sent over the Internet (in theory, anyone can see/read these).
  3. The Sender and a Receiver communicate a Private Key in a secure manner.  This Private Key is the two prime factors used to create the Public Key.
  4. The Receiver uses its Private Key to decrypt or decipher the message.

The encoded message cannot be decoded without knowing this “private key”. Said another way, finding the two prime factors of a very large number is exceedingly difficult, so if the RSA Encipher key is based on a sufficiently large number (i.e., 2048 bits which is over 600 digits long), it is practically impossible with current computers to find the two prime factors. However, in 1994, mathematician Peter Shor proposed an algorithm that could factor large numbers into their primes in much shorter polynomial time. In fact the algorithm he created is open source and available on the Internet for anyone to download. [For those of you interested in seeing the actual code, you can visit here: [GitHub implementation of Shor’s algorithm written in Python calling Q# for the quantum part]. Existing Quantum Computers only have the power to factor fairly small numbers, but the code is readily available for whomever creates a powerful enough Quantum Computer to use it to break existing RSA encryption.

Cryptocurrency mining and wallets are also areas which could be vulnerable to Quantum Computers. Bitcoin and other cryptocurrencies are “mined” by computers that crunch increasingly complex algorithms which result in the creation of new bitcoins (and which is why bitcoins consume increasing amounts of power). As levels of cryptocurrency are deciphered, the code to uncover the next round of coins increases in complexity. By some estimates, the current bitcoin protocols will take another 120 years to mine the remaining coins, so once Quantum Computers are powerful enough they could mine the remaining coins much faster. In addition, the wallets that most people use to hold their cryptocurrency have similar vulnerabilities as described above regarding encryption.

I hope this post helps you appreciate how quantum mechanics already affects your everyday life and to begin to appreciate areas where Quantum Computers will have a profound impact.   Stay tuned for a deeper dive into this subject.


Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Bodur, Hüseyin and Kara, Resul ,“Secure SMS Encryption Using RSA Encryption Algorithm on Android Message Application.”, 2015.

Bova, Francesco, Goldfarb, Avi & Melko, Roger G., “Commercial applications of quantum computing,” EPJ Quantum Technology, 2021.

Cavicchioli, Marco, “How fast can quantum computers mine bitcoin?” The Cryptonomist, May 12, 2020.

Choudhury, Ambika, 8 Ways You Didn’t Know Quantum Technology Is Used In Everyday Lives (analyticsindiamag.com), October 7, 2019.

Leprince-Ringuet, Daphne, “Quantum computers: Eight ways quantum computing is going to change the world,” ZDNet, November 1, 2021.

Liao, Sheng-Kai, Cai, Wen-Qi, Pan, Jian-Wei, “Satellite-to-ground quantum key distribution,” Nature, August 9, 2017.

Palmer, Jason, “Here, There and Everywhere: Quantum Technology Is Beginning to Come into Its Own,” The Economist, 2017.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

Schwink, Siv, “Self-annealing photon detector brings global quantum internet one step closer to feasibility,” University of Illinois Urbana-Champaign Grainger College of Engineering, October 13, 2021.

Quantum Superposition and Entanglement

In prior posts I emphasized the excitement and potential of Quantum Computing, without any reference to the underlying quantum mechanics, but would like to introduce you to some unique quantum properties in this post. While understanding the nature of Quantum Computing is complex and contains many new concepts, a basic understanding of “Superposition” and “Entanglement” is fundamental to grasp why this new computing methodology is so novel, powerful and exciting.  I am going to try to describe these concepts in a way that does not require any math, although I will use some math references to highlight how these concepts manifest.


As noted in the prior post, one of the fundamental differences between Quantum Computers and classical computers lies in the core components used to process information.  We know that classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because in order to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a quantum system can be in a superposition of both at the same time.  While this seems counter-intuitive, and somewhat supernatural, it is well proven so please try and accept it at face value in order to get the gist of the other concepts covered in this post.  For a deeper dive into superposition from a particle physics perspective (light is both a particle and a wave), you can investigate Wave–particle duality. [Fun Fact – Einstein did not receive the Nobel prize for his famous E=MC2 relativity equation but rather for his photoelectric effect work, which is fundamental to quantum mechanics, where he postulated the existence of photons, or “quanta” of light energy which underpins much of the power behind Quantum Computing].

Without getting into the physics or explaining complex numbers, Superposition can be mathematically depicted as:

Please bear with me here, I have promised not to overwhelm you with complex math, I only want to highlight how to think about Superposition in a way that will help you appreciate its power for computing and to share the nomenclature that is generally used in the field.  Don’t focus on the Greek characters (psi, alpha and beta) or the linear algebra notation (the |Ket> notations and the parenthetical portion).  Simply note that the equation above on the left, is the mathematical representation of a qubit and is simply stating that there is a probability that a given qubit (the Psi or trident symbol) when measured is “0” (the alpha symbol) and a probability that the qubit is a “1” (the beta symbol).  The equation on the right, known as the Born rule, is simply stating that the two probabilities add to 100%.  Let me reframe that in a simpler manner. Before a qubit is actually measured, it is both a “1” and a “0” at the same time and the relative odds of it being one or the other are included in the qubit equation. In practice this means that using Quantum Computers to solve problems becomes a probabilistic analysis, and if the equations are run enough times, they will average out to the answer.

You may recall in a prior post that qubits are described as 3-dimensional, as shown below in blue and red. The line drawing version with the funny symbols, shows how this is used for calculations when put into a superposition (the math and symbols are helpful for those comfortable with them, but not essential for a general understanding):

In this depiction, if the North pole is “0” and the South pole is “1” and if the qubit is tilted to the side, the degree of its tilt translates generally into these probabilities.   In the image with the blue arrow pointing to psi, you will notice that the psi symbol looks like it is leaning between north and south, in this case closer to north.  For example, this might be represented as “0.8 |0> + 0.6 |1>” meaning it is leaning closer to “0”. This generally means it would have a higher probability of being 0 when measured. [You will also note that if you square each term, you get 0.64 + 0.36 which equals 1, and therefore follows the Born rule, and roughly means that the odds of this qubit being a 0 are 64% and the odds of it being a 1 are 36%.]

The important part to take away is that since a qubit can represent an input with various weightings of 1 and 0, it contains much more information than a simple binary bit.


If the above explanation of superposition seems a bit unintuitive, I’m afraid that entanglement might seem even more bizarre [don’t worry, you’re not alone, even Einstein struggled with it], but I will do my best to explain it.  I will ask you to accept that, like Superposition, this is an actual phenomenon which is well proven, even if you likely won’t be able to picture it in your mind in a way that will be satisfying.  And it is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (more on that below).

Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled“, their spin or polarization becomes perfectly correlated.  In fact, this concept is much older than Quantum Computers and was first described in 1935 by Einstein, along with Boris Podolsky and Nathan Rosen, and become known as the EPR (each of their initials) paradox, which Einstein famously referred to as “spooky action at a distance”.  What it means, simply, is that qubits can be made to entangle, which then enables them to correlate with each other. Quantum Computers use microwaves or lasers to nudge the qubits into a state/alignment where they can be correlated, or entangled.

Now let’s walk through how this entanglement can manifest in an exponential increase in computing power.  If we consider two classical bits, we know that they can be either 1 or 0, so together they can take on the following values:

                0, 0

                1, 0

                0, 1


However, two entangled qubits can take all of those values at once, because of the entanglement, so in this case 2 qubits can take the value of 4 bits.  If we consider three classical bits, they can be any of the following combined entries:

                0, 0, 0                    1, 0, 0

                0, 1, 0                    1, 1, 0

                0, 0, 1                    1, 0, 1

                0, 1, 1                    1, 1, 1

So, in this case there are 8 combinations, but this can be fully described using 3 qubits (again because they are entangled).  Mathematically, the number of bits required to match the computing power of qubits is as follows: n qubits = 2n bits or an exponential relationship.  For now, don’t try and picture how entangled qubits do this, just know that they do.  The purpose of this line of analysis is to give some numerical context as to why this entanglement makes Quantum Computers (phenomenally) more powerful than classical computers.

If we continue the logic above for increasing numbers of bits/qubits we get the following:

So it only takes 13 qubits to have the equivalent classical computing power as a kilobyte (KB). Now let’s see how that manifests in computer power/speed.

Let’s assume we have a pretty powerful current classical computer processer, which might have a clock speed of 4 GHz, which means it can execute 4 BILLION cycles per second which sounds (and is) phenomenally fast.  High end current gaming PC’s generally operate at this speed and provide an excellent performance experience.  Let’s now use this baseline processing speed, and scale up the prior table, to see the profound impact of exponential computing power on processing time:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.

While this analysis assumes certain types of computer analysis/equations, the point is to show how Quantum Computers can process information at an unprecedented speedup

The key takeaway from this post is that Quantum Computers, using qubits that can be both in superposition and entangled, allow these machines to process inputs much faster than is possible with classical computing architecture.  Now all we need is a reliable working Quantum Computer with just 100 qubits, which would not only enable massive speedup for certain problems but would open the door to all sorts of new questions and analyses. Many experts predict this will be achieved within 5 years (IonQ has recently showcased a Quantum Computer with 32 entangled qubits, so real progress is being made).  Some general categories of problems where this phenomenal speedup will have a profound impact include simulation, optimization and encryption. In the next post I will provide more insights into what types of problems can be solved with the exponential speedup that can be provided by Quantum Computers.




What is a Computer? – Analog vs Digital vs Quantum

Before we can get into the inner workings of a Quantum Computer, we should make sure we are in alignment on what a computer actually is.  At its core, a computer is a machine that is designed to perform prescribed mathematical and logical operations at high speed, and display the results of these operations.  Generally speaking, mankind has been using “computers” in the form of the abacus, since circa 2700 BC.  Fast forward a few millennia, and we see the first “programmable computer” invented by Charles Babbage in 1833.  It then took another 100+ years for the first working “electromechanical programmable digital computer” or the Z3, to be invented by Konrad Zuse in 1941.

During World War II, a flurry of advances occurred, including the usage of vacuum tubes and digital electronic circuits, and the development of the famously depicted Enigma, which was used to break the encryption of German military communications. This was soon followed by Colossus in 1944, which was the first “electronic digital programmable computer” which was also used for military advantage. Enigma and Colossus were built in Bletchley Park in the UK, while ENIAC was the first such device built in the US and which was used extensively from 1943-1945. It weighed 30 tons and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors and inductors, but could add or subtract 5,000 times per second, which was a thousand times faster than any prior device and could handle multiplication, division and square roots. In many ways, the tangle of cables and electronics noted in photos of ENIAC (below left) seem eerily similar to the photos of today’s Quantum Computers (below right):

The next big advance in computers came with the invention of the integrated circuit in the 1950’s.  By 1968 the first silicon-gate semiconductor integrated circuit was developed at Fairchild Semiconductor, generally known as the MOSFET (short for metal oxide semiconductor field effect transistor) and is the core technology underpinning most current “digital computers”. 

Moore’s Law

The first MOSFET semiconductors built in 1971 had process nodes that were 10 microns in size, which is a fraction of the width of a human hair (which is about 50-70 microns).  Gordon Moore was a co-founder of Fairchild Semiconductor, and in 1975 he postulated that the number of transistors that could fit on an integrated circuit would double every two years, implicitly suggesting that the costs would thereby decrease by a factor of two.  This log-linear relationship was estimated, at the time, to continue for ten years but has amazingly been fairly consistent through today, meaning it has held for nearly 50 years.  However, in order for this rule/law to be in effect, the size of the process nodes needed to continue to shrink.  In fact todays generation of MOSFET includes 5 nm nodes (“nm” or nanometer is one-billionth the size of a meter), which is 1/2,000th the size of the first MOSFET nodes.  Ironically, as these size scales continue to shrink, they begin to approach “quantum scale” whereby the electrons being used in the processors begin to exhibit quantum behaviors thereby reducing their effectiveness at processing, in traditional digital devices, due to quantum tunneling.

While Moore’s Law has been amazingly prescient and consistent for these many decades, there is a theoretical minimum size that can’t be breached efficiently utilized for transistors, largely because of these scale/quantum limitations.  While the 5nm processor size is the current working minimum for semiconductors, and there are 3nm and even 2nm transistor scales in development, it appears that there is some end likely in sight, likely due to this quantum tunneling challenge at such scales.  The graphic below[1] shows the uncanny straight line (dark blue) of transistor scale.  However, the light blue and brown lines show some recent plateauing of maximum clock speed and thermal power utilization, indicating the declining efficiency as scale reduces.

Analogue vs Digital vs. Quantum

Readers that lived through the 90’s are likely familiar with the transition from “analogue” to “digital”.  This manifested most notably in the music industry, with the replacement of analogue phonograph records to digital discs and streamed digitized music. I won’t get into the audiophile arguments about which sound was purer but highlight this item to emphasize the “digitization” of things during our lifetime.

In the prior blog post I noted that computers used digital gates to process logic (i.e., AND, NOT and OR gates).   However, each of these gates can be performed by analogue methods and can be simulated using billiard balls, which was proposed in 1982 by Edward Fredkin and Tommaso Toffoli.  While this is a highly theoretical construct that assumes no friction and perfect elasticity between balls, I point it out because it shows that although current digital computation is amazing, efficient and powerful, it is just a sophisticated extension of basic analog (i.e., particle) movements.  Let me briefly walk you through one example to emphasize this point.

Picture two billiard balls entering a specially constructed wooden box. When a single billiard ball arrives at the gate through an input (0-in or 1-in), it passes through the device unobstructed and exits via 0-out or 1-out. However, if a 0-in billiard ball arrives simultaneously as a 1-in billiard ball, they collide with each other in the upper-left-hand corner of the device and redirect each other to collide again in the lower-right-hand corner of the device forcing one ball to exit via 1-out and the other ball to exit via the lower AND-output. Thus, the presence of a ball being emitted from the AND-output is logically consistent with the output of an AND gate that takes the presence of a ball at 0-in and 1-in as inputs.[2]

Similar physical gates and billiard balls could be constructed to replicate the OR and NOT gates.  As you may recall from the prior blog, all Boolean logic operators can be created using combinations of these three gates, so a theoretical computer constructed entirely of wood and billiard balls, could replicate the results of any existing computer. 

Admittedly, this is a theoretical construct, but I cite it to point out that while our current digital computers are amazingly powerful and fast and have led to countless advances and improvements in our daily lives, today’s digital computers, at their essence, are somewhat simplistic.  The “digitization” vastly improves speed and the ability to stack gates for interoperability and thereby tackling increasingly complex processes, but there are certain limits to their capabilities (I will cover some specifics on speed-up and complexity in subsequent posts).

Quantum Computers, and the gates possible using qubits, are a very different animal. The underlying mechanics and processes cannot be replicated using standard analogue materials because they operate using different laws of physics. Therefore, it is not really appropriate to compare the performance of a Quantum Computer with that of a digital computer, to suggest the quantum version is more powerful or faster – it is an “apples to oranges” comparison. Stated another way, it would be like saying a light-emitting diode (LED) is a more powerful candle. It is, in fact, an entirely different form of creating light and comparisons between the two are therefore not useful.

In summary, mankind has been using different forms of “computing devices” for thousands of years and Quantum Computers are in some ways a natural extension of computing progress.  However, different laws of physics are involved and therefore Quantum Computers are in a new category of computing devices that have the potential to create new approaches to problems and novel new solutions.

In the next few posts I will dig in on where these new computing approach will provide the most benefit, and how “Superposition” and “Entanglement” are used to massively increase the computing power of Quantum Computers.

[1] The Economist Technology Quarterly, published March 12, 2016

[2] Wikipedia contributors. (2021, May 4). Billiard-ball computer. In Wikipedia, The Free Encyclopedia. Retrieved 15:51, October 25, 2021, from https://en.wikipedia.org/w/index.php?title=Billiard-ball_computer&oldid=1021387675

Why is Quantum Computing so Exciting and How Can It Be So Powerful?

If you are reading this blog you are likely already familiar with some of the essential features of Quantum Computing like superposition and entanglement.  Even if you are not, I will cover some of those details in future posts.  For now, I wanted to begin without any physics or linear algebra and instead give you some layman observations about Quantum Computing to establish a fundamental understanding of the potential that quantum computing holds.  For those of you that are more familiar with some of these underlying principles, please allow me to take some liberty as I describe them.  I am more interested at this point with conveying a general sense for why quantum computing has so much more power than classical computing (at least for certain problems), so will generalize some things in ways that may not be explicitly or literally correct.

If you are early in your quantum journey, you are likely finding it to be a lot like buying a 1,000 piece, two-sided puzzle without edges and dumping the pieces on the table.

There are a jumble of new phrases and concepts, but for now, let’s dispense with the peculiarities of why Schrodinger’s cat can be both alive and dead, why Heisenberg talks about uncertainty and why Einstein famously said that “God does not play dice with the universe.”  And, with powerful deference to Richard Feynman, the modern godfather of quantum theory, you don’t actually need to understand the specifics of quantum mechanics to grasp the enormous potential of Quantum Computing.

I think we all appreciate the awesome power of the microchip and the advances in technology (and creature comforts it has provided) without understanding the actual engineering of the silicon chips or transistors or integrated circuits that underlie current computers. 

So if you’ll indulge me for a bit here, let me try and convey the potential of the powerful new computing methodology of quantum, without reference to superposition or Josephson Junctions or coherence or error correction (I’ll cover those in subsequent posts).  Let’s start with some fundamental features of classical computer “bits” versus quantum computer “qubits”.  As you may know, classical computers use “bits” to establish a binary yes/no or on/off state which is then interpreted in algorithms to perform calculations.  These bits are combined into bytes to represent letters and numbers, among other things.  Let’s picture a bit as a one-dimensional creature (this takes some liberties, since a one-dimensional object is a line, but bear with me here).  The first fundamental difference between classical computing and quantum computing is that the qubit is a three-dimensional structure.  The following graphic showcases this difference[1]:

[1] Image from “The Need, Promise, and reality of Quantum Computing” by Jason Roell, published on Medium 2/1/18.

As the graphic shows, the Classical Bit is one of two states (here shown as 0 or 1).  For the Qubit, if “up” is considered as “0” and “down” is considered “1”, until it is measured it can be pointing in any direction among the 360-degree sphere, so think of that as holding lots of additional information besides just up or down. Or to think about it another way, a Qubit has more dimensions than a Bit and so it can process more info per step and therefore can speed up the processing.

So, the first major difference between classical and quantum computing lies in having three dimensions for the underlying data bits vs just one dimension for classical computers.   This by itself suggests enormous added potential of quantum computing.

Next, let’s dig in a bit further to how these bits/qubits are processed. Computers use the information contained in the bits to process according to its program or computer code. So, an original input is entered into a processor, and it spits out an output based on some rules contained in the processer. Each step or operation is generally referred to as a “gate” or basic computing rule. It turns out there are three gates from which ALL current classical computing processes derive their output. While not essential to this discussion, these three gates are “AND”, “OR” and “NOT”. Stacking and using combinations of these rules, or gates, can perform every Boolean logic operation that exists in classical computers [2]. Generally, you should take this to mean that ALL classical computing can currently be done with only 3 operators, which conversely suggests that the universe of possible computational abilities is somewhat limited to only three rules. Consider it like a chess pawn which can only move to one of three squares when being played. For single quantum “gates”, again without getting into the details, know that there are at least 6 gates (not that it is needed for this train of thought, but they include the three Pauli X, Y and Z Gates, the Hadamard Gate, the SWAP Gate and the CNOT or Controlled Not Gate). There may be others, and Entanglement allows gates to be conjoined, if you will, but let’s stick with 6 for this discussion. So, coming back to the chess analogy, this is like the movement potential of the queen and if we consider a contest between a classical computer, which only has pawns as its pieces, versus a quantum computer which has all queens, it is clear which would win the chess match.

[2] Technically, either the NOR (Not Or) or NAND (Not And) gates are considered “universal” meaning either can be used to reproduce the functions of the other gates, so technically, current digital computing logic can be done with ONE gate, but the analysis herein, assuming the three core gates, is still quite solid vis-a-vis the greater number of core gates applicable to quantum gates. For further details, consider Logic gate – Wikipedia

Hence, the second major difference between classical and quantum computing is a result of the essential gates that can be utilized to perform operations.  Classical uses three and quantum uses six. Think about this as enabling a Qubit to do more per process, which translates into being able to process things faster.

The final major difference between classical computers and quantum computers is rooted in the computing logic “efficiency”. We are all familiar with the physical noise that our current computer emits, which is the sound of the fan. We also know how hot a computer can get with usage. This is because classical computing processing causes a “loss” of information whenever a calculation is performed, which is lost in the form of heat. This is because classical computer gates are one-directional. Because energy is expended by the gate and information is lost, if you perform a calculation on a classical gate, you cannot run the output in reverse through the gate to get the original input. However, quantum gates are bi-directional, meaning that any output can be reversed through the gate with the original input revealed.

The final major difference between classical and quantum computing is that classical computing operators are one-directional and not reversible, while quantum operators are bi-directional. This bi-directionality suggests more processing information per operation, which also translates into being able to operate faster.

In summary we have the following:

 Classical ComputerQuantum Computer
Bit Dimensions13
Core Operators36
Logic DirectionOne-directionalBi-directional

Think about these differences as enabling a Quantum Computer to do more per step, which is another way of saying it is able to process things faster than a classical computer. As it turns out, this speed advantage is phenomenal, which is why there is such enormous potential for Quantum Computers. In future posts, I will cover how this phenomenal speed advantage manifests and why it will allow us to solve problems currently unsolvable by the most powerful supercomputers.

I hope this helps convey the potential of Quantum Computers over classical computers, without any details around superposition or entanglement or the underlying non-intuitive features of quantum mechanics. Naturally a greater knowledge of those other concepts are fundamental to the actual inner-workings of a Quantum Computer and I will get into some of that in future posts.

A little about about me and this website

I remember being intrigued by “computers” as a high school student in the late 70’s. Personal Computers (PC’s) were not yet available, but I bought a Sinclair ZX80 at Radio Shack in 1980 and tried to teach myself how to program it. I was awed by the potential, and fascinated by the details, and wrote a few simple bits of code.

In college I took some basic “business” computing (FORTRAN and COBOL) but wasn’t particularly drawn to programming and those languages were kludgy and certainly were not user friendly (I remember spending hours searching for missing punctuation marks…but I digress).

During my senior year in college I convinced my father to get me my first computer so I could write my senior thesis, which was a great tool for basic text editing, but couldn’t readily do much more. By the time I was graduating from college (1985), PC’s were beginning to become more accessible , although were still not very user-friendly. This was before GUI (graphical user interface) or WYSIWYG (what-you-see-is-what-you-get) functionality, so lots of trial and error just to print out a page. Neither my Apple IIC nor my computer at my first job (investment banking in 1987) even had a hard drive. At that first job, I put the 5 1/4″ Lotus123 version 1.0 disc in the A-drive and a blank disc in the B-drive, where the spreadsheet files were saved.

Fast forward just a few years, and there were AOL discs available EVERYWHERE. I was an early adopter of email and general on-line access (over a 1,200 baud modem for much of the early period), so was actively participating in the computing revolution, although generally passively.

However, despite being an “investment professional” I missed the opportunities to buy Microsoft or Google or Facebook. I have often regretted not doing more with my Sinclair ZX80 or my Apple IIc or my first few “IBM clones” so that I could appreciate the potential of personal computing and the Internet, which presumably would’ve helped me become an early investor. Too bad because if I had invested $5,000 into Microsoft in 1986, it would be worth $10.5m today and would be yielding $150,000 per year in dividends!

In addition to that general proximity to the computer/Internet wave, I have always been fascinated by theoretical physics. I was intrigued by relativity and quantum theory and read dozens of books on those subjects. (It helped that when in graduate school at the University of Chicago, the student across the hall from my dorm room was getting his doctorate in theoretical physics and indulged me on countless evenings, explaining yet again how length and time shrank as speed increased…) I read nearly everything written by Stephen Hawking and Brian Greene, fascinated by astrophysics, string theory and quantum dynamics.

As I began to see my theoretical physics and computing science interests merging…and started to learn about Quantum Computing, I pledged to myself not to miss out on the investment opportunities this time. So I have been on a personal journey to satisfy my cravings for learning about quantum theory and its applications to computing, while at the same time focusing on where the commercial opportunities may be.

Website Rules of Engagement

I intend to start out with some broad posts about the details underlying Quantum Computers, the immense potential they hold and some advances being made. Eventually I aim to focus more on current events, companies and breakthroughs, with an aim to helping find investment opportunities. I welcome constructive feedback and engagement. Understanding quantum theory or how Quantum Computers work is immensely difficult and challenging, so if you have evolved some proficiency and aptitude, it’s okay to pound your chest a bit. But everyone starts from the beginning, and it is generally a long, non-linear journey, so I ask anyone reading or reacting to posts to do so with some humility. There are no dumb questions and no taboo topics, so please be respectful and constructive in commenting. If I, or someone responding, make a mistake, it’s okay to point that out if it helps the broader analysis, but it’s less helpful if it’s just to prove a point or convey superior knowledge for the sake of it. My general hope is that most concepts described in this blog are readily understood by laymen and deep practitioners alike and that we can all engage in spirited discussions that help expand our collective understanding and learnings in this rapidly evolving field.

This website and these blog posts reflect some of these learnings. I’ve had success in my career in synthesizing very difficult topics into “layman” terms, so I aim to do that with Quantum via these posts. I hope they are helpful and informative and welcome feedback and discussions. Thanks for visiting and I look forward to taking this exciting “quantum leap” together.

Russ Fein, August 2021