How to Invest in Quantum Computing

For some of you that have been following these posts, or others that have been learning about the power and potential of Quantum Computing, you may be wondering how to invest in this emerging opportunity.  Unfortunately, there are not many ways for individual investors to participate, although that is an evolving situation.  I will cover some of the ways you can make direct investments, some options for some indirect investments, and a few situations where publicly traded securities should be available later this year.  For this post, I will not be expressing any investment opinion but rather wanted to showcase the various avenues for making investments today (or in the near future).  I will be covering the investment strengths and weaknesses, in future posts, for some of the companies noted below.

Quantum Focused Public Companies

IonQ: Today, there is only one significant pure-play Quantum Computing company publicly traded, and that is IonQ ($IONQ), the College Park, MD based firm founded in 2015.  The Company was launched with seed funding from New Enterprise Associates, a pre-eminent venture investor, and a license to core technology from Duke University and the University of Maryland.  IonQ has built ion trap based working Quantum Computers which can be accessed directly or through cloud partnerships with Microsoft, Amazon and Google.  In October 2021, IonQ began trading on the NYSE, and as of 2/11/21 had a market capitalization of $3 billion.  The stock has had some recent gyrations, and will likely be dragged down a bit near-term as other players go public (see Rigetti and D-Wave below) and investors re-allocate some of their QC exposure from IONQ to those other firms, but this is an essential component of any long-term QC portfolio.

Rigetti: While Rigetti Computing is not quite public, they have signed a definitive agreement to merge with a SPAC called Supernova Partners Acquisition Company II ($SNII), which values Rigetti equity at approximately $1.5 billion and will provide over $450 million in cash proceeds to Rigetti.  Rigetti is another full-stack Quantum Computing provider, but they use superconducting loops for their qubits.  While the formal merger date has not been announced, a formal shareholder vote is scheduled for February 28, 2022 and the merger should be completed shortly thereafter.  Investors hoping to get in early can buy SNII today, or watch for it to trade post-merger, at which point its symbol will be RGTI.

D-Wave:  Similar to Rigetti, D-Wave has signed an agreement to merge with a SPAC, this one called DPCM Capital ($XPOA).  For this transaction, D-Wave equity is valued at $1.2 billion and it will provide $300 million in cash.  D-Wave is a different type of Quantum Computing company that offers quantum annealing as opposed to gate-based algorithms.  While annealing is less powerful than gate-based systems, it is easier to operate and scale and D-Wave has 25 customers from among the Forbes Global 2000, so is one of the current Quantum Computing companies with meaningful current revenues.  Investors hoping to get in early can buy XPOA today or watch for it to trade post-merger as QBTS.

Quantinuum: In June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to be acquired by Honeywell.  Honeywell merged CQC with its Honeywell Quantum Solutions (HQS) division and in November of 2021, spun out the combined businesses into a new stand-alone company called “Quantinuum” which is owned 54% by Honeywell and 46% by Cambridge Quantum Shareholders. Separately, Honeywell invested $300m in Quantinuum which has the benefit of CQC’s software and algorithm expertise combined with HQS’s hardware expertise, creating the largest full-stack dedicated quantum computer company.   Company executives have been quoted as confirming a 2022 targeted IPO, although there has been no official company announcement.  See here for a prior post showcasing Quantinuum.

PsiQuantum:  While currently private without any publicized plans to go public, PsiQuantum has been the most venture funded QC company in the US.  To date they have raised nearly $750m, most recently at a $3.15 billion post-money valuation.  While they are not in immediate need of liquidity, nor have they announced a desire to go public, the broad investor base and recent completion of a “D-Round” hint at an IPO some time in the not-too-distant future.

Among the five companies noted in this section, only one ($IONQ) is a currently traded pure-play quantum investments.  Two have committed to going public via SPAC some time this year, one has announced plans to go public but has not taken formal steps and the fifth is not necessarily going public this year, but they are worth watching for an IPO announcement in the future.  Interestingly, should all five become public, it would represent a broad bet on qubit construction (a mix of superconducting (Rigetti), ion traps (IonQ and Quantinuum) and photonics (PsiQuantum) so would enable diversification among these leading QC hardware strategies.

Exchange Traded Funds/Mutual Funds

In addition to these five pure-play companies, there are professionally managed, publicly traded funds with a focus on Quantum Computing and/or advanced computing.  Many of these have portfolios with considerable over-lap, so the best strategy here would be to select one of these funds as your “advanced computing” vehicle to provided diversified exposure to QC.

Defiance Quantum ETF:  Defiance Quantum ($QTUM) is an exchange traded fund with a portfolio of investments in advanced technology companies that operate in Quantum Computing as well as artificial intelligence and machine learning.  While not purely “quantum” the companies in its portfolio should all benefit from increasing commercialization of Quantum Computing.  The fund trades at or near its “net asset value”, in other words it is a relatively efficient way to own a diversified portfolio of about 70 companies. Holdings of QTUM include Teradata, Lockheed Martin, Airbus, HP, IBM, IonQ and others.

Fidelity Select Technology Portfolio ($SFPTX): This non-diversified fund invests primarily in equity securities, especially common stocks of companies that are engaged in offering, using, or developing products, processes, or services that will provide or will benefit significantly from technological advances and improvements. Some of the fund’s top quantum holdings include Google, Nvidia, Microsoft and Micron Technology.

Fidelity Select Software & IT Services Portfolio ($FSCSX): This non-diversified fund invests a majority of its assets in common stocks of companies engaged in research, design, production or distribution of products or processes that relate to software or information-based services.  Some of the fund’s top quantum computing holdings are Microsoft, Google and International Business Machines.

T. Rowe Price Global Technology Fund ($PRGTX): aims for long-term capital growth. This non-diversified fund invests most assets in the common stocks of companies that will generate a majority of revenues from the development, advancement and use of technology. Some of the fund’s top quantum computing positions are Alibaba, Advanced Micro Devices, Micron Technology and NXP Semiconductors.

Franklin DynaTech Fund Class A ($FKDNX): The fund invests primarily in common stocks with a focus on companies that are leaders in innovation, take advantage of new technologies, have superior management, and benefit from new industry conditions. Some of the fund’s top quantum computing investments are Google, Nvidia, Microsoft and Alibaba.

Technology Select Sector SPDR Fund ($XLK): Seeks to provide exposure to companies from technology hardware, storage, and peripherals; software; communications equipment; semiconductors and semiconductor equipment; IT services; and electronic equipment, instruments and components.  Top holdings include Apple, Microsoft, NVIDIA, Broadcom and Cisco.

Public Companies with Quantum Initiatives

None of the following publicly traded companies are pure-play quantum investments, but each has major Quantum Computing initiatives and a varying level of reliance on successful penetration of the QC market.

International Business Machines ($IBM): As a leading legacy company focused on computing hardware, IBM seems like a natural company to lead QC efforts.  In fact, they have created the IBM Q Experience which enables more than 100 customers to access IBM’s quantum resources via cloud-based access.  In addition, IBM has developed Qiskit, one of the more popular open-source quantum SDKs (software development kits).   Their latest 127-qubit Eagle quantum processor is one of the more robust QCs available and it is being utilized by major firms including Goldman Sachs, Samsung, JPMorgan Chase, ExxonMobil, and Boeing, among others.  IBM features its quantum initiatives prominently in its corporate materials, so I expect QC to be an ever-increasing part of its value.

Microsoft ($MSFT):   As a leading software company, it makes sense that MSFT would be working on quantum software.  Specifically, they have a widely used SDK called Q# (pronounced Q Sharp) and have been offering access to the quantum hardware systems offered by Honeywell, IonQ and QCI via their Azure Quantum cloud-based quantum platform.  And, via their M12 corporate venture arm, are investors in PsiQuantum.   By remaining fairly agnostic to the quantum hardware used, and by developing an open-source SDK, MSFT is well positioned to enjoy the growing usage and needs for access to QCs regardless of which hardware technologies ultimately gain the most traction.  However, despite Microsoft’s clear commitment to Quantum Computing through their Azure Quantum platform and their Q# SDK, in their latest 10-K annual report as of June 30, 2021, there is zero mentions of “quantum” or “Q#” so it may be difficult in the near term for MSFT’s quantum efforts to move their equity value.

Honeywell International ($HON)As noted above, Honeywell spun its Honeywell Quantum Solutions (HQS) division out into Quantinuum, with a stated plan to take Quantinuum public.  However, until that spinout occurs, it is possible to obtain QC exposure via a direct investment in Honeywell.  Even once Quantinuum goes public itself, it is expected that Honeywell will retain a significant ownership in Quantinuum so acquiring shares of HON now is an early way to get in on the upside possible in Quantinuum.

Alphabet ($GOOG, $GOOGL)Alphabet/Google has been a major quantum headline grabber over the past couple of years, especially after it published the breakthrough paper in Nature describing how its Sycamore quantum processor was the first QC able to achieve “quantum supremacy.”  In addition to the Sycamore claims, Google maintains a robust quantum offering, including its Cirq SDK, cloud-based QC access and various libraries of quantum resources and algorithms.  However, like other large companies included in this section, Alphabet is a huge, diversified conglomerate, so the relative contribution of QC to the broader Alphabet valuation is likely modest.

Intel ($INTL)Intel has been a leading player in computing hardware since it was founded by Gordon Moore and Robert Noyce in 1968, so they are another corporate candidate for meaningful quantum exposure.  Additionally, as “Moore’s Law” begins to bump up against physics constraints, Quantum Computing seems like a natural extension of their technology, in order to continue to produce ever more powerful computing chips.  In fact, in 2019 Intel announced Horse Ridge, a cryogenic control chip designed to speed the development of full-stack QC systems.  Intel is hoping to leverage this chip, along with its legacy expertise around interconnect technologies, to become a major player in the QC realm.

Amazon ($AMZN): Similar to Microsoft, Amazon has a broad cloud-based quantum platform within its Amazon Web Services (AWS) offering, known as Braket.  It provides access to systems from D-Wave, Rigetti and IonQ.  They also have an AWS Center for Quantum Computing in partnership with the California Institute of Technology among others.  However, Amazon is a massive business with many interests and “quantum” is not often featured in its corporate description materials nor was it mentioned in their 2020 annual report, so its overall equity exposure to QC may not be very significant.

Nvidia ($NVDA): Founded in 1999 with a focus on advanced gaming, Nvidia’s GPU’s (graphics processing units) are now also being utilized for deep learning, parallel processing and artificial intelligence, so they have become an important player in advanced computing.  A newly announced cuQuantum for quantum computing, enables large quantum circuits to be simulated dramatically faster, allowing quantum researchers to study a broader space of algorithms and applications. Developers can simulate areas such as near-term variational quantum algorithms for molecules and error correction algorithms to identify fault tolerance, as well as accelerate popular quantum simulators from Google and IBM.  Given their success in becoming significant players in advanced computing generally, it seems likely they will have success leveraging these assets in Quantum Computing.  Currently, “quantum” is a very modest focus within Nvidia’s press or shareholder reports, so it is unlikely to have a near-term major impact on its stock value, but this may be worth taking a modest, long-term position.

Summary

For those of you anxious to invest in the evolving Quantum Computing industry, there are a few publicly available options.  Some will provide a direct, pure-play investment, while others should enjoy enhanced returns based on their QC exposure.    The following table summarizes the public company investments (and stock symbols) that would provide decent portfolio exposure to Quantum Computing upside:

Those seeking meaningful investment exposure to QC should certainly maintain positions in IONQ, RGTI and QBTS and likely at least one of the funds noted.  For added exposure to a broader advanced computing portfolio that also adds QC exposure, you may consider adding some or all of MSFT, IBM, HON, GOOG, GOOGL, AMZN, INTL and/or NVDA. 

Disclosure: I maintain personal long positions in IONQ, SNII, QTUM and XLK, but do not have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Nvidia Press Release, Introducing cuQuantum: Acclerating State Vector and Tensor Network-based Quantum Circuit Simulation, November, 2021.

Zacks Equity Research, 4 Funds to Shine as Quantum Computing Comes Into Play, July 8, 2021

Intel Corporation Press Release, “Intel Introduces ‘Horse Ridge’ to Enable Commercially Viable Quantum Computers,” December 9, 2019.

Taulli, Tom, InvestorPlace, “These 7 Quantum Computing Stocks Are Futuristic Buys,” June 15, 2020.

Gecgil, Tezcan, InvestorPlace, “The 7 Best Quantum Computing Stocks to Buy for February 2022,” February 4, 2022.

Hajjar, Alamira J., AI Multiple,  “33+ Public & Private Quantum Computing Stocks in 2022”, published May 5, 2021 and updated Jan 11, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantum Advantage is Closer than you Think

I recently had the pleasure of speaking with Anisha Musti, a delightful and empowering 16-year-old CEO and Co-Founder of Q-munity, a 501(c)(3) non-profit that is introducing and teaching young individuals about Quantum Computing (I encourage you to check out Anisha and her project(s) at the Q-Munity website).  She hopes to expose her peers to QC so that they will consider careers in the field, or “if they learn about it from us but choose not to pursue it, at least they will be making an educated assessment.”  Anisha’s poise and wisdom belie her age.

The fact that a 16-year-old, along with a few of her friends and co-students, have established a robust and constructive free resource is one of the topics I highlight below.  But I am starting this post highlighting this conversation because it was an interesting multi-generational dialogue.  She asked a bit about my QC journey, and I began explaining my first computer courses in college (COBOL and FORTRAN) where we used “punch cards” to store and retrieve commands.  The conversation went along generally as follows:

Me: I started in computers when we still used punch cards to record the commands.

Anisha: Huh?

Me: You know, that was before we even had floppy discs.

Anisha: I have no idea what a floppy disc is.

Which certainly made me chuckle.  I reflected on the amazing advances we’ve seen in just my lifetime.  Born in the 60’s, I entered college before personal computers (or GPS or cell phones or the Internet, etc.) and have witnessed amazing technological progress ever since.  Sometimes, when I consider the power contained in my iPhone, I am awed by it and it feels like we can’t possibly need any more technological advances…I can do almost anything, virtually instantly, in the palm of my hand.

But time and technology invariably move forward.  And in fact, we appear to be on the cusp of even more profound technological capabilities in the form of working, powerful Quantum Computers.  Using the growth in power and capacity of some electronics over the past 20 years, the following table provides a level of growth-speed context:

You may notice that the growth-rate of the speed of the processor of your PC, while substantial at 4.3x, is a tiny fraction of the rate of growth in cellular data speeds.  This is a nuance of these sorts of growth rates, which are more explosive earlier in the life cycle, but eventually slow down and physical limits become more difficult to overcome.  There is also a relative utility factor, in that PC’s created in 2002 were pretty good at basic office program usage (email, word documents, spreadsheets, etc.) so the utility of speed increases was less valuable.  Compare that to gaming consoles.  While the graphics of Grand Theft Auto: Vice City (#2 videogame of 2002) may have made your mother cringe, it is a far cry from the realism experienced by today’s FIFA22.  In other words, the consumer utility of increased speeds and capacity is still a steep demand-curve for certain technologies, especially for those with substantial headroom in progress and need.

Given the utility of improved Quantum Computing, it is my opinion that the rate of growth will continue to accelerate at a phenomenal rate.  We are already seeing 10x/year increases in quantum volume (albeit over a short window of time) and I expect that pace to remain or accelerate in the near term, as I’ll explain below.  While there has been much written on this topic, and many billions of dollars invested, many still speak of a “quantum winter” where the hype overshoots the reality.  Readers of my posts know that I am mindful not to contribute to the hype, but I truly believe that useful, practical Quantum Computing applications are imminent (i.e., by the end of this decade or sooner).  Let me explain a few reasons why.

  1. The Quantum Evolution is Quite Mature

In 1879, electricity was first harnessed for home use to power Edison’s electric light bulbs.  During the period of 1920-1935 the US went on an electrification campaign bringing power to 70% of US homes.  So, in about 50 years, a profound new technology became ubiquitous.  Nobody could have imagined the impact electricity would have on daily life in those early years.  Yet today we take for granted that we can plug a cord into any wall in our home and have instant, nearly free power.  Personal Computers and the Internet have had similar, profound impacts on our daily lives, generally over shorter and shorter spans of time.

Quantum Computing has the potential to be a next profound disruptor.  Many authors, including me, have covered the power and potential of QC, so that is not the focus of this post.  Rather, the concept to keep in mind, is that while “Quantum Computing” is relatively new, the utilization of quantum physics/mechanics has been progressing for the past 130 years.  We have had great success utilizing the dual wave-particle nature of electrons and photons for a variety of purposes including MRI’s, lasers and GPS (which I covered in a prior post entitled “Quantum Quantum Everywhere”), among many others.  As that prior post noted, today we are already using quantum mechanics in Quantum Sensing for precise measurement probes (even where GPS is unavailable), ghost imaging and quantum illumination.  It is also being used today for certain applications of Quantum Communication.  And yes, while current Quantum Computers are not as powerful as we’d like, there are dozens of companies offering access to their working Quantum Computers today, with the power of the machines increasing quite rapidly.  While it is difficult to get consensus over exactly when the QC’s will become powerful enough to surpass classical computers for real-world problems, nearly everyone in the field will confirm it is just a matter of “when” not a matter of “if”.

  1. Cutting edge Quantum Processors are Available in the Cloud

As noted above in in a prior post, there are a variety of QC companies offering their latest QCs via cloud-based access.  This is important because it “socializes” access to QCs.  Today, anybody with some basic computing chops, can access actual, working QCs for modest, or in some cases, no cost.  Quantum algorithms are being written and run every day.  Furthermore, because many QC makers are providing their latest QCs via the Cloud, commercial users do not have to deal with a large CapEx (capital expenditure) cost up-front nor do they have to worry about obsolescence.  When mainframe computers became available to commercial users in the early part of the 21st century, they were extremely expensive, difficult to operate, and subject to being outdated relatively quickly.  The same was generally true of desktop computers, which often were made obsolete due to advanced software, well before they stopped “working.”

By utilizing QCs over the cloud, this cycle of CapEx àObsolescence à CapEx can be eliminated, which should spur greater utilization and adoption of QC than otherwise might occur.

  1. Open Source is the Default

I mean this in a broader sense than you might expect.  On the one hand, most of the existing QDKs (quantum development kits) are both open source (i.e., free to use) and cross-platform compatible.  What this means from a practical perspective, is that the learning curve for QC proficiency is much less steep because whatever skills are acquired can be used across many different platforms.  In addition, someone who creates a QC algorithm to access via a cloud provider such as Amazon’s Braket or Microsoft’s Azure Quantum can have the same algorithm run over a variety of QC hardware provider platforms.  Contrast this with early PC access where PCs did not speak to Mac’s or Linux boxes.  In addition, they required competing software, input devices and physical plugs in many cases.  All of that “confusion” made it difficult for the industry to scale at the same pace it might have, if all power users spoke the same language and used fully compatible hardware.

Even more profound and telling in the current QC environment is the “open” nature of so many of the participants.  Access to the programs offered by Anisha’s Q-Munity, noted in the opening paragraph, is free.  Many authors have published complete textbooks on Quantum Computing (Thomas G. Wong’s Introduction to Classical and Quantum Computing, and Brian Siegelwax’s Dungeons-n-Qubits are but two examples) for free.  And there are innumerable fist-rate on-line courses and programs about Quantum Computing for free.  In addition to all the free resources, I have found that the players and participants in the industry are also generally open, friendly, and eager to help folks on their quantum journeys.  This spirit of community and cooperation is refreshing, especially around an industry with such tremendous commercial potential.  Perhaps this openness will be less pervasive once the industry gets more mature (and companies are competing more vigorously for QC customers), but the essence of this post is to suggest that point arrives quickly, and this current state of openness certainly accelerates access to, and development of, quantum technologies.

  1. QC is Leveraging Adjacent Technologies

In addition to leveraging the historical progress in taming quantum mechanics for commercial use, recent advances in machine learning, artificial intelligence and big data are quite complementary to Quantum Computing.  Many advances and breakthroughs in these industries can be accelerated or improved by applying QC technology, so the pool of well experienced, advanced computing talent, is quite larger already, even in the relatively early stages of QC evolution.  Similarly, we see certain quantum hardware strategies leveraging existing advances in semiconductor technology (i.e., quantum dots) and optics (photonic qubits) to create QCs.  As the hardware advances and applications continue to evolve, I expect many to also converge.

  1. Quantum Advantage is a Continuum not a Milestone

As a refresher, while there is no definitive guide to definitions about QC, “quantum supremacy” is generally referred to as a QC being able to tackle a problem, even one without real-life application, faster than a classical computer.  This was achieved in by Google in 2019 and repeated by others since.   “Quantum advantage” on the other hand, is meant to denote when QCs can out-perform classical computers in actual, useful applications.  The QC world is anxiously awaiting this Quantum Advantage threshold without clear consensus on when that might occur.  However, as those who study quantum effects well know, things are never so binary!  It is more constructive to think about QC progress as a continuum, not a specific threshold to be achieved.

I am not the first to suggest this perspective.  In a recent Harvard Business Review podcast, host Azeem Azhar interviewed Rigetti Computing founder and CEO Chad Rigetti.  In it, Rigetti noted select instances where a QC offered a very slight performance advantage to a small part of a broader problem.  He discusses how this happened with Rigetti’s attempt to improve weather forecasting.  While this is certainly not Quantum Advantage, it is a real-world example, today, of QC contributing to real analysis.  Chad elaborated on some of his thoughts around “narrow” versus “broad” quantum advantage, which I found very compelling.  Specifically, he referred to “narrow advantage” where a specific use case might benefit from QC, such as in the pricing of derivatives.  Any small advantage could produce outsized financial benefits in portfolio allocation or timing of trades, and could occur well before “broad advantage” is achieved.  While financial markets are just one example, the finance industry is already very computing advanced and the underlying data is already in computing format, so this sort of narrow quantum advantage could be quite close.  Broader quantum advantage, where QCs can generally outperform classical computers, is more difficult and therefore further away, but I imagine we will see many steps up a spectrum of advances on the way to full quantum advantage.

  1. Calling Dr. Evil…

The final point I want to make, is the enormous economic impact that a powerful QC will make.  The heading of this section, using a tongue-in-cheek reference to Austin Powers movies, is meant to evoke the massive commercial gains that can be made with a powerful QC.  Much has been written about using Shor’s algorithm to break current encryption protocols and the “HNDL” (hack now, decrypt later) movement, which unfortunately is a real thing.  A bad actor or nation-state could command enormous power if they were the first to create a powerful QC.  They could break most encryption, mine all remaining bitcoin and other cryptocurrency and skim untold profits from financial systems by front-running traders, just to name a few powers.  In fact, the US and China are currently engaged in a ferocious race to develop the most powerful QC capabilities, each fearing the other’s ability to get there first with each establishing nationally supported quantum initiatives.

Naturally I hope and expect that the “good guys” will have the most powerful QCs and will focus their powers on good use such as better medicines, more efficient car batteries and optimized logistics, among other things.  Certainly, the rewards and upside for constructive use of QCs is enormous, and smart people are busy at work to protect us from those bad actors.  The point is that the massive financial upside for access to powerful QCs will spur accelerated development.

So can we say with any certainty, what a QC timeline looks like?  Unfortunately, not.  But as this post points out, the foot is on the accelerator, billions of dollars are being invested and super smart people are working on creative solutions to existing progress rate delimiters.  For these reasons and those enumerated above, I am confident that we will begin seeing quantum advantage in our daily lives more and more over the next few years.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Azhar, Azeem, Host, “How Quantum Computing Will Change Everything (with Chad Rigetti)”, Season 6, Episode 11, Harvard Business Review, December 2021.

Intel® Microprocessor Quick Reference Guide – Year, accessed February 5, 2022

The astounding evolution of the hard drive (pcworld.com), accessed February 5, 2022             

A Brief and Abbreviated History of Gaming Storage – Techbytes (umass.edu), accessed February 5, 2022

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Cloud Based Quantum Computer Access – Available Today

If you have been following this blog, hopefully you have some broad appreciation for the promise and potential of Quantum Computing (QC).  This is a rapidly evolving field and generally, the hype has been front-running the actual capabilities.  While a narrow “quantum supremacy” has been achieved by Google and others, general “quantum advantage” (where a quantum computer can out-perform a classical computer for a given real-world problem) is still out of reach (for now). 

That said, the purpose of this post is to highlight and showcase the fact that people are using actual working Quantum Computers every day.  Each of Amazon and Microsoft offer cloud access to several QC hardware systems, while players like Google, IBM, IonQ, Rigetti, Honeywell and others offer direct access to their systems via direct web-based interfaces.  I’m going to spell out some of the modes of access these firms have made available, not to be a definitive catalogue of all QCaaS (Quantum Computing as a Service) providers but to emphasize two facts:

  1. Many people and companies are already using quantum computers to process real-world quantum algorithms.  Results are generally less robust than can be achieved on existing classical computers, but routines are being run and occasionally, results surpass classical computing results.
  2. The industry is moving towards a largely open-source software environment for programming and accessing quantum processors. Many quantum hardware manufacturers are offering cloud-based access to their systems, obviating the need to purchase physical quantum hardware.  This substantially lowers the barriers to entry for companies seeking to begin exploring how QC can benefit their businesses, and “future-proofs” the investment since the providers continually upgrade the QC machines they provide via the cloud.

Working QCs are currently available to anyone (and as you’ll see below, the costs of operating QCs can be quite modest).  In fact, a recent study reviewed all of the QC cloud access of IBM’s Quantum systems over a two-year period and found over 6,000 jobs which contained over 600,000 quantum circuit executions and almost 10 billion “shots” (a shot is a single execution of a quantum algorithm on a QPU (quantum processing unit), further described below).  IBM notes on their website that they have run over 1 trillion circuits to date, which is clearly a non-trivial amount.  And that is just at IBM. 

The following tables highlight some aspects of the current state-of-play in using actual quantum computers to run algorithms via cloud-based access:

Note : Other providers include Alibaba Quantum Lab (China), Alpine Quantum Technologies (Austria), Origin Quantum (China)

[1] Quantum annealing is a different protocol than typical Quantum Computing gates so is not a direct equivalent when comparing numbers of qubits.

Quantum Computing Power Available Via the Cloud Today

Before I get into details about specific methods for accessing working Quantum Computers, I want to review a few facts about the state of the industry vis-à-vis QC power.  The current environment has been referenced as “NISQ” or noisy intermediate-scale quantum.  Generally, this means that existing quantum computers operate with a lot of noise that interferes with qubit control and coherence, and that working quantum computers have a somewhat limited number of qubits.  QC power can be increased both with the addition of more qubits and/or with the successful implementation of error-correction.  Generally, a QC with about 50-60 working logical qubits (representing around a petabyte of processing power) should begin to achieve consistent quantum advantage.  Some expect this will require as many as 1,000x more qubits per logical qubit to handle the error-correcting overhead, although as control and error-correction improves, this number should decrease.  In any case, today’s working QCs provide 10’s of working qubits not 100’s or 1000’s, but they are working, accessible machines, nonetheless and beginning to yield significant computing power.

To emphasize how existing Quantum Computers are already showing real-world promise, Rigetti Computing recently used their machines to augment a portion of GSWR (Global Synthetic Weather Radar) analysis using their 32 qubit QC, and in select instances, were able to modestly outperform results achieved using only classical computing power.  Similar select improvements over classical computing have also been noted in certain portfolio/security valuation algorithms.  So real-word real benefits are beginning to appear even in this early NISQ environment.

What does it Cost to run Quantum Algorithms via the Cloud?

In order to provide an example of how you can begin accessing Quantum Computers and running quantum algorithms, the following describes access via Amazon Braket:

You can use an account with Amazon Braket to access the Quantum Computers provided by IonQ, Rigetti or D-Wave.  Once you construct a quantum algorithm, it is recommended that you test and debug it on a simulator, which is generally available for no cost.   Once you are ready to actually run the algorithm on a bona fide quantum machine, there are some cost factors to keep in mind.  There are generally two pricing components when using a quantum computer or quantum processing unit (QPU) via the cloud: a “per-shot” fee and a “per task” fee.

As you may recall from prior posts, quantum algorithms are “probabilistic” not deterministic.  There is no single correct result from a quantum operation, rather outputs are aggregated and averaged to determine the correct output.   For this reason, algorithms are usually run many, many times (10,000 times is a standard number).  A “shot” is a single execution of a quantum algorithm on a QPU. For example, a shot is a single pass through each stage of a complete quantum circuit on a gate-based QPU.  The per-shot pricing depends on the type of QPU used but is not affected by the number or type of gates used in a quantum circuit or the number of variables used in a quantum annealing problem.

A task is a sequence of repeated shots based on the same circuit design or annealing problem. You define how many shots you want included in a task when you submit the task to Amazon Braket.  The current pricing to run algorithms via Amazon Braket are as follows:

  • D-Wave 2000Q: $0.30/task  + $0.00019/shot
  • D-Wave Advantage: $0.30/task + $0.00019/shot
  • IonQ: $0.30/task + $0.01/shot
  • Rigetti: $0.30/task + $0.00035/shot

For example, a scientist runs a quantum algorithm on the Rigetti Aspen-11 quantum computer in the AWS US West (N. California) Region. This task includes 10,000 repeated shots of the same circuit design. The cost to run this task includes a per-task charge of $0.30, plus 10,000 shots at a per-shot price of $0.00035.

So, the cost to run this algorithm:
Task charges: 1 task x $0.30 / task = $0.30
Shots charges: 10,000 shots x $0.00035 / shot = $3.50
Total charges: $3.80

Competing quantum cloud providers have similar pricing constructs or charge a fixed amount for a certain level of access/time.  Naturally there is no guarantee that your circuit or algorithm will provide the desired results, or useful results, but the table stakes to begin testing QC for your business is quite modest.  These costs may increase in the future, but it is a very low bar considering the potential upside, and certainly less expensive (or risky) than purchasing a dedicated Quantum Computing machine today.

Conclusion

By providing access to actual Quantum Computers via the cloud, a handful of QC hardware makers are providing QC access to virtually anyone.  With a basic working knowledge of Python (a common programming language, particularly good at connecting various components), a user can investigate many free open-source resources and QDKs (quantum development kits), begin compiling “quantum algorithms” and test/debug them for free on any number of cloud-based simulators.  Once ready to run on an actual, working Quantum Computer, you can then sign up directly at some QC providers or via Amazon or Microsoft’s web platforms (or others) and have the algorithm run on an actual QC for a very modest cost.  Such access eliminates the capital and technology risks of purchasing a Quantum Computer.

This is already happening with, literally, trillions of circuits run to-date.  While the power of the machines currently accessible are modest relative to high-performance classical computers, real-world achievements are becoming increasing possible.  As more users are provided with broader access to ever larger QCs, and as advances in error correction and control continue, it is only a matter of time before consistent quantum advantage is available to nearly anyone.


References:

Enos, Graham; Reagor, Matthew; Henderson, Maxwell; Young, Christina; Horton, Kyle; Birch, Mandy and Rigettin, Chad; Synthetic Weather Radar Using Hybrid Quantum-Classical Machine Learning, November, 30, 2021

The Quantum Insider, QCaaS write-up, accessed January 26, 2022

Dilmegani, Cem, Quantum Software Development Kits in 2022, https://research.aimultiple.com/quantum-sdk/amp/, accessed January 22, 2022

Ravi, Gokul Subramanian; Smith, Kaitlin; Gokhale, Pranav; Chong, Frederic, Quantum Computing in the Cloud: Analyzing job and machine characteristics, University of Chicago papers, November 1, 2021

Shaw, David, “Quantum Software Outlook 2022”, Fact Based Insight, January 19, 2022

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

A Quantum Computing Glossary

Hopefully many of you have been following this blog since it began and are familiar with the terms highlighted below.  For some of you, a refresher for reference may be helpful.  For others, this may all be very overwhelming and confusing so I hope this guide will clarify things for you.  I’ve curated this list to provide a broad set of definitions that should help frame the Quantum Computing (QC) potential, and for ease of reference as you come across terms where a definitional reminder would be helpful.  In the first post in this series, I introduced QC with the following word-cloud graphic:


While not every word in this cloud bears defining in this post, I hope many of these definitions help you in your efforts to understand and appreciate QC, and I have grouped them into silos to add context (although some may naturally apply to more than one silo).  This is not intended to be complete list, and it’s likely that more definitions will need to be added over time, but this should provide a good grounding in the general nomenclature and principles.

Quantum Concepts

  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated. 
  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.Dirac Notation: Symbolic representation of quantum states via linear algebra, also called bra-ket notation.  The bra portion represents a row vector and the ket portion represents a column vector.  While a general understanding of QC does not necessarily require familiarity with linear algebra or these notations, it is fundamental to a deeper working knowledge.
  • Quantum Supremacy: Demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.  Based on this definition, the threshold was passed in October 2019.
  • Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.
  • Collapse: The phenomenon that occurs upon measurement of a quantum system where the system reverts to a single observable state.  Said another way, after a qubit is put into a superposition, upon measurement it collapses to either a 1 or 0.
  • Bloch Sphere: a geometrical representation of the state space of a qubit, named after the physicist Felix Bloch.  The Bloch Sphere provides the following interpretation: the poles represent classical bits, and we use the notation |0⟩ and |1⟩. However, while these are the only possible states for the classical bit representation, quantum bits cover the whole sphere. Thus, there is much more information involved in the quantum bits, and the Bloch sphere depicts this.
  • Schrodinger’s Cat: A quantum mechanics construct or thought experiment that illustrates the paradox of superposition wherein the cat may be considered both alive and dead (until the box is opened and its status is then known for certain).  This “both alive and dead” concept often confuses early students of quantum mechanics.
  • Heisenberg Uncertainty: (also known as Heisenberg’s uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the position and momentum of a particle can be known based on its starting parameters.  Generally, the more precise the position location is, the less precise the momentum can be described, and vice versa.  This also confuses early students of quantum mechanics who are used to typical physics where speed and position are usually well known by observation.

Hardware/Physical Components

  • Qubit: Also known as a quantum bit, a qubit is the basic building block of a quantum computer. In addition to the conventional—binary—states of 0 or 1, it can also assume a superposition of the two values.  There are several different ways that qubits can be created with no clear candidate emerging as the definitive method.
  • Auxiliary Qubit:  Unfortunately, there is no such thing as quantum-RAM so it is difficult for QC’s to store information for extended periods of time.  An “Auxiliary Qubit” serves as a temporary memory for a quantum computer and is allocated and de-allocated as needed (also referred to as an ancilla).
  • Cryogenics: Operating at extremely cold temperatures, generally meant to be less than -153 Celsius, or in the case of QC, -180 Celsius.  Cryogenics are of particular interest for QC when applied to silicon-based semiconductors because at this temperature, such semiconductors operate with superconductivity (i.e., the electrons flow with no loss to resistance).
  • Dilution Refrigerator: Used in superconducting qubits and often with quantum dots, whereby a series of physical levels (typically 7) are sequentially chilled to the lowest level, where the qubits operate.
  • High Performing Computer (HPC): Sometimes also referred to as a “supercomputer” is generally meant to represent any ultra-high performing classical computer.  Powerful gaming PCs operate at 3 GHz (i.e., 3 billion calculations per second) while HPC’s operate at quadrillions of calculations per second.  Despite this blazing speed, there are many problems that HPC’s cannot perform in a reasonable about of time, but theoretically can be done with a QC in a very short amount of time. 
  • Quantum Annealer: Annealing is used to harden iron, where the temperature is raised so the molecular speed increases and strong bonds are formed.  The iron is then slowly cooled which reinforces these new bonds, a process called “annealing” in metallurgy. Quantum annealing works in a similar way, where the temperature is replaced by energy and the lowest energy state, the global minimum, is found via annealing.  Quantum annealing is a quantum computing method used to find the optimal solution of problems involving many solutions, by taking advantage of properties specific to quantum physics.   Since there are no “Gates”, the mechanics of annealing are less daunting than full blown QC, although the outputs are less refined and precise than they would be under a full gate-based QC. 
  • Quantum Dot: Quantum dots are effectively “artificial atoms.” They are nanocrystals of semiconductor wherein an electron-hole pair can be trapped. The nanometer size is comparable to the wavelength of light and so, just like in an atom, the electron can occupy discrete energy levels. The dots can be confined in a photonic crystal cavity, where they can be probed with laser light.
  • Quantum Sensor: Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.  Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds.  Quantum illumination is similar and can be used in quantum radar.

Computing Operations

  • Gate: A basic operation on quantum bits and the quantum analogue to a conventional logic gate. Unlike conventional logic gates, quantum gates are reversible. Quantum algorithms are constructed from sequences of quantum gates.
  • Hadamard Gate: The Hadamard operation acts on a single qubit and puts it in an even superposition (i.e., turns and spins the qubit so the poles face left and right instead of up and down).  It is a universal gate operation which establishes superposition.
  • Fault Tolerance: technical noise in electronics, lasers, and other components of quantum computers lead to small imperfections in every single computing operation. These small errors ultimately lead to erroneous computation results. Such errors can be countered by encoding one logical qubit redundantly into multiple physical qubits. The required number of redundant physical qubits depends on the amount of technical noise in the system. For superconducting qubits, experts expect that about 1,000 physical qubits are required to encode one logical qubit. For trapped ions, due to their lower noise levels, only a few dozens of physical qubits are required. Systems in which these errors are corrected are fault tolerant.
  • Measurement: the act of observing a quantum state. This observation will yield classical information, but the measurement process will change the quantum state. For instance, if the state is in superposition, this measurement will ‘collapse’ it into a classical state of 1 or 0. Before a measurement is done, there is no way of knowing what the outcome will be.
  • NISQ: Noisy intermediate-scale quantum, coined by John Preskill in 2017, meant to depict the current state of QC whereby qubits suffer from noise and rapid decoherence.  It generally means the establishment of 50-100 logical qubits (the “intermediate-scale” portion of the definition, which would require 100,000 – 1,000,000 physical qubits with the balance of the qubits dedicated to noise reduction).
  • Noise: In QC, noise is anything which impacts a qubit in an undesirable way, namely electromagnetic charges, gravity or temperature fluctuations, mechanical vibrations, voltage changes, scattered photons, etc.  Because of the precise nature of qubits, such noise is nearly impossible to prevent and requires substantial error-correction (to correct for the noise) in order to allow the qubits to perform desired calculations.
  • Quantum Algorithm: An algorithm is a collection of instructions that allows you to compute a function, for instance the square of a number. A quantum algorithm is exactly the same thing, but the instructions also allow superpositions to be made and entanglement to be created. This allows quantum algorithms to do certain things that cannot be done efficiently with regular algorithms.
  • Quantum Development Kit (QDK): A number of providers offer different types of QDK’s including some that are proprietary and others that are open source.  It generally contains the programming language for quantum computing along with various libraries, samples and tutorials.  QDK’s are available from the following companies (with their QDK name in parentheses): D-Wave (Ocean), Rigetti (Forest), IBM (Qiskit), Google (Cirq), Microsoft (Microsoft QDK), Zapata (Orquestra), 1Qbit (1Qbit SDK), Amazon (Braket), ETH Zurich (ProjectQ), Xanadu (Strawberry Fields) and Riverlane (Anian).
  • Quantum Error Correction: The environment can disturb the computational state of qubits, thereby causing information loss. Quantum error correction combats this loss by taking the computational state of the system and spreading it out over an entangled state using many qubits. This entanglement allows observers to identify and remedy disturbances without observing the computational state itself, which would collapse it.  However, many 100’s or 1000’s of error correcting qubits are required for each logical qubit.
  • Speedup: The improvement in speed for a problem solved by a quantum algorithm compared to running the same problem through a conventional algorithm on conventional hardware.
  • Coherence/Decoherence: Coherence is the ability of a qubit to maintain its state over time.  Decoherence generally occurs when the quantum system exchanges energy with its environment, typically from gravity, electromagnetism, temperature fluctuation or other physical inputs (see “Noise”).  Longer coherence times generally enable more computations and therefore more computational power for QC.
  • No Cloning Theorem: The no-cloning principle is a fundamental property of quantum mechanics which states that, given a quantum state, there is no reliable way of producing extra copies of that state. This means that information encoded in quantum states is unique. This is sometimes annoying, such as when we want to protect quantum information from outside influences, but it is also sometimes especially useful, such as when we want to communicate securely with someone else.
  • Oracle: A subroutine that provides data-dependent information to a quantum algorithm at runtime.  It is often used in the context of “how many questions must be asked before an answer can be given” in order to confirm or establish quantum advantage.

Applications

  • Quantum Cloud: Access to Quantum Computers via a cloud-based provider.  Some prominent firms currently offering such access includes IBM, Amazon, Google, and Microsoft, among others.  Two benefits of such QC access included lower up-front costs (users do not need to buy any hardware) and futureproofing (i.e., as the QC makers create more powerful machines, cloud access can be directed to the newer machines without any added investment required by the users).
  • Quantum Communication: A method of communication that leverages certain features of quantum mechanics to ensure security.  Specifically, once a given qubit is “observed” or measured, it collapses to either a “1” or a “0”. Therefore, if anyone intercepts or reads a secure quantum message, the code will have changed such that the sender and receiver can see the impact of the breach.  QKD or quantum key distribution is an existing technology that is already in use over fiber optics, certain line-of-sight transmissions, and recently by China via a special satellite, between Beijing and Austria.
  • Shor’s Algorithm:  An integer factorization algorithm written in 1994 by American mathematician Peter Shor.  It is open-sourced and currently available to anyone to use to break RSA encryption or other protocols relying on the difficulty of factoring large numbers.  For this reason, it is often cited as a clear example of the need and desire for a powerful enough QC to run the algorithm.  No QCs are yet powerful enough to use this algorithm to circumvent RSA or related encryption, but that will change at some point in the coming years.  “Post-Quantum” encryption is generally meant as a protocol that would not be vulnerable to Shor’s Algorithm.
  • Grover’s Algorithm:  Another open-source algorithm already written, intended for search optimization.  For most current computer searches, the target samples must either be processed one at a time until the desired result is found, or the data must be organized (i.e., put in numerical or alphabetical order) to be searched more efficiently.  Grover’s algorithm can simultaneously search much of the entire field (depending on the power of the QC) and therefore find results much faster.  Shor’s and Grover’s algorithms are often the first two algorithms cited when discussing quantum supremacy and are elegant examples of the speedup that QC’s can provide.

I hope this glossary is a useful companion for your journey in understanding and appreciating Quantum Computing.  Feedback is always invited.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Quantum Computing: Progress and Prospects, The National Academies Press, 2019

Azure Quantum Glossary, Microsoft.com, accessed January 22, 2022

The Rise of Quantum Computing, McKinsey & Company, December 14, 2021

Glossary, Dotquantum.io, accessed January 22, 2022

Dilmegani, Cem, Quantum Computing Programming Languages, AI Multiple, published April 11, 2021, updated January 4, 2022.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Ten Fundamental Facts About Quantum Computing


The Quantum Leap

January 17, 2022


I’ve covered some of the key aspects of Quantum Computing in prior posts, including details about things like qubits, superposition, and entanglement.  I thought it would be helpful to readers to now synthesize and consolidate some of the fundamental properties of Quantum Computing in order to provide a bigger picture of the promise and potential of the industry. 

However, I want to be on alert for overblown claims or statements disconnecting fact from reality.  Some speak of a “Quantum Winter” where the hype gets overblown, and people get fed up with the promise and divert their attention (and resources) elsewhere, such as the case with nuclear fusion as a power source.  So, I will be careful to be as fact-based as possible.  As with all these posts, I hope that readers without any formal physics or computer science training can still appreciate and understand the information presented.  Feedback is always welcomed and encouraged.

  1. What is a Quantum Computer?

Quantum Computers (QCs) use incredibly tiny particles (e.g., atoms, ions, or photons) to process information.  The physics that govern the behavior of particles at this minute size scale is quite different from the physics we experience in our much larger “people-scale”.  QCs control and manipulate the individual particles as “qubits” which hold and process information analogous to how “bits” control our computers and electronic devices.  However, the quantum mechanics at work at this scale allow QCs to process more information much more quickly than ordinary computers.  Also, because of the different physics at play, different questions can be processed, and physical systems can be more accurately modeled, suggesting significant new advances as the machines continue to scale in size and power.  The following table highlights some of the differences between existing digital/classical computers and QCs:

Think about these differences as enabling a Quantum Computer to do more per step, which is another way of saying it can process information faster than a classical computer. As it turns out, this speed advantage is phenomenal, which is why there is such enormous potential for Quantum Computers.  See here for a prior post with additional details.

  1. How are Qubits made?

There are several different ways people are creating and manipulating qubits, each with an array of strengths and weaknesses. The overarching challenge for each method is the desire to maintain a constant environment for the qubit, shielding from light, electromagnetism, temperature fluctuations, etc., (i.e., non-disturbed) while at the same time maintaining exquisite control of the qubit.  Any tiny disturbance in the environment can throw off the qubit and create “noise” in the calculations.  On top of this, is the challenge of achieving precise control of such tiny elements, often in a cryogenic environment.  The power of the qubits resides in the ability to manipulate or rotate them very precisely.  This is a difficult engineering requirement that is increasingly being met by the players in the industry.  While there are a growing number of methods of creating and controlling qubits, here are some of the most common:

  • Superconducting Qubits:  Some leading QC players including Google and IBM are using superconducting, achieved at near absolute-zero temperatures, to control and measure electrons.  While there are a few different ways these qubits are created (charge, flux, or phase qubits) it generally utilizes a microwave resonator to excite an electron as it oscillates around a loop which contains a tiny gap, and measures how the electron crosses that gap.Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperatures adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to this low temperature environment.
  • Trapped Ions:  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
  • Photonic QubitsPhotons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 
  • Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium, or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

The following table highlights some of the features of these strategies along with companies currently working on QCs with these qubits.  See here for a prior post which provides added details.

  1. What is Superposition and Entanglement?

Nearly every introduction to Quantum Computing includes an explanation of Superposition and Entanglement, because these are the properties that enable qubits to contain and process so much more information than digital computing bits and enable the phenomenal speed-up in calculations.  While these are profound properties that are difficult to conceptualize with our common frame-of-reference on the macro-scale world, they are well established quantum physical properties. 

  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a qubit can be in a superposition of both at the same time, which imparts more information per processing unit than a classical bit. 
  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated.  It is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (n qubits = 2n bits).  The following table showcases this feature:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.  See here for more details on superposition and entanglement.

However, the sobering reality is that this chart assumes the qubits can be perfectly controlled for the duration of the calculations and that all the qubits can entangle with each other.  We are still quite far away from being able to achieve these parameters at a meaningful scale, although progress and advances are being made continuously.  The other key to understanding and appreciating this, is to distinguish between “logical qubits” which this table describes, versus “physical qubits”.  You may hear of companies using quantum computers with over 1,000 qubits but in the current NISQ (noisy intermediate-stage quantum) environment, many of the physical qubits are dedicated to error-correction as opposed to logic/calculations and often the qubits lose their superposition or entanglement properties (decoherence) very quickly, before the algorithms can be completed.  So, discussions about the number of qubits in a given quantum computer need to have the proper context to understand the computing power implications.

  1. Is the Power of Quantum Computers Magical?

You may be hearing claims of phenomenal powers of Quantum Computers (including from yours truly) along with descriptions of “quantum” as doing things surreal or supernatural (e.g., Schrodinger’s cat being both alive and dead).  Features of Superposition and Entanglement are very difficult for a lay person to understand or appreciate, let alone believe it can be used for computing purposes.  Some even describe quantum mechanics as “magical.”  Most people, when they think of magic, conjure up parlor tricks or optical illusions, so it would be natural to doubt the veracity of the claims of QC.  This, combined with the fact that nobody has created a QC that can perform real-world useful computations (yet) that can’t be performed on a classical computer.  However, while the underlying mathematics are advanced, there is clear and agreed science concerning the construction and performance of Quantum Computers.  The mathematical principals of manipulating qubits and using them to create logic gates are based on well-established linear algebra and trigonometry.  Innumerable quantum algorithms are being written and will perform useful and important calculations once quantum machines scale to match the required power needed.  At this point, it is difficult to predict precisely when such scale will be achieved, but those in the field will confirm that this is an engineering challenge not a theoretical challenge.

  1. I Hear True Quantum Computing may be Decades Away.  Is that True?

This is very difficult to answer with precision.  My first “computer”, bought in 1980, was a Sinclair ZX80 with only 8k of memory, a puny amount compared to today’s PC’s.  It certainly could not perform any applications or calculations that were of practical use at the time, although I was able to write some very basic code (ironically “Basic” was also the software language it used).  But I could truthfully and accurately say in 1980 that I was using a personal computer to execute commands.  A similar statement can currently be made by users of existing QC’s and many people are using cloud-based Quantum Computers today to run simple algorithms. While they are not yet capable of performing calculations that ordinary computers can’t perform, it is a dynamic and evolving situation.

At the same time, companies like D-Wave have “quantum” computers that use annealing, which leverages certain aspects of quantum mechanics, but cannot yet perform typical gate functions.  They have many customers performing useful optimization calculations today, although not full-fledged QCs in the typical sense.

While there are no crystal balls, there are several high-profile quantum computing companies publishing their development timelines, which generally suggest a large-scale product (i.e., more than 100 logical qubits) before the end of this decade.   See below for IBM, Honeywell (Quantinuum) and IonQ versions: 

Many predict consistent Quantum Advantage (when Quantum Computers can consistently perform real-world calculations) in the next 5-10 years.  The key thing to follow as the industry advances, will be to monitor which players are successful in meeting their timeline milestones.  As more and more companies achieve important stated milestones, this timeline should become more precise.

  1. Can We Measure Quantum Computing Power?

Unfortunately, there is no universally recognized measurement standard for the power of a Quantum Computer.  There are several characteristics that are important including the number of qubits, the fidelity of the qubits, the length of time entanglement can be sustained, the numbers of gates that can be utilized, the numbers of connections between qubits that can be controlled, etc.  Recently, IBM proposed a metric called “quantum volume” which is intended to consolidate many of these features although not all players are utilizing this standard.  Barring any established metric, be careful to understand and appreciate the claims made by Quantum Computing companies realizing that the power of the computer is not necessarily directly correlated to the numbers of qubits it uses.   See here, for a prior post which covered performance measurement.

  1. Are People Really Using Quantum Computers?

This is a bit of a trick question.  The truth is that dozens of providers have made actual working Quantum Computers available for use via the Cloud.  Some basic machines are available for no charge, some are available free for academic use, and some can be utilized for a modest cost.  You could finish reading this article, and assuming you were familiar with basic Python programming, download a development kit from IBM (Quiskit), Microsoft (Q#), Google (Cirq), Amazon (AWS Bracket) or others, and begin writing quantum algorithms, and then establish an account with one of the QC cloud providers and either wait in the queue for your turn on a given machine, or acquire time to have the algorithm run on one of dozens of machines available remotely.

A recent study by Zapata Computing revealed that many companies are also using or planning to use QC in their businesses.  Specifically, the study indicates that “69% of enterprises across the globe reveal they have adopted or are planning to adopt QC in the next year,” with those already having adopted some form of QC amounting to 29% of their survey respondents.  In addition, you may read of many companies using Quantum Computers today to begin various optimization analyses.  The following highlights some of the companies currently exploring QCs for various business applications:

  1. Where will Quantum Computing Provide Early Impact?

The superposition and entanglement of qubits enables QCs to evaluate many dataset items simultaneously instead of linearly, hence the tremendous speed-up in processing.   One area where QCs can use these speedup features to provide a quantum advantage is in the ability to process currently unmanageable combinatorial problems (simulation and/or optimization).  To visualize this, consider that a simple seating chart for 16 people involves over 20 trillion possible configurations [see here for prior post describing this in more detail].  Imagine the complexity of trying to design new chemicals or materials or medicines or optimized financial portfolios.  The numbers of atoms, chemical bonds, or securities involved makes computer simulations practically impossible with existing classical computers, and the trial-and-error of experimentation is costly and time consuming.   Therefore, problems involving combinatorics are the likely first uses of QCs.  The following table highlights some of these use cases:

  1. Are My Bitcoin Porfolio or Encrypted Bank Transactions Vulnerable to Quantum Attack?

The short answer is, not really.   While it is theoretically true that a powerful enough Quantum Computer could mine all remaining cryptocurrency and break standard RSA encryption (used for most secure messages and transactions communicated over the Web), this is a well-known issue that is seeing substantial remedial attention.  NIST (the National Institute of Science and Technology), a government entity which oversees certain standards and measurements, is in the final round of approving candidates to deploy a post-quantum cryptography standard.  There are four Round 3 finalists with Public-Key Encryption and Key-Established Algorithms, and three Round 3 finalists with Digital Signature Algorithms, so new approved protocols which are “quantum safe” are imminent.  In addition, there are other ways to secure on-line transactions besides RSA encryption, such as two-factor authentication, so more and more users are establishing enhanced protections.  As for bitcoin, that is a bit more nuanced.  Since most cryptocurrencies rely on increasingly complex mathematics for the mining of new coins, there is a finite number or bitcoins that can be created, and with existing computing power, it is anticipated that the discovery, or mining, of new coins will continually take longer and longer until it reaches its final amount (estimated at ~100 years at the current pace).   So, if quantum computers are built which can mine faster, this end date may be accelerated, but the total number of possible bitcoins won’t change.

  1.  How can I Learn More?

There are many excellent resources available including articles, papers, on-line tutorials, books, and other resources.  Please sign up to receive this blog as new posts are written and/or visit this section of the Quantum Leap blog for links to some additional resources.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

IBM’s roadmap for scaling quantum technology | IBM Research Blog, retrieved January 16, 2022.

Scaling IonQ’s Quantum Computers: The Roadmap, retrieved January 16, 2022.

Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When “If” Turns to “When” in Quantum Computing”, Boston Consulting Group, July 2021.

Harnessing the Power of Quantum Computing | Honeywell Beyond 2021, accessed January 9, 2021

Starting the Quantum Incubation Journey with Business Experiments”, Digitale Welt Magazine, accessed January 16, 2022

The First Annual Report on Enterprise Quantum Computing Adoption, Zapata Computing, July 5, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantinuum – Company Evaluation

The Quantum Leap

January 9, 2022

When I established this blog in November of last year, I noted that I would present posts regarding details underlying Quantum Computers (QC), the immense potential they hold, and advances being made.  I hope you have enjoyed those posts which will continue (see here for a link to prior posts), but I also stated an intention to reflect on current events, companies and breakthroughs.  I thought it fitting that Quantinuum be the first company profile presented in this series. 

Background

IIn June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to merge with Honeywell Quantum Solutions (HQS), a division of Honeywell. Then in November of 2021, Honeywell spun out the combined businesses into a new stand-alone company called “Quantinuum”. In addition, Honeywell invested $300m into Quantinuum which is now 54% owned by Honeywell and 46% by CQC shareholders.

CQC, founded in 2014, is a global Quantum Computing software company which designs enterprise applications in the areas of quantum chemistry, machine learning and cybersecurity, among others.  Honeywell is a Fortune 100 multinational conglomerate with operations in aerospace, building technologies, performance materials, and safety and productivity solutions.  Its diverse industrial footprint included expertise in cryogenic systems, ultra-high vacuum systems, photonics, RF (radio-frequency) magnetic systems, and ultra-high precision control systems, all of which turned out to be extremely well suited for building a quantum computer.  In ~2010 Honeywell Quantum Solutions was secretly formed, reached some critical technical milestones in 2015 and was publicly disclosed in 2018. By 2020 HQS released the “Model H1”, a modest 10-qubit trapped-ion QC and it has been on an aggressive timetable for scaling up its QC portfolio, recently showcasing the achievement of quantum volume of 2,048 using its 12 qubit Model H1-2 which was a 10x increase in quantum volume in less than one year.

Details on Honeywell Quantum Solutions

Leveraging its 130 years of innovation including strengths in science, engineering and research, Honeywell has developed trapped-ion quantum computers using individual, charged atoms (ions) to hold quantum information. Their system uses electromagnetic fields to hold (trap) each ion so it can be manipulated and encoded using microwave signals and lasers.  These trapped-ion qubits can be uniformly manufactured and controlled more easily compared to alternative qubit technologies that do not directly use atoms and it does not require cryogenic cooling (although an ultra-high vacuum environment is required).

In October of 2020, HQS introduced its first quantum computer, the System Model H1, which featured 10 fully connected qubits and a quantum volume of 128, which was the highest reported at the time (surpassing IBM’s prior record of 64).  By this past December, the Model H1-2 successfully passed the quantum volume benchmark of 2,048, a new global record and consistent with the Company’s stated timeline of annual 10x increases in quantum volume.  The hardware roadmap includes four key milestones to be achieved before the end of the current decade:

  1. Model H1: Creation of a linear device with 10 computational qubits [achieved], eventually scaling to 40 qubits.
  2. Model H2: Using the same lasers to perform operations on two sides of a racetrack configuration.  Once achieved, quantum volume should exceed that possible with classical computers (i.e., will not be able to be simulated on classical machines).
  3. Model H3: Geometries change to a grid, which will be much more scalable than linear or racetrack configurations.
  4. Model H4: Aim to integrate optics via photonic devices that allow laser sources to be an integrated circuit. 

The following chart showcases the planed roadmap:

Source: Honeywell

Details on Cambridge Quantum Computing

The team at CQC has been developing the theoretical foundations of Quantum Computing for over 25 years.  They design, engineer and deploy algorithms and enterprise level applications leveraging TKET, their hardware-agnostic software development platform, along with other technologies.  They have developed application specific quantum software across a number of fields including quantum chemistry, quantum artificial intelligence and quantum cybersecurity.  Here is a brief overview of their products and solutions:

TKET: is a leading open-source development toolkit that enables optimization and manipulation of quantum circuits for current quantum computers.  As a platform-agnostic tool, TKET can integrate with most commercially available quantum hardware platforms including IBM, Honeywell, Google, IonQ and others, as well as third-party quantum programming tools including Cirq, Qiskit and Pennylane.

Quantum Origin: is an industry-defining, cryptographic key generation platform that employs quantum computers to generate quantum-enhanced cryptographic keys.  Using the Quantum Origin platform, both classical algorithms (e.g., RSA or AES) and post-quantum algorithms (e.g., CRYSTALS-Dilithium and Falcon) can be seeded to provide cryptographic keys that offer superior protection against even the most powerful of adversaries (see more on Quantum Origin below in the Strangeworks Collaboration section).

QNLP: The rapidly emerging field of Quantum Natural Language Processing, and its underlying theoretical foundations, has been pioneered by the team at CQC. lambeq is the world’s first software toolkit for QNLP capable of converting sentences into a quantum circuit. It is designed to accelerate the development of practical, real-world QNLP applications, such as automated dialogue, text mining, language translation, text-to-speech, language generation and bioinformatics. Their structural approach takes advantage of mathematical analogies between theoretical linguistics and quantum theory to design “quantum native” NLP pipelines. Combined with advances in Quantum Machine Learning (QML), CQC has successfully trained quantum computers to perform elementary text classification and question-answering tasks, paving the way for more scalable intelligence systems.

Quantum Artificial Intelligence: (QAI) is one of the most promising and broadly impactful application areas of quantum computing. CQC is simultaneously pioneering the highly interconnected areas of quantum machine learning, quantum natural language processing, quantum deep learning, combinatorial optimization and sampling (i.e., Monte Carlo simulations) to build intelligence systems of the future.

QA: The Quantum Algorithms division is seeking to realize definitive and unequivocal quantum computational advantage as soon as possible. Although ultimately interested in all quantum algorithms, at present, the focus is on three problems which show promise for early quantum advantage, including Monte Carlo estimation, optimization and solving Partial Differential Equations (PDEs).

QML: The Quantum Machine Learning division, in collaboration with industrial, academic and governmental partners, designs and engineers novel, application-motivated Quantum Machine Learning algorithms across industries such as finance, healthcare, pharma, energy and logistics.

EUMEN: Currently in advanced beta testing, EUMEN is an enterprise-grade quantum computational chemistry package and development ecosystem, enabling a new era of molecular and materials simulations. Developed in close collaboration with Fortune 500 partners, EUMEN’s modular workflow enables both computational chemists and quantum algorithm developers to easily mix and match the latest quantum algorithms with advanced subroutines and error mitigation techniques to obtain best-in-class results. Current applications in development with clients include new material discovery for carbon sequestration, drug design and discovery, and hydrogen storage.

The Combined Companies as Quantinuum

Quantinuum has the benefit of CQC’s software and algorithm expertise combined with HQS’s hardware expertise, creating the largest full-stack dedicated quantum computer company.  Quantinuum has about 400 employees in 7 offices in the US, UK and Japan.  On the hardware side, the Model H series of quantum computers are available via the cloud, facilitating broad access and ensuring it is “future-proof” for customers as the product evolves and advances.  On the software side, the open-source platform-agnostic approach will continue, ensuring customers always have access to the best tools for the target application and will not be dependent on a single company’s machines.

The predecessor companies had a long history of collaboration.  In fact, CQC was the first external user to run a quantum circuit on the System Model H0, Honeywell’s inaugural commercial system.  No organization outside of Honeywell had used the H-Series hardware more than CQC, so the formal combination of the businesses seems like a natural extension of their legacy collaborations.  By spinning the business out into a stand-alone company, you can expect to see a Quantinuum IPO some time this year.

Strangeworks Collaboration

“Quantum Origin” is the first commercially available product based on verifiable quantum randomness, a capability essential to securing existing security software and to protect enterprise systems from threats posed by quantum computing-based attacks.  Just this past week, Strangeworks, a global quantum computing software company, announced a collaboration to implement Quantinuum’ s quantum-enhanced cryptographic keys into the Strangeworks ecosystem.  By implementing Quantum Origin, Strangeworks will be the first to implement a seamless path to quantum-generated cryptographic keys and it expects to expand the relationship between the parties enabling rapid adoption, insights and continued development.  

Select Customer Usage Cases

Quantinuum has listed a few case studies on their website,  including the following:

Nippon Steel: Has collaborated with the Company to optimize scheduling.  As the recent global supply-chain disruptions have highlighted, complexities in managing manufacturing and supply often requires companies to juggle resources.  Nippon Steel produces over 50 million metric tons of steel annually and has been using an algorithm co-developed with Quantinuum and run on a System Model H1, to schedule the intermediate products it uses.  Having the right balance of raw materials and intermediate products is essential and is a delicate balancing act facilitated by Quantinuum. 

Samsung: The electronics giant teamed up with Imperial College London to investigate new battery materials using a System Model H1. The team created a simulation of the dynamics of an interacting spin model to examine changes and effects of magnetism.  They were able to run deep circuits and use as many as 100 two-qubit gates to support the calculations, confirming the Model H1 can handle complex algorithms with a high degree of accuracy.

BMW: Entropica Labs, a Singapore-based quantum software startup, and the BMW Tech Office, teamed up to develop and run a Recursive Quantum Approximate Optimization Algorithm (R-QAOA) to benchmark logistics and supply chain optimization via number portioning, a classic combinatorial problem that is an entry point to many logistics challenges.  More complex versions of R-QAOA are now being explored.

This is just a small sampling of current projects and customers, with more than 750 overall collaborations currently underway, suggesting substantial customer uptake and potential.

Summary

Cambridge Quantum Computers and Honeywell Quantum Solutions were each already formidable players in the evolving QC space and have been generating meaningful revenues from this nascent field. CQC is/was a reputable and well-established quantum software and algorithm provider and HQS has created advanced QC devices which continue to scale and surpass performance records.  Assuming they can achieve synergies as a combined company, the upward trajectory should accelerate.  That said, the QC industry is still quite immature, and many players are dedicating substantial resources, so any early market leads will remain vulnerable to new technologies or competitive advances.  If Quantinuum can successfully leverage the broad client portfolio and historical industrial legacy of Honeywell with the substantial history and success of CQC, it should remain a leader in this growing field.  The following table highlights some of the key attributes of Quantinuum:

Rating

Apropos of the probabilistic nature of quantum algorithms, I wanted to leverage the nomenclature to create a company rating system and assign a scale to my overall assessment of a company’s potential.  Accordingly, I am going to use the formula below when reviewing companies, whereby the “alpha” coefficient correlates with “positivity” (and the formula adheres to the Born rule).  Given my overall assessment of Quantinuum including its strong position as a full-stack player, the strengths of the legacy businesses and the potential synergies, I am assigning the highest rating to Quantinuum at this time, with an Alpha of 0.95 which equates to an “Exceptional performance expected”.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Our Technology – Cambridge Quantum, retrieved January 8, 2022.

Strangeworks and Quantinuum partner to integrate world’s first quantum-enhanced cryptographic key service – Strangeworks, retrieved January 8, 2022.

TQD Exclusive: Interview with Tony Uttley, President of Honeywell Quantum Solutions, Kirmia, Andrew, May 3, 2021.

Cambridge Quantum Computing, Pitchbook profile, accessed August 2, 2021

Next Few Months Will Demonstrate Quantum Cybersecurity Value of the New Quantum Computing Company Quantinuum, The Qubit Report, December 3, 2021

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantum Supremacy vs. Quantum Advantage – and how do we measure these things? 

Quantum Supremacy vs Quantum Advantage 

In October of 2019, Google announced that they had demonstrated the ability to compute in seconds what would take the largest and most advanced supercomputers thousands of years, thereby achieving a milestone referred to as “quantum supremacy” for the first time. They used a processor named “Sycamore” with 54 programmable superconducting qubits to create quantum states on 53 qubits (one did not operate), corresponding to a computational state-space of 253 (equivalent to about 1016 or over ten million-billion calculations).  They achieved this using a two-dimensional array of 54 transmon qubits, where each qubit is tunably coupled to four nearest neighbors. Each transmon has two controls: a microwave drive to excite the qubit, and a magnetic flux control to tune the frequency.  The claim was generally considered by many to be a “Wright Brothers Kitty Hawk” type of achievement.

And then, later that year, researchers at the University of Science and Technology of China (“USTC”) announced that they had also achieved quantum supremacy, utilizing a Quantum Computer named “Jiuzhang” which manipulates photons via a complex array of optical devices including light sources, hundreds of beam splitters, dozens of mirrors and 100 photon detectors.  They claimed that their device performed calculations in 20 seconds that would take a supercomputer 600 million years. Each of Google and USTC have increased their qubit utilization since these breakthroughs and now several other companies have successfully operated Quantum Computers with dozens of qubits and a couple with 100 or more. 

Let’s review some semantics regarding the measurement of Quantum Computing performance.  In 2012 a leading quantum mechanics researcher named John Preskill, a professor of theoretical physics at CalTech, first coined the term “quantum supremacy” to “describe the point where quantum computers can do things that classical computers can’t, regardless of whether those tasks are useful.”   He coined this term before any actual Quantum Computers had been built.  At the time, Preskill was wondering, in his words, “whether controlling large-scale quantum systems was merely really, really hard or whether it was ridiculously hard.  In the former case we might succeed in building large-scale quantum computers after a few decades.  In the latter case we might not succeed for centuries.”  In this sense, and based on Preskill’s original intent, the announcement by Google is a bona fide example of Quantum Supremacy and indicated that “a plethora of quantum technologies are likely in the next decade or so” [Preskill, 2019]. 

So, although the Google Sycamore quantum supremacy claim was discounted by some (most notably IBM and researchers in China), and despite it being an admittedly highly contrived and not very useful calculation, it was a ground-breaking achievement.     

Before I get into the semantics of how we measure Quantum Computing power, here is what the quantum community generally means regarding quantum progress: 

Quantum Supremacy: This term still retains Preskill’s original context and is considered the first major step to prove quantum computing is feasible.  Specifically, it means: “demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.”  Based on the definition, this threshold has been passed since October 2019, in fact at this point it has been shown by several different companies beyond Google and this is why I refer to the current hurdles as engineering challenges rather that theoretical ones.

Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.  

How do we Measure Quantum Computing Performance? 

At the end of a prior post regarding Qubits, I alluded to the challenge of measurement metrics for Quantum Computing highlighting that the count of operating qubits is not appropriate as a yardstick.  Imagine if you were shopping for a new car.  If the only metric that was available was “horsepower”, it would be very difficult to decide which car to buy.  By itself, horsepower is only one measure of car performance.  It does not factor actual acceleration power, fuel efficiency, ride comfort, handling, noise levels, legroom, sleekness, color/trim/style, etc.   Even if we are considering computers, just focusing on the clock speed, for example, would not provide enough breadth of information to make an informed purchase decision.  While Quantum Computers are in their very early stages, simply measuring a particular calculation speed or the number of qubits used, is not enough to describe accurately the actual performance capabilities.   Researchers at IBM have proposed the term “Quantum Volume” to enable the systematic measurement of Quantum Computing performance.  It is a metric that measures the capabilities and error rates of a Quantum Computer by calculating the maximum size of square quantum circuits that can be implemented successfully.  While the details are a bit esoteric, it is intended to provide one number, or score, to be used to compare incremental technology, configuration and design changes and to compare the relative power of one Quantum Computer to another.   

In fact, the performance of a quantum computer involves many factors as shown below: 

Source: IBM and Forbes as adapted by Riccardo Silvestri 

Since quantum volume is not quite an industry term-of-art at this point, I won’t use it as the definitive measurement tool.  However, the concept of focusing on characteristics beyond just the “number of qubits” is crucial, and I will discuss the relative performance characteristics of competing Quantum Computers beyond just a mention of the number of qubits. 

While many of the balloons in the above graphic may be unfamiliar, there are three key metrics for measuring quantum computing performance: 

  1. Scale: The number of qubits which the computer can simultaneously process.  It is important to distinguish between physical and logical qubits, with logical qubits being the key element (as I’ll show below, many constructs are adding physical qubits for error correction overhead). 
  1. Quality: The quality of the circuits which factors in both the time that the qubits remain in a superposition and entangled before they decohere, and the numbers of qubits that can entangle with each other. 
  1. Speed: Typically measured by circuit layer operations per second (or CLOPS) or how many circuits or gates can run on a Quantum Computers at a given time.  While this is a strong and objective measurement, it is not generally reported at this time. 

Another reason that the “number of qubits” is not useful to compare performance, is that we are currently operating in the NISQ environment (recall the “N” is for noisy).  Accordingly, many constructs are being proposed where certain qubits are dedicated to error correction and not for added entanglement.  IBM has a useful graphic to highlight the tradeoff between physical and logical qubits based on error rates: 

Quantum Computing Milestones 

While the semantics and various yardsticks used to describe Quantum Computer performance is confusing, evolving and not yet universally agreed upon, real progress is being made no matter which metric is showcased.   Here are a few recent advances in early working Quantum Computers, although not all report the same metrics, so it is difficult to compare these to each other: 

In addition to these Quantum Computers, Intel has a 49 qubit QC, Xanadu as a 24 qubit QC, and MIT has a 100 qubit QC, however the other performance metrics noted in the table are not readily available for these. 

It is worth noting that USTC recently claimed that Zuchongzhi 2.1 is a million times more powerful than Google’s Sycamore, and that it is 10 million- 100 trillion times faster than the world’s fastest supercomputer.  While it is difficult to substantiate these claims, given China’s enormous focus on Quantum Computing, a China-US space race of sorts is certainly afoot.  Also, the Quantinuum achievement on H1, only very recently announced, is worth paying close attention to given its high quantum volume and long decoherence times. 

Semantics and yardsticks aside, it is fascinating to see the increasing number of companies creating working Quantum Computers with ever increasing performance metrics, confirming that it is merely “really, really hard” to build these devises and not “ridiculously hard”.  It seems like we are seeing new press releases each week showcasing quantum performance achievements by these and others in the field.  Stay tuned as we track the performance. 

Refereneces: 

arXiv:1203.5813, “Quantum Computing and the entanglement frontier”, Preskill, John, March 26, 2012 

Quanta Magazine, “Why I called it ‘Quantum Supremacy”, Preskill, John, October 2, 2019 

Nature, “Quantum supremacy using a programmable superconducting processor,” Arute, Arya, Babbush, et. al., October 23, 2019 

The Independent – UK, “China builds world’s fasted programmable quantum computers that outperform ‘classical’ computers,” Sankaran, Vishwam, October 31, 2021 

Scorecards – Quantum Computing Report, Retrieved December 2021 

Silvestri, Riccardo. Masters Degree Thesis: “Business Value of Quantum Computers: analyzing its business potentials and identifying needed capabilities for the healthcare industry.” August 2020 

The Evolving Quantum Computing Ecosystem

In the past few blogs I have described what a Quantum Computer is and how it can be so powerful and transformative, basic features of qubits and highlights on some of the major players in Quantum Computing (“QC”).  But just like the evolution of personal computing, there are many participants in the QC ecosystem beyond just the makers of the actual machines.  You likely use a PC today, manufactured by one of a number of various hardware makers.  However, your machine’s core operating memory is made by a different company,  it is built upon an operating system (likely MS-DOS, owned by Microsoft) as well as various software applications.  You may also use external data drives, a mouse input device, a screen, a printer, various cables and other physical devices.  You likely also access the internet and some of the cloud services, utilize a virus protection program and other related activities and services.  There are likely dozens if not 100’s of companies whose technologies you use daily to operate your computing device. 

Companies like Oracle ($280 billion market cap; database management), Ingram Micro ($5.8 billion market cap; distributor of technology equipment), Cisco ($250 billion market cap; interconnecting equipment and services), Symantec ($15 billion market cap; antivirus protection), Adobe ($311 billion market cap; document and process software) and Salesforce ($262 billion market cap; productivity platform) have created enormous value despite not actually making any computers.    Quantum Computers will likely spur many similar such players in its ecosystem, in fact there are already 100’s of players engaged in this space., Some of these participants may also carve out significant market positions and value.  To give a sense for the breadth and depth of players needed, you can visualize the basic inner workings of a Quantum Computer as follows:

As this graphic shows, there are various aspects of the physical creation and manipulation of qubits (the bottom section of the graphic) along with software needed to control the logical layer.  Also, covered in a prior post, there are various ways to create qubits, often requiring cryogenic temperatures and/or detailed laser or radio frequency controls. 

Here is another graphic to help visualize the complexities of building a quantum computer:

Source: IBM

You’ll note the various wiring, amplifying, microwave generation and shilling components all requiring highly specialized design and control.  In order to describe the various QC players, it is helpful to segregate them into some functional categories or buckets as follows:

Hardware: Companies seeking to build a fully-functional Quantum Computer.  Many are also creating software and are integrating access to the cloud.  As discussed in a prior post, there are a few competing technologies underlying the creation of a working Quantum Computer including superconducting loops and Quantum Dots (which require cryogenics), or Ion traps and Photonics (which require sophisticated optics/laser controls), among others.

Circuits/Qubits: There are some companies focused on quibits and their interoperability for entanglement rather than attempting to build complete systems.

Cryogenics: Superconducting loops and quantum dots require temperatures that approach “absolute zero” (~negative 460 degrees Fahrenheit).  Many of the pictures you may see of Quantum Computers (like the graphic above) generally depict a 7-tiered structure, whereby the temperature is lowered in each of the layers, and there are companies that specialize in temperature control.

Wiring/Controllers: Operating near absolute zero, using lasers to control individual atoms or manipulating and controlling individual photons all require specialized and sophisticated devices and connections.  Some players are focused just on these types of challenges.

Error Correction: Due to the current NISQ (noisy intermediate-stage quantum) landscape and the need to have enormous computing “overhead” to correct for the noise in today’s qubits, some companies are concentrating on error correction strategies.

Photonics:  Lasers and/or photons are being utilized in various QC constructs and some companies are providing this specialization.

Software: Many of the major companies have developed quantum software to control and manipulate the qubits and the gates formed to perform quantum algorithms.  Some of these are creating open-source platforms while others are working on proprietary languages.

Applications:  Although this is still a somewhat immature portion of the market, as Quantum Computers continue to become more and more robust, I expect to see many more businesses develop applications and various related consulting services.

I will describe some of the players in this ecosystem, although the list is vast and growing, so this is not meant to be a definitive roster, rather a sampling to highlight the broad set of players and opportunities in Quantum Computing.  For a more complete list of players, I encourage you to visit this Quantum Computing Report listing.

In a prior post I noted that some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, as highlighted below: 

Each of these firms is making a major push in Quantum Computing, although their “valuation” is more driven by their other activities.  In any case, they are worth following and I expect their QC activities will make up an increasing portion of their values.

For the balance of this post I want to focus more on the players who are dedicated to QC or who have major operating divisions participating in the space, segregated by the categories described above:

Xanadu: Operator of a quantum photonic platform which it will combine with advanced artificial intelligence to integrate quantum silicon photonic chips into existing hardware to create a full-stack quantum computers.

IonQ: IonQ is a quantum computing hardware and software company developing a general-purpose trapped ion quantum computer and software to generate, optimize, and execute quantum circuits. It is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access and was the first pure-play public QC company.

Atom Computing:  Developer of quantum computers built using individually controlled atoms, creating nuclear-spin qubits made from neutral atoms that can be controlled, scaled, and stabilized optically.

PsiQuantum: PsiQuantum was founded on the premise that if you want a useful quantum computer, you need fault tolerance and error correction, and therefore ~1,000,000 physical qubits– to address commercially useful quantum computing applications.

Rigetti: Developer of quantum computing integrated circuits packaged and deployed in cryogenic environments and integrated into cloud infrastructure using pre-configured software.   The company also develops a cloud platform called Forest that enables programmers to write quantum algorithms.

EeroQ: Developer of quantum cloud platform using  trapping and control of individual electrons floating in a vacuum above superfluid helium, which form the qubits, and the purity of the superfluid protects the intrinsic quantum properties of each electron, allowing users to get seamless delivery of computing power.

ColdQuanta: Developer of quantum sensing technologies with a focus on improving the positioning and navigation systems as well as providing cold atom experimentation, quantum simulation, quantum information processing, atomic clocks, and inertial sensing products, enabling users to explore their own quantum matter innovations for sensing and other applications.

Quantum Circuits: The company’s computers are superconducting devices that include a quantum circuit model for quantum computation with an error correction system, enabling clients to make error-free computation through solid-state quantum bits.

D-Wave: Developer of quantum computing technologies offering annealing algorithms to solve optimization problems for commercial use in logistics, bioinformatics, life, and physical sciences, quantitative finance, and electronic design automation.

Oxford Instruments: Designs and manufactures tools and systems for industry and research. Their Quantum Technologies division helps companies with cryogenics, sensing photons and fabricating novel quantum materials.

Silicon Quantum Computing: SQC is currently developing a 10-qubit quantum integrated circuit in silicon to be delivered in 2023, and has the ultimate goal of delivering useful commercial quantum computing solutions.

Oxford Ionics: Manufacturer of computational electronic systems intended to create the most powerful, accurate, and reliable quantum computers. The company’s system is based on noiseless electronic qubits trapped ions control technology to create high-performance quantum computers by combining high quality qubits and trapped ions. 

Teledyne e2V: The engineering groups of Teledyne draws on a portfolio of leading edge technology, unique expertise and decades of experience in sensing, signal generation and processing for the development and commercialisation of Quantum technologies.

Quantum Brilliance: Using synthetic diamonds to develop quantum computers that can operate at room temperature, without the cryogenics or complex infrastructure, enabling disruptive quantum computing applications.

Chronos: Chronos Technology specializes in time, timing, phase, and monitoring solutions and services including highly accurate atomic clocks and clock syncronization.

BraneCell: Developer of a new quantum processing unit that can function at ambient temperatures. The company offers decentralized quantum computing hardware

Quantum Machines: Designing quantum controllers that translate quantum algorithms into pulse sequences, enabling organizations to run complex quantum algorithms and experiments in a smooth, intuitive way.

Alpine Quantum Technologies: Developer of ion trap quantum computer technology where single, charged atoms are trapped inside vacuum chambers.  Each qubit is manipulated and measured by precisely timed laser pulses.

Bluefors: Developer of a cryogen-free dilution refrigeration system designed to deliver easy-to-operate refrigerators. The company’s system provides custom unit connection components for different specifications including dilution units, control systems and gas handling units.

kiutra: Developer of a cooling technology intended to offer cryogen-free cooling service. The company’s technology offers sub-Kelvin temperatures for basic research, material science, quantum technology, high-performance electronics, and detector applications.

Toptica: Manufacturer of and distributor of high-end laser systems designed for scientific and industrial applications including for qubit control.

M-Squared: Developer of photonics and quantum technology used specifically for quantum research, biophotonics and chemical sensing application. The company’s laser based systems offer lasers and photonic optical instruments for applications in remote sensing, frontier science, bio-photonics, defence, microscopy, spectroscopy and metrology.

Montana Instruments: Delivers best-in-class cryostats that are simple to set up, use, and grow with our partners in your journeys over time. Since 2009, Montana Instruments has worked with hundreds of category pioneers to build cryostats with purposeful modularity.

Single Quantum: Developer of single-photon detectors designed to detect particles of light. The company’s detectors are based on superconducting nanotechnology.

Sparrow Quantum: Spun out of the Niels Bohr Institute, a developer of a photonic quantum technology based on self-assembled quantum dots coupled to a slow-light photonic-crystal waveguide, enabling nanophotonics researchers to increase light-matter interaction and enhance chip out-coupling.

Quantum Motion: Developer of quantum computer architectures designed to solve the problem of fault tolerance. The company’s architectures leverage CMOS processing to achieve high-density qubits which can scale up to large numbers and tackle practical quantum computing problems, enabling users to help reduce errors and thereby improve quality.

QDevil: Developer of electronics and specialized components for quantum electronics research.  The QFilter is a cryogenic filter for reducing electron temperatures below 100 mK. The product portfolio also includes the QDAC, a 24-channel ultra-stable low noise Digital-Analogue-Converter, the QBoard, a fast-exchange chip carrier system, and the QBox, a 24-channel breakout box.

SeeQC: The company’s technologies are developed and commercialized for quantum information processing applications including scalable fault-tolerant quantum computers and simulators, quantum communications, and quantum sensors, enabling businesses to get access to a full suite of electronic circuit design tools for integrated circuit design including PSCAN2, XIC, WR Spice and InductEx.

Delft Circuits: Manufacturer of cryogenic circuit technologies intended to perform scientific instrumentation, quantum computing, and astronomy. The company’s technology offers custom-engineered superconducting circuits and cryogenic instrumentation which have ultra-low thermal conductance and scalable cryogenic cabling, enabling users to conduct their research with cryogenic circuit packaging as per their need.

Q-CTRL: Developer of quantum control infrastructure software designed to perform quantum calculations to identify the potential for errors. The company’s platform uses quantum sensors to visualize noise and decoherence and then deploy controls to defeat the errors, enabling R&D professionals and quantum computing end users to improve the efficiency and performance of standoff detection as well as precision navigation and timing for defense and aerospace.

TMD Technologies: Manufacturer of professional microwave and radio frequency products primarily focused n the defense and communications markets as well as providing compact and precise atomic clocks, new gravimetric and magnetic sensors used in quantum computers. 

Terra Quantum: Developer of a hybrid quantum algorithm intended to solve a linear system of equations with exponential speedup that utilizes quantum phase estimation.

QxBranch: Developer of algorithms and software intended to provide predictive analytics, forecasting and optimization for quantum and classical computers.

Zapata: Spun out from Harvard in 2017, developer of a quantum software and algorithms to compose quantum workflows and orchestrate their execution across classical and quantum technologies. The company’s platform provides artificial intelligence, machine learning and quantum autoencoder to deliver an end-to-end, workflow-based toolset for quantum computing that advances computational power.

Cambridge Quantum Computing: Quantum computing software company building tools for commercialization of quantum technologies. The company designs software combining enterprise applications in the area of quantum chemistry, quantum machine learning and augmented cybersecurity in a variety of corporate and government use cases.

RiverLane: Developer of quantum computing software using an ultra-low latency quantum operating system that accelerates quantum-classical hybrid algorithms to facilitate hardware research and development and also develops algorithms to make optimal use of the full quantum computing stack, enabling hardware partners to focus on the physics and build better full-stack solutions.

QCWare: Developer of enterprise software designed to perform quantum computing. The company’s software simplifies QC programming and provides access to QC machines while improving risk-adjusted returns and monitoring networks, enabling clients to integrate quantum computing power into any existing application and remove performance bottlenecks.

StrangeWorks: Strangeworks QC™ is used by thousands of researchers, developers, and companies around the world to learn, teach, create, and collaborate on quantum computing projects and , enabling clients to overcome the risks of vendor lock-in and architectural uncertainties. 

1Qbit: 1QB Information Technologies is a quantum computing software company in hardware partnerships with Microsoft, IBM, Fujitsu and D-Wave Systems. 1QBit develops general purpose algorithms focused on computational finance, materials science, quantum chemistry, and the life sciences.

Quantum Computing Inc.: Quantum Computing Inc is focused on providing software tools and applications for quantum computers. Its products include the Qatalyst, Qatalyst Core, and Quantum Application Accelerator. Qatalyst enables developers to create and execute quantum-ready applications on conventional computers while being ready to run on quantum computers where those systems achieve performance advantage.

Quintessence Labs: Developer of quantum-cybersecurity applications designed to implement robust security strategies to protect data. The company’s cybersecurity technologies are used for cryptographic purposes to centralize the management and control of data-security policy and harness quantum science properties, thereby enabling businesses to increase returns on investment from existing assets and reduce data-security complexities.

MagiQ: A research and development company offering quantum cryptography system. The company’s offering includes optical sensing applications for RF interference cancellation, quantum cryptography, and optical surveillance for advanced energy exploration, enabling customers better communicate, safeguard and secure their worlds.

Quantinuum: A Honeywell spin-out, the company provides an open-access, architecture-independent quantum software stack and a development platform, enabling researchers and developers to work seamlessly across multiple platforms and tackle some of the most intriguing problems in chemistry, material science, finance, and optimization.

Nu Quantum: Developer of cryptography systems designed to be more secure and time-efficient. The company’s system created a portfolio of patented ground-breaking single-photon components fundamental to the realization of commercially viable photonic technologies by combining novel materials and semiconductor technology, enabling clients to secure exchange of cryptographic keys worldwide for the ultra-sensitive detection of light.

ID Quantique: Provider of quantum-safe crypto services designed to protect data for the long-term future. The company offers quantum-safe network encryption, secure quantum key generation, and quantum key distribution, enabling financial clients, enterprises, and government organizations to solve problems by exploiting the potential of quantum physics.

Some of these companies are now publicly traded or about to go public, others are private but well-funded by preeminent venture firms or other institutions.  Many are independent and working hard to establish a strong position in the ecosystem.  Stay tuned to this blog for future reports which will showcase some of the individual players and investment opportunities.

References:

Nature, “Building logical qubits in a superconducting quantum computing system,” Gambetta, Chow and Steffen, January 13, 20917

AI Multiple, “QC Companies of 2021: Guide based on 4 ecosystem maps” Dilmegani, Cem , January 1, 2021

Fact Based Insight, Accessed December 2021

Qubits: A Primer

In a prior post about Superposition and Entanglement (click here to re-read), we learned that superposition allows a qubit to have a value of not just “0” or “1” but both states at the same time, enabling simultaneous computation.  Entanglement enables one qubit to share its state with other qubits enabling the information or processing capability to double with each entangled qubit.   These two features of Quantum Computing, embodied by “qubits,” enable it to perform certain types of calculations substantially faster than existing computers, and underlie the vast potential of Quantum Computing.   In this post I will describe how qubits are currently made and controlled.

There are mutually exclusive forces at play, which make qubit construction and manipulation exceedingly difficult, although not impossible.  On the one hand, in order for qubits to be as stable as possible, they need to be immune to external forces such as temperature changes, electromagnetic radiation, vibrations, etc., so that they stay in their “state” until we need to use them.  However, this makes it very difficult to manipulate them.  In addition, qubits operate based on quantum mechanics, which is the physics of incredibly small objects such as individual electrons (often measured by their spin which is either “spin up” or “spin down”) or photons (measured by their polarization which is either horizontal or vertical).  Controlling an individual electron or photon adds another layer of difficulty to the mix due to their extremely small scale.

When the bits created for classical computing were first created, there were several different transistor designs developed before the industry settled on MOFSET (metal-oxide-semiconductor field-effect transistor).  Similarly, today there are many ways to create a qubit.  The following is a brief overview of some of the more common types:

Superconducting QubitsSome leading Quantum Computing firms including Google and IBM are using superconducting transmons (an abbreviation derived from “transmission line shunted plasma oscillation qubit”) as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule).  The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop.  A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap.  Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.

Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit.  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.

Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 

Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

Diamond Vacancies: There is a well-know defect that can be manufactured into artificial diamonds, which leaves a nitrogen-vacancy inside the diamond which is filled by a single electron.   The spin of this electron can then be manipulated and measured with laser light.  This technology can operate at room temperature and ambient pressure, which are extremely positive attributes, although they have so far proven very difficult to scale to large numbers of qubits.

Topological Qubits: Quasiparticles can be observed in the behavior of electrons channeled through semi-conductor structures.  Braided paths can encode quantum information via electron fractionalization and/or ground-state degeneracy which can be manipulated via magnetic fields.  While this form of qubit is only theoretical at this point, it is being pursued by some large players including Microsoft.

There are a few others, including Neutral Atoms, Nuclear Magnetic Resonance (which seems more experimental but very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed.  Hopefully this provides a high-level flavor for the various types of qubits.  The good news is that many entities have created, manipulated and measured qubits and often there has been success in controlling them into superpositions and in entangling a limited but growing number of qubits at a time.    

The following table summarizes some of the benefits and challenges along with selected current proponents of key qubit technologies currently in use:

The “Noisy Intermediate-Scale Quantum” (or “NISQ”) Envirnonment

In prior posts I have covered how qubits use superposition and entanglement to empower massive processing speed for certain applications.  However, the technical and manufacturing challenges noted above regarding various qubit types, has prohibited the construction of a very large and non-error-prone system.    There are many competing strategies for creating qubits, each with a different set of advantages and challenges. 

In order to have a Quantum Computer that can exhibit supremacy to a classical computer, it is estimated that we need at least ~100 “logical” qubits, meaning 100 qubits that maintain their fidelity and coherence for as long as needed to perform a desired analysis.   However, as noted above, qubits are unstable, are easily affected by environmental factors, and are difficult to get to remain entangled.  These challenges are generally referred to as “noise”, hence the “N” in NISQ.  One way to solve for this “noise” is to allocate additional qubits to check on or correct the target qubit.  Currently, it is thought that as many as 1,000 “physical” qubits are required to ensure stable utilization of 1 “logical” qubit and many firms are focusing exclusively on the quantum error correction schemes to address this challenge.  Therefore, in order to create a Quantum Computer with 100 logical qubits in the NISQ phase of quantum computing, 100,000 – 1,000,000 physical qubits are being targeted.  To date, the most entangled qubits reported are still measured in the 100’s so there is a long way to go.  That said, this is now an engineering challenge more than a theoretical challenge, and many of the companies noted in this blog have announced product roadmaps to reach 1,000,000 active physical qubits in the next five years or so.

An alternative or competing framework is to create error-correcting qubits.  Today’s transistors have error correction built in, so they operate at extremely high accuracy rates.  The hope is that a method of qubit construction can be created that can self-correct, obviating the need for the massive error-correction overhead noted above in the 1,000:1 ratio of physical to logical qubits.

How do we measure Qubit Performance?

Unfortunately, there is no common and agreed upon set of metrics to allow apples-to-apples comparisons among various Quantum Computing configurations.  A few important measurement factors include the number of operations that can be performed before error, the gate fidelities, and the gate speeds.  IBM has proposed a “Quantum Volume” construct, intended to provide a single-number metric which factors in several key items in order to quantify the largest random circuit of equal width and depth that the Quantum Computer successfully implements.  While the approach of creating a single and agreed upon metric has broad interest, not everyone agrees with the IBM methodology, so a universal standard is still not available. In the meantime, two great resources for tracking and comparing qubit/Quantum Computer performance metrics inculde: Quantum Computing Report and Fact Based Insight. I’ve provided hyperlinks to their qubit dashboards.

So, in order to have objective assessments of various Quantum Computer performance metrics, it is important to acknowledge the various attributes desired and take a wholistic approach to such performance announcements.

References:

Images from Science, C. Bickel, December 2016 and New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010

What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?”, Bobier, Langione, Tao and Gourevitch, BCG, July 2021

Fact Based Insight, Accessed December 2021

7 Primary Qubit Technologies for Quantum Computing, Dr. Amit Ray, December 10, 2018

Inside the race to build the best quantum computer on Earth, Gideon Lichfield, MIT Technology Review, February 26, 2020

Follow the Money…the Quantum Computing Goldrush

You’re likely thinking to yourself, “OK, I see there is some potential in Quantum Computers, and some theoretically important use cases, but nobody has created a robust working Quantum Computer…existing qubits only stay coherent for milliseconds at best, so isn’t this all just hype?”

While no one can say for sure, my suggestion, paraphrasing Deep Throat’s instructions to Bob Woodward, is to “follow the money.”

The amount of funding being dedicated to Quantum Computing on a global basis is staggering.  Governments, private companies, venture firms and academic institutions are all committing huge sums of money and resources to this field.  While investment flows are no guarantee of future value, there is a broad common theme to push the development of Quantum Computers, and the equivalent of the modern “space race” is garnering growing attention in the media.  Given the awesome power, potential and disruption that Quantum Computers can deliver, these trends should not be surprising.

The industry is at an interesting crossroad, where it has evolved from being an esoteric theoretical construct, to having many dozens of firms and academic institutions creating actual working (albeit still not very powerful) Quantum Computers. The challenge now is an engineering one, not a theoretical one. And with the growing pull of resources, it should be expected that engineering challenges will be overcome and developments will accelerate. When integrated circuits were still being created in the 1950’s, very few people could have imagined the boon it would create. Things like personal computers, cellular phones or the Internet were not yet contemplated. Even when PC’s were made available in the early 80’s, many were skeptical that there was an actual market for such an esoteric device. In fact, here is a reprint of an editorial by William F. Buckley Jr. as printed in the Lancaster New Era on July 19, 1982, where he is mulling that he cannot fathom any possible way a personal computer could be useful in the home:

Not surprisingly, his point-of-view was strictly in the context of the written word, since he was a writer, so his myopia makes contextual sense. Given that Quantum Computers are based on a completely different set of physics, logic gates and architecture, I am confident that the use cases will expand well beyond any currently contemplated uses and that current skeptics should try to maintain an open mind.

Government Directed Quantum Computing Investments

As can be seen in the chart below, the top ten countries focused on Quantum Computing technology have recently invested or committed over $21 billion towards this field:

The breadth and depth of these commitments are catalyzing the industry and I expect these trends to continue, so even excluding private company investment, there will be significant advancements achieved at the national level.

Major Current Players

Some of the largest players in the technology space have already dedicated large departments or divisions to Quantum Computing, and lead the push to broad adoption, as highlighted below:

Many are already offering their own quantum software platforms and providing early access to prototype machines over the web. For example, anyone can download the IBM Qiskit open-source Quantum Software Development Kit (SDK), create programs and run them on an IBM quantum emulator. Similarly, you can download and run Google’s Cirq, Microsoft’s Azure, Alibaba’s Aliyum, etc. among others. These firms are leveraging their broad infrastructure, technological resources and established web-based platforms to advance the access to, and utilization of, evolving Quantum Computing resources. In addition, in June Honeywell agreed to invest $300 million into its Quantum Computing unit after it merged with Cambridge Quantum Computing.

Venture Investment in Quantum Computing

In addition to the large government programs and major push by leading technology firms, there is a growing and accelerating focus on Quantum Computing among venture investors. According to the Quantum Computing Report, there have been more than 450 venture investments in Quantum Computing companies made by more than 300 different venture investment firms.  Echoing the growth of Silicon Valley companies funded by legendary Sand Hill Road venture investors, current venture investors are making increasing large and diverse bets on many parts of the Quantum Computing ecosystem.  The following chart showcases aggregate venture investments in each of the past three years (with more than a month still left in 2021):

A few venture firms have focused on Quantum Computing investments, with 17 firms making 3 or more such investments and with two (Quantonation and DCVC) making 10 or more each, as highlighted in the following table:

Not only has the playing field for Quantum Computing investments been growing, but there have been some very significant investments made. The following highlights some of the larger announced venture investments:

Sources: PitchBook, Boston Consulting Group

Of these companies, IonQ became the first-ever pure-play Quantum Computing company to go public, debuting on the NYSE on October 1, 2021 and as of Nov. 23rd had a market capitalization of $4.8 BILLION. Rigetti Computing also recently announced it would be going public in an expected $1.5 billion reverse merger with a SPAC. The latest PsiQuantum investment was announced this past summer and included a $450 million investment at a valuation exceeding $3 billion, with ambitious plans to build a commercially viable Quantum Computer by 2025.

University Focus on Quantum Computing

Quantum computing and quantum information theory has gone from being a fringe subject to a full complement of classes in well-funded programs at quantum centers and institutes at leading universities.  Some world-class universities offering dedicated Quantum Computing classes and research efforts include:

  • University of Waterloo – Institute for Quantum Computing
  • University of Oxford
  • Harvard University – Harvard Quantum Initiative
  • MIT – Center for Theoretical Physics
  • National University of Singapore and Nanyang Technological University – Centre for Quantum Technologies
  • University of California Berkeley – Berkeley Center for Quantum Information and Computation
  • University of Maryland – Joint Quantum Institute
  • University of Science and Technology of China – Division of Quantum Physics and Quantum Information
  • University of Chicago – Chicago Quantum Exchange
  • University of Sydney, Australia
  • Ludwig Maximilian University of Munich – Quantum Applications and Research Lab
  • University of Innsbruck – Quantum Information & Computation

These Colleges and Universities, as well as many others, continue to add courses and departments dedicated to Quantum Computing.

We are witnessing an unprecedented concentration of money and resources focused on Quantum Computing, including substantial government initiatives, major industrial player committment, accelerating venture investment and evolving university programs. While not all investments will be positive, and the landscape continues to evolve, serious, smart money is backing this trend. The clear message is that resource focus will lead to engineering breakthroughs and immense value creation. There are now 100’s of companies jockeying for position in this evolving field. Stay tuned to this blog as we watch for the winners and losers.


References:

Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Hajjar, Alamira Jouman, 33+ Public & Private Quantum Computing Stocks, AI Multiple, May 2, 2021

Inside Quantum Technology News, Government Investments in Quantum Computing Around the Globe, May 31, 2021.

Pitchbook Database, Retrieved November 2021

Universities With Research Groups — Quantum Computing Report, Retrieved November 2021

Venture Capital Organizations — Quantum Computing Report, Retrieved November 2021