The Quantum Computing Elevator Pitch

Readers of this blog may have read prior posts where I attempt to summarize the features and benefits of Quantum Computing (QC) that underly its ability to transform computing with enormous power and potential.  I have tried to do so without layering in too much math or physics and I hope you have found these posts helpful.

In this post I will summarize the key aspects of QC in an “elevator pitch” without any math or physics, intended to help pique your interest in digging in and learning more. Before I launched this site/blog, I wanted to understand some of the underlying quantum physics, linear algebra, computational theory and related physics associated with qubits to satisfy my own curiosity, but none of that is essential to appreciate the power of QC.  Here is the “pitch” (useful if the elevator is only going up a couple of floors):

Quantum Computers will transform the way we use computers by massively accelerating certain computations and, more importantly, by enabling a wholly new form of computing.

That is the headline and take-away message.  Many have conveyed the first part (speedup of computing), but the second part is less noted, although more potent.  Here are some further details to support this statement (if the elevator trip is slightly longer):

  • Taming tiny particles (atoms, electrons, photons, etc.) enables transformative computational power. 
  • At this tiny scale, matter behaves both like a particle and a wave.  We are familiar with both but have challenges understanding and describing actions where particle and wave features happen at the same time, as in QCs.
  • This particle-wave duality underpins the features of superposition and entanglement, which is where the power of QC is derived.  Superposition simply means that each computing bit or ‘qubit’, can be either 1 or 0 or a combination of both.  Entanglement simply means that qubits can be connected and dependent on each other, enabling simultaneous processing/computation.
  • Mankind has refined this ability to control these tiny particles over the past 100 or so years, and actual working quantum computers now exist and can be accessed today over the cloud, albeit these early QCs do not yet contain more power than classical computers.
  • Now that QC has moved from being theoretical, to being practical, billions of dollars and enormous resources are being funneled into the QC space by governments, major corporations, new companies, venture investors and academic institutions, in order to perfect and leverage this new computing power.
  • Over the next few years, QC power should increase to the point where it can be used to understand chemical reactions in a way that leads to new medicines, design highly specialized materials to improve batteries, solar power, fertilizers and other important things, as well as other amazing advances.
  • However, the really exciting applications of QC will not be because QCs can do current computing faster (which is immensely valuable) but that QCs will let us tackle problems that at present we don’t even bother trying with regular computers because we know they are much too hard, or enable us to provide answers to questions we haven’t even thought to ask. [Rudolph, 2017]

This is the spine tingling, wide open blank canvas, that makes Quantum Computing so exciting for me (and by the time you finish this post, hopefully for you too).


Now, let’s break this down into a few baby steps.

You don’t need to understand the physics of most technologies in order to use and benefit from them

New technologies are hard to appreciate and understand but can be transformative to society even without a user’s understanding of how they work.  Many people that are new to QC get tripped up trying to understand exactly how the underlying quantum mechanics “work”.  Unfortunately, understanding the quantum physics is extremely challenging because it involves a scale so small that it is not easy to relate to using our current person-sized orientation.  However, if you think back to prior transformational technologies, you likely will note that most were (and still are) not understood by lay people.  Here are a few examples:

  • Radio (and television)
  • Electricity
  • Integrated Circuits

In the late 1800’s man mastered “waves” in order to transmit voice signals across long distances.  We are all familiar with AM/FM radios but likely do not understand the physics of “amplitude modulation” for AM or “frequency modulation” for FM.    We cannot see radio waves with our eyes.  Yet we can all enjoy listening to music in our cars or speaking with loved ones on our phones.

Also in the late 1800’s (specifically on September 4, 1882) New York City was illuminated by electrical light for the first time, showcased by the lighting of the New York Times building (as depicted in episode 7 of the Gilded Age on Hulu).  During the subsequent decades, there was raging debate about the dangers of electricity and the relative benefits/weaknesses of AC vs DC power.  Despite these debates, electrical wires were installed throughout the country/world and today’s electrical grid is a complex and inter-connected wonder.

You take for granted that you can plug a lamp into a power socket in your home, and instantly have “light”.  You may not realize that the electrons flowing over the filament in the bulb arrived there nearly instantly from many, many (maybe thousands of) miles away.  You do not need to know anything about electricity to turn on that lamp and enjoy its benefits.  Similarly, when PCs were first released, many critics were baffled why anyone would want to buy a home computer, and certainly did not understand how they worked.  Yet today, nearly everyone has access to PCs and they are ubiquitous in business and education.  People use them daily without any underlying appreciation for how integrated circuits function.

The power and potential of QC has to ability to rival the transformative impacts of radio, electricity, and integrated circuits, and it will do so whether users understand the inner physics or not.

What’s with the Particle/Wave Duality?  I thought you promised no math or physics…

Yes, I understand this sounds very “science-y” but it is quite straight forward.  We are all familiar with the way particles behave.  We don’t have to understand the underlying Newtonian physics to appreciate the way billiard balls move on the pool table, the way the golf ball travels when you hit it, or for that matter, the way the tides happen in response to the movement of the moon.  The actions and reactions of “particles” that we experience in our frame of reference is intuitively understood even if the underlying physics are not.

Similarly, we understand and appreciate the behavior of waves.  If we throw two stones into a pond we can see the resulting ripples (waves) in the water and we can see how those waves interact with each other.  We can wave a streamer and see the waveform in the ribbon.  And for anyone who has used noise-cancelling headphones, we can appreciate how the noise is removed once we engage that feature.  The way this noise cancellation happens is the unit is “listening” and then creates a sound wave that is opposite to the sound it hears.  When the two “waves” are both transmitted to the ear, one out of phase (opposite) with the other, they “cancel” each other and we hear silence.

In Quantum Computing, the physical qubits (electrons, atoms, or photons, etc.), because they are so tiny, behave a bit like particles and a bit like waves.  It is not important to understand precisely how, but the “wave” aspect, similar to the “cancellation” that happens in noise cancelling headphones, is part of what empowers Quantum Computers to process information so much faster than classical computers by amplifying what is being sought and/or cancelling out what is not. 

Transformative technologies create transformative wealth

It is certainly not an overstatement to say that integrated circuits have wildly transformed society.  There are now integrated circuits in nearly every powered device.  Everything from kids’ toys to smart thermostats to cell phones is dependent on integrated circuits. And this transformation is evidenced in enormous wealth creation.  Look at the top ten companies, by market capitalization (as of 3/20/22):

  • Apple ($2.8 trillion)
  • Saudi Aramco ($2.3 trillion)
  • Microsoft ($2.2 trillion)
  • Alphabet/Google ($1.8 trillion)
  • Amazon ($1.7 trillion)
  • Tesla ($1.0 trillion)
  • Berkshire Hathaway ($771 billion)
  • NVIDIA ($675 billion)
  • Meta/Facebook ($608 billion)
  • Taiwan Semiconductor ($555 billion)

Seven of these top ten companies are directly in the “integrated circuit” business, either as a manufacturer or for their primary value proposition (Aramco, Tesla and Berkshire Hathaway are the exceptions, although you might argue that Tesla’s could not operate without integrated circuits and Berkshire has large positions in tech companies).  These seven companies have created over $10 trillion dollars in wealth, which is a staggering amount, and this has been done in an astonishingly short period of time.  Those seven companies combined are larger than every country on the planet (as measured by GDP) with the exceptions of the US and China.  I won’t be so bold as to definitively say that QC will do the same (or more accurately, won’t predict when), but the potential for QC to create these levels of wealth is certainly possible.

A Quantum Computer, by definition, is a Computer.  What is so Different?

Today’s personal computers are awesome and have power that may have been considered unimaginable just a few decades ago.  My current PC (a Dell OptiPlex 7780) has 64 GB of RAM and can operate at 2.90GHz.  That means its RAM or rapid access memory (the playing field of the computational power of my machine) has 64,000,000,000 bytes of computing memory or 64 billion units. It can process those bytes at the rate of 2.9 billion per second.  Think about that.  My basic desktop computer has billions of computing units and can process them at the rate of 2.9 billion per second.  That means that in the 6 seconds it took me to type this sentence, my computer could perform over 17 billion calculations.

That sounds unfathomably powerful and fast to me, and it is.  So why could we possible need even faster calculations and what can’t I already do on my Internet-connected machine that I might want to do?  Answers to that question are where Quantum Computing gets fun and exciting.  The purpose of today’s post is not to explain all of the details, physics or specific use cases, but rather to excite you enough to want to learn more about those things.  So, let’s change the perspective a bit (pun intended for those of you familiar with the Hadamard gate).

Because QCs approach calculations differently and can utilize entangled qubits, it approaches calculations/algorithms differently from classical computers.  As a reminder: a) qubits operate in three dimensions; b) QC gates are more complex than the AND/NOT/OR gate functions of classical computers; c) quantum algorithms are bi-directional; and d) results are probabilistic (not deterministic) [see here for prior post which explains these features in greater detail]. 

So what do these features do that make QCs so much different than classical computers?  Let’s use an analogy to help convey this.  Classical computers are a bit like radios that existed before television.  You could listen to sports events live without going to the stadium. You could hear the news and listen to stories about faraway lands.  And you could be entertained for hours.  These were deeply satisfying activities in the day but compare those machines with today’s high definition big screen televisions and the evolution of the information and entertainment is massive.

To extend this thought exercise a bit further, consider the evolution of the Internet.  When AOL was distributing their enrollment CDs seemingly everywhere and was the first company to make e-mail a household thing, most people did not imagine such technology would also let you sell your junk (eBay), get a ride to the airport (Uber), or find the answer to nearly any basic question nearly instantly (Google).

Putting it all Together

To paraphrase Terry Rudolph (Imperial College quantum physics professor and co-founder of Psi Quantum), explaining how quantum computers work is a bit like having someone describe van Gogh’s “Starry Night” after only seeing a black and white photograph of it, which has been chewed by a dog.  It is difficult to do it justice. 

There are countless articles about the power of Quantum Computers and the marvels society will enjoy once they are powerful enough.  This is generally framed in terms of combinatorics and what can be achieved if such calculations can be sped up considerably.  I am extremely excited about this aspect of QC and much of my blog writing has been to sing those praises.   However, what is most exciting to me, is to contemplate the “eBays” and “Googles” of Quantum Computing.  What will we be able to do with this completely new form of computing?  What new questions can we ask and have answered?  What types of products, companies and industries will be created?  What programming masterpieces will the programmers create with this new medium? 

I look forward to finding out and hope you continue to join me on the journey.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Rudolph, Terry, “Q is for Quantum”, 2017

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Learning More About Quantum Computing

Rigetti Computing Chandelier

As I’ve interacted with readers of the Quantum Leap, I’m often awed by the diversity and openness of so many of the folks involved in the Quantum Computing (QC) field.  However, I find that many people new to the field are still put off or intimidated by the concepts covered.  While I have tried to orient my posts so that non-technical readers can still benefit from them, it seems that many well intended, intelligent readers are still baffled by most things “Quantum”.    And while I don’t have a monopoly on the best learning resources, I’ve been on a pretty long and deep dive in my own quantum journey, so wanted to provide links to some resources to help my readers accelerate their quantum educations.  Most of these are available for free although some require modest subscriptions for premium content.

In addition, I’ll provide some guidance as follows:

  • Easy items include topics where ZERO math, physics or prior technical knowledge is needed.
  • Moderate items provide a bit more technical detail but should be approachable for those intellectually curious enough to dig in a bit deeper.  Many refer to some advanced concepts such as Entanglement and Superposition, but no formal math or physics training is required to understand the fundamental details.
  • Advanced items will be reserved for resources that are more technical in nature and/or would take a longer time to get through.

Finally, there is a plethora of resources available on the Internet including articles, podcasts, blogs, interactive learning resources, classes, lectures, seminars, academic papers, etc., so I will also point out a variety of resources and media.  Fortunately, there are many resources, delivered in a variety of ways, so I encourage you to try a few and if they don’t resonate or feel helpful, skip to the next one.  I hope these items accelerate your understanding and spur further learning. This is only meant to be a sampling of resources that I have read or used myself and is not intended to be a complete directory. If any of my readers find additional resources that should be added to this list, please reach out to me at russ@quantumtech.blog.

Essentials of Quantum Computing

For an EASY introduction, consider the following:

For a more MODERATE introduction, the following are included largely because they are longer and therefore take more time to review.  They contain some intermediate concepts but are generally still introductory and should be readable by nearly anyone curious about QC:

  • Research Piece: The Next Decade in Quantum Computing – and How to Play, by Philipp Gerbert and Frank Ruess, Boston Consulting Group, November 2018: a good overview including some technical details and potential use cases.
  • Research Piece: The Coming Quantum Leap in Computing, by Anant Thaker and Suhare Adam, Boston Consulting Group, May 16, 2018: Mostly non-math oriented although there are some graphics depicting speedup, with some intermediate math concepts.  That said, you should be able to absorb the full content without any math background.
  • Research Piece: What Happens When ‘If’ Turns to ‘When’ in Quantum Computing, by Jean-Francois Bobier, Matt Langione, Edward Tao, and Antoine Gourevitch, Boston Consulting Group, July 2021: Another good overview with use cases and some details on various hardware approaches where math is not required.
  • Research Piece: Quantum computing: An emerging ecosystem and industry use cases, McKinsey & Company, December 2021: Another broad overview with details on players and use cases, heavy on details but light on math.
  • Research Piece: Economic-technological revolution through Quantum 2.0: New super technologies are within reach, by Dr. Hermann Rapp, Deutsche Bank, December 17, 2021: A detailed compendium broadly covering QC with very modest levels of technical details or math.
  • Research Piece: Broad Interest in Quantum Computing as a Driver of Commercial Success, by Bob Sorensen, Hyperion Research (sponsored by D-Wave), November 2021: A focus on use cases and potential commercial users.
  • Article: Inside the race to build the best quantum computer on earth, by Gideon Lichfield, MIT Technology Review, February 26, 2020: A broad overview with a deep dive into IBM’s QC history.
  • Article: What is Quantum Computing, CB Insights, January 7, 2021: A good mix of some modest technical details and examples along with a broad overview of use cases.
  • Online Tutorial: Q-CTRL’s Black Opal, a hands-on tutorial with excellent visualizations and short sessions.  There is also a fantastic “Practice” section which is a very visual tool for understanding how gates act on qubits (including animations).  There are three “beginner” modules on Superposition, Qubits and Measurement which I have included here as “intermediate” level.  More advanced modules are shown below.
  • YouTube: “Quantum Computing, Software and Tech” by Anastasia Marchenkova: A charming and approachable series of short videos on various introductory topics and concepts.
  • Book: Quantum Computing for Everyone, by Chris Bernhardt, published 2019: Another good intro book although the “for Everyone” in the title is a bit misleading since it digs in a bit on quantum theory and math.
  • Book: Quantum Boost, by Brian Lenahan, published 2021: A good intro book covering the basics of QC along with some specific applications.

And here are more ADVANCED introductions.  While you don’t need formal linear algebra or physics backgrounds, these get a more technical and introduce some rudimentary linear algebra in the context of describing qubit gates.

  • Article: The Need, Promise and Reality of Quantum Computing, by Jason Roell, published in Medium February 1, 2018: Some good introductions to exponential speed-up, details on qubits, superposition and entanglement and quantum volume.
  • Article: Quantum Computing, by Sarvesh Patil, published in Medium May 16, 2021: Dense but strong overview.
  • Book: Quantum Computing: An Applied Approach, by Jack D. Hidary, published 2019: a great foundational overview with a practical approach to programming and detailed appendices focusing on the QC-specific applications of linear algebra.  There is a more recent new addition as well as a great companion GitHub site (http://github.com/jackhidary/quantumcomputingbook).
  • Book: Quantum Computation and Quantum Information, by Michael Nielsen and Isaac Chuang, 10th Edition published 2010: This is one of the most cited books in physics of all time and is a standard college course textbook often referred to as “Mike & Ike”.  I have not read this yet but hope to this year, and include it based on its nearly-universal mention by advanced QC users.
  • Online Tutorial: Q-CTRL’s Black Opal, the hands-on tutorial with excellent visualizations and short sessions noted above.  There are three “Intermediate” modules on Circuits, Entanglement and Noise which I have included here as “advanced” level. 
  • Online Tutorial: Quantum County, by Andy Matuschak and Michael Nielson (of “Mike & Ike” fame), has a novel and engaging method of introducing complex quantum principles including their own “mnemonic” approach which they hope will help users retain the information longer.  While this is included in the “advanced” section of this post, I highly recommend this for readers with some basic working understanding of quantum mechanics.

Hardware/Software/Key Players/Use Cases

While some of these topics are broadly covered in the resources noted above, here are some additional resources which drill down further on specific players and uses.

For EASY summaries, the following provide a broad overview:

Additional Resources

In addition to the articles, research pieces, on-line tutorials and books noted above, here are a few more places to continue your QC education:

Podcasts

Classiq publishes a weekly podcast called “The Qubit Guy’s Podcast,” hosted by CMO Yuval Boger, which I highly recommend (and for an added treat, make sure to check out his February 23rd interview of yours truly).  Yuval has an excellent interview style and his podcasts are interesting and right-sized.  I particularly recommend his interviews of Jack Hidary (Google/Sandbox/Alphabet), Dr. Robert Sutor (IBM) and Paul Lipman (Coldquanta), although all are worth listening to.

Inside Quantum Technology also showcases its own podcasts, hosted by Christopher Bishop, another highly skilled and interesting interviewer.  I suggest his interviews of Chad Rigetti (Rigetti Computing) and Ilyas Khan (Quantinuum).

Aggregators

The Quantum Insider, led by Alex Challans and Evan Kubes, is a great starting point for general QC information and includes directories of Companies, Investors, Funding Rounds, Universities, Government Entities and Quantum Users among other details.  They also publish a quarterly and an annual report as well as daily news aggregators.

Quantum Computing Report, put out by Doug Finke, is another aggregator that includes deep dives on players (segregated by Public Companies, Private Companies, Universities, Government and Venture) and hardware scorecards.  Doug is a key intermediary in the quantum realm and he is available for a variety of consulting assistance.

Fact Based Insight, founded by David Shaw, is yet another compendium of broad QC information including links to lots of introductory content as well as a daily news aggregator and various company directories and summaries.

Inside Quantum Technology News, in addition to hosting the excellent podcasts noted above, IQT has a robust website, news aggregation and also hosts period industry events, like the upcoming Quantum Enterprise event May 10-12 in San Diego, CA. 

The Qubit Report – Because Quantum is Coming, is another news aggregator with its finger on the pulse of current events, research, cybersecurity, software, business and technology among other important QC developments.  Their regular LinkedIn posts are a great way to stay current on evolving events and announcements.

Quantum Strategy Institute, led by Brian Lenahan, has a lot of good content including position papers and links and is focused on connecting customers with its network of cross-domain experts to provide consulting services.

Linear Algebra

The bad news is that in order to really understand and appreciate the power of QC, some basic linear algebra is required. This is especially true when trying to understand how gates manipulate qubits. While you don’t need to be an expert in all aspects of linear algebra, a working understanding of vectors, scalars, dot products (and orthogonal bases), matrix addition, matrix multiplication, eigenvalues and tensor products will enable a clearer understanding of gates and quantum algorithms. The good news is there are a ton or good resources available for self-learning. Here are two particularly good ones:

3Blue1Brown: a series of video lessons with excellent graphics to help visualize the concepts. The animation engine behind the graphics is a fantastic tool.

Khan Academy: Another resource for video lessons, with accompanying practice questions.

Other Resources

Medium: is a platform for self-publishing, used by a myriad of authors and readers.  It doesn’t have any particular industry focus, although it can be searched by topics and there are many excellent articles on Quantum Computing included.  In addition, some users curate newsletters (“publications”) aggregating content from various authors. 

qBraid QuBes Course:   qBraid is focused on helping high school students, but their content is well done, intuitive and thorough, combining video lectures with coding examples.  While some topics dive in a bit beyond what might be approached by non-science users, since it is oriented towards high school students, you should be able to get through the material without a deep math or science background.

Qmunity.tech: Q-munity is a 501(c)(3) non-profit aiming to connect and teach young individuals about Quantum Computing.  They offer a number free and paid courses on QC and while some have some technical details, they are geared towards high school students so should be relatively approachable.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

The Case for an Annual “Quantum Games” Competition

While the amount of innovation and technical advancement in the Quantum Computing (QC) realm has been incredible over the past 12 months or so, it is still very hard to quantify the power of existing QCs, to compare one QC against another, or to compare a QC against a classical computer.

Last week (February 23rd) IonQ boldly announced that their latest QC named Aria with 32 qubits, “has achieved a record 20 algorithmic qubits and has furthered its lead as the most powerful quantum computer in the industry…”  But is this actually true?  In November 2021 IBM unveiled their Eagle 127-qubit quantum processor.  Isn’t 127 a lot more than 32?  What gives here?

In this post I suggest an annual “Quantum Games” or world Olympics in order to spur innovation and friendly competition as well as collaboration.  I’ll describe this in added detail towards the end of this post, but first, let’s set out some of the parameters in QC so that these games have added context.

The Guts of Quantum Computation

While digging into all the details of the full quantum stack and the various types of algorithms being written and run on existing QCs is beyond the scope of this post, some background will be helpful in understanding the nuances involved in building, operating and measuring QCs.

The fundamental core to a QC involves qubits, which can be electrons, atoms, photons or other tiny elements.  Storing such tiny elements in a given state, then precisely manipulating and measuring them, has significant challenges including maintaining near absolute-zero temperatures and/or vacuums.  Over the past 12-months or so, QC companies have gone from creating machines with 10’s of qubits to now machines with 100’s of qubits, with many predicting that this “order of magnitude increase” can be repeated each year.    It is generally thought that we will need to implement ~1,000,000 physical qubits in order to achieve consistent quantum advantage (i.e., when QCs can surpass classical computers performing real-world applications), so if that cadence of 10x improvement per year can be maintained, quantum advantage could be achieved within 4-5 years.

However, there are many other factors in building and implementing QCs beyond simply the number of qubits.  QCs derive most of their computational advantages due to principles of Superposition and Entanglement but because qubits are so sensitive and fragile, any noise in the system threatens to sabotage the computational power via decoherence.  Therefore, in the current NISQ (noisy intermediate-scale quantum) environment a lot of the qubits are earmarked for error correction. In addition, due to a non-cloning feature of quantum mechanics, there is no “quantum RAM” and therefore some of the qubits need to be allocated to storage overhead (i.e., noting the result of a predecessor calculation in an algorithm).  Without digging into all of the technical detail, you can think of QCs needing to address all of the following:

  1. Placing all the physical qubits in an original state, including requisite cryogenics, vacuums, microwave pulses and/or laser pulses, etc.
  2. Manipulating the various qubits to establish Superposition
  3. Applying gates to the qubits to program algorithms, including entangling certain qubits
  4. Applying error correction overhead to confirm the algorithms are performing the desired calculations before decoherence
  5. Applying compiler-level logic as well as various other layers in the QC stack
  6. Measuring the readouts of the 10’s of thousands of “shots” of each algorithm run (QC algorithms are based on the variational metric because QC calculations are probabilistic and need to be repeated many times to average to the answer)
  7. Resetting the system between calculations

While the above list is not meant as an actual blueprint, it is intended to give some sense to the various activities underway in a working QC.  There are performance bottlenecks and areas for performance enhancement in each of these activities.  Let’s categorize them for ease of further discussion.

Key QC Performance Metrics

There are four core functions or parameters of performance needed to measure QC power:

  1. Scale, or the total number qubits
  2. Quality, or the ability of the qubits to implement circuits before errors enter the system
  3. Speed, or the number of circuits that can be implemented at a given time
  4. Context, of the type of calculation being measured.  Some focus on the physical system and others focus on the applications…some focus on simulations and some on optimization, etc.

[1] Based on BCG analysis which included many other competing benchmarks.  See References for link.

Here is a bit more color on each of these four proposed benchmarking strategies.

IBM: Has proposed a three-prong measurement set of metrics including the number of qubits, quantum volume (an indication of the quality of circuits and how faithfully curcuits are implemented) and speed as measured in CLOPS (circuit layer operations per second) which indicates how many circuits can run in a given time.  While this seems like a fairly straight-forward and objective set of metrics, the criticism has been that the metrics are based on a random set of gates (theory being this keeps it objective), and therefore it doesn’t factor in real-world usage.

QED-C: The US Quantum Economic Development Consortium, which was established as a result of 2018’s National Quantum Initiative Act, has developed a suite of application benchmarks targeted towards practical applications and based on executing common quantum algorithms or programs.  Given that these benchmarks were derived from industry input, this seems like a broadly validated set of measurements.

IonQ: Has proposed #AQ or algorithmic qubits as the yardstick, and has used this standard to perform apples-to-apples comparisons with other leading QC makers.  They claim that by using the series of algorithmic benchmarks developed by QED-C, they are featuring important real-world algorithms, and by taking one metric (#AQ), advocate an easy measurement to track and compare.  They claim that having an #AQ of 20 means they can execute a reference quantum circuit over 20 qubits that contains over 400 (20 x 20) entangling gate operations and expect the results to be correct with meaningful confidence.  Below is their latest announcement with the metric shown for their latest Aria machine compared to Quantinuum Model H1.1, IBM’s Falcon and Rigetti’s Aspen M-1, with the size of the rectangle outlined in pink denoting the QC “size”.

SupermarQ: Just last week Super.tech released SupermarQ, another application-centric benchmarking suite for QCs.  The target applications mirror real-world problems in a variety of domains such as finance, chemistry, energy and encryption. 

While these are some useful ways to consider measuring QC performance, it is important to realize that these firms are battling over very modest performance yardsticks in the scheme of the eventual potential of QC.  If we assume a scale of 1-100 where 100 is a robust QC that consistently achieves quantum advantage, current machines are roughly in the 5-10 range now, so arguing whether a given machine is a 5 out of 100 or an 8 out of 100 is not that meaningful from a practical sense.

That said, in addition to the metrics proposed in the table above, there are other proposed benchmarking strategies including Mirror Circuits by Sandia National Labs, Quantum LINPACK by UC Berkely, and Q-Score by Atos, among others.  In fact, to provide standards against which to measure quantum computing progress and drive current research toward specific goals, DARPA announced its Quantum Benchmarking program. Its aim is to re-invent key quantum computing metrics, make those metrics testable, and estimate the required quantum and classical resources needed to reach critical performance thresholds. 

For now, my advice is to use caution when describing the power of a given Quantum Computer.  While the number of qubits is important, it is not the only important metric.  Focusing just on numbers of qubits is like assessing the performance of a high-end automobile solely by the number of cylinders in the engine.  Clearly there are many other factors that impact drivability and performance, and a similar analogy applies to QC.

So Let the Games Begin!

Given that some benchmarks favor optimization strategies, some favor simulation, some focus on contrived theoretical tasks and others try to reflect real-world applications, some are great at 2-qubit gates but not at larger entanglements, etc., it unfortunately does not look like a universally accepted standard is going to be agreed upon in the near-future.  So instead, what if there were an annual contest like a global QC decathlon?  I think it would be reasonably easy to agree on a set of measurement algorithms, similar to those proposed by QED-C.  Different entrants could compete to win the fasted correct results in several different categories of algorithms and problems, with the start-and-stop times agreed upon and a panel of experts to arbitrate any discrepancies among entrants.  Gold, Silver and Bronze medals could be awarded for each category with an overall “best in show” award to the team that wins the most individual events or achieved the highest overall score. 

I’ll nominate myself as one of the judges.  I’d certainly love a front-row seat to watch as the players competed, each driving the best of each other.  What do you think?

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Langione, Bobier, Krayer, Park and Kumar, “The Race to Quantum Advantage Depends on Benchmarking,” Boston Consulting Group, published February 23, 2022.

IonQ press release entitled “IonQ Aria Further Lead As World’s Most Powerful Quantum Computer”, issued February 23, 2022.

IBM Quantum breaks the 100-qubit processor barrier,” International Business Machines, November 16, 2021.

Cross, Bishop, Sheldon, Nation, Gambetta, “Validating quantum computers using randomized model circuits,” Physical Review A October 11, 2019.

Driving quantum performance: more qubits, higher Quantum Volume, and now a proper measure of speed,” International Business Machines, accessed February 27, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

At the Intersection of Quantum Computing, Artificial Intelligence and Machine Learning

There are some obvious and not-so-obvious overlaps among various “advanced computing” concepts.  Before I describe some of the inter-relationships among these concepts, it would be helpful to level-set the general definitions:

Classical Computing: is the form of data storage and analysis utilizing transistors in integrated circuits to turn switches on or off, hence storing a given computational state as a “bit”.  These circuits are coordinated into logic gates to perform various instructions such as “AND”, “OR” and “NOT” and do so in a sequential manner.  Today’s computers are increasingly fast and robust, having enjoyed Moore’s law for nearly 50 years.  However, classical computers are beginning to hit an advancement ceiling and with the ever-increasing amount of data being collected and stored, the sequential nature of classical computing analysis is leading to longer and longer processing times for large data sets.

High-Performance Computing (HPC):  is a technology that harnesses the power of supercomputers or computer clusters to solve complex problems requiring massive computation.  While aggregating computing resources can improve overall power and speed, such increases in performance are linear (i.e., classical computing based), so an increasingly large set of resources is required as the data increases.

Quantum Computing: Quantum Computers (QCs) utilize evolving new technologies which take advantage of certain features of quantum mechanics.  It uses “qubits” instead of classical computing bits and harnesses the properties of superposition, entanglement, and interference to perform calculations.  Combining these quantum properties with a broader array of logic gates, QC’s can perform calculations simultaneously (instead of sequentially) and therefore much faster than classical computers.  QCs are relatively new, and the existing devices are still not very powerful, but they are becoming more and more powerful all the time.

Artificial intelligence (AI): is intelligence demonstrated by machines, as opposed to natural intelligence displayed by animals including humans. In AI’s most basic form, computers are programmed to “mimic” human behavior using extensive data from past examples of similar behavior. AI applications include advanced web search engines (e.g., Google), recommendation systems (used by YouTube, Amazon and Netflix), understanding human speech (e.g., Siri and Alexa), self-driving cars (e.g., Tesla), etc.

Machine Learning (ML): the study of computer algorithms that can improve automatically through experience and by the use of data. It is seen as a part of artificial intelligence. Machine learning algorithms build a model based on sample data, known as training data, in order to make predictions or decisions without being explicitly programmed to do so.   There are three types of machine learning: supervised learning (classification and regression), unsupervised learning (clustering and dimensionality reduction), and reinforcement (semi-supervised learning).

Big Data: refers to large, diverse sets of information that grow at ever-increasing rates. It encompasses the volume of information, the velocity or speed at which it is created and collected, and the variety or scope of the data points being covered (known as the “three v’s” of big data).  Data analysts look at the relationship between different types of data, such as demographic data and purchase history, to determine whether a correlation exists.

Quantum Machine Learning

Using these broad definitions, we can further refine this discussion to note that “Artificial Intelligence” today, is a general catch-all category for using classical computers to parse, analyze and draw conclusions.  ML and Big Data are generally considered sub-sets of AI and HPC is a general catch-all for using mainframes, supercomputers and/or parallel processing to scale the power of classical computing.  With the recent introduction of working QCs, and given that QCs operates with different processes and logic, there is an evolving field known as “Quantum Machine Learning” (QML) at the intersection of these technologies.

Over the past few years, classical ML models have shown promise in tackling challenging scientific issues, leading to advancements in image processing for cancer detection, predicting extreme weather patterns, and detecting new exoplanets, among other achievements. With recent QC advances, the development of new Quantum ML models could have a profound impact on the world’s biggest problems, leading to breakthroughs in the areas of medicine, materials, sensing, and communications.

In a milestone discovery, IBM and MIT revealed the first experimental proof that the theory of combining quantum computing and machine learning could be achieved.  They published their findings in Nature on March 13, 2019, using a two-qubit QC to demonstrate that QCs could bolster classification supervised learning.

TensorFlow, and PyTorch are leading platforms used for classical computing machine learning.  TensorFLow is an end-to-end open source platform with a comprehensive ecosystem of tools, libraries and resources that allow researchers and ML developers easily build and deploy ML powered applications.  PyTorch is also open source and has a machine learning library that specializes in tensor computations, automatic differentiation, and GPU acceleration. 

TensorFlow Quantum

Reimagining these concepts for use on a QC, Google has released open sourced TensorFlow Quantum (TFQ) which provides quantum algorithm research and ML applications within the Python framework, designed to build QML models leveraging Google’s QC system.  To build and train such models, users would do the following:

  1. Prepare a quantum dataset
  2. Evaluate a quantum neural network model
  3. Sample or average measurements
  4. Evaluate a classical neural networks model
  5. Evaluate cost functions
  6. Evaluate gradients and update parameters

Which is graphically depicted below:

A key feature of TensorFlow Quantum is the ability to simultaneously train and execute many quantum circuits. This is achieved by TensorFlow’s ability to parallelize computation across a cluster of computers, and the ability to simulate relatively large quantum circuits on multi-core computers. 

PennyLane

Similarly, Xanadu’s PennyLane is another open-source software framework for QML, built around the concept of quantum differentiable programming.  It integrates classical ML libraries with quantum hardware and simulators, giving users the power to train quantum circuits.  Companies such as Menten AI are using PennyLane to design novel drug molecules that can efficiently bind to a specific target of interest.  Menten AI is seeking to develop new approaches that are beyond the reach of current classical computation by integrating QC and classical machine learning techniques.

PennyLane, is integrated with Amazon Braket, a fully managed quantum computing service from Amazon Web Services (AWS). Together with Amazon Braket, it seamlessly integrates classical machine learning (ML) libraries with quantum hardware and simulators, giving users the power to train quantum algorithms in the same way they train neural networks.  Data scientists and machine learning researchers who work with TensorFlow or PyTorch on AWS will now have a way to experiment with quantum computing and see how easily it can fit into their workflows.

“Amazon Braket makes it easy for customers to experiment with quantum computing through secure, on-demand access to a variety of quantum hardware and fully managed simulators. We are delighted to be working with PennyLane to give our customers a powerful set of tools to apply proven and familiar machine learning concepts to quantum computing. Our goal is to accelerate innovation, and PennyLane on Amazon Braket makes it easy and intuitive to explore applications of hybrid quantum computing, an area of research that aims to maximize the potential of near-term quantum computing devices” said Eric Kessler, Sr. Product Manager for Amazon Braket.

Summary

While QC is still in its early stages, there are promising developments in applying QC to Artificial Intelligence/Machine Learning.  Menten AI’s use of this technology for drug discovery and Quantum Image Processing are but two examples of near-term applications.  As the amount of stored data and images continues to explode, along with the increasing adoption of voice recognition tools (i.e., Alexa, Siri, etc.) utilization of QML will be vital to enabling efficient use of these evolving tools.  I expect we’ll see many more collaborations and tools in the QML space in the next few years.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Uj, Anjaii, “Quantum Machine Learning: A Smart Convergence of Two Disruptive Technologies, ” Analytics Insights, October 24, 2018

What is Quantum Machine Learning,” published by Discover Data Science, accessed February 20, 2022

Havlicek, Corcoles, Temme, Harrow, Kandela, Chow & Gambetta, “Supervised learning with quantum-enhanced feature spaces,” Nature, March 13, 2019

Pennylane.ai, accessed February 20, 2022

TensorFlow.org, accessed February 21, 2022

Ho, Alan and Mohseni, Masoud, “Announcing TensorFlow Quantum: An Open Source Library for Quantum Machine Learning,” Google AI Blog, March 9, 2020

Menten AI Partners with Xanadu to Develop Quantum Machine Learning for Protein-Based Drug Discovery,” PR Newswire, January 25, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

How to Invest in Quantum Computing

For some of you that have been following these posts, or others that have been learning about the power and potential of Quantum Computing, you may be wondering how to invest in this emerging opportunity.  Unfortunately, there are not many ways for individual investors to participate, although that is an evolving situation.  I will cover some of the ways you can make direct investments, some options for some indirect investments, and a few situations where publicly traded securities should be available later this year.  For this post, I will not be expressing any investment opinion but rather wanted to showcase the various avenues for making investments today (or in the near future).  I will be covering the investment strengths and weaknesses, in future posts, for some of the companies noted below.

Quantum Focused Public Companies

IonQ: Today, there is only one significant pure-play Quantum Computing company publicly traded, and that is IonQ ($IONQ), the College Park, MD based firm founded in 2015.  The Company was launched with seed funding from New Enterprise Associates, a pre-eminent venture investor, and a license to core technology from Duke University and the University of Maryland.  IonQ has built ion trap based working Quantum Computers which can be accessed directly or through cloud partnerships with Microsoft, Amazon and Google.  In October 2021, IonQ began trading on the NYSE, and as of 2/11/21 had a market capitalization of $3 billion.  The stock has had some recent gyrations, and will likely be dragged down a bit near-term as other players go public (see Rigetti and D-Wave below) and investors re-allocate some of their QC exposure from IONQ to those other firms, but this is an essential component of any long-term QC portfolio.

Rigetti: While Rigetti Computing is not quite public, they have signed a definitive agreement to merge with a SPAC called Supernova Partners Acquisition Company II ($SNII), which values Rigetti equity at approximately $1.5 billion and will provide over $450 million in cash proceeds to Rigetti.  Rigetti is another full-stack Quantum Computing provider, but they use superconducting loops for their qubits.  While the formal merger date has not been announced, a formal shareholder vote is scheduled for February 28, 2022 and the merger should be completed shortly thereafter.  Investors hoping to get in early can buy SNII today, or watch for it to trade post-merger, at which point its symbol will be RGTI.

D-Wave:  Similar to Rigetti, D-Wave has signed an agreement to merge with a SPAC, this one called DPCM Capital ($XPOA).  For this transaction, D-Wave equity is valued at $1.2 billion and it will provide $300 million in cash.  D-Wave is a different type of Quantum Computing company that offers quantum annealing as opposed to gate-based algorithms.  While annealing is less powerful than gate-based systems, it is easier to operate and scale and D-Wave has 25 customers from among the Forbes Global 2000, so is one of the current Quantum Computing companies with meaningful current revenues.  Investors hoping to get in early can buy XPOA today or watch for it to trade post-merger as QBTS.

Quantinuum: In June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to be acquired by Honeywell.  Honeywell merged CQC with its Honeywell Quantum Solutions (HQS) division and in November of 2021, spun out the combined businesses into a new stand-alone company called “Quantinuum” which is owned 54% by Honeywell and 46% by Cambridge Quantum Shareholders. Separately, Honeywell invested $300m in Quantinuum which has the benefit of CQC’s software and algorithm expertise combined with HQS’s hardware expertise, creating the largest full-stack dedicated quantum computer company.   Company executives have been quoted as confirming a 2022 targeted IPO, although there has been no official company announcement.  See here for a prior post showcasing Quantinuum.

PsiQuantum:  While currently private without any publicized plans to go public, PsiQuantum has been the most venture funded QC company in the US.  To date they have raised nearly $750m, most recently at a $3.15 billion post-money valuation.  While they are not in immediate need of liquidity, nor have they announced a desire to go public, the broad investor base and recent completion of a “D-Round” hint at an IPO some time in the not-too-distant future.

Among the five companies noted in this section, only one ($IONQ) is a currently traded pure-play quantum investments.  Two have committed to going public via SPAC some time this year, one has announced plans to go public but has not taken formal steps and the fifth is not necessarily going public this year, but they are worth watching for an IPO announcement in the future.  Interestingly, should all five become public, it would represent a broad bet on qubit construction (a mix of superconducting (Rigetti), ion traps (IonQ and Quantinuum) and photonics (PsiQuantum) so would enable diversification among these leading QC hardware strategies.

Exchange Traded Funds/Mutual Funds

In addition to these five pure-play companies, there are professionally managed, publicly traded funds with a focus on Quantum Computing and/or advanced computing.  Many of these have portfolios with considerable over-lap, so the best strategy here would be to select one of these funds as your “advanced computing” vehicle to provided diversified exposure to QC.

Defiance Quantum ETF:  Defiance Quantum ($QTUM) is an exchange traded fund with a portfolio of investments in advanced technology companies that operate in Quantum Computing as well as artificial intelligence and machine learning.  While not purely “quantum” the companies in its portfolio should all benefit from increasing commercialization of Quantum Computing.  The fund trades at or near its “net asset value”, in other words it is a relatively efficient way to own a diversified portfolio of about 70 companies. Holdings of QTUM include Teradata, Lockheed Martin, Airbus, HP, IBM, IonQ and others.

Fidelity Select Technology Portfolio ($SFPTX): This non-diversified fund invests primarily in equity securities, especially common stocks of companies that are engaged in offering, using, or developing products, processes, or services that will provide or will benefit significantly from technological advances and improvements. Some of the fund’s top quantum holdings include Google, Nvidia, Microsoft and Micron Technology.

Fidelity Select Software & IT Services Portfolio ($FSCSX): This non-diversified fund invests a majority of its assets in common stocks of companies engaged in research, design, production or distribution of products or processes that relate to software or information-based services.  Some of the fund’s top quantum computing holdings are Microsoft, Google and International Business Machines.

T. Rowe Price Global Technology Fund ($PRGTX): aims for long-term capital growth. This non-diversified fund invests most assets in the common stocks of companies that will generate a majority of revenues from the development, advancement and use of technology. Some of the fund’s top quantum computing positions are Alibaba, Advanced Micro Devices, Micron Technology and NXP Semiconductors.

Franklin DynaTech Fund Class A ($FKDNX): The fund invests primarily in common stocks with a focus on companies that are leaders in innovation, take advantage of new technologies, have superior management, and benefit from new industry conditions. Some of the fund’s top quantum computing investments are Google, Nvidia, Microsoft and Alibaba.

Technology Select Sector SPDR Fund ($XLK): Seeks to provide exposure to companies from technology hardware, storage, and peripherals; software; communications equipment; semiconductors and semiconductor equipment; IT services; and electronic equipment, instruments and components.  Top holdings include Apple, Microsoft, NVIDIA, Broadcom and Cisco.

Public Companies with Quantum Initiatives

None of the following publicly traded companies are pure-play quantum investments, but each has major Quantum Computing initiatives and a varying level of reliance on successful penetration of the QC market.

International Business Machines ($IBM): As a leading legacy company focused on computing hardware, IBM seems like a natural company to lead QC efforts.  In fact, they have created the IBM Q Experience which enables more than 100 customers to access IBM’s quantum resources via cloud-based access.  In addition, IBM has developed Qiskit, one of the more popular open-source quantum SDKs (software development kits).   Their latest 127-qubit Eagle quantum processor is one of the more robust QCs available and it is being utilized by major firms including Goldman Sachs, Samsung, JPMorgan Chase, ExxonMobil, and Boeing, among others.  IBM features its quantum initiatives prominently in its corporate materials, so I expect QC to be an ever-increasing part of its value.

Microsoft ($MSFT):   As a leading software company, it makes sense that MSFT would be working on quantum software.  Specifically, they have a widely used SDK called Q# (pronounced Q Sharp) and have been offering access to the quantum hardware systems offered by Honeywell, IonQ and QCI via their Azure Quantum cloud-based quantum platform.  And, via their M12 corporate venture arm, are investors in PsiQuantum.   By remaining fairly agnostic to the quantum hardware used, and by developing an open-source SDK, MSFT is well positioned to enjoy the growing usage and needs for access to QCs regardless of which hardware technologies ultimately gain the most traction.  However, despite Microsoft’s clear commitment to Quantum Computing through their Azure Quantum platform and their Q# SDK, in their latest 10-K annual report as of June 30, 2021, there is zero mentions of “quantum” or “Q#” so it may be difficult in the near term for MSFT’s quantum efforts to move their equity value.

Honeywell International ($HON)As noted above, Honeywell spun its Honeywell Quantum Solutions (HQS) division out into Quantinuum, with a stated plan to take Quantinuum public.  However, until that spinout occurs, it is possible to obtain QC exposure via a direct investment in Honeywell.  Even once Quantinuum goes public itself, it is expected that Honeywell will retain a significant ownership in Quantinuum so acquiring shares of HON now is an early way to get in on the upside possible in Quantinuum.

Alphabet ($GOOG, $GOOGL)Alphabet/Google has been a major quantum headline grabber over the past couple of years, especially after it published the breakthrough paper in Nature describing how its Sycamore quantum processor was the first QC able to achieve “quantum supremacy.”  In addition to the Sycamore claims, Google maintains a robust quantum offering, including its Cirq SDK, cloud-based QC access and various libraries of quantum resources and algorithms.  However, like other large companies included in this section, Alphabet is a huge, diversified conglomerate, so the relative contribution of QC to the broader Alphabet valuation is likely modest.

Intel ($INTL)Intel has been a leading player in computing hardware since it was founded by Gordon Moore and Robert Noyce in 1968, so they are another corporate candidate for meaningful quantum exposure.  Additionally, as “Moore’s Law” begins to bump up against physics constraints, Quantum Computing seems like a natural extension of their technology, in order to continue to produce ever more powerful computing chips.  In fact, in 2019 Intel announced Horse Ridge, a cryogenic control chip designed to speed the development of full-stack QC systems.  Intel is hoping to leverage this chip, along with its legacy expertise around interconnect technologies, to become a major player in the QC realm.

Amazon ($AMZN): Similar to Microsoft, Amazon has a broad cloud-based quantum platform within its Amazon Web Services (AWS) offering, known as Braket.  It provides access to systems from D-Wave, Rigetti and IonQ.  They also have an AWS Center for Quantum Computing in partnership with the California Institute of Technology among others.  However, Amazon is a massive business with many interests and “quantum” is not often featured in its corporate description materials nor was it mentioned in their 2020 annual report, so its overall equity exposure to QC may not be very significant.

Nvidia ($NVDA): Founded in 1999 with a focus on advanced gaming, Nvidia’s GPU’s (graphics processing units) are now also being utilized for deep learning, parallel processing and artificial intelligence, so they have become an important player in advanced computing.  A newly announced cuQuantum for quantum computing, enables large quantum circuits to be simulated dramatically faster, allowing quantum researchers to study a broader space of algorithms and applications. Developers can simulate areas such as near-term variational quantum algorithms for molecules and error correction algorithms to identify fault tolerance, as well as accelerate popular quantum simulators from Google and IBM.  Given their success in becoming significant players in advanced computing generally, it seems likely they will have success leveraging these assets in Quantum Computing.  Currently, “quantum” is a very modest focus within Nvidia’s press or shareholder reports, so it is unlikely to have a near-term major impact on its stock value, but this may be worth taking a modest, long-term position.

Summary

For those of you anxious to invest in the evolving Quantum Computing industry, there are a few publicly available options.  Some will provide a direct, pure-play investment, while others should enjoy enhanced returns based on their QC exposure.    The following table summarizes the public company investments (and stock symbols) that would provide decent portfolio exposure to Quantum Computing upside:

Those seeking meaningful investment exposure to QC should certainly maintain positions in IONQ, RGTI and QBTS and likely at least one of the funds noted.  For added exposure to a broader advanced computing portfolio that also adds QC exposure, you may consider adding some or all of MSFT, IBM, HON, GOOG, GOOGL, AMZN, INTL and/or NVDA. 

Disclosure: I maintain personal long positions in IONQ, SNII, QTUM and XLK, but do not have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Nvidia Press Release, Introducing cuQuantum: Acclerating State Vector and Tensor Network-based Quantum Circuit Simulation, November, 2021.

Zacks Equity Research, 4 Funds to Shine as Quantum Computing Comes Into Play, July 8, 2021

Intel Corporation Press Release, “Intel Introduces ‘Horse Ridge’ to Enable Commercially Viable Quantum Computers,” December 9, 2019.

Taulli, Tom, InvestorPlace, “These 7 Quantum Computing Stocks Are Futuristic Buys,” June 15, 2020.

Gecgil, Tezcan, InvestorPlace, “The 7 Best Quantum Computing Stocks to Buy for February 2022,” February 4, 2022.

Hajjar, Alamira J., AI Multiple,  “33+ Public & Private Quantum Computing Stocks in 2022”, published May 5, 2021 and updated Jan 11, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantum Advantage is Closer than you Think

I recently had the pleasure of speaking with Anisha Musti, a delightful and empowering 16-year-old CEO and Co-Founder of Q-munity, a 501(c)(3) non-profit that is introducing and teaching young individuals about Quantum Computing (I encourage you to check out Anisha and her project(s) at the Q-Munity website).  She hopes to expose her peers to QC so that they will consider careers in the field, or “if they learn about it from us but choose not to pursue it, at least they will be making an educated assessment.”  Anisha’s poise and wisdom belie her age.

The fact that a 16-year-old, along with a few of her friends and co-students, have established a robust and constructive free resource is one of the topics I highlight below.  But I am starting this post highlighting this conversation because it was an interesting multi-generational dialogue.  She asked a bit about my QC journey, and I began explaining my first computer courses in college (COBOL and FORTRAN) where we used “punch cards” to store and retrieve commands.  The conversation went along generally as follows:

Me: I started in computers when we still used punch cards to record the commands.

Anisha: Huh?

Me: You know, that was before we even had floppy discs.

Anisha: I have no idea what a floppy disc is.

Which certainly made me chuckle.  I reflected on the amazing advances we’ve seen in just my lifetime.  Born in the 60’s, I entered college before personal computers (or GPS or cell phones or the Internet, etc.) and have witnessed amazing technological progress ever since.  Sometimes, when I consider the power contained in my iPhone, I am awed by it and it feels like we can’t possibly need any more technological advances…I can do almost anything, virtually instantly, in the palm of my hand.

But time and technology invariably move forward.  And in fact, we appear to be on the cusp of even more profound technological capabilities in the form of working, powerful Quantum Computers.  Using the growth in power and capacity of some electronics over the past 20 years, the following table provides a level of growth-speed context:

You may notice that the growth-rate of the speed of the processor of your PC, while substantial at 4.3x, is a tiny fraction of the rate of growth in cellular data speeds.  This is a nuance of these sorts of growth rates, which are more explosive earlier in the life cycle, but eventually slow down and physical limits become more difficult to overcome.  There is also a relative utility factor, in that PC’s created in 2002 were pretty good at basic office program usage (email, word documents, spreadsheets, etc.) so the utility of speed increases was less valuable.  Compare that to gaming consoles.  While the graphics of Grand Theft Auto: Vice City (#2 videogame of 2002) may have made your mother cringe, it is a far cry from the realism experienced by today’s FIFA22.  In other words, the consumer utility of increased speeds and capacity is still a steep demand-curve for certain technologies, especially for those with substantial headroom in progress and need.

Given the utility of improved Quantum Computing, it is my opinion that the rate of growth will continue to accelerate at a phenomenal rate.  We are already seeing 10x/year increases in quantum volume (albeit over a short window of time) and I expect that pace to remain or accelerate in the near term, as I’ll explain below.  While there has been much written on this topic, and many billions of dollars invested, many still speak of a “quantum winter” where the hype overshoots the reality.  Readers of my posts know that I am mindful not to contribute to the hype, but I truly believe that useful, practical Quantum Computing applications are imminent (i.e., by the end of this decade or sooner).  Let me explain a few reasons why.

  1. The Quantum Evolution is Quite Mature

In 1879, electricity was first harnessed for home use to power Edison’s electric light bulbs.  During the period of 1920-1935 the US went on an electrification campaign bringing power to 70% of US homes.  So, in about 50 years, a profound new technology became ubiquitous.  Nobody could have imagined the impact electricity would have on daily life in those early years.  Yet today we take for granted that we can plug a cord into any wall in our home and have instant, nearly free power.  Personal Computers and the Internet have had similar, profound impacts on our daily lives, generally over shorter and shorter spans of time.

Quantum Computing has the potential to be a next profound disruptor.  Many authors, including me, have covered the power and potential of QC, so that is not the focus of this post.  Rather, the concept to keep in mind, is that while “Quantum Computing” is relatively new, the utilization of quantum physics/mechanics has been progressing for the past 130 years.  We have had great success utilizing the dual wave-particle nature of electrons and photons for a variety of purposes including MRI’s, lasers and GPS (which I covered in a prior post entitled “Quantum Quantum Everywhere”), among many others.  As that prior post noted, today we are already using quantum mechanics in Quantum Sensing for precise measurement probes (even where GPS is unavailable), ghost imaging and quantum illumination.  It is also being used today for certain applications of Quantum Communication.  And yes, while current Quantum Computers are not as powerful as we’d like, there are dozens of companies offering access to their working Quantum Computers today, with the power of the machines increasing quite rapidly.  While it is difficult to get consensus over exactly when the QC’s will become powerful enough to surpass classical computers for real-world problems, nearly everyone in the field will confirm it is just a matter of “when” not a matter of “if”.

  1. Cutting edge Quantum Processors are Available in the Cloud

As noted above in in a prior post, there are a variety of QC companies offering their latest QCs via cloud-based access.  This is important because it “socializes” access to QCs.  Today, anybody with some basic computing chops, can access actual, working QCs for modest, or in some cases, no cost.  Quantum algorithms are being written and run every day.  Furthermore, because many QC makers are providing their latest QCs via the Cloud, commercial users do not have to deal with a large CapEx (capital expenditure) cost up-front nor do they have to worry about obsolescence.  When mainframe computers became available to commercial users in the early part of the 21st century, they were extremely expensive, difficult to operate, and subject to being outdated relatively quickly.  The same was generally true of desktop computers, which often were made obsolete due to advanced software, well before they stopped “working.”

By utilizing QCs over the cloud, this cycle of CapEx àObsolescence à CapEx can be eliminated, which should spur greater utilization and adoption of QC than otherwise might occur.

  1. Open Source is the Default

I mean this in a broader sense than you might expect.  On the one hand, most of the existing QDKs (quantum development kits) are both open source (i.e., free to use) and cross-platform compatible.  What this means from a practical perspective, is that the learning curve for QC proficiency is much less steep because whatever skills are acquired can be used across many different platforms.  In addition, someone who creates a QC algorithm to access via a cloud provider such as Amazon’s Braket or Microsoft’s Azure Quantum can have the same algorithm run over a variety of QC hardware provider platforms.  Contrast this with early PC access where PCs did not speak to Mac’s or Linux boxes.  In addition, they required competing software, input devices and physical plugs in many cases.  All of that “confusion” made it difficult for the industry to scale at the same pace it might have, if all power users spoke the same language and used fully compatible hardware.

Even more profound and telling in the current QC environment is the “open” nature of so many of the participants.  Access to the programs offered by Anisha’s Q-Munity, noted in the opening paragraph, is free.  Many authors have published complete textbooks on Quantum Computing (Thomas G. Wong’s Introduction to Classical and Quantum Computing, and Brian Siegelwax’s Dungeons-n-Qubits are but two examples) for free.  And there are innumerable fist-rate on-line courses and programs about Quantum Computing for free.  In addition to all the free resources, I have found that the players and participants in the industry are also generally open, friendly, and eager to help folks on their quantum journeys.  This spirit of community and cooperation is refreshing, especially around an industry with such tremendous commercial potential.  Perhaps this openness will be less pervasive once the industry gets more mature (and companies are competing more vigorously for QC customers), but the essence of this post is to suggest that point arrives quickly, and this current state of openness certainly accelerates access to, and development of, quantum technologies.

  1. QC is Leveraging Adjacent Technologies

In addition to leveraging the historical progress in taming quantum mechanics for commercial use, recent advances in machine learning, artificial intelligence and big data are quite complementary to Quantum Computing.  Many advances and breakthroughs in these industries can be accelerated or improved by applying QC technology, so the pool of well experienced, advanced computing talent, is quite larger already, even in the relatively early stages of QC evolution.  Similarly, we see certain quantum hardware strategies leveraging existing advances in semiconductor technology (i.e., quantum dots) and optics (photonic qubits) to create QCs.  As the hardware advances and applications continue to evolve, I expect many to also converge.

  1. Quantum Advantage is a Continuum not a Milestone

As a refresher, while there is no definitive guide to definitions about QC, “quantum supremacy” is generally referred to as a QC being able to tackle a problem, even one without real-life application, faster than a classical computer.  This was achieved in by Google in 2019 and repeated by others since.   “Quantum advantage” on the other hand, is meant to denote when QCs can out-perform classical computers in actual, useful applications.  The QC world is anxiously awaiting this Quantum Advantage threshold without clear consensus on when that might occur.  However, as those who study quantum effects well know, things are never so binary!  It is more constructive to think about QC progress as a continuum, not a specific threshold to be achieved.

I am not the first to suggest this perspective.  In a recent Harvard Business Review podcast, host Azeem Azhar interviewed Rigetti Computing founder and CEO Chad Rigetti.  In it, Rigetti noted select instances where a QC offered a very slight performance advantage to a small part of a broader problem.  He discusses how this happened with Rigetti’s attempt to improve weather forecasting.  While this is certainly not Quantum Advantage, it is a real-world example, today, of QC contributing to real analysis.  Chad elaborated on some of his thoughts around “narrow” versus “broad” quantum advantage, which I found very compelling.  Specifically, he referred to “narrow advantage” where a specific use case might benefit from QC, such as in the pricing of derivatives.  Any small advantage could produce outsized financial benefits in portfolio allocation or timing of trades, and could occur well before “broad advantage” is achieved.  While financial markets are just one example, the finance industry is already very computing advanced and the underlying data is already in computing format, so this sort of narrow quantum advantage could be quite close.  Broader quantum advantage, where QCs can generally outperform classical computers, is more difficult and therefore further away, but I imagine we will see many steps up a spectrum of advances on the way to full quantum advantage.

  1. Calling Dr. Evil…

The final point I want to make, is the enormous economic impact that a powerful QC will make.  The heading of this section, using a tongue-in-cheek reference to Austin Powers movies, is meant to evoke the massive commercial gains that can be made with a powerful QC.  Much has been written about using Shor’s algorithm to break current encryption protocols and the “HNDL” (hack now, decrypt later) movement, which unfortunately is a real thing.  A bad actor or nation-state could command enormous power if they were the first to create a powerful QC.  They could break most encryption, mine all remaining bitcoin and other cryptocurrency and skim untold profits from financial systems by front-running traders, just to name a few powers.  In fact, the US and China are currently engaged in a ferocious race to develop the most powerful QC capabilities, each fearing the other’s ability to get there first with each establishing nationally supported quantum initiatives.

Naturally I hope and expect that the “good guys” will have the most powerful QCs and will focus their powers on good use such as better medicines, more efficient car batteries and optimized logistics, among other things.  Certainly, the rewards and upside for constructive use of QCs is enormous, and smart people are busy at work to protect us from those bad actors.  The point is that the massive financial upside for access to powerful QCs will spur accelerated development.

So can we say with any certainty, what a QC timeline looks like?  Unfortunately, not.  But as this post points out, the foot is on the accelerator, billions of dollars are being invested and super smart people are working on creative solutions to existing progress rate delimiters.  For these reasons and those enumerated above, I am confident that we will begin seeing quantum advantage in our daily lives more and more over the next few years.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Azhar, Azeem, Host, “How Quantum Computing Will Change Everything (with Chad Rigetti)”, Season 6, Episode 11, Harvard Business Review, December 2021.

Intel® Microprocessor Quick Reference Guide – Year, accessed February 5, 2022

The astounding evolution of the hard drive (pcworld.com), accessed February 5, 2022             

A Brief and Abbreviated History of Gaming Storage – Techbytes (umass.edu), accessed February 5, 2022

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Cloud Based Quantum Computer Access – Available Today

If you have been following this blog, hopefully you have some broad appreciation for the promise and potential of Quantum Computing (QC).  This is a rapidly evolving field and generally, the hype has been front-running the actual capabilities.  While a narrow “quantum supremacy” has been achieved by Google and others, general “quantum advantage” (where a quantum computer can out-perform a classical computer for a given real-world problem) is still out of reach (for now). 

That said, the purpose of this post is to highlight and showcase the fact that people are using actual working Quantum Computers every day.  Each of Amazon and Microsoft offer cloud access to several QC hardware systems, while players like Google, IBM, IonQ, Rigetti, Honeywell and others offer direct access to their systems via direct web-based interfaces.  I’m going to spell out some of the modes of access these firms have made available, not to be a definitive catalogue of all QCaaS (Quantum Computing as a Service) providers but to emphasize two facts:

  1. Many people and companies are already using quantum computers to process real-world quantum algorithms.  Results are generally less robust than can be achieved on existing classical computers, but routines are being run and occasionally, results surpass classical computing results.
  2. The industry is moving towards a largely open-source software environment for programming and accessing quantum processors. Many quantum hardware manufacturers are offering cloud-based access to their systems, obviating the need to purchase physical quantum hardware.  This substantially lowers the barriers to entry for companies seeking to begin exploring how QC can benefit their businesses, and “future-proofs” the investment since the providers continually upgrade the QC machines they provide via the cloud.

Working QCs are currently available to anyone (and as you’ll see below, the costs of operating QCs can be quite modest).  In fact, a recent study reviewed all of the QC cloud access of IBM’s Quantum systems over a two-year period and found over 6,000 jobs which contained over 600,000 quantum circuit executions and almost 10 billion “shots” (a shot is a single execution of a quantum algorithm on a QPU (quantum processing unit), further described below).  IBM notes on their website that they have run over 1 trillion circuits to date, which is clearly a non-trivial amount.  And that is just at IBM. 

The following tables highlight some aspects of the current state-of-play in using actual quantum computers to run algorithms via cloud-based access:

Note : Other providers include Alibaba Quantum Lab (China), Alpine Quantum Technologies (Austria), Origin Quantum (China)

[1] Quantum annealing is a different protocol than typical Quantum Computing gates so is not a direct equivalent when comparing numbers of qubits.

Quantum Computing Power Available Via the Cloud Today

Before I get into details about specific methods for accessing working Quantum Computers, I want to review a few facts about the state of the industry vis-à-vis QC power.  The current environment has been referenced as “NISQ” or noisy intermediate-scale quantum.  Generally, this means that existing quantum computers operate with a lot of noise that interferes with qubit control and coherence, and that working quantum computers have a somewhat limited number of qubits.  QC power can be increased both with the addition of more qubits and/or with the successful implementation of error-correction.  Generally, a QC with about 50-60 working logical qubits (representing around a petabyte of processing power) should begin to achieve consistent quantum advantage.  Some expect this will require as many as 1,000x more qubits per logical qubit to handle the error-correcting overhead, although as control and error-correction improves, this number should decrease.  In any case, today’s working QCs provide 10’s of working qubits not 100’s or 1000’s, but they are working, accessible machines, nonetheless and beginning to yield significant computing power.

To emphasize how existing Quantum Computers are already showing real-world promise, Rigetti Computing recently used their machines to augment a portion of GSWR (Global Synthetic Weather Radar) analysis using their 32 qubit QC, and in select instances, were able to modestly outperform results achieved using only classical computing power.  Similar select improvements over classical computing have also been noted in certain portfolio/security valuation algorithms.  So real-word real benefits are beginning to appear even in this early NISQ environment.

What does it Cost to run Quantum Algorithms via the Cloud?

In order to provide an example of how you can begin accessing Quantum Computers and running quantum algorithms, the following describes access via Amazon Braket:

You can use an account with Amazon Braket to access the Quantum Computers provided by IonQ, Rigetti or D-Wave.  Once you construct a quantum algorithm, it is recommended that you test and debug it on a simulator, which is generally available for no cost.   Once you are ready to actually run the algorithm on a bona fide quantum machine, there are some cost factors to keep in mind.  There are generally two pricing components when using a quantum computer or quantum processing unit (QPU) via the cloud: a “per-shot” fee and a “per task” fee.

As you may recall from prior posts, quantum algorithms are “probabilistic” not deterministic.  There is no single correct result from a quantum operation, rather outputs are aggregated and averaged to determine the correct output.   For this reason, algorithms are usually run many, many times (10,000 times is a standard number).  A “shot” is a single execution of a quantum algorithm on a QPU. For example, a shot is a single pass through each stage of a complete quantum circuit on a gate-based QPU.  The per-shot pricing depends on the type of QPU used but is not affected by the number or type of gates used in a quantum circuit or the number of variables used in a quantum annealing problem.

A task is a sequence of repeated shots based on the same circuit design or annealing problem. You define how many shots you want included in a task when you submit the task to Amazon Braket.  The current pricing to run algorithms via Amazon Braket are as follows:

  • D-Wave 2000Q: $0.30/task  + $0.00019/shot
  • D-Wave Advantage: $0.30/task + $0.00019/shot
  • IonQ: $0.30/task + $0.01/shot
  • Rigetti: $0.30/task + $0.00035/shot

For example, a scientist runs a quantum algorithm on the Rigetti Aspen-11 quantum computer in the AWS US West (N. California) Region. This task includes 10,000 repeated shots of the same circuit design. The cost to run this task includes a per-task charge of $0.30, plus 10,000 shots at a per-shot price of $0.00035.

So, the cost to run this algorithm:
Task charges: 1 task x $0.30 / task = $0.30
Shots charges: 10,000 shots x $0.00035 / shot = $3.50
Total charges: $3.80

Competing quantum cloud providers have similar pricing constructs or charge a fixed amount for a certain level of access/time.  Naturally there is no guarantee that your circuit or algorithm will provide the desired results, or useful results, but the table stakes to begin testing QC for your business is quite modest.  These costs may increase in the future, but it is a very low bar considering the potential upside, and certainly less expensive (or risky) than purchasing a dedicated Quantum Computing machine today.

Conclusion

By providing access to actual Quantum Computers via the cloud, a handful of QC hardware makers are providing QC access to virtually anyone.  With a basic working knowledge of Python (a common programming language, particularly good at connecting various components), a user can investigate many free open-source resources and QDKs (quantum development kits), begin compiling “quantum algorithms” and test/debug them for free on any number of cloud-based simulators.  Once ready to run on an actual, working Quantum Computer, you can then sign up directly at some QC providers or via Amazon or Microsoft’s web platforms (or others) and have the algorithm run on an actual QC for a very modest cost.  Such access eliminates the capital and technology risks of purchasing a Quantum Computer.

This is already happening with, literally, trillions of circuits run to-date.  While the power of the machines currently accessible are modest relative to high-performance classical computers, real-world achievements are becoming increasing possible.  As more users are provided with broader access to ever larger QCs, and as advances in error correction and control continue, it is only a matter of time before consistent quantum advantage is available to nearly anyone.


References:

Enos, Graham; Reagor, Matthew; Henderson, Maxwell; Young, Christina; Horton, Kyle; Birch, Mandy and Rigettin, Chad; Synthetic Weather Radar Using Hybrid Quantum-Classical Machine Learning, November, 30, 2021

The Quantum Insider, QCaaS write-up, accessed January 26, 2022

Dilmegani, Cem, Quantum Software Development Kits in 2022, https://research.aimultiple.com/quantum-sdk/amp/, accessed January 22, 2022

Ravi, Gokul Subramanian; Smith, Kaitlin; Gokhale, Pranav; Chong, Frederic, Quantum Computing in the Cloud: Analyzing job and machine characteristics, University of Chicago papers, November 1, 2021

Shaw, David, “Quantum Software Outlook 2022”, Fact Based Insight, January 19, 2022

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

A Quantum Computing Glossary

Hopefully many of you have been following this blog since it began and are familiar with the terms highlighted below.  For some of you, a refresher for reference may be helpful.  For others, this may all be very overwhelming and confusing so I hope this guide will clarify things for you.  I’ve curated this list to provide a broad set of definitions that should help frame the Quantum Computing (QC) potential, and for ease of reference as you come across terms where a definitional reminder would be helpful.  In the first post in this series, I introduced QC with the following word-cloud graphic:


While not every word in this cloud bears defining in this post, I hope many of these definitions help you in your efforts to understand and appreciate QC, and I have grouped them into silos to add context (although some may naturally apply to more than one silo).  This is not intended to be complete list, and it’s likely that more definitions will need to be added over time, but this should provide a good grounding in the general nomenclature and principles.

Quantum Concepts

  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated. 
  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.Dirac Notation: Symbolic representation of quantum states via linear algebra, also called bra-ket notation.  The bra portion represents a row vector and the ket portion represents a column vector.  While a general understanding of QC does not necessarily require familiarity with linear algebra or these notations, it is fundamental to a deeper working knowledge.
  • Quantum Supremacy: Demonstrating that a programmable quantum device can solve any problem that no classical computing device can solve in a feasible amount of time, irrespective of the usefulness of the problem.  Based on this definition, the threshold was passed in October 2019.
  • Quantum Advantage: Refers to the demonstrated and measured success in processing a real-world problem faster on a Quantum Computer than on a classical computer.  While it is generally accepted that we have achieved quantum supremacy, it is anticipated that quantum advantage is still some years away.
  • Collapse: The phenomenon that occurs upon measurement of a quantum system where the system reverts to a single observable state.  Said another way, after a qubit is put into a superposition, upon measurement it collapses to either a 1 or 0.
  • Bloch Sphere: a geometrical representation of the state space of a qubit, named after the physicist Felix Bloch.  The Bloch Sphere provides the following interpretation: the poles represent classical bits, and we use the notation |0⟩ and |1⟩. However, while these are the only possible states for the classical bit representation, quantum bits cover the whole sphere. Thus, there is much more information involved in the quantum bits, and the Bloch sphere depicts this.
  • Schrodinger’s Cat: A quantum mechanics construct or thought experiment that illustrates the paradox of superposition wherein the cat may be considered both alive and dead (until the box is opened and its status is then known for certain).  This “both alive and dead” concept often confuses early students of quantum mechanics.
  • Heisenberg Uncertainty: (also known as Heisenberg’s uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the position and momentum of a particle can be known based on its starting parameters.  Generally, the more precise the position location is, the less precise the momentum can be described, and vice versa.  This also confuses early students of quantum mechanics who are used to typical physics where speed and position are usually well known by observation.

Hardware/Physical Components

  • Qubit: Also known as a quantum bit, a qubit is the basic building block of a quantum computer. In addition to the conventional—binary—states of 0 or 1, it can also assume a superposition of the two values.  There are several different ways that qubits can be created with no clear candidate emerging as the definitive method.
  • Auxiliary Qubit:  Unfortunately, there is no such thing as quantum-RAM so it is difficult for QC’s to store information for extended periods of time.  An “Auxiliary Qubit” serves as a temporary memory for a quantum computer and is allocated and de-allocated as needed (also referred to as an ancilla).
  • Cryogenics: Operating at extremely cold temperatures, generally meant to be less than -153 Celsius, or in the case of QC, -180 Celsius.  Cryogenics are of particular interest for QC when applied to silicon-based semiconductors because at this temperature, such semiconductors operate with superconductivity (i.e., the electrons flow with no loss to resistance).
  • Dilution Refrigerator: Used in superconducting qubits and often with quantum dots, whereby a series of physical levels (typically 7) are sequentially chilled to the lowest level, where the qubits operate.
  • High Performing Computer (HPC): Sometimes also referred to as a “supercomputer” is generally meant to represent any ultra-high performing classical computer.  Powerful gaming PCs operate at 3 GHz (i.e., 3 billion calculations per second) while HPC’s operate at quadrillions of calculations per second.  Despite this blazing speed, there are many problems that HPC’s cannot perform in a reasonable about of time, but theoretically can be done with a QC in a very short amount of time. 
  • Quantum Annealer: Annealing is used to harden iron, where the temperature is raised so the molecular speed increases and strong bonds are formed.  The iron is then slowly cooled which reinforces these new bonds, a process called “annealing” in metallurgy. Quantum annealing works in a similar way, where the temperature is replaced by energy and the lowest energy state, the global minimum, is found via annealing.  Quantum annealing is a quantum computing method used to find the optimal solution of problems involving many solutions, by taking advantage of properties specific to quantum physics.   Since there are no “Gates”, the mechanics of annealing are less daunting than full blown QC, although the outputs are less refined and precise than they would be under a full gate-based QC. 
  • Quantum Dot: Quantum dots are effectively “artificial atoms.” They are nanocrystals of semiconductor wherein an electron-hole pair can be trapped. The nanometer size is comparable to the wavelength of light and so, just like in an atom, the electron can occupy discrete energy levels. The dots can be confined in a photonic crystal cavity, where they can be probed with laser light.
  • Quantum Sensor: Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.  Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds.  Quantum illumination is similar and can be used in quantum radar.

Computing Operations

  • Gate: A basic operation on quantum bits and the quantum analogue to a conventional logic gate. Unlike conventional logic gates, quantum gates are reversible. Quantum algorithms are constructed from sequences of quantum gates.
  • Hadamard Gate: The Hadamard operation acts on a single qubit and puts it in an even superposition (i.e., turns and spins the qubit so the poles face left and right instead of up and down).  It is a universal gate operation which establishes superposition.
  • Fault Tolerance: technical noise in electronics, lasers, and other components of quantum computers lead to small imperfections in every single computing operation. These small errors ultimately lead to erroneous computation results. Such errors can be countered by encoding one logical qubit redundantly into multiple physical qubits. The required number of redundant physical qubits depends on the amount of technical noise in the system. For superconducting qubits, experts expect that about 1,000 physical qubits are required to encode one logical qubit. For trapped ions, due to their lower noise levels, only a few dozens of physical qubits are required. Systems in which these errors are corrected are fault tolerant.
  • Measurement: the act of observing a quantum state. This observation will yield classical information, but the measurement process will change the quantum state. For instance, if the state is in superposition, this measurement will ‘collapse’ it into a classical state of 1 or 0. Before a measurement is done, there is no way of knowing what the outcome will be.
  • NISQ: Noisy intermediate-scale quantum, coined by John Preskill in 2017, meant to depict the current state of QC whereby qubits suffer from noise and rapid decoherence.  It generally means the establishment of 50-100 logical qubits (the “intermediate-scale” portion of the definition, which would require 100,000 – 1,000,000 physical qubits with the balance of the qubits dedicated to noise reduction).
  • Noise: In QC, noise is anything which impacts a qubit in an undesirable way, namely electromagnetic charges, gravity or temperature fluctuations, mechanical vibrations, voltage changes, scattered photons, etc.  Because of the precise nature of qubits, such noise is nearly impossible to prevent and requires substantial error-correction (to correct for the noise) in order to allow the qubits to perform desired calculations.
  • Quantum Algorithm: An algorithm is a collection of instructions that allows you to compute a function, for instance the square of a number. A quantum algorithm is exactly the same thing, but the instructions also allow superpositions to be made and entanglement to be created. This allows quantum algorithms to do certain things that cannot be done efficiently with regular algorithms.
  • Quantum Development Kit (QDK): A number of providers offer different types of QDK’s including some that are proprietary and others that are open source.  It generally contains the programming language for quantum computing along with various libraries, samples and tutorials.  QDK’s are available from the following companies (with their QDK name in parentheses): D-Wave (Ocean), Rigetti (Forest), IBM (Qiskit), Google (Cirq), Microsoft (Microsoft QDK), Zapata (Orquestra), 1Qbit (1Qbit SDK), Amazon (Braket), ETH Zurich (ProjectQ), Xanadu (Strawberry Fields) and Riverlane (Anian).
  • Quantum Error Correction: The environment can disturb the computational state of qubits, thereby causing information loss. Quantum error correction combats this loss by taking the computational state of the system and spreading it out over an entangled state using many qubits. This entanglement allows observers to identify and remedy disturbances without observing the computational state itself, which would collapse it.  However, many 100’s or 1000’s of error correcting qubits are required for each logical qubit.
  • Speedup: The improvement in speed for a problem solved by a quantum algorithm compared to running the same problem through a conventional algorithm on conventional hardware.
  • Coherence/Decoherence: Coherence is the ability of a qubit to maintain its state over time.  Decoherence generally occurs when the quantum system exchanges energy with its environment, typically from gravity, electromagnetism, temperature fluctuation or other physical inputs (see “Noise”).  Longer coherence times generally enable more computations and therefore more computational power for QC.
  • No Cloning Theorem: The no-cloning principle is a fundamental property of quantum mechanics which states that, given a quantum state, there is no reliable way of producing extra copies of that state. This means that information encoded in quantum states is unique. This is sometimes annoying, such as when we want to protect quantum information from outside influences, but it is also sometimes especially useful, such as when we want to communicate securely with someone else.
  • Oracle: A subroutine that provides data-dependent information to a quantum algorithm at runtime.  It is often used in the context of “how many questions must be asked before an answer can be given” in order to confirm or establish quantum advantage.

Applications

  • Quantum Cloud: Access to Quantum Computers via a cloud-based provider.  Some prominent firms currently offering such access includes IBM, Amazon, Google, and Microsoft, among others.  Two benefits of such QC access included lower up-front costs (users do not need to buy any hardware) and futureproofing (i.e., as the QC makers create more powerful machines, cloud access can be directed to the newer machines without any added investment required by the users).
  • Quantum Communication: A method of communication that leverages certain features of quantum mechanics to ensure security.  Specifically, once a given qubit is “observed” or measured, it collapses to either a “1” or a “0”. Therefore, if anyone intercepts or reads a secure quantum message, the code will have changed such that the sender and receiver can see the impact of the breach.  QKD or quantum key distribution is an existing technology that is already in use over fiber optics, certain line-of-sight transmissions, and recently by China via a special satellite, between Beijing and Austria.
  • Shor’s Algorithm:  An integer factorization algorithm written in 1994 by American mathematician Peter Shor.  It is open-sourced and currently available to anyone to use to break RSA encryption or other protocols relying on the difficulty of factoring large numbers.  For this reason, it is often cited as a clear example of the need and desire for a powerful enough QC to run the algorithm.  No QCs are yet powerful enough to use this algorithm to circumvent RSA or related encryption, but that will change at some point in the coming years.  “Post-Quantum” encryption is generally meant as a protocol that would not be vulnerable to Shor’s Algorithm.
  • Grover’s Algorithm:  Another open-source algorithm already written, intended for search optimization.  For most current computer searches, the target samples must either be processed one at a time until the desired result is found, or the data must be organized (i.e., put in numerical or alphabetical order) to be searched more efficiently.  Grover’s algorithm can simultaneously search much of the entire field (depending on the power of the QC) and therefore find results much faster.  Shor’s and Grover’s algorithms are often the first two algorithms cited when discussing quantum supremacy and are elegant examples of the speedup that QC’s can provide.

I hope this glossary is a useful companion for your journey in understanding and appreciating Quantum Computing.  Feedback is always invited.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Quantum Computing: Progress and Prospects, The National Academies Press, 2019

Azure Quantum Glossary, Microsoft.com, accessed January 22, 2022

The Rise of Quantum Computing, McKinsey & Company, December 14, 2021

Glossary, Dotquantum.io, accessed January 22, 2022

Dilmegani, Cem, Quantum Computing Programming Languages, AI Multiple, published April 11, 2021, updated January 4, 2022.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blogRuss Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Ten Fundamental Facts About Quantum Computing


The Quantum Leap

January 17, 2022


I’ve covered some of the key aspects of Quantum Computing in prior posts, including details about things like qubits, superposition, and entanglement.  I thought it would be helpful to readers to now synthesize and consolidate some of the fundamental properties of Quantum Computing in order to provide a bigger picture of the promise and potential of the industry. 

However, I want to be on alert for overblown claims or statements disconnecting fact from reality.  Some speak of a “Quantum Winter” where the hype gets overblown, and people get fed up with the promise and divert their attention (and resources) elsewhere, such as the case with nuclear fusion as a power source.  So, I will be careful to be as fact-based as possible.  As with all these posts, I hope that readers without any formal physics or computer science training can still appreciate and understand the information presented.  Feedback is always welcomed and encouraged.

  1. What is a Quantum Computer?

Quantum Computers (QCs) use incredibly tiny particles (e.g., atoms, ions, or photons) to process information.  The physics that govern the behavior of particles at this minute size scale is quite different from the physics we experience in our much larger “people-scale”.  QCs control and manipulate the individual particles as “qubits” which hold and process information analogous to how “bits” control our computers and electronic devices.  However, the quantum mechanics at work at this scale allow QCs to process more information much more quickly than ordinary computers.  Also, because of the different physics at play, different questions can be processed, and physical systems can be more accurately modeled, suggesting significant new advances as the machines continue to scale in size and power.  The following table highlights some of the differences between existing digital/classical computers and QCs:

Think about these differences as enabling a Quantum Computer to do more per step, which is another way of saying it can process information faster than a classical computer. As it turns out, this speed advantage is phenomenal, which is why there is such enormous potential for Quantum Computers.  See here for a prior post with additional details.

  1. How are Qubits made?

There are several different ways people are creating and manipulating qubits, each with an array of strengths and weaknesses. The overarching challenge for each method is the desire to maintain a constant environment for the qubit, shielding from light, electromagnetism, temperature fluctuations, etc., (i.e., non-disturbed) while at the same time maintaining exquisite control of the qubit.  Any tiny disturbance in the environment can throw off the qubit and create “noise” in the calculations.  On top of this, is the challenge of achieving precise control of such tiny elements, often in a cryogenic environment.  The power of the qubits resides in the ability to manipulate or rotate them very precisely.  This is a difficult engineering requirement that is increasingly being met by the players in the industry.  While there are a growing number of methods of creating and controlling qubits, here are some of the most common:

  • Superconducting Qubits:  Some leading QC players including Google and IBM are using superconducting, achieved at near absolute-zero temperatures, to control and measure electrons.  While there are a few different ways these qubits are created (charge, flux, or phase qubits) it generally utilizes a microwave resonator to excite an electron as it oscillates around a loop which contains a tiny gap, and measures how the electron crosses that gap.Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable.  However, the requirement to operate near absolute zero temperatures adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to this low temperature environment.
  • Trapped Ions:  Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge.  Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons.  Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
  • Photonic QubitsPhotons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing.  Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity. 
  • Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium, or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.

The following table highlights some of the features of these strategies along with companies currently working on QCs with these qubits.  See here for a prior post which provides added details.

  1. What is Superposition and Entanglement?

Nearly every introduction to Quantum Computing includes an explanation of Superposition and Entanglement, because these are the properties that enable qubits to contain and process so much more information than digital computing bits and enable the phenomenal speed-up in calculations.  While these are profound properties that are difficult to conceptualize with our common frame-of-reference on the macro-scale world, they are well established quantum physical properties. 

  • Superposition: classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a qubit can be in a superposition of both at the same time, which imparts more information per processing unit than a classical bit. 
  • Entanglement: Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled “, their spin or polarization becomes perfectly correlated.  It is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (n qubits = 2n bits).  The following table showcases this feature:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.  See here for more details on superposition and entanglement.

However, the sobering reality is that this chart assumes the qubits can be perfectly controlled for the duration of the calculations and that all the qubits can entangle with each other.  We are still quite far away from being able to achieve these parameters at a meaningful scale, although progress and advances are being made continuously.  The other key to understanding and appreciating this, is to distinguish between “logical qubits” which this table describes, versus “physical qubits”.  You may hear of companies using quantum computers with over 1,000 qubits but in the current NISQ (noisy intermediate-stage quantum) environment, many of the physical qubits are dedicated to error-correction as opposed to logic/calculations and often the qubits lose their superposition or entanglement properties (decoherence) very quickly, before the algorithms can be completed.  So, discussions about the number of qubits in a given quantum computer need to have the proper context to understand the computing power implications.

  1. Is the Power of Quantum Computers Magical?

You may be hearing claims of phenomenal powers of Quantum Computers (including from yours truly) along with descriptions of “quantum” as doing things surreal or supernatural (e.g., Schrodinger’s cat being both alive and dead).  Features of Superposition and Entanglement are very difficult for a lay person to understand or appreciate, let alone believe it can be used for computing purposes.  Some even describe quantum mechanics as “magical.”  Most people, when they think of magic, conjure up parlor tricks or optical illusions, so it would be natural to doubt the veracity of the claims of QC.  This, combined with the fact that nobody has created a QC that can perform real-world useful computations (yet) that can’t be performed on a classical computer.  However, while the underlying mathematics are advanced, there is clear and agreed science concerning the construction and performance of Quantum Computers.  The mathematical principals of manipulating qubits and using them to create logic gates are based on well-established linear algebra and trigonometry.  Innumerable quantum algorithms are being written and will perform useful and important calculations once quantum machines scale to match the required power needed.  At this point, it is difficult to predict precisely when such scale will be achieved, but those in the field will confirm that this is an engineering challenge not a theoretical challenge.

  1. I Hear True Quantum Computing may be Decades Away.  Is that True?

This is very difficult to answer with precision.  My first “computer”, bought in 1980, was a Sinclair ZX80 with only 8k of memory, a puny amount compared to today’s PC’s.  It certainly could not perform any applications or calculations that were of practical use at the time, although I was able to write some very basic code (ironically “Basic” was also the software language it used).  But I could truthfully and accurately say in 1980 that I was using a personal computer to execute commands.  A similar statement can currently be made by users of existing QC’s and many people are using cloud-based Quantum Computers today to run simple algorithms. While they are not yet capable of performing calculations that ordinary computers can’t perform, it is a dynamic and evolving situation.

At the same time, companies like D-Wave have “quantum” computers that use annealing, which leverages certain aspects of quantum mechanics, but cannot yet perform typical gate functions.  They have many customers performing useful optimization calculations today, although not full-fledged QCs in the typical sense.

While there are no crystal balls, there are several high-profile quantum computing companies publishing their development timelines, which generally suggest a large-scale product (i.e., more than 100 logical qubits) before the end of this decade.   See below for IBM, Honeywell (Quantinuum) and IonQ versions: 

Many predict consistent Quantum Advantage (when Quantum Computers can consistently perform real-world calculations) in the next 5-10 years.  The key thing to follow as the industry advances, will be to monitor which players are successful in meeting their timeline milestones.  As more and more companies achieve important stated milestones, this timeline should become more precise.

  1. Can We Measure Quantum Computing Power?

Unfortunately, there is no universally recognized measurement standard for the power of a Quantum Computer.  There are several characteristics that are important including the number of qubits, the fidelity of the qubits, the length of time entanglement can be sustained, the numbers of gates that can be utilized, the numbers of connections between qubits that can be controlled, etc.  Recently, IBM proposed a metric called “quantum volume” which is intended to consolidate many of these features although not all players are utilizing this standard.  Barring any established metric, be careful to understand and appreciate the claims made by Quantum Computing companies realizing that the power of the computer is not necessarily directly correlated to the numbers of qubits it uses.   See here, for a prior post which covered performance measurement.

  1. Are People Really Using Quantum Computers?

This is a bit of a trick question.  The truth is that dozens of providers have made actual working Quantum Computers available for use via the Cloud.  Some basic machines are available for no charge, some are available free for academic use, and some can be utilized for a modest cost.  You could finish reading this article, and assuming you were familiar with basic Python programming, download a development kit from IBM (Quiskit), Microsoft (Q#), Google (Cirq), Amazon (AWS Bracket) or others, and begin writing quantum algorithms, and then establish an account with one of the QC cloud providers and either wait in the queue for your turn on a given machine, or acquire time to have the algorithm run on one of dozens of machines available remotely.

A recent study by Zapata Computing revealed that many companies are also using or planning to use QC in their businesses.  Specifically, the study indicates that “69% of enterprises across the globe reveal they have adopted or are planning to adopt QC in the next year,” with those already having adopted some form of QC amounting to 29% of their survey respondents.  In addition, you may read of many companies using Quantum Computers today to begin various optimization analyses.  The following highlights some of the companies currently exploring QCs for various business applications:

  1. Where will Quantum Computing Provide Early Impact?

The superposition and entanglement of qubits enables QCs to evaluate many dataset items simultaneously instead of linearly, hence the tremendous speed-up in processing.   One area where QCs can use these speedup features to provide a quantum advantage is in the ability to process currently unmanageable combinatorial problems (simulation and/or optimization).  To visualize this, consider that a simple seating chart for 16 people involves over 20 trillion possible configurations [see here for prior post describing this in more detail].  Imagine the complexity of trying to design new chemicals or materials or medicines or optimized financial portfolios.  The numbers of atoms, chemical bonds, or securities involved makes computer simulations practically impossible with existing classical computers, and the trial-and-error of experimentation is costly and time consuming.   Therefore, problems involving combinatorics are the likely first uses of QCs.  The following table highlights some of these use cases:

  1. Are My Bitcoin Porfolio or Encrypted Bank Transactions Vulnerable to Quantum Attack?

The short answer is, not really.   While it is theoretically true that a powerful enough Quantum Computer could mine all remaining cryptocurrency and break standard RSA encryption (used for most secure messages and transactions communicated over the Web), this is a well-known issue that is seeing substantial remedial attention.  NIST (the National Institute of Science and Technology), a government entity which oversees certain standards and measurements, is in the final round of approving candidates to deploy a post-quantum cryptography standard.  There are four Round 3 finalists with Public-Key Encryption and Key-Established Algorithms, and three Round 3 finalists with Digital Signature Algorithms, so new approved protocols which are “quantum safe” are imminent.  In addition, there are other ways to secure on-line transactions besides RSA encryption, such as two-factor authentication, so more and more users are establishing enhanced protections.  As for bitcoin, that is a bit more nuanced.  Since most cryptocurrencies rely on increasingly complex mathematics for the mining of new coins, there is a finite number or bitcoins that can be created, and with existing computing power, it is anticipated that the discovery, or mining, of new coins will continually take longer and longer until it reaches its final amount (estimated at ~100 years at the current pace).   So, if quantum computers are built which can mine faster, this end date may be accelerated, but the total number of possible bitcoins won’t change.

  1.  How can I Learn More?

There are many excellent resources available including articles, papers, on-line tutorials, books, and other resources.  Please sign up to receive this blog as new posts are written and/or visit this section of the Quantum Leap blog for links to some additional resources.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

IBM’s roadmap for scaling quantum technology | IBM Research Blog, retrieved January 16, 2022.

Scaling IonQ’s Quantum Computers: The Roadmap, retrieved January 16, 2022.

Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When “If” Turns to “When” in Quantum Computing”, Boston Consulting Group, July 2021.

Harnessing the Power of Quantum Computing | Honeywell Beyond 2021, accessed January 9, 2021

Starting the Quantum Incubation Journey with Business Experiments”, Digitale Welt Magazine, accessed January 16, 2022

The First Annual Report on Enterprise Quantum Computing Adoption, Zapata Computing, July 5, 2022.

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.

Quantinuum – Company Evaluation

The Quantum Leap

January 9, 2022

When I established this blog in November of last year, I noted that I would present posts regarding details underlying Quantum Computers (QC), the immense potential they hold, and advances being made.  I hope you have enjoyed those posts which will continue (see here for a link to prior posts), but I also stated an intention to reflect on current events, companies and breakthroughs.  I thought it fitting that Quantinuum be the first company profile presented in this series. 

Background

IIn June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to merge with Honeywell Quantum Solutions (HQS), a division of Honeywell. Then in November of 2021, Honeywell spun out the combined businesses into a new stand-alone company called “Quantinuum”. In addition, Honeywell invested $300m into Quantinuum which is now 54% owned by Honeywell and 46% by CQC shareholders.

CQC, founded in 2014, is a global Quantum Computing software company which designs enterprise applications in the areas of quantum chemistry, machine learning and cybersecurity, among others.  Honeywell is a Fortune 100 multinational conglomerate with operations in aerospace, building technologies, performance materials, and safety and productivity solutions.  Its diverse industrial footprint included expertise in cryogenic systems, ultra-high vacuum systems, photonics, RF (radio-frequency) magnetic systems, and ultra-high precision control systems, all of which turned out to be extremely well suited for building a quantum computer.  In ~2010 Honeywell Quantum Solutions was secretly formed, reached some critical technical milestones in 2015 and was publicly disclosed in 2018. By 2020 HQS released the “Model H1”, a modest 10-qubit trapped-ion QC and it has been on an aggressive timetable for scaling up its QC portfolio, recently showcasing the achievement of quantum volume of 2,048 using its 12 qubit Model H1-2 which was a 10x increase in quantum volume in less than one year.

Details on Honeywell Quantum Solutions

Leveraging its 130 years of innovation including strengths in science, engineering and research, Honeywell has developed trapped-ion quantum computers using individual, charged atoms (ions) to hold quantum information. Their system uses electromagnetic fields to hold (trap) each ion so it can be manipulated and encoded using microwave signals and lasers.  These trapped-ion qubits can be uniformly manufactured and controlled more easily compared to alternative qubit technologies that do not directly use atoms and it does not require cryogenic cooling (although an ultra-high vacuum environment is required).

In October of 2020, HQS introduced its first quantum computer, the System Model H1, which featured 10 fully connected qubits and a quantum volume of 128, which was the highest reported at the time (surpassing IBM’s prior record of 64).  By this past December, the Model H1-2 successfully passed the quantum volume benchmark of 2,048, a new global record and consistent with the Company’s stated timeline of annual 10x increases in quantum volume.  The hardware roadmap includes four key milestones to be achieved before the end of the current decade:

  1. Model H1: Creation of a linear device with 10 computational qubits [achieved], eventually scaling to 40 qubits.
  2. Model H2: Using the same lasers to perform operations on two sides of a racetrack configuration.  Once achieved, quantum volume should exceed that possible with classical computers (i.e., will not be able to be simulated on classical machines).
  3. Model H3: Geometries change to a grid, which will be much more scalable than linear or racetrack configurations.
  4. Model H4: Aim to integrate optics via photonic devices that allow laser sources to be an integrated circuit. 

The following chart showcases the planed roadmap:

Source: Honeywell

Details on Cambridge Quantum Computing

The team at CQC has been developing the theoretical foundations of Quantum Computing for over 25 years.  They design, engineer and deploy algorithms and enterprise level applications leveraging TKET, their hardware-agnostic software development platform, along with other technologies.  They have developed application specific quantum software across a number of fields including quantum chemistry, quantum artificial intelligence and quantum cybersecurity.  Here is a brief overview of their products and solutions:

TKET: is a leading open-source development toolkit that enables optimization and manipulation of quantum circuits for current quantum computers.  As a platform-agnostic tool, TKET can integrate with most commercially available quantum hardware platforms including IBM, Honeywell, Google, IonQ and others, as well as third-party quantum programming tools including Cirq, Qiskit and Pennylane.

Quantum Origin: is an industry-defining, cryptographic key generation platform that employs quantum computers to generate quantum-enhanced cryptographic keys.  Using the Quantum Origin platform, both classical algorithms (e.g., RSA or AES) and post-quantum algorithms (e.g., CRYSTALS-Dilithium and Falcon) can be seeded to provide cryptographic keys that offer superior protection against even the most powerful of adversaries (see more on Quantum Origin below in the Strangeworks Collaboration section).

QNLP: The rapidly emerging field of Quantum Natural Language Processing, and its underlying theoretical foundations, has been pioneered by the team at CQC. lambeq is the world’s first software toolkit for QNLP capable of converting sentences into a quantum circuit. It is designed to accelerate the development of practical, real-world QNLP applications, such as automated dialogue, text mining, language translation, text-to-speech, language generation and bioinformatics. Their structural approach takes advantage of mathematical analogies between theoretical linguistics and quantum theory to design “quantum native” NLP pipelines. Combined with advances in Quantum Machine Learning (QML), CQC has successfully trained quantum computers to perform elementary text classification and question-answering tasks, paving the way for more scalable intelligence systems.

Quantum Artificial Intelligence: (QAI) is one of the most promising and broadly impactful application areas of quantum computing. CQC is simultaneously pioneering the highly interconnected areas of quantum machine learning, quantum natural language processing, quantum deep learning, combinatorial optimization and sampling (i.e., Monte Carlo simulations) to build intelligence systems of the future.

QA: The Quantum Algorithms division is seeking to realize definitive and unequivocal quantum computational advantage as soon as possible. Although ultimately interested in all quantum algorithms, at present, the focus is on three problems which show promise for early quantum advantage, including Monte Carlo estimation, optimization and solving Partial Differential Equations (PDEs).

QML: The Quantum Machine Learning division, in collaboration with industrial, academic and governmental partners, designs and engineers novel, application-motivated Quantum Machine Learning algorithms across industries such as finance, healthcare, pharma, energy and logistics.

EUMEN: Currently in advanced beta testing, EUMEN is an enterprise-grade quantum computational chemistry package and development ecosystem, enabling a new era of molecular and materials simulations. Developed in close collaboration with Fortune 500 partners, EUMEN’s modular workflow enables both computational chemists and quantum algorithm developers to easily mix and match the latest quantum algorithms with advanced subroutines and error mitigation techniques to obtain best-in-class results. Current applications in development with clients include new material discovery for carbon sequestration, drug design and discovery, and hydrogen storage.

The Combined Companies as Quantinuum

Quantinuum has the benefit of CQC’s software and algorithm expertise combined with HQS’s hardware expertise, creating the largest full-stack dedicated quantum computer company.  Quantinuum has about 400 employees in 7 offices in the US, UK and Japan.  On the hardware side, the Model H series of quantum computers are available via the cloud, facilitating broad access and ensuring it is “future-proof” for customers as the product evolves and advances.  On the software side, the open-source platform-agnostic approach will continue, ensuring customers always have access to the best tools for the target application and will not be dependent on a single company’s machines.

The predecessor companies had a long history of collaboration.  In fact, CQC was the first external user to run a quantum circuit on the System Model H0, Honeywell’s inaugural commercial system.  No organization outside of Honeywell had used the H-Series hardware more than CQC, so the formal combination of the businesses seems like a natural extension of their legacy collaborations.  By spinning the business out into a stand-alone company, you can expect to see a Quantinuum IPO some time this year.

Strangeworks Collaboration

“Quantum Origin” is the first commercially available product based on verifiable quantum randomness, a capability essential to securing existing security software and to protect enterprise systems from threats posed by quantum computing-based attacks.  Just this past week, Strangeworks, a global quantum computing software company, announced a collaboration to implement Quantinuum’ s quantum-enhanced cryptographic keys into the Strangeworks ecosystem.  By implementing Quantum Origin, Strangeworks will be the first to implement a seamless path to quantum-generated cryptographic keys and it expects to expand the relationship between the parties enabling rapid adoption, insights and continued development.  

Select Customer Usage Cases

Quantinuum has listed a few case studies on their website,  including the following:

Nippon Steel: Has collaborated with the Company to optimize scheduling.  As the recent global supply-chain disruptions have highlighted, complexities in managing manufacturing and supply often requires companies to juggle resources.  Nippon Steel produces over 50 million metric tons of steel annually and has been using an algorithm co-developed with Quantinuum and run on a System Model H1, to schedule the intermediate products it uses.  Having the right balance of raw materials and intermediate products is essential and is a delicate balancing act facilitated by Quantinuum. 

Samsung: The electronics giant teamed up with Imperial College London to investigate new battery materials using a System Model H1. The team created a simulation of the dynamics of an interacting spin model to examine changes and effects of magnetism.  They were able to run deep circuits and use as many as 100 two-qubit gates to support the calculations, confirming the Model H1 can handle complex algorithms with a high degree of accuracy.

BMW: Entropica Labs, a Singapore-based quantum software startup, and the BMW Tech Office, teamed up to develop and run a Recursive Quantum Approximate Optimization Algorithm (R-QAOA) to benchmark logistics and supply chain optimization via number portioning, a classic combinatorial problem that is an entry point to many logistics challenges.  More complex versions of R-QAOA are now being explored.

This is just a small sampling of current projects and customers, with more than 750 overall collaborations currently underway, suggesting substantial customer uptake and potential.

Summary

Cambridge Quantum Computers and Honeywell Quantum Solutions were each already formidable players in the evolving QC space and have been generating meaningful revenues from this nascent field. CQC is/was a reputable and well-established quantum software and algorithm provider and HQS has created advanced QC devices which continue to scale and surpass performance records.  Assuming they can achieve synergies as a combined company, the upward trajectory should accelerate.  That said, the QC industry is still quite immature, and many players are dedicating substantial resources, so any early market leads will remain vulnerable to new technologies or competitive advances.  If Quantinuum can successfully leverage the broad client portfolio and historical industrial legacy of Honeywell with the substantial history and success of CQC, it should remain a leader in this growing field.  The following table highlights some of the key attributes of Quantinuum:

Rating

Apropos of the probabilistic nature of quantum algorithms, I wanted to leverage the nomenclature to create a company rating system and assign a scale to my overall assessment of a company’s potential.  Accordingly, I am going to use the formula below when reviewing companies, whereby the “alpha” coefficient correlates with “positivity” (and the formula adheres to the Born rule).  Given my overall assessment of Quantinuum including its strong position as a full-stack player, the strengths of the legacy businesses and the potential synergies, I am assigning the highest rating to Quantinuum at this time, with an Alpha of 0.95 which equates to an “Exceptional performance expected”.

Disclosure: I have no beneficial positions in stocks discussed in this review, nor do I have any business relationship with any company mentioned in this post.  I wrote this article myself and express it as my own opinion.


References:

Our Technology – Cambridge Quantum, retrieved January 8, 2022.

Strangeworks and Quantinuum partner to integrate world’s first quantum-enhanced cryptographic key service – Strangeworks, retrieved January 8, 2022.

TQD Exclusive: Interview with Tony Uttley, President of Honeywell Quantum Solutions, Kirmia, Andrew, May 3, 2021.

Cambridge Quantum Computing, Pitchbook profile, accessed August 2, 2021

Next Few Months Will Demonstrate Quantum Cybersecurity Value of the New Quantum Computing Company Quantinuum, The Qubit Report, December 3, 2021

If you enjoyed this post, please visit my website and enter your email to receive future posts and updates:
http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC).  For more of his thoughts about QC please visit the link to the left.  For more information about his firm, please visit
Corporate Fuel.  Russ can be reached at russ@quantumtech.blog.