Quantum Quantum Everywhere

Quantum Mechanics in Everyday Life, Near Term Use and Future Quantum Computing Applications

In prior posts, I conveyed some of the underlying reasons why Quantum Computers can do things that existing digital computers cannot do or would take prohibitively long to do. In this post I will cover some of the near-term use cases for Quantum Computing, but first I want to cover how “Quantum” or, specifically, the quantum mechanics underlying the power of Quantum Computing, is already used in our daily lives, some near term applications where quantum effects are providing powerful new capabilities, and finally, where the power of Quantum Computing will likely have the most impact.

Quantum Mechanics in Everyday Life

Anyone trying to learn about Quantum Computing or quantum mechanics is likely baffled by how to picture it in your head in a relatable way. Because quantum mechanics occurs on a scale so small and the physics are wholly unfamiliar, it is an intimidating field and very difficult to visualize. However, we use and benefit from quantum mechanics every day without understanding the underlying physics. Here are some examples (Choudhury, 2019):

  1. Electronic Appliances: If you notice the heating elements in your toaster, you are witnessing a quantum mechanical application of electricity as evidenced by the red glow which is the power being converted to heat.
  2. Computers (transistors and microchips): The core transistors in every computer (and in the chips used in many other modern products) work via semiconductors, where the electrons behave like waves, which is a core principle of quantum physics.
  3. LED’s:  Like transistors, LED’s are made of two layers of semiconductor, which are caused to meet and to release the energy applied by the power source, again a quantum physical action.
  4. Lasers: Lasers produce their monochromatic light via a form of optical amplification based on the stimulated emissions of photons, another quantum physical process.
  5. MRI’s: Magnetic Resonance Imaging works by flipping the spins in the nuclei of hydrogen atoms.
  6. GPS: the ubiquitous Global Positional System, where the interconnected satellites, using atomic clocks, use principles of quantum theory and relativity to measure time and distance.
  7. Incandescent Bulbs: Like with the toaster noted above, current passes through a thin filament and makes it hot, which causes it to glow, which creates visible light – all quantum mechanical processes.
  8. Sensors: Nearly all of us have digital cameras or use the cameras in our phones.  These cameras use a lens to collect and convey photons, which the sensor, a form of semiconductor, converts to a digital image.

Hopefully, these examples give you the confidence to appreciate that quantum physics impacts your everyday life without any need to understand the underlying physics.  Let’s use that baseline to now explore applications of quantum physics in quantum sensing, quantum communications and, finally, Quantum Computing.

Quantum Sensing

Quantum sensing has a broad variety of use cases including enhanced imaging, radar and for navigation where GPS is unavailable.   None of these uses require entanglement, so these are much nearer to actual utilization than robust Quantum Computers. 

Probes with highly precise measurements of time, acceleration, and changes in magnetic, electric or gravitational fields can provide precise tracking of movement.  In this case, if a starting point is known, the exact future position is also known, without the need for external GPS signals, and without the ability for an adversary to jam or interfere with the signals, so this is of particular interest to the military.

Another application of quantum sensing involves ghost imaging and quantum illumination.  Ghost imaging uses quantum properties to detect distant objects using very weak illumination beams that are difficult for the target to detect, and which can penetrate smoke and clouds (Shapiro, 2008).  Quantum illumination is similar and can be used in quantum radar. 

Tabletop prototypes of these quantum sensing applications have already been demonstrated and have the nearest-term commercial potential (Palmer, 2017).

Quantum Communication

The primary near-term application of quantum mechanics in communications involves quantum key distribution (QKD).  QKD is a form of encryption (more on encryption below) used between two communicating parties who encode their messages in transmitted photons.  Due to the quantum nature of photons, any eavesdropper who intercepts a message encoded with QKD will leave a telltale sign that the data stream was read since the act of viewing a photon alters it (a fundamental principle of quantum dynamics).  For this reason, quantum-secure communication is referred to as “unhackable”.  This principal has already been shown over fiber optics and across line-of-sight towers (both of which have limitations on distance) and has recently been demonstrated by China via satellite.  China launched the Mozi satellite in 2018 and beamed a completely secure QKD encrypted message between China and Austria (Liao et al., 2018).  And this past month, the CAPSat, quantum communication satellite, a collaboration between University of Illinois Urbana-Champaign and the University of Waterloo, was placed into orbit by the ISS, and is designed to test unhackable quantum communications.  So long-range quantum communication is already becoming a reality (Schwink, 2021). 

Quantum Computing

So far in this post I have shown you how quantum physics already impacts your everyday life as well as some new applications that are already in use or have shown success via prototypes, so will be utilized near-term.  The least commercially developed feature of quantum physics, but the most profoundly beneficial, involves the superposition and entanglement of qubits in Quantum Computing [covered in detail in the prior post].

I want to make clear that “Quantum Computers” are not all-powerful supercomputers that will replace existing binary-based computers.  An essential feature of Quantum Computing lies in the exponential increase in its computing power as you increase the number of entangled qubits which distinguishes it from digital computing for certain types of calculations or problems.  The most fundamental areas where this exponential speedup is valuable applies to an area known as combinatorics.  Let me provide an example to set the stage for this discussion.

Assume you manage a networking group, and you are planning the seating chart for this month’s meeting where eight members are going to attend. You want to arrange the seating so that you help optimize the networking opportunities as well as respect seniority by having certain members sit facing the door, etc. (the reasons are not important, just assume that the seating chart has many nuances). You may think this is an easy exercise – for example, put Alice and Bob next to each other, but not next to Charlie since they already know each other.  Put Sam closest to the door, etc.  However, it turns out that there are more than 40,000 different seating arrangements with just 8 people (for those trying to decipher the math, it is 8! or 8 factorial, meaning place any of the 8 attendees in the first seat, then any of the 7 remaining attendees in the next seat, etc., or 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 = 40,320 different seating combinations).  This may seem more complicated than you expected, but intuitively you may feel that you could work it out if you had to.

However, imagine that at the next month’s meeting you have 16 members attend and want to be equally diligent in the seating arrangement.  For this meeting there are now 20,922,789,888,000 different seating arrangements possible, or more than 20 trillion!  With just 16 people (16x15x14x….). This defies logic but is simple factorial math.  Now, I am not suggesting we need Quantum Computers to help with seating charts, but a seating chart represents a typical “optimization” challenge. For certain instances, as you increase the number of inputs, the potential combinations become unmanageable very quickly, hence the reference to combinatorics

Where will Quantum Computers Provide Near-Term Results?

The superposition and entanglement of qubits enables Quantum Computers to consider many combinations simultaneously instead of linearly, hence the tremendous speed-up in processing.   Let’s now dig into two areas where Quantum Computers can use these speedup features to provide a “quantum advantage” in the ability to process currently unmanageable combinatorial problems, namely simulation/optimization and cryptography. 

Simulation and Optimization

For optimization, you can imagine our networking seating problem as analogous to molecular modeling for things such as drug development or materials science.

PASIEKA/Getty Images

In these cases, as you tweak the atoms or molecules or proteins you are studying, the numbers of different alignments or configurations increases quickly, like shown with the seating chart example.    A powerful Quantum Computer could simulate and evaluate many potential configurations simultaneously and could dramatically accelerate advances in these fields.  Here are some examples where Quantum Computers can accelerate computational problems:

  • Simulation: Simulating processes that occur in nature and are difficult or impossible to characterize and understand with classical computers, which has the potential to accelerate advances in drug discovery, battery design, fertilizer design, fluid dynamics, weather forecasting and derivatives pricing, among others.
  • Optimization: Using quantum algorithms to identify the best solution among a set of feasible options, such as in supply chain logistics, portfolio optimization, energy grid management or traffic control.

The table below highlights additional examples of fields where Quantum Computing speedup will manifest:

Here are examples regarding a few of these applications along with some of the companies already deploying early quantum computing programs:

  • Today, most new drugs are formulated by trial and error and the time between finding a new drug molecule and getting it into the clinic averages 13 years and costs up to $2 billion. If we can use Quantum Computers to model various drugs in silico, instead of through the trial and error of lab experiments, we could shorten this timeline and decrease the overall costs. Recently, healthcare giant Roche announced a partnership with Cambridge Quantum Computing to support efforts in research tackling Alzheimer’s disease. And synthetic biology company Menton AI has partnered with quantum annealing company D-Wave to explore how quantum algorithms could help design new proteins with therapeutic applications.
  • Fertilizers are crucial to feeding the world’s growing population because they allow food crops to grow stronger, bigger and faster. More than half of the world’s food production relies on synthetic ammonia fertilizer which is created by the Haber-Bosch process which converts hydrogen and nitrogen to ammonia. However, this process has an enormous carbon footprint including the energy needed to perform the conversion (some estimate this to be 2%-5% of ALL global energy production) as well as the huge amount of carbon-dioxide by-product it emits. Scientists believe that using a Quantum Computer, they could map the chemistry used by certain bacteria that naturally create fertilizers and uncover an alternative to the current synthetic fertilizers created by the Haber-Bosch process. In fact Microsoft has already demonstrated how Quantum Computers can create better fertilizer yields and has created a Quantum Chemistry Library to facilitate such research.
  • There is a global push to expand battery powered automobiles in a transition to a greener economy, but existing car batteries have limited capacity/range and long charge times. Searching for materials with better properties is another molecular simulation problem that can be better handled by Quantum Computers. That is why German car maker Daimler has partnered with IBM to assess how Quantum Computers could help simulate the behavior of sulphur molecules in different environments, with the end-goal of building lithium-sulphur batteries that are longer-lasting, better performing and less expensive than existing lithium-ion batteries.
  • The “traveling salesman problem” generally describes the challenge of optimizing the routing for businesses, another area where combinatorics makes the problems exponentially difficult to resolve as inputs are added. For example, a fleet of more than 50,000 merchant ships carrying 200,000 containers each, with a total value of $14 trillion dollars, is actively in motion each day. Energy giant ExxonMobil has teamed up with IBM to find out if Quantum Computers could do a better job optimizing these routes and related logistics.

In the next blog I will cover additional details on the players currently working with Quantum Computers for these and similar applications.

Encryption

Another field where Quantum Computers will have a profound impact is for encryption.  Nearly every time you log into a site on your computer, perform on-line banking transactions or when governments send confidential communications between entities, such activity is “on the web” meaning accessible to others.  It is protected by an encryption protocol developed by Ron Rivest, Adi Shamir and Leonard Adleman in 1977 and known as RSA public-key encryption.   

In a very truncated description, the foundation of the RSA encryption lies in the fact that it uses two very large prime numbers to create a “factoring problem”.    Here is an over-simplified explanation:

  1. A sender uses a very large number (the product of two large prime numbers) to encrypt or encipher a message.  This is known as the Public Key.
  2. The encoded message along with the Public Key are sent over the Internet (in theory, anyone can see/read these).
  3. The Sender and a Receiver communicate a Private Key in a secure manner.  This Private Key is the two prime factors used to create the Public Key.
  4. The Receiver uses its Private Key to decrypt or decipher the message.

The encoded message cannot be decoded without knowing this “private key”. Said another way, finding the two prime factors of a very large number is exceedingly difficult, so if the RSA Encipher key is based on a sufficiently large number (i.e., 2048 bits which is over 600 digits long), it is practically impossible with current computers to find the two prime factors. However, in 1994, mathematician Peter Shor proposed an algorithm that could factor large numbers into their primes in much shorter polynomial time. In fact the algorithm he created is open source and available on the Internet for anyone to download. [For those of you interested in seeing the actual code, you can visit here: [GitHub implementation of Shor’s algorithm written in Python calling Q# for the quantum part]. Existing Quantum Computers only have the power to factor fairly small numbers, but the code is readily available for whomever creates a powerful enough Quantum Computer to use it to break existing RSA encryption.

Cryptocurrency mining and wallets are also areas which could be vulnerable to Quantum Computers. Bitcoin and other cryptocurrencies are “mined” by computers that crunch increasingly complex algorithms which result in the creation of new bitcoins (and which is why bitcoins consume increasing amounts of power). As levels of cryptocurrency are deciphered, the code to uncover the next round of coins increases in complexity. By some estimates, the current bitcoin protocols will take another 120 years to mine the remaining coins, so once Quantum Computers are powerful enough they could mine the remaining coins much faster. In addition, the wallets that most people use to hold their cryptocurrency have similar vulnerabilities as described above regarding encryption.

I hope this post helps you appreciate how quantum mechanics already affects your everyday life and to begin to appreciate areas where Quantum Computers will have a profound impact.   Stay tuned for a deeper dive into this subject.


References:

Jean-Francois Bobier, Matt Langione, Edward Tao and Antoine Gourevitch, “What Happens When ‘If’ Turns to ‘When’ in Quantum Computing”, Boston Consulting Group, July 2021.

Bodur, Hüseyin and Kara, Resul ,“Secure SMS Encryption Using RSA Encryption Algorithm on Android Message Application.”, 2015.

Bova, Francesco, Goldfarb, Avi & Melko, Roger G., “Commercial applications of quantum computing,” EPJ Quantum Technology, 2021.

Cavicchioli, Marco, “How fast can quantum computers mine bitcoin?” The Cryptonomist, May 12, 2020.

Choudhury, Ambika, 8 Ways You Didn’t Know Quantum Technology Is Used In Everyday Lives (analyticsindiamag.com), October 7, 2019.

Leprince-Ringuet, Daphne, “Quantum computers: Eight ways quantum computing is going to change the world,” ZDNet, November 1, 2021.

Liao, Sheng-Kai, Cai, Wen-Qi, Pan, Jian-Wei, “Satellite-to-ground quantum key distribution,” Nature, August 9, 2017.

Palmer, Jason, “Here, There and Everywhere: Quantum Technology Is Beginning to Come into Its Own,” The Economist, 2017.

Parker, Edward, “Commercial and Military Applications and Timelines for Quantum Technology” Rand Corporation, July, 2020.

Schwink, Siv, “Self-annealing photon detector brings global quantum internet one step closer to feasibility,” University of Illinois Urbana-Champaign Grainger College of Engineering, October 13, 2021.

Quantum Superposition and Entanglement

In prior posts I emphasized the excitement and potential of Quantum Computing, without any reference to the underlying quantum mechanics, but would like to introduce you to some unique quantum properties in this post. While understanding the nature of Quantum Computing is complex and contains many new concepts, a basic understanding of “Superposition” and “Entanglement” is fundamental to grasp why this new computing methodology is so novel, powerful and exciting.  I am going to try to describe these concepts in a way that does not require any math, although I will use some math references to highlight how these concepts manifest.

Superposition

As noted in the prior post, one of the fundamental differences between Quantum Computers and classical computers lies in the core components used to process information.  We know that classical computers use a binary system, meaning each processing unit, or bit, is either a “1” or a ”0” (“on” or “off”) whereas Quantum Computers use qubits which can be either “1” or “0” (typically “spin up” or “spin down”) or both at the same time, a state referred to as being in a superposition.  This is a bit more subtle than it sounds because in order to use qubits for computational purposes, they need to be measured and whenever you measure a qubit you will find it collapsing into either the 1 or 0 state.  But between measurements, a quantum system can be in a superposition of both at the same time.  While this seems counter-intuitive, and somewhat supernatural, it is well proven so please try and accept it at face value in order to get the gist of the other concepts covered in this post.  For a deeper dive into superposition from a particle physics perspective (light is both a particle and a wave), you can investigate Wave–particle duality. [Fun Fact – Einstein did not receive the Nobel prize for his famous E=MC2 relativity equation but rather for his photoelectric effect work, which is fundamental to quantum mechanics, where he postulated the existence of photons, or “quanta” of light energy which underpins much of the power behind Quantum Computing].

Without getting into the physics or explaining complex numbers, Superposition can be mathematically depicted as:

Please bear with me here, I have promised not to overwhelm you with complex math, I only want to highlight how to think about Superposition in a way that will help you appreciate its power for computing and to share the nomenclature that is generally used in the field.  Don’t focus on the Greek characters (psi, alpha and beta) or the linear algebra notation (the |Ket> notations and the parenthetical portion).  Simply note that the equation above on the left, is the mathematical representation of a qubit and is simply stating that there is a probability that a given qubit (the Psi or trident symbol) when measured is “0” (the alpha symbol) and a probability that the qubit is a “1” (the beta symbol).  The equation on the right, known as the Born rule, is simply stating that the two probabilities add to 100%.  Let me reframe that in a simpler manner. Before a qubit is actually measured, it is both a “1” and a “0” at the same time and the relative odds of it being one or the other are included in the qubit equation. In practice this means that using Quantum Computers to solve problems becomes a probabilistic analysis, and if the equations are run enough times, they will average out to the answer.

You may recall in a prior post that qubits are described as 3-dimensional, as shown below in blue and red. The line drawing version with the funny symbols, shows how this is used for calculations when put into a superposition (the math and symbols are helpful for those comfortable with them, but not essential for a general understanding):

In this depiction, if the North pole is “0” and the South pole is “1” and if the qubit is tilted to the side, the degree of its tilt translates generally into these probabilities.   In the image with the blue arrow pointing to psi, you will notice that the psi symbol looks like it is leaning between north and south, in this case closer to north.  For example, this might be represented as “0.8 |0> + 0.6 |1>” meaning it is leaning closer to “0”. This generally means it would have a higher probability of being 0 when measured. [You will also note that if you square each term, you get 0.64 + 0.36 which equals 1, and therefore follows the Born rule, and roughly means that the odds of this qubit being a 0 are 64% and the odds of it being a 1 are 36%.]

The important part to take away is that since a qubit can represent an input with various weightings of 1 and 0, it contains much more information than a simple binary bit.

Entanglement

If the above explanation of superposition seems a bit unintuitive, I’m afraid that entanglement might seem even more bizarre [don’t worry, you’re not alone, even Einstein struggled with it], but I will do my best to explain it.  I will ask you to accept that, like Superposition, this is an actual phenomenon which is well proven, even if you likely won’t be able to picture it in your mind in a way that will be satisfying.  And it is this feature of quantum mechanics that largely underpins the awesome power of Quantum Computers because it enables the information processing to scale by an exponential factor (more on that below).

Quantum entanglement is a physical phenomenon that occurs when a group of particles are created, interact, or share proximity in such a way that the particles are “connected,” even if they are subsequently separated by large distances.  Qubits are made of things such as electrons (which spin in one of two directions) or photons (which are polarized in one of two directions), and when they become “entangled“, their spin or polarization becomes perfectly correlated.  In fact, this concept is much older than Quantum Computers and was first described in 1935 by Einstein, along with Boris Podolsky and Nathan Rosen, and become known as the EPR (each of their initials) paradox, which Einstein famously referred to as “spooky action at a distance”.  What it means, simply, is that qubits can be made to entangle, which then enables them to correlate with each other. Quantum Computers use microwaves or lasers to nudge the qubits into a state/alignment where they can be correlated, or entangled.

Now let’s walk through how this entanglement can manifest in an exponential increase in computing power.  If we consider two classical bits, we know that they can be either 1 or 0, so together they can take on the following values:

                0, 0

                1, 0

                0, 1

                1,1

However, two entangled qubits can take all of those values at once, because of the entanglement, so in this case 2 qubits can take the value of 4 bits.  If we consider three classical bits, they can be any of the following combined entries:

                0, 0, 0                    1, 0, 0

                0, 1, 0                    1, 1, 0

                0, 0, 1                    1, 0, 1

                0, 1, 1                    1, 1, 1

So, in this case there are 8 combinations, but this can be fully described using 3 qubits (again because they are entangled).  Mathematically, the number of bits required to match the computing power of qubits is as follows: n qubits = 2n bits or an exponential relationship.  For now, don’t try and picture how entangled qubits do this, just know that they do.  The purpose of this line of analysis is to give some numerical context as to why this entanglement makes Quantum Computers (phenomenally) more powerful than classical computers.

If we continue the logic above for increasing numbers of bits/qubits we get the following:

So it only takes 13 qubits to have the equivalent classical computing power as a kilobyte (KB). Now let’s see how that manifests in computer power/speed.

Let’s assume we have a pretty powerful current classical computer processer, which might have a clock speed of 4 GHz, which means it can execute 4 BILLION cycles per second which sounds (and is) phenomenally fast.  High end current gaming PC’s generally operate at this speed and provide an excellent performance experience.  Let’s now use this baseline processing speed, and scale up the prior table, to see the profound impact of exponential computing power on processing time:

To give this some context, 100 qubits is the equivalent of an Exabyte of classical computing RAM which is a million trillion bytes (18 zeros). It would take a powerful classical computer nearly the lifetime of the universe to process that amount of data! The corollary is that quantum computers can perform certain complex calculations phenomenally faster than classical computers, and this concept of entanglement is a key to the performance superiority.

While this analysis assumes certain types of computer analysis/equations, the point is to show how Quantum Computers can process information at an unprecedented speedup

The key takeaway from this post is that Quantum Computers, using qubits that can be both in superposition and entangled, allow these machines to process inputs much faster than is possible with classical computing architecture.  Now all we need is a reliable working Quantum Computer with just 100 qubits, which would not only enable massive speedup for certain problems but would open the door to all sorts of new questions and analyses. Many experts predict this will be achieved within 5 years (IonQ has recently showcased a Quantum Computer with 32 entangled qubits, so real progress is being made).  Some general categories of problems where this phenomenal speedup will have a profound impact include simulation, optimization and encryption. In the next post I will provide more insights into what types of problems can be solved with the exponential speedup that can be provided by Quantum Computers.

References:

https://towardsdatascience.com/the-need-promise-and-reality-of-quantum-computing-4264ce15c6c0

https://vincentlauzon.com/2018/03/21/quantum-computing-how-does-it-scale/

What is a Computer? – Analog vs Digital vs Quantum

Before we can get into the inner workings of a Quantum Computer, we should make sure we are in alignment on what a computer actually is.  At its core, a computer is a machine that is designed to perform prescribed mathematical and logical operations at high speed, and display the results of these operations.  Generally speaking, mankind has been using “computers” in the form of the abacus, since circa 2700 BC.  Fast forward a few millennia, and we see the first “programmable computer” invented by Charles Babbage in 1833.  It then took another 100+ years for the first working “electromechanical programmable digital computer” or the Z3, to be invented by Konrad Zuse in 1941.

During World War II, a flurry of advances occurred, including the usage of vacuum tubes and digital electronic circuits, and the development of the famously depicted Enigma, which was used to break the encryption of German military communications. This was soon followed by Colossus in 1944, which was the first “electronic digital programmable computer” which was also used for military advantage. Enigma and Colossus were built in Bletchley Park in the UK, while ENIAC was the first such device built in the US and which was used extensively from 1943-1945. It weighed 30 tons and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors and inductors, but could add or subtract 5,000 times per second, which was a thousand times faster than any prior device and could handle multiplication, division and square roots. In many ways, the tangle of cables and electronics noted in photos of ENIAC (below left) seem eerily similar to the photos of today’s Quantum Computers (below right):

The next big advance in computers came with the invention of the integrated circuit in the 1950’s.  By 1968 the first silicon-gate semiconductor integrated circuit was developed at Fairchild Semiconductor, generally known as the MOSFET (short for metal oxide semiconductor field effect transistor) and is the core technology underpinning most current “digital computers”. 

Moore’s Law

The first MOSFET semiconductors built in 1971 had process nodes that were 10 microns in size, which is a fraction of the width of a human hair (which is about 50-70 microns).  Gordon Moore was a co-founder of Fairchild Semiconductor, and in 1975 he postulated that the number of transistors that could fit on an integrated circuit would double every two years, implicitly suggesting that the costs would thereby decrease by a factor of two.  This log-linear relationship was estimated, at the time, to continue for ten years but has amazingly been fairly consistent through today, meaning it has held for nearly 50 years.  However, in order for this rule/law to be in effect, the size of the process nodes needed to continue to shrink.  In fact todays generation of MOSFET includes 5 nm nodes (“nm” or nanometer is one-billionth the size of a meter), which is 1/2,000th the size of the first MOSFET nodes.  Ironically, as these size scales continue to shrink, they begin to approach “quantum scale” whereby the electrons being used in the processors begin to exhibit quantum behaviors thereby reducing their effectiveness at processing, in traditional digital devices, due to quantum tunneling.

While Moore’s Law has been amazingly prescient and consistent for these many decades, there is a theoretical minimum size that can’t be breached efficiently utilized for transistors, largely because of these scale/quantum limitations.  While the 5nm processor size is the current working minimum for semiconductors, and there are 3nm and even 2nm transistor scales in development, it appears that there is some end likely in sight, likely due to this quantum tunneling challenge at such scales.  The graphic below[1] shows the uncanny straight line (dark blue) of transistor scale.  However, the light blue and brown lines show some recent plateauing of maximum clock speed and thermal power utilization, indicating the declining efficiency as scale reduces.

Analogue vs Digital vs. Quantum

Readers that lived through the 90’s are likely familiar with the transition from “analogue” to “digital”.  This manifested most notably in the music industry, with the replacement of analogue phonograph records to digital discs and streamed digitized music. I won’t get into the audiophile arguments about which sound was purer but highlight this item to emphasize the “digitization” of things during our lifetime.

In the prior blog post I noted that computers used digital gates to process logic (i.e., AND, NOT and OR gates).   However, each of these gates can be performed by analogue methods and can be simulated using billiard balls, which was proposed in 1982 by Edward Fredkin and Tommaso Toffoli.  While this is a highly theoretical construct that assumes no friction and perfect elasticity between balls, I point it out because it shows that although current digital computation is amazing, efficient and powerful, it is just a sophisticated extension of basic analog (i.e., particle) movements.  Let me briefly walk you through one example to emphasize this point.

Picture two billiard balls entering a specially constructed wooden box. When a single billiard ball arrives at the gate through an input (0-in or 1-in), it passes through the device unobstructed and exits via 0-out or 1-out. However, if a 0-in billiard ball arrives simultaneously as a 1-in billiard ball, they collide with each other in the upper-left-hand corner of the device and redirect each other to collide again in the lower-right-hand corner of the device forcing one ball to exit via 1-out and the other ball to exit via the lower AND-output. Thus, the presence of a ball being emitted from the AND-output is logically consistent with the output of an AND gate that takes the presence of a ball at 0-in and 1-in as inputs.[2]

Similar physical gates and billiard balls could be constructed to replicate the OR and NOT gates.  As you may recall from the prior blog, all Boolean logic operators can be created using combinations of these three gates, so a theoretical computer constructed entirely of wood and billiard balls, could replicate the results of any existing computer. 

Admittedly, this is a theoretical construct, but I cite it to point out that while our current digital computers are amazingly powerful and fast and have led to countless advances and improvements in our daily lives, today’s digital computers, at their essence, are somewhat simplistic.  The “digitization” vastly improves speed and the ability to stack gates for interoperability and thereby tackling increasingly complex processes, but there are certain limits to their capabilities (I will cover some specifics on speed-up and complexity in subsequent posts).

Quantum Computers, and the gates possible using qubits, are a very different animal. The underlying mechanics and processes cannot be replicated using standard analogue materials because they operate using different laws of physics. Therefore, it is not really appropriate to compare the performance of a Quantum Computer with that of a digital computer, to suggest the quantum version is more powerful or faster – it is an “apples to oranges” comparison. Stated another way, it would be like saying a light-emitting diode (LED) is a more powerful candle. It is, in fact, an entirely different form of creating light and comparisons between the two are therefore not useful.

In summary, mankind has been using different forms of “computing devices” for thousands of years and Quantum Computers are in some ways a natural extension of computing progress.  However, different laws of physics are involved and therefore Quantum Computers are in a new category of computing devices that have the potential to create new approaches to problems and novel new solutions.

In the next few posts I will dig in on where these new computing approach will provide the most benefit, and how “Superposition” and “Entanglement” are used to massively increase the computing power of Quantum Computers.


[1] The Economist Technology Quarterly, published March 12, 2016

[2] Wikipedia contributors. (2021, May 4). Billiard-ball computer. In Wikipedia, The Free Encyclopedia. Retrieved 15:51, October 25, 2021, from https://en.wikipedia.org/w/index.php?title=Billiard-ball_computer&oldid=1021387675

Why is Quantum Computing so Exciting and How Can It Be So Powerful?

If you are reading this blog you are likely already familiar with some of the essential features of Quantum Computing like superposition and entanglement.  Even if you are not, I will cover some of those details in future posts.  For now, I wanted to begin without any physics or linear algebra and instead give you some layman observations about Quantum Computing to establish a fundamental understanding of the potential that quantum computing holds.  For those of you that are more familiar with some of these underlying principles, please allow me to take some liberty as I describe them.  I am more interested at this point with conveying a general sense for why quantum computing has so much more power than classical computing (at least for certain problems), so will generalize some things in ways that may not be explicitly or literally correct.

If you are early in your quantum journey, you are likely finding it to be a lot like buying a 1,000 piece, two-sided puzzle without edges and dumping the pieces on the table.

There are a jumble of new phrases and concepts, but for now, let’s dispense with the peculiarities of why Schrodinger’s cat can be both alive and dead, why Heisenberg talks about uncertainty and why Einstein famously said that “God does not play dice with the universe.”  And, with powerful deference to Richard Feynman, the modern godfather of quantum theory, you don’t actually need to understand the specifics of quantum mechanics to grasp the enormous potential of Quantum Computing.

I think we all appreciate the awesome power of the microchip and the advances in technology (and creature comforts it has provided) without understanding the actual engineering of the silicon chips or transistors or integrated circuits that underlie current computers. 

So if you’ll indulge me for a bit here, let me try and convey the potential of the powerful new computing methodology of quantum, without reference to superposition or Josephson Junctions or coherence or error correction (I’ll cover those in subsequent posts).  Let’s start with some fundamental features of classical computer “bits” versus quantum computer “qubits”.  As you may know, classical computers use “bits” to establish a binary yes/no or on/off state which is then interpreted in algorithms to perform calculations.  These bits are combined into bytes to represent letters and numbers, among other things.  Let’s picture a bit as a one-dimensional creature (this takes some liberties, since a one-dimensional object is a line, but bear with me here).  The first fundamental difference between classical computing and quantum computing is that the qubit is a three-dimensional structure.  The following graphic showcases this difference[1]:


[1] Image from “The Need, Promise, and reality of Quantum Computing” by Jason Roell, published on Medium 2/1/18.

As the graphic shows, the Classical Bit is one of two states (here shown as 0 or 1).  For the Qubit, if “up” is considered as “0” and “down” is considered “1”, until it is measured it can be pointing in any direction among the 360-degree sphere, so think of that as holding lots of additional information besides just up or down. Or to think about it another way, a Qubit has more dimensions than a Bit and so it can process more info per step and therefore can speed up the processing.

So, the first major difference between classical and quantum computing lies in having three dimensions for the underlying data bits vs just one dimension for classical computers.   This by itself suggests enormous added potential of quantum computing.

Next, let’s dig in a bit further to how these bits/qubits are processed. Computers use the information contained in the bits to process according to its program or computer code. So, an original input is entered into a processor, and it spits out an output based on some rules contained in the processer. Each step or operation is generally referred to as a “gate” or basic computing rule. It turns out there are three gates from which ALL current classical computing processes derive their output. While not essential to this discussion, these three gates are “AND”, “OR” and “NOT”. Stacking and using combinations of these rules, or gates, can perform every Boolean logic operation that exists in classical computers [2]. Generally, you should take this to mean that ALL classical computing can currently be done with only 3 operators, which conversely suggests that the universe of possible computational abilities is somewhat limited to only three rules. Consider it like a chess pawn which can only move to one of three squares when being played. For single quantum “gates”, again without getting into the details, know that there are at least 6 gates (not that it is needed for this train of thought, but they include the three Pauli X, Y and Z Gates, the Hadamard Gate, the SWAP Gate and the CNOT or Controlled Not Gate). There may be others, and Entanglement allows gates to be conjoined, if you will, but let’s stick with 6 for this discussion. So, coming back to the chess analogy, this is like the movement potential of the queen and if we consider a contest between a classical computer, which only has pawns as its pieces, versus a quantum computer which has all queens, it is clear which would win the chess match.


[2] Technically, either the NOR (Not Or) or NAND (Not And) gates are considered “universal” meaning either can be used to reproduce the functions of the other gates, so technically, current digital computing logic can be done with ONE gate, but the analysis herein, assuming the three core gates, is still quite solid vis-a-vis the greater number of core gates applicable to quantum gates. For further details, consider Logic gate – Wikipedia

Hence, the second major difference between classical and quantum computing is a result of the essential gates that can be utilized to perform operations.  Classical uses three and quantum uses six. Think about this as enabling a Qubit to do more per process, which translates into being able to process things faster.

The final major difference between classical computers and quantum computers is rooted in the computing logic “efficiency”. We are all familiar with the physical noise that our current computer emits, which is the sound of the fan. We also know how hot a computer can get with usage. This is because classical computing processing causes a “loss” of information whenever a calculation is performed, which is lost in the form of heat. This is because classical computer gates are one-directional. Because energy is expended by the gate and information is lost, if you perform a calculation on a classical gate, you cannot run the output in reverse through the gate to get the original input. However, quantum gates are bi-directional, meaning that any output can be reversed through the gate with the original input revealed.

The final major difference between classical and quantum computing is that classical computing operators are one-directional and not reversible, while quantum operators are bi-directional. This bi-directionality suggests more processing information per operation, which also translates into being able to operate faster.

In summary we have the following:

 Classical ComputerQuantum Computer
Bit Dimensions13
Core Operators36
Logic DirectionOne-directionalBi-directional

Think about these differences as enabling a Quantum Computer to do more per step, which is another way of saying it is able to process things faster than a classical computer. As it turns out, this speed advantage is phenomenal, which is why there is such enormous potential for Quantum Computers. In future posts, I will cover how this phenomenal speed advantage manifests and why it will allow us to solve problems currently unsolvable by the most powerful supercomputers.

I hope this helps convey the potential of Quantum Computers over classical computers, without any details around superposition or entanglement or the underlying non-intuitive features of quantum mechanics. Naturally a greater knowledge of those other concepts are fundamental to the actual inner-workings of a Quantum Computer and I will get into some of that in future posts.

A little about about me and this website

I remember being intrigued by “computers” as a high school student in the late 70’s. Personal Computers (PC’s) were not yet available, but I bought a Sinclair ZX80 at Radio Shack in 1980 and tried to teach myself how to program it. I was awed by the potential, and fascinated by the details, and wrote a few simple bits of code.

In college I took some basic “business” computing (FORTRAN and COBOL) but wasn’t particularly drawn to programming and those languages were kludgy and certainly were not user friendly (I remember spending hours searching for missing punctuation marks…but I digress).

During my senior year in college I convinced my father to get me my first computer so I could write my senior thesis, which was a great tool for basic text editing, but couldn’t readily do much more. By the time I was graduating from college (1985), PC’s were beginning to become more accessible , although were still not very user-friendly. This was before GUI (graphical user interface) or WYSIWYG (what-you-see-is-what-you-get) functionality, so lots of trial and error just to print out a page. Neither my Apple IIC nor my computer at my first job (investment banking in 1987) even had a hard drive. At that first job, I put the 5 1/4″ Lotus123 version 1.0 disc in the A-drive and a blank disc in the B-drive, where the spreadsheet files were saved.

Fast forward just a few years, and there were AOL discs available EVERYWHERE. I was an early adopter of email and general on-line access (over a 1,200 baud modem for much of the early period), so was actively participating in the computing revolution, although generally passively.

However, despite being an “investment professional” I missed the opportunities to buy Microsoft or Google or Facebook. I have often regretted not doing more with my Sinclair ZX80 or my Apple IIc or my first few “IBM clones” so that I could appreciate the potential of personal computing and the Internet, which presumably would’ve helped me become an early investor. Too bad because if I had invested $5,000 into Microsoft in 1986, it would be worth $10.5m today and would be yielding $150,000 per year in dividends!

In addition to that general proximity to the computer/Internet wave, I have always been fascinated by theoretical physics. I was intrigued by relativity and quantum theory and read dozens of books on those subjects. (It helped that when in graduate school at the University of Chicago, the student across the hall from my dorm room was getting his doctorate in theoretical physics and indulged me on countless evenings, explaining yet again how length and time shrank as speed increased…) I read nearly everything written by Stephen Hawking and Brian Greene, fascinated by astrophysics, string theory and quantum dynamics.

As I began to see my theoretical physics and computing science interests merging…and started to learn about Quantum Computing, I pledged to myself not to miss out on the investment opportunities this time. So I have been on a personal journey to satisfy my cravings for learning about quantum theory and its applications to computing, while at the same time focusing on where the commercial opportunities may be.

Website Rules of Engagement

I intend to start out with some broad posts about the details underlying Quantum Computers, the immense potential they hold and some advances being made. Eventually I aim to focus more on current events, companies and breakthroughs, with an aim to helping find investment opportunities. I welcome constructive feedback and engagement. Understanding quantum theory or how Quantum Computers work is immensely difficult and challenging, so if you have evolved some proficiency and aptitude, it’s okay to pound your chest a bit. But everyone starts from the beginning, and it is generally a long, non-linear journey, so I ask anyone reading or reacting to posts to do so with some humility. There are no dumb questions and no taboo topics, so please be respectful and constructive in commenting. If I, or someone responding, make a mistake, it’s okay to point that out if it helps the broader analysis, but it’s less helpful if it’s just to prove a point or convey superior knowledge for the sake of it. My general hope is that most concepts described in this blog are readily understood by laymen and deep practitioners alike and that we can all engage in spirited discussions that help expand our collective understanding and learnings in this rapidly evolving field.

This website and these blog posts reflect some of these learnings. I’ve had success in my career in synthesizing very difficult topics into “layman” terms, so I aim to do that with Quantum via these posts. I hope they are helpful and informative and welcome feedback and discussions. Thanks for visiting and I look forward to taking this exciting “quantum leap” together.

Russ Fein, August 2021

https://www.linkedin.com/in/russfein/

@russfein

https://corporatefuel.com/community-fuel/team/russ-s-fein