In recent posts I have focused on the technical specifics of Quantum Computing and quantum sciences more generally (i.e., optical clocks and quantum in space), and showcased certain companies operating in the industry. However, for this post I want to focus on a more abstract theme.
If you ask just about any pundit or professional participating in the Quantum Computing industry, the biggest question of today, nearly universally posed is: “When will we achieve Quantum Advantage?” Quantum Advantage is generally defined as:
“The achievement of demonstrated and measured success in processing a real-world problem faster on a quantum computer than on a classical computer.”
In this writer’s opinion, this focus on having Quantum Computers do things faster focuses on the wrong attributes of Quantum Computing. It is not the speed, per se, that is the key attribute that will deliver QC value, so doing things faster than classic misses the point.
The brains and core workhorse of a classical computer is its CPU or central processing unit. CPUs are made up of integrated circuits which (to grossly oversimplify) are simply billions of on/off switches. These integrated circuits are comprised of individual “bits” which are either ‘one’ or ‘zero’ (binary) and all computer processing is rooted in Boolean logic. Specifically, there are only three fundamental gates (AND, NOT, OR). That’s it. There is an art to programming and a skill for parsing and processing information. Today’s classical computers can apply these rules incredibly fast (gaming PCs operate at ~4GHz meaning they can manipulate 4 billion bits per second). Clever programmers have found increasingly efficient and profound ways to implement programs despite having only two inputs (1 or 0) and only three logic gates. We can do AMAZING things with this somewhat limited architecture.
Now let’s switch our focus to QC. Instead of bit, Quantum Computers operate using qubits, which are the quantum version of on/off switches, however qubits can be in a superposition of both ‘on’ and ‘off‘ at the same time. They can also be entangled, they have wave functions, and they can utilize more logic gates so can perform vastly different operations. And despite what many famous physicists have proclaimed, this is not voodoo science that nobody can understand. Quantum Computers are available today albeit with limited numbers of qubits. It is fundamental physics…it’s just different from classical physics.
What does this mean and what is the thrust of this post? Since Quantum Computers operate so differently, we can ask different questions. Doing anything faster is not that novel (yeah, sure, you can break RSA encryption and do a few other notable things super-fast). SPEED is not the value-add, per se. With different physics you can (and should) ask different questions.
Let’s look at the following example to help make this point more tangibly. Imagine that you and your partner are planning a San Francisco dream vacation. You are considering staying at one of the following two hotels:
Based on the above descriptions, which hotel should you pick?
One strategy might be to score each of the features, and then add up the scores and select the hotel with the highest score. But what about trade-offs? You may not care that much about amenities if you are conveniently located. You may also really enjoy certain features, but what if the view from the room is the most important consideration? Are any of these items deal-breakers by themselves?
Now, let’s approach this problem from a quantum perspective. If one assigns a “score” to each feature, this sounds a lot like weighting or using a superposition to program each feature. There are also various trade-offs. You might be willing to sacrifice having a spa for proximity to a dynamic neighborhood, or perhaps room amenities are the most important feature and outweighs all others. These tradeoffs suggest that certain features are correlated or entangled. Most of us don’t need a Quantum Computer to select which hotel we would prefer because our brain already does an informal weighting of the various features and considers the tradeoffs, and likely factors in other subtle variables not in the chart. Interestingly, different people will choose different outcomes to the same inputs…and the same person might select a different outcome over differing times.
In this context, using superposition (weighting) and entanglement (tradeoffs) does not involve mysterious quantum physics that are beyond comprehension. It is the way our brains already work. We assess multiple variables with all sorts of subtleties on characteristics and complex “entangled” trade-offs and inter-relationships. Leveraging these features of analysis is where the “art” to QC programming will lay. This is where QC programmers will create the next generation of eBay’s, DoorDash’s, Oracle’s, Google Search’s or _______ (insert your favorite killer app).
How can superposition, entanglement, and other unique features of QCs alter the framing of the problem or the context of the answer? What are the questions nobody has ever thought to ask a computer before? Here are just a few simplistic examples:
As with the hotel choices above, QCs will be particularly valuable in problems involving weighting and tradeoffs, for example:
Given the detailed profile of each athlete in the draft pool, and matching that with the specific needs of the team, which player should be drafted?
With the following list of symptoms, what is the prognosis?
What asset portfolio gives me the best risk/reward profile?
These are generally “optimization” problems, which have been well addressed by those following Quantum Computing, and early use cases using optimization are abundant.
So here are a few others, more outside the box:
Is there a way to hear colors?
Can we leverage QCs to enhance the capabilities of AI Chatbots like ChatGPT or Google Lambda? Should we?
Can we leverage quantum effects to detect neutrinos and if so, can we then use them to create high-definition holograms?
Some of these seem fairly straight-forward, such as the pro team draft choice. There are many inputs that can be weighted and for which there are trade-offs. Could someone program a quantum computer to do a better job confirming draft choices versus a classical compute? I have to imagine that as the list of features grows substantially, that there could be a Quantum Advantage. And this would be a valuable tool to provide in that instance. But it’s the less clear-cut problems that excite me. Imagining ways to make colors impact auditory sensors, or finding solutions sets to complex problems that were never before considered. Inquiries such as these can open all sorts of amazing new potential.
So as the QC hardware makers continue to let people test things on their increasingly powerful machines, and as the QC software companies expand the capabilities of their programs, I’m excited to see what people try. What new and novel questions they ask and what interesting and unpredicted answers are returned.
What do you think the new “killer app” will be on a Quantum Computer? What question would you like a QC to answer? I’d love to hear your ideas.
Disclosure: The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In an early Quantum Leap post back in December 2021, I wrote about the various qubits being used to drive Quantum Computers (QC) and neutral atoms didn’t make it into the post. When I revisited the QC landscape in October of last year, four neutral atom companies made the list, representing a significant advancement in that modality. There are numerous strengths to this approach, which I’ll describe in greater detail below, but to pique your interest in learning more, consider the following recent announcements by neutral atom companies:
Infleqtion (f/k/a ColdQuanta) completed a $110 million Series B equity round
Professor Alain Aspect, a co-founder of PASQAL, was awarded the 2022 Nobel Prize in Physics
Atom Computer had a ribbon-cutting ceremony for the opening of its new $100 million facility in Boulder, Colorado
QuEra’s 256-qubit Aquila QC was made available on Amazon’s Bracket, its cloud-based QC platform
Why the recent surge in jaw-dropping announcements? Why are neutral atoms seeming to leapfrog other qubit modalities? Keep reading to find out.
The table below highlights the companies working to make Quantum Computers using neutral atoms as qubits:
And as an added feature I am writing this post to be “entangled” with the posts of Brian Siegelwax, a respected colleague and quantum algorithm designer (see his overview on Neutral Atoms here). My focus will be on the hardware and corporate details about the companies involved, while Brian’s focus will be on actual implementation of the platforms and what it is like to program on their devices. Unfortunately, most of the systems created by the companies noted in this post are not yet available (other than QuEra’s), so I will update this post along with the applicable hot links to Brian’s companion articles, as they become available.
Neutral Atoms as Qubits
Neutral Atoms, sometimes referred to as “cold atoms,” are built from an array of individual atoms that are trapped in a room-temperature vacuum by using lasers as optical “tweezers” to restrict the movement of the individual atoms and thereby chill them (hence the “cold atom” reference). These neutral atoms can be put into a highly excited state by firing certain laser pulses at them which expands the radius of the outer electron(s) (a Rydberg state), which can be used to entangle them with each other, among other features.
While there are a few notable differences among the approaches the neutral atom players use, there are also many similarities. The graphic below highlights the Atom Computing set-up which is representative of the broad cold atom approach. It includes two sets of lasers and related controllers and AOD’s (acousto-optic deflectors), a vacuum chamber, and a photon-sensitive camera to read results.
Let’s drill down a bit further to explain a bit more of the underlying science.
Each of the players focused on neutral atoms uses elements from either the first column of the atomic periodical table (alkali metals such as Rubidium or Cesium) or second column (alkali earth metals such as Strontium). In either case, there are equal numbers of electrons and protons among those elements and so the electrical charges balance out, hence the “neutral” label. The alkali metals have a single electron in the outer orbit whereas the alkali earth metals have two electrons in the outer orbit (some believe the two-valence electron configuration, which is a “closed shell”, provides greater stability and protection from external noise). It is the focus on these outer electrons which produce the quantum-mechanical effects that drive the algorithms or desired analog activity.
In a neutral-atom quantum processor, atoms are first heated to a gaseous cloud and then suspended in an ultrahigh vacuum via arrays of tightly focused lasers of specific wavelengths, often referred to as “optical tweezers.” Every element reacts to very specific wavelengths of light, so can be manipulated by lasers tuned to those specific wavelengths. These optical tweezers can also be used to configure the atoms into specific geometric arrays. For digital, gate-based computation, single-gate and multiple-gate implementations can be programmed via differing light pulses. Rob Hays, CEO of Atom Computing (and a deep veteran of the computing industry as former Chief Strategy Officer of Lenovo and a 20-year leadership tenure at Intel where he led the Xeon processor roadmap) explained that “every element has a magic wavelength of light that allows atoms to be captured by optical tweezers.” He further noted that “with a different wavelength of light, we can effectively control the spin of the nucleus in any position in three dimensions…and that’s how we create single qubit gates…and then what we can do is create entanglement with two qubit gates by using different wavelengths of light to excite the electron cloud into what’s called a Rydberg state where the radius of the electron orbit gets much larger to the point where it crosses paths with neighboring atoms and gets entanglement.” This is the foundation for one of the key strengths of neutral atom QC’s, namely its strong connectivity. For analog operations the tweezer moves the atoms into the desired configuration and other laser or microwave pulses then trigger the atoms into performing Hamiltonians (more on this as well as the various differences between the digital and analog approach below). In both cases, the final results are read out optically.
Some important characteristics of neutral atom-based qubits include:
Exceptionally Long Coherence Times: Leading super-conducting and photonic Quantum Computers have achieved coherence times measured in microseconds (millionths of a second), which doesn’t provide much time to run algorithms (although they also have very fast gate speeds). Neutral atom players generally enjoy coherence measured in full seconds and, in fact, Atom Computing published a paper in Nature Communications in May 2022 touting coherence times exceeding 40 seconds.
Strong Connectivity: The topography of the neutral atom structure is quite flexible, and these modalities typically enjoy robust connectivity among qubits, often achieving all-to-all connectivity. In fact, neutral atoms can also implement multi-qubit gates (involving more than 2 qubits such as a CCNOT or Toffoli Gate) and can even implement 3-level qubits or “Qutrits”.
Scalability: Because neutral atom players use “atoms” and since all atoms of a given element isotope are intrinsically identical, all qubits based on such elements are identical to each other. In addition, since there is no ionic charge contained in the elements being used, the atoms can be packed into tight arrays, often only microns apart. Also, rather than a sperate laser for each qubit, since the atoms are manipulated by common wavelengths, a laser of a specific wavelength can be split into “beamlets” in order to control multiple atoms.
External Cryogenics Not Required: Modalities which require cryogenic chillers are burdened with significant added overhead and must typically contend with long chill-up/chill-down cycles.
Reduced Wiring Complexity: All of the functions to control the neutral atoms are performed via light propagating through free space. This is opposed to superconducting qubits which require multiple electrical cables for each qubit.
Can be Operated in Analog or Digital Mode (or both): Digital or gate-based operations are required for full algorithm development, but some early quantum advantages may be achieved utilizing qubits geometrically or in analog mode. This is an important distinction, so I will elaborate further in the next section.
Leveraging Three Decades of Legacy Research: While using neutral atoms in quantum computing is relatively new, the neutral atom technology has been successfully deployed in other physics research and has powered the world’s most accurate atomic clocks for many years. The laser-cooling technology is based on research that led to the 1997 Nobel Prize and optical tweezers are based on research that led to the 2018 Nobel Prize.
While this is an impressive feature list, neutral atom quantum computers are relatively new to the Quantum Computing landscape and have yet to showcase important real-world results. There are also meaningful technological challenges in refinement of the lasers and the ultra-high vacuums. Dr. Mark Saffman, Chief Scientist for Quantum Information at Infleqtion and Professor of Physics at the University of Wisconsin-Madison, had tremendous insights for me regarding the differences among analog and digital mode for neutral atom QC, and noted that Infleqtion has “recognized the challenges of some of the specialized laser systems being used,” and noted that they are “working with partners on developing more integrated laser technology…with a real challenge currently being developing faster calibration and tuning routines in order to keep the machines in a calibrated state.” That said, Infleqtion and their neutral atom peers are advancing at a furious pace, and I expect significant progress to be made in 2023.
Analog vs Digital/Gate Mode
Richard Feynman is often cited as the father of quantum computing and he is credited with saying “…trying to find a computer simulation of physics seems to me to be an excellent program to follow… and nature isn’t classical dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical…” While Feynman said this in 1981, well before today’s Quantum Computers were possible, he was quite prescient. He wasn’t referring to algorithms or gates or quantum computer code, he was talking about literally simulating nature and that is what some of these neutral atom companies are offering today in analog mode.
Many of you are likely familiar with “digital quantum” algorithms where the information is encoded into single and multiple-qubit functions which are driven by a series of commands or gates, much like traditional computers are currently programed. The specific steps and their order are vital to a successful code or algorithm and there is an art to how such commands are created and sequenced. Most press about QC covers this “digital” mode, and the fidelities and speeds of the gates as well as the length of coherence are two of the bigger hurdles being addressed by today’s players. The challenges facing current digital QC approaches are rooted in the fidelities of the systems, which are quite fickle today and subject to many disruptive factors or “noise” (see a prior post about this noise here).
“Analog quantum” computing also uses qubits and the various quantum mechanical properties that power digital quantum computers (superposition, entanglement and wave properties, etc.) but there are no gates. The exquisite control required to execute gates is one of the major hurdles facing QC development and by circumventing the need to utilize gates, analog quantum computing has surpassed the digital mode on a number of fronts. By transforming certain problems into a “geometric” structure (like Feynman suggested) instead of a sequential gate-based formula, results can be derived without gates. As Alex Keesling, CEO of QuEra told me, “…whereas in gate-based [digital] quantum computing the focus is on the sequence of the gates, in analog quantum processing it’s more about the position of the atoms and where you place them so they can mirror real life problems. We arrange the atoms and define the forces that drive them and then measure the result…so it’s a geometric encoding of the problem itself.”
It took me a while to appreciate this difference, and it is only useful for a certain subset of problem type but given that analog quantum computers require less engineering overhead versus digital Quantum Computers, they are already providing meaningful results and can operate with a larger number of qubits (such as QuEra’s Aquila QC with its 256 qubits and PASQAL’s Fresnel with 324 qubits). So let me explain this further.
Analog quantum computers utilize a paradigm of quantum computing which utilizes the conversion of a problem into a mathematical object known as a Hamiltonian. The Hamiltonian is an operator that corresponds to the total energy of a system, including both kinetic and potential energy. It is somewhat similar to how some companies, such as D-Wave, are using quantum annealing as a way to measure the global minimum energy of a system to get useful output from today’s noisy Quantum Computers. The “Traveling Salesman Problem” is a typical optimization problem (e.g., finding the shortest or least expensive route for the salesman to follow to cover all of his customers, or finding the best placement of cell-phone towers to cover a given area). However, in addition to optimization problems, quantum analog computers can also solve for problems in chemistry simulation and material engineering. Specifically, geometrically creating a “digital twin” of the systems under study and then using the Hamiltonian functionality of the analog processing, users can better understand underlying physics, phase transitions of materials and dynamics of particle collisions, among other features. Further, given the analog mode’s ability to parse sets of data into subsets via Hamiltonian simulations, it is also showing increasing promise in machine learning.
In summary, Analog mode for neutral atom Quantum Computers is showing near-term utility for certain classes of optimization and material engineering problems as well as accelerating quantum machine learning. Currently, each of QuEra and PASQAL are using 2D arrays of neutral atoms in their analog processors. They also have the capability of using 3D arrays (as the technology further evolves), which would provide even greater power from their geometric approach and can also eventually use an analog-digital hybrid approach with the same neutral atom technology. It will be fascinating to watch as the analog and digital approaches scale, and to see which company is able to provide the fastest path to quantum advantage.
The Leading Neutral Atom Players
A special thanks to Yuval Boger and Brian Siegelax and to Georges-Olivier Reymond of Pasqal, Alexander Keesling of QuEra, Rob Hays, Mickey McDonald, and Kortny Rolston-Duce of Atom Computing and Mark Saffman, Max Perez and Sarah Schupp of Infleqtion for their patience and insights about their companies as well as Quantum Computing more generally. Many of the details in this post were derived from my conversations with them.
PASQAL
The Nobel Prize in Physics for 2022 was awarded to Alain Aspect, John Clauser and Anton Zeilinger “for their experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” Professor Dr. Aspect eventual shifted his research focus from photons to neutral atoms and partnered with Georges-Olivier Reymond and Chrstophe Jurczak to create PASQAL. In fact, PASQAL is the first quantum computing company with a Nobel Prize winning co-founder. In September of last year, PASQAL unveiled “Fresnel”, its 324-qubit quantum processor, and they expect a 1000-qubit machine to be available next year.
PASQAL is advancing neutral atom Quantum Computers focused on Analog mode and has already amassed an impressive roster of customers (including BMW, Airbus, LG, Siemans, Saudi ARAMCO and others) and use cases. For example, the company has developed a Quantum Machine Learning algorithm applicable to smart grids, aiming to improve the efficiency of electricity distribution. As noted above, Analog quantum computing has interesting applications in problems that can be structured graphically, such as for material design and for optimization. According to PASQAL, some of the world’s most interesting data is relational and can be encoded in graphs: nodes and links in a network, financial indicators (for portfolio risk optimization) and atoms in a molecular diagram. Graph structures can be rich sources of information, allowing the system to uncover hot spots in a network, clusters in a dataset, or infer function from structure in chemical compounds. Such problems are extremely hard to solve with classical computers but lend themselves to Analog quantum computing. As Georges Reymond told me, “you just need quantum developers that are smart enough to design the specific Hamiltonian that you need. Alternatively, we have a team that can help you do that.” He added that “since you are programming very close to the qubits, you can change the geometry of the register into any shape you want.” He also noted that their Pulsar tool, which is a Python library of applicable primitives, and related Pulsar Studio which uses a no-code graphical interface to help address the given problem and then automatically generates the line code required, makes utilization of analog QC mode more accessible.
While Fresnel is not available other than to existing customers (and so my colleague Brian Siegelwax was not able to test it out himself although he has access to Pulsar and Pulsar Studio), the strong client roster is testament to its general utility. I look forward to PASQAL making its machine(s) more broadly available and to seeing how Brian gauges its utility. UPDATE: Brian has now provided his thoughts on PASQAL’s Pulsar Studio here.
QuEra
Full-stack Quantum Computing firm QuEra, based in Boston, traces its roots to quantum research performed at nearby Harvard University and MIT. Their signature 256-qubit Quantum Computer known as “Aquila” is available now for general use on Amazon Braket, and is complemented by their Bloqade open-source software and GenericTensorNetworks algorithm platform. The management team is quite strong and the fact that they are the first neutral atom player to broadly offer access to their QC, gives them a bit of a front-runner status in the neutral atom field. While their underlying technology and approach can apply to digital gate-based algorithms, they have opted instead to focus on analog processing. Their field programmable qubit arrays (FPQA) offer near-arbitrary configurations of atoms and highly flexible connectivity. Aqila promises rapid development cycles, easy geographic encoding of problems and the exploration of exotic topologies.
The Company has generated 11-figures of R&D and development income from a broad array of government agencies and commercial customers and is the only neutral atom quantum computing company that has made its systems generally available to the public (albeit at a somewhat limited 10 hours per week). Management encourages users to try the platform and they are interested in real-world feedback, including error analysis, so they can utilize the input to further evolve their technology. Brian Sieglewax is generally bullish on Aquila as he describes in a recent post and follow-up on using Aquila for Maximal Independet Set (MIS) and a deeper dive by Brian regarding implementation of Rydberg Toffoli gates. I applaud QuEra’s accessibility focus and firmly believe that QC makers will learn the most, and gain the quickest technological progress, with diverse direct feedback from users, warts and all. The mix of QuEra’s strong management team and current availability of their system, suggests they will continue to make rapid and important progress. I look forward to following progress along their roadmap and to learning what novel applications users are able to execute.
Atom Computing
I recently had the benefit (and pleasure) of spending some time with Rob Hays, via video call, as well as an on-site tour of the new Atom Computing facility in Boulder, led by Mickey McDonald (Principal Quantum Engineer) and Kortney Rolston-Duce (Director of Marketing and Communications), who also indulged me with a private white-boarding session where they did their best to answer all of my Quantum neutral atom 101 questions, and which (finally) helped connect many of the dots that had been swimming around my head. All of these interactions were immensely informative, and I was impressed with the team and with what I learned. Atom’s headquarters and original R&D machine are in Berkeley, but they are using Boulder to create their production unit(s). In fact, they have an interesting approach in simultaneously creating twin machines with the intention of always maintaining customer access to one, while any upgrades or maintenance are performed on the other. Given their lack of requiring cryogenic freezers, their QC’s are not the usual chandeliers many of us are familiar with, but instead are room-sized “black boxes” housing all their optics and the majority of the controllers in various modularized sections.
Atom has an impressive roster of employees and consultants including Dr. Ben Bloom, a co-founder and CTO, who has deep connections in the Boulder quantum ecosystem, and Dr. Jun Ye, their Scientific Advisor, who is a physics professor at nearby CU Boulder, Fellow of the Joint Institute for Laboratory Astrophysics (JILA) and the National Institute of Standards and Technology (NIST) and was recently named member of President Biden’s National Quantum Initiative Advisory Committee. They also have an enviable roster of investors including Venrock, Innovation Endeavors, Prelude Ventures, Prime Movers Lab, and Third Point Ventures, among others.
While they have published some impressive results from their Quantum Computers including “Phoenix”, their first-generation platform, they have opted not to make Phoenix publicly available (although it is accessible by select early customers). However, they are working furiously on their second-generation systems which they plan to make available online via a Quantum Computing as a Service (QCaaS) model. They are actively collaborating with software and application developers, and I look forward to feedback from users (including Mr. Siegelwax), once Atom makes their systems more widely available.
Infleqtion (f/k/a ColdQuanta)
Infleqtion, located a bike-ride away from Atom Computing, traces its roots to Drs. Eric Cornell and Carl Weiman who created the first ever Bose-Einstein Condensate (BEC) at UC Boulder in 1995, a feat for which they were awarded a 2001 Nobel Prize. BEC is a new form of matter, which is created when atoms are cooled close to absolute zero. Infleqtion uses neutral atoms across multiple quantum applications including gate-based quantum computers as well as a variety of quantum sensing and signal processing applications such as High Precision Clocks, Quantum Positioning Systems (QPS), Quantum Radio Frequency Receivers (QRF) and Quantum Networking and Communications as well as some of the fundamental components used by others (i.e., ultra-high vacuum cells). While Quantum Computing steals most of the “quantum” headlines these days, these other devices bring enormous advances in their fields and, importantly, current revenues. I have been fortunate to know a number of management members of Infleqtion and have been closely following their progress since an original blog post in April 2022 about Collaborations and a follow-up dedicated to ColdQuanta in May 2022.
Infleqtion had a number of important milestones noted in 2022 including:
Completion of a $110 million B-round, including A$29 million earmarked to create a Quantum Technology Centre in Australia.
Acquisition of Super.tech, a leading developer of quantum software and related platforms, and announced collaboration with Morningstar to integrate Super.tech’s SuperstaQ software into Morningstar Direct.
Participation as a subcontractor on the Office of Naval Research’s Compact Rubidium Optical Clock program, valued at up to $16.2m.
“Albert,” their BEC design device, was named one of TIME’s Best Inventions of 2022 and winner of the 2022 Prism Award, Quantum.
Won the 2022 Best of Sensors Award, for their high-performance test and calibration instrument known as “Maxwell.”
Dr. Fred Chong, Chief Scientist for Quantum Software, was named IEEE Fellow for his Enabling Practical-scale Quantum Computing (EPiQC) project.
Dr. Bob Sutor, VP and Chief Quantum Advocate, testified at Senate committee hearings regarding the importance of Quantum Computing technologies.
While neither Albert (BEC design platform) nor Hilbert (their 100 qubit QC unit) are regularly available to the public, they continue to make progress advancing both systems and I look forward to an update from Mr. Siegelwax once those systems can be tested (for now, here is his review of Albert when it was available last year). In the meantime, Infleqtion continues to generate meaningful revenues and advance the technologies of its broad quantum-related components and I’m certain they are leveraging their learnings across their portfolio.
[Note: “planqc”, a recent graduate of the Creative Destruction Lab startup incubator, is the newest entrant to the neutral atom field, and is included in the table of players, but was not covered in detail in this post due to its very early stage. I look forward to providing more details on planqc in future posts.
Conclusion
While neutral atom Quantum Computing is not without its shortcomings, has yet to supply consistent and robust performance, and lags being other modalities that have been accessible longer (i.e., superconducting and ion trap modalities), they are gaining strong momentum and feature important theoretical advantages. If the speed of innovation in 2022 is a harbinger of the rate of progress we should expect in 2023, I am excited about the prospect of reporting on this progress and look forward to providing updates.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
Hays, Rob, CEO of Atom Computing, Interview conducted by author on Nov. 17, 2022.
Keesling, Alex, CEO of QuEra, and Boger, Yuval, Consultant, Interview conducted by author on Nov. 10, 2022.
McDonald, Mickey, Principal Quantum Engineer Atom Computing and Rolston-Duce, Kortny, Director of Marketing and Communications Atom Computing, Interview conducted by author during site tour on Nov. 17, 2022.
Reymond, Georges-Olivier, CEO of PASQAL, Interview conducted by author on Nov. 15, 2022.
Saffman, Mark, Chief Scientist for Quantum Information, Infeqtion, Interview conducted by author on Dec. 16, 2022.
Ebadi, Keesling, Cain, Wang, Levine, et. al., “Quantum Optimization of Maximum Independent Set using Rydberg Atom Arrays,” Quantum, February 18, 2022, arXiv:2202.09372v1 [quant-ph]
Silverio, Grijalva, Dalyac, Leclerc, et. al, “Pulsar: An open-source package for the design of pulse sequences in programmable neutral-atom arrays,” Quantum, January 11, 2022, arXiv:2104.15044v3 [quant-ph].
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
When Robert Lamm wrote that first hit song for the band Chicago in 1969, he was likely referring to the pressure that time places on society, not the technological advances dependent on precise time keeping. It’s a crucially important and prescient question that enables modern technology in ways most people are unaware. Why am I featuring this in a “Quantum Leap” blog? If we can now readily obtain the “official” time by syncing our cell phone with GPS satellites or our computer with an atomic clock with accuracy to within one second per 60 million years, why do we need to measure time more accurately than that? Keep reading and I hope you’ll understand.
How do Clocks Work?
“Time” is not some absolute and discrete “thing”, it’s a somewhat arbitrary convention that society agrees to agree on. [It’s also “relative” as in Einstein’s theory, which essentially means that time can differ based on conditions of the measurer]. In the very early days, it was measured by the earth’s rotation, with a day being defined as one rotation. Ancient Egyptians defined a second by dividing the day by 24 into hours, then by 60 into minutes and then again by 60 into seconds (rooted in night cycles and decans), so a second was 1/86,400th of a day. In other words, our current “second” is a man-made construct. In this section I want to explain a bit of the history on the evolution of clocks so that you have a fundamental understanding of how time is measured. The subsequent sections will explain why accuracy and precision of time measurement is so important and enabling, along with listing some of the companies in this field.
Clocks work by counting a periodic event with a known frequency. In the above example, it is the daily rotation of the earth. When grandfather clocks were the standard time-keeping devices, they worked by having a pendulum swing back and forth with its gears counting the swings. The arm of the pendulum in that grandfather clock is typically adjusted to make each half-swing one second. One “cycle” per second is known as 1 Hertz (Hz).
When electronic wrist watches were developed, they used a piece of quartz which vibrates at a certain frequency (32,768 Hz) so in this case a “second” is measured as 32,768 vibrations. The higher the base frequency, generally the more accurate the clock. For example, if that grandfather clock is off by 0.1 Hz, it will be off by one second in ten. If the quartz wristwatch is off by 0.1 Hz, it will be off by one second out of 327,689 or roughly 0.26 seconds per day.
Around 70 years ago, scientists realized that atoms could be used as clocks. When certain atoms are exposed to specific energies (e.g., microwave frequencies) the outer electrons transition between orbits. Specifically, the electron jumps to a higher energy orbit (or takes a “quantum leap”) and the time it takes to return to the original lower energy state is the measurement frequency, hence the “quantum” connection. Since 1967, The International System of Units (SI) has defined the “second” as the period equal to 9,192,631,770 cycles of the radiation transition of Cesium-133. Cesium oscillators, such as the atomic clock maintained by NIST in Boulder, CO (UTC(NIST)), is accurate to within 0.03 nanoseconds per day. The SI aggregates the data of more than 400 atomic clocks operated by over 80 laboratories around the world, averaging their “time” to create the world’s “official” time known as UTC.
Why Do We Need Such Precision Regarding Time?
GPS satellites are a relatively ubiquitous technology and while the name refers to their role in positioning, it is their role as timekeepers that is most relevant to this post and many systems use GPS to derive their time. Specifically, each satellite has an onboard atomic clock, and the signals it beams down to your GPS receiver utilizes the precision of that clock to enable the GPS receiver to triangulate signals and determine position (as well as transmitted time).
In January of 2016, the US Air Force took one of the many satellites in the US GPS constellation offline, and an incorrect time stamp was accidentally uploaded to several other GPS satellites leading to a thirteen-millionths of a second error in their time – less time than it takes the sound of a bullet to leave the chamber. It caused global telecommunications networks to begin to fail, BBC digital radio was out for two days, electrical power grids began to malfunction and even police and fire EMS radio equipment in the US and Canada stopped functioning. This 13-mircosecond error in GPS clocks wreaked havoc on our modern world.
To help illustrate why accurate timekeeping is so important, imagine that you oversee a train tunnel that brings goods in and out of a city. If the trains that run on the tracks are accurate to within 5 minutes of their schedule, that means you must allow a 10-minute window for that train to have access to the tunnel (+/- the five minutes) and therefore you can only schedule 6 trains per hour to use the tunnel. If those trains were more accurate and arrived within 2 minutes of their schedule, you could schedule 15 trains per hour (60 minutes divided by the 4-minute window). So, the throughput of the tunnel is directly proportional to the accuracy of the trains.
This same concept impacts many critical infrastructure elements of our modern society, including:
Stock exchanges
Power Grids
Telecommunications systems
Computer networks
Defense applications (e.g., ballistics accuracy, navigation without GPS, etc.)
Stock exchanges are increasingly driven by high-frequency computer trading and keeping the exchanges fair and equitable under such conditions is a core concern of regulators. All trades are required to maintain timestamps because cutting in line, known as “front running,” is illegal. FINRA, the regulatory body that governs domestic exchanges, maintains “clock synchronization” requirements relative to UTC(NIST). The more precise this requirement, the more trading volume the exchanges can accommodate. The US power grid consists of more than 360,000 miles of transmission lines connecting to about 7,000 power plants, all of which must be synchronized and monitored. Monitoring for faults is one of the core attributes requiring accurate time measurement. Faults in transmission lines are measured at both ends of a given line by synchronized clocks, which can then determine which transmission tower is the source of the fault. Given the broad interdependence of the energy grid and its many power plants, any faults in the system can affect the broader grid unless resolved quickly and accurate clocks help pinpoint the faults in real-time. The current telecommunications system is a two-way transmission medium and maximizing the throughput of data is important both for user experience and for profit. Fitting more bandwidth within a given transmission line means the telecom can earn more money on it. In fact, there is talk that the next generation of cellular protocol (i.e., “6G”) will require each cell tower to maintain an internal atomic clock to optimize bandwidth/throughput. This throughput concept also applies to dispersed networks (i.e., the Internet, the Metaverse, etc.). For example, Google Spanner is a worldwide database designed to operate seamlessly across hundreds of datacenters, millions of machines and trillions of lines of information. Precise timing is vital for seamless handing off between locations and to eliminate drag, but also to ensure that nobody is writing to a given byte at the same time someone is reading that byte. Google achieved this global no-latency network by using their own atomic clocks to create a proprietary time protocol (TrueTime API). Similarly, Meta has utilized atomic clocks in their Metaverse to ensure minimal latency, among other important features.
Time is Money
You undoubtedly have heard the cliché that “time is money”. Here are two examples of how this can be literally true, especially as it relates to US defense initiatives:
In January of this year, Frequency Electronics was awarded a contract worth up to $20.2 million for the development of a Mercury Ion Atomic Clock for applications in various US Naval platforms.
In December 2021, Vescent Photonics was awarded a contract worth up to $16.2 million to develop portable atomic clocks for the Office of Naval Research (ONR) Compact Rubidium Optical Clock (CROC) program.
These are just two examples of such programs, highlighting the increasing importance of exquisitely accurate, field-deployable atomic clocks. A report released earlier this month suggests that the overall size of the atomic clock market will exceed $740 million by 2028. The Vescent deal cited above is being fulfilled in partnership with Infleqtion, Octave Photonics and NIST. The group aims to improve upon existing commercial atomic clocks by interrogating a two-photon optical clock transition in a warm vapor of rubidium (Rb) atoms. As Scott Davis, CEO of Vescent told me, “Exploiting the frequencies of quantized atomic energy levels to define the second, i.e., atomic clocks, has changed the world. These historically have used microwave transitions (lower energy). After the advent of the optical frequency comb, quantized transition at optical frequencies can be utilized. This represents an orders of magnitude step in performance. Vescent manufactures combs designed, for the first time, to leave the lab. This is enabling a next generation of deployed, higher performing, optical atomic clocks.” The CROC program is likely the first of many similar programs where Vescent will apply its technologies.
Can we Create Even More Precise Clocks?
The short answer is yes, by using “quantum” clocks versus existing atomic clocks. As the quantum information industry continues to advance, developments have broad benefits across the industry. As Jun Ye, a Fellow at both NIST and JILA and recently named member of President Biden’s National Quantum Initiative Advisory Committee noted to me, he is working on “experimental atomic clocks which explore the new measurement frontier based on quantum science. From this perspective quantum [optical] atomic clocks and quantum information processing are connected through shared intellectual development and technological advances.”
The following graphic helps display the way frequency combs act like a gear between the ultrafast optical frequencies and the microwave frequencies, which can be counted by current detectors.
In addition to the creation of frequency combs (2005 Nobel Prize in physics), controlling atoms used for measuring frequency transitions is also vital in increasing the accuracy of underlying clocks (the movements of the atoms lead to doppler effects, similar to what you hear as a speeding car goes by, reducing precision of measurement) so “laser cooling” helps push accuracy up. Companies like Infleqtion (f/k/a ColdQuanta) are leveraging their broad capabilities in cold atom science to contribute to improved clocks, with the challenge now being to move these optical atomic clocks out of the lab and into the field. As Max Perez, VP of Research and Security Solutions at Infleqtion told me, “It’s important to get noise out of the system so you can achieve long-term stability…the challenge has been that the laser systems required for cold-atom clocks have been expensive and complicated. The lasers need to be highly tuned with very specific and narrow line widths…and a big part of what we are doing is bringing down the cost and size by leveraging our various technologies.”
Early atomic clocks were room-sized and much less accurate than today. As is common with many technologies, science has been able to improve the accuracy and decrease the size (and cost) of atomic clocks. In fact, today it is possible to compact certain atomic clocks into a microchip.
The example to the right is only 35g and is less than 17cm3 in volume. Photonic clocks are earlier in their evolution and progress is being made to move these out of the lab and into the field (with the aim to also bring them down to chip-scale via photonically integrated circuits).
The following highlights some of the companies manufacturing atomic clocks and/or the components that are used to create them:
Conclusion
The surging attention and resources dedicated to quantum mechanics has yielded amazing technological advances. Much of the Quantum Leap blog has focused on applications in Quantum Computing (QC), but other related technologies are also pushing the frontiers of technology and knowledge. More accurate clocks, leveraging certain advances developed for broader quantum information sciences, are already beginning to have practical applications beyond QC including fundamental advances in physics, more accurate sensors and more precise time keeping. As Jun Ye further noted “Atomic clocks…represent some of the most exquisitely sensitive and accurate scientific instruments that humankind has built to explore the unknows of nature.” It’s an exciting time to be following quantum science and I look forward to tracking and reporting on evolving breakthroughs.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
“Clocks” graphic by Andrey Grushnikov via Pexels.com.
Davis, Scott, CEO of Vescent, Interview contacted by the author on December 16, 2022.
Perez, Max, VP of Research and Security Solutions at Infleqtion, interview conducted by the author on December 16, 2022.
Ye, Jun, NIST Fellow, head of the Ye Group at JILA and Adjoint Professor at the University of Colorado Boulder, via email exchange, November 24, 2022.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
Some have described the rapidly accelerating global push in Quantum Computing as a figurative “space race” given the potential reach of its computational power and its applications in drug development, logistics, material science, and its potential ability to overpower existing encryption techniques. However, this post is focused on the literal quantum space race – the increasing number of quantum devices in orbit and their profound applications. While the fragility of quantum states has been a core challenge in advancing Quantum Computers, that same challenge is a powerful asset for creating ultra-sensitive measuring instruments, and these quantum sensors are now making their way into orbit.
Quantum sensing and quantum communications are making important advances in space in the following areas:
Earth Sensing and Observation
Quantum Key Distribution (QKD) and Secure Communication Networks
Time and Frequency Transfer
Fundamental Physics and Space Exploration
Today there are 77 countries with space agencies,16 of these countries have launch capabilities, and more than 4,500 satellites are currently in earth orbit. Satellites containing quantum devices are increasingly being placed into orbit, and quantum devices have been used in, and deployed from, the International Space Station. As Arthur Herman noted in a recent Forbes article: “Quantum communication satellites will become hubs of not only a future quantum internet, but hubs for hack-proof networks for transfer of classified data and communications – not to mention a command-and-control architecture thatwill be an integral part of space domain dominance” [emphasis added].
The following chart is a partial sampling of existing and planned quantum space launches:
Note: Above chart not intended to be all-inclusive, and some programs have contributions from additional countries.
We are already increasingly dependent on satellites for global communications and GPS service, among other applications, and space-based experiments are advancing basic science and human knowledge. Adding the powerful capabilities of quantum technologies will accelerate and expand upon these space-based advances. The following summarizes some important space-based quantum initiatives:
Earth Sensing and Observation
A key attribute of quantum mechanics which is one of the main rate delimiters in advancing Quantum Computing, is the fragility of the tiny particles placed into a quantum state. Specifically, attempting to control individual atoms, electrons or photons has been very difficult due to the sensitivities of such particles to external forces including gravity, electromagnetic radiation, temperature fluctuations, and vibrations. However, it is this sensitivity to such forces that make “qubits” such powerful sensors enabling them to study and assay the earth in detail never before available.
Space (satellite) based quantum sensors can provide reliable detection, imaging, and mapping of underground earth environments from transit tunnels, sewers and water pipes to ancient ruins, mines, and subterranean habitats. There are important civil engineering benefits that more precise sensing can achieve, particularly around large projects (e.g., nuclear power plants, high-speed rail, etc.) where existing subsurface surveys are extremely expensive, time-consuming, and often not as precise as necessary. Such space-based sensors can also be used to track minute gravitational changes and tectonic shifts that can forewarn of avalanches, earthquakes, volcanic eruptions, or tsunamis. The strength of Earth’s gravitational field varies from place to place, often due to underlying causes of climate change. Variations in gravity are caused by factors such as relative positions of mountains and ocean trenches and variations in the density of the Earth’s interior, but also by small fluctuations in underground water reservoirs or changes in ice mass, so gravimetry is an important new tool to help monitor global warming.
Quantum Key Distribution (QKD) and Secure Communication Networks
QKD is a secure communication method that uses quantum properties of photons to encrypt secret keys that can be share by two parties to encode their communications. The technique is considered un-hackable since any attempt to eavesdrop destroys the keys. Current forms of encryption, such as the widely used public-key cryptosystem developed by RSA, rely on the difficulty of solving mathematical problems whereas QKD instead relies on physical processes. In quantum physics, there is a “no-cloning” theorem which states that it is impossible to create identical copies of an unknown quantum state. This prevents hackers from simply copying the quantum encoded information. Another quantum property known as “observer effect” causes quantum states to change upon observation and therefore, if anyone were to try and read the QKD it would change it and that change would be instantly known by the parties involved. (If interested in learning more about QKD please see here.)
QKD has already been successfully implemented via fiber optic cables, but only over short distances. Beyond 100 kilometers (about 60 miles) the signal degrades and beyond 300 kilometers the information transmission becomes prohibitively slow (i.e., only about one bit per second). In fact, the signal degradation increases exponentially as the distance increases. By using satellites in low-earth orbit (LEO) to send and receive transmissions via line-of-sight, this distance challenge can be largely overcome. LEO orbits can provide line-of-sight transmission between earth-based ground stations that are up to about 700 kilometers (about 430 miles) apart, although this limitation can be exceeded if the key can be stored in the satellite while it orbits or, preferably, by relaying the signal among connected satellites.
Naturally, un-hackable communications is a key objective of many governments as well as certain industrial firms, hence the broad number of countries currently working on space-based QKD.
Time and Frequency Transfer
An overwhelming array of modern conveniences are reliant upon highly accurate clocks. [In fact, this is such a prevalent and important observation that my next post will be dedicated to need for more precise time measurement]. Many electric power grids use clocks to fine-tune current flow. Telecom networks rely on GPS clocks to keep cell towers synchronized so calls can be passed between them. The finance sector uses clocks to timestamp ATM, credit card and high-frequency trades. Doppler radar, seismic monitoring and even multi-camera sequencing for film production all use highly precise clocks. Today’s earth-based atomic clocks are extremely accurate, and you can readily synchronize your computer to the atomic clock of your choice. However, relying on existing atomic clocks for timestamping, such as currently done for GPS satellites, is becoming increasingly challenging. GPS navigation is currently accurate to about three meters (about 10 feet), so it presents challenges to using it for autonomous driving, as one example.
In order to improve on existing time keeping and related applications, we need both a more accurate clock as well as more precise dissemination and sharing of time. Quantum technologies can improve time accuracy by orders of magnitude and placing them in space can enhance dissemination. Increased time accuracy will improve current communications and geolocation services as well as enable new applications and a space-based quantum clock can enable long-range transfer timing.
Fundamental Physics and Space Exploration
NASA’s Cold Atom Lab aboard the International Space Station (ISS) has used atom interferometry to create a new generation of exquisitely precise quantum sensors that scientists are using to explore the universe. Applications of these spaceborne quantum sensors include tests of general relativity, searches for dark energy and gravitational waves, spacecraft navigation and drag referencing, and gravity science, including planetary geodesy—the study of a planet’s shape, orientation, and gravity field.
In 2019, the image of a supermassive black hole was created using earth-based synthetic aperture telescopes. By precisely measuring the arrival time of radio waves at two different locations, an image of their source was created. Because visible light wavelengths are much shorter than radio waves (nanometers vs meters), more sensitive detectors and clocks are required to use this methodology for visible light, such as those now being placed in orbit. The resolution of such an image would match the resolution of a conventional telescope with an aperture equal to the distance between the two satellites. Such telescopes would be extremely sensitive, potentially enabling astronomers to study planets around other stars in vast detail.
Space-based quantum sensors will also be crucial for space exploration. As spacecraft venture further away from Earth, the ability to provide navigational instructions diminishes. Naturally “GPS” would be unavailable in deep space, and Earth-based control signals have increased time lag times as spacecrafts travel further away. Additionally, if such Earth-based navigational commands are not precise enough, the target craft may miss its destination completely. Sensors that can measure a vehicle’s acceleration and rotation can enable navigation without requiring external commands. In addition, space-based quantum sensors are planned to help search for water and other resources on the moon and Mars.
Conclusions
The pace of advances in quantum science is rapid and paradigm shifting. While Quantum Computing gets most of the headlines, important advances in quantum sensing and communication is also advancing rapidly including via deployment in space. By placing powerful quantum devices into space, significant advances in earth observation, space exploration and secure communications are being achieved. Given the intense competitive nature of terrestrial quantum advances, extending this to a “space race” is inevitable and, in fact, is already underway. Readers should anticipate more and more headlines on this topic, and I look forward to providing periodic updates.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In December 2021, in an early iteration of this Blog, I described the various qubit modalities in use by some of the Quantum Computing (QC) hardware players. A lot has happened since that post, so I thought it would be constructive to revisit the topic.
When that earlier post was published (click here if interested in reviewing), it described 10 leading quantum hardware companies focusing on four core qubit types (superconducting, trapped ions, photonics and quantum dots). Today there are dozens of quantum hardware companies, a few additional common modalities (notably neutral atoms) and significant advances made across the spectrum.
Qubit Dynamics
While many articles describing and comparing QCs focus on the number of qubits, this core number belies the complexity in comparing actual QC performance due to additional limitations described below. Qubit count is the equivalent of only using horsepower to describe a car. While horsepower is an important metric, most car buyers are equally if not more focused on comfort, handling, fuel economy, styling, etc. Some effort has been made to “consolidate” these variables for QC into a single performance metric (such as Quantum Volume, CLOPS (circuit layer operations per second) or QED-C’s Benchmarks), although no single measurement has yet been adopted by the broad QC ecosystem. For the casual reader, I’d caution you to not focus too much on the number of qubits a given QC has. While “more is better” is generally a useful mantra, as you’ll see below, it is not that simple.
As you may know or recall, placing qubits in a superposition (both “0” and “1” at the same time) and entangling multiple qubits where one is dependent on the status of the other (entanglement) are two fundamental quantum properties which help empower Quantum Computers and allow them to perform certain calculations that can’t easily be executed on traditional computers. Before we review the various types of qubits (i.e., quantum hardware platforms), it may be helpful to summarize some of the limitations faced when placing qubits in superposition and/or entangling multiple qubits, and discuss the key metrics used to measure these properties.
Two-qubit Gate Error Rate: Entanglement is a core property of QCs and the two-qubit gate error rate is the second-most-often reported metric (after qubit count). An error rate of 1% is the equivalent of 99% gate fidelity. You may have come across the concept of a ‘CNOT gate’ or controlled-not gate, which simply takes two qubits and when the first (control qubit) is in a desired state, it flips the second (target qubit). While this sounds basic and simplistic, it is this correlating of the qubits that enables the exponential speedup of QCs. Said another way, it is a method for enabling QCs to analyze multiple pathways simultaneously, and so is truly a fundamental property being leveraged by QCs. Many in the industry suggest that 2Q fidelities exceeding 99.99% will be required to achieve quantum advantage and some modalities are approaching that (for example, IonQ has achieved 99.92%), but most are still considerably below that threshold.
Single qubit/Rotation Error Rate: Single qubit gates, also often referred to as “rotations” adjust the qubits around various axes (i.e., x-axis, y-axis, and z-axis). In classical computing, you may be familiar with a NOT gate, which essentially returns the opposite of whatever is read by the machine. So, a NOT applied to a 0 “flips” it to a 1. Similarly, in quantum computing, we have the X-Gate, which rotates the qubit 180-degrees (around the X-axis) and so also takes a 0 and “flips” it to a 1. Given the exquisite control required to manipulate qubits, it is possible that the pulse instructing the qubit to “flip” may only apply 179-degress of rotation instead of the required 180 and therefore lead to some error, especially if such imprecision impacts many qubits within an algorithm.
Decoherence Time (T1 and T2): T1 (qubit lifetime) and T2 (qubit coherence time) are effectively two ways to view equivalent information, namely “how long do the qubits remain in a state useful for computation?” Specifically, T1 measures a qubit lifetime, or for how long we can distinguish a 1 from a 0, while T2 is focused on phase coherence, a more subtle but also crucial aspect of qubit performance. Many early QC modalities such as superconducting have modest T2 lifetimes, capping out at 100 microseconds (or millionths of a second) whereas some recent entrants such as neutral atoms, have achieved T2 as long as 10 seconds and certain trapped ions have extended that to 50 seconds. These many orders of magnitude difference in T2 among qubit modalities is a key differentiator among them.
Gate Speed: Is a metric that measures how quickly a QC can perform a given quantum gate. This is especially important relative to the decoherence time noted above, in that the QC must implement its gates BEFORE the system breaks down or decoheres. Gate speed will become increasingly important as a raw metric of time-to-solution where microseconds add up. Interestingly, the modalities with relatively short T2 times (i.e., superconducting, and photonic) generally have the fastest gate speeds (measured in nanoseconds or billionths of a second).
Connectivity: Sometimes referred to as topology, is a general layout of the qubits in a grid and is concerned with how many neighboring qubits a given qubit can interact with. In many standard layouts, the qubits are lined up in rows and columns with each qubit able to connect to its four “nearest neighbors”. Other systems can have “all-to-all” qubit connectivity, meaning every qubit is connected to every other one. If two qubits can’t directly interact with each other, “swaps” can be inserted, to move the information around and enable virtual connections, however this leads to added overhead, which translates into increased error rates.
SPAM (State Preparation and Measurement) Error Rate: At the start of any quantum algorithm, the user must first set the initial state, and then in the end, that user must measure the result. SPAM error measures the likelihood of a system doing this correctly. A 1% SPAM error on a five-qubit system provides a very high likelihood that the results will be read correctly (99%5=95%) but as the system scales, this becomes more problematic.
Qubit Modalities
When the bits created for classical computing were first created, there were several different transistor designs developed. Similarly, today there are many ways to create a qubit and there are crucial performance trade-offs among them. The following is a brief overview of some of the more common types:
Superconducting Qubits: Some leading Quantum Computing firms including Google and IBM are using superconducting transmons as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule). The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop. A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap. Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable. However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.
Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit. Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge. Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons. These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras. Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing. However, this same feature makes two-gate implementation particularly challenging. Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity.
Neutral Atoms: Sometimes referred to as “cold atoms”
are built from an array of individual atoms that are trapped in a room-temperature vacuum and chilled to ultra-low temperatures by using lasers as optical “tweezers” to restrict the movement of the individual atoms and thereby chill them. These neutral atoms can be put into a highly excited state by firing laser pulses at them which expands the radius of the outer electron (a Rydberg state), which can be used to entangle them with each other. In addition to large connectivity, neutral atoms can implement multi-qubit gates involving more than 2 qubits, which is instrumental in several quantum algorithms (i.e., Grover search) and highly efficient for Toffoli (CCNOT) gates.
Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium, or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.
The following table highlights some of the features of these various qubit modalities, as of Oct. 2022:
There are a few other modalities including NV Diamonds, Topological, Nuclear Magnetic Resonance (which seems more experimental and very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed.
The following table summarizes some of the benefits and challenges along with select current proponents of key qubit technologies currently in use:
The table above is not intended to be all-inclusive. In fact, there is an excellent compendium of qubit technologies put out by Doug Finke’s Quantum Computing Report which can be accessed here (behind a pay wall, but well worth the fee), and which includes over 150 different quantum hardware computing programs/efforts. A special thank-you also to David Shaw and his Fact Based Insight website which has covered this topic in great detail.
Conclusions
As noted in this post, there have been significant advancements in Quantum Computing hardware over the past year or so and I expect this momentum will continue in 2023. Presently there are QCs with 10s to 100s of qubits, and the coherence, connect-ability and control on these early machines continues to improve. In 2023 we should see machines with 1000’s of qubits (e.g., IBM is on pace to release their Osprey QC with 433 qubits before year end and their 1,121 qubit Condor QC next year). Adding sophisticated control and advanced algorithm compilation further extends the capability of these early machines. Whether and when we can achieve universally recognized quantum advantage (i.e., these QCs performing operations that existing supercomputers cannot do) during this NISQ (noisy intermediate stage quantum) era remains to be seen, but this author believes this will happen in the ’23-’24 timeframe and is excited to continue tracking (and reporting on) the progress.
Disclosure: The author has modest positions in some stocks discussed in this review but does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
Qubit images from Science, C. Bickel, December 2016, Science, V. Altounian, September 2018, New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010
Performance Tables and additional modality details from Fact Based Insight, Accessed October 2022
We all know what “noise” is. And we all appreciate that it is usually an unwelcomed invasion of our peace and quiet. Screaming babies on airplanes, jackhammers in the street, leaf blowers outside your window – all can ruin an otherwise tranquil setting. “Noise” in computer lingo represents a similar disconcerting situation. In Quantum Computing (QC), you likely have come across the concept of noise as a major obstacle to QC’s achieving their potential. In fact, John Preskill, a professor of theoretical physics at Caltech and one of the pioneers of QC, coined the acronym “NISQ”, standing for Noisy Intermediate-Scale Quantum Computers which is used to describe today’s QC stage. There are several significant challenges facing QC makers today, and “noise” is one of the most difficult to overcome.
Quantum Computing Noise
There are many causes for the underlying noise in QCs. In order to best visualize and understand this, here is a reminder of how qubits (the underlying components of QC processing, comprised of individual atoms, photons or electrons) store and manipulate information:
The graphic above depicts a few rotations of a qubit, with the blue arrows pointing to various points before and after a rotation (various rotations are implemented via gates representing algorithm commands) and the red arrow showing the axis of rotation. The ending position of the blue arrow contains important and precise information but can move incorrectly due to several noise factors. Here are a few of the core sources:
The Environment: Qubits are exquisitely sensitive to any changes in their environment. Small changes in temperature or stray electrical or magnetics fields can disturb qubits and cause a degradation of the information. Even weak galactic space radiation can push qubits and thereby degrade them.
Crosstalk: Quantum Computers are powered by qubits acting together. Generally, individual qubits are manipulated by lasers or microwaves. However, sometimes the laser or microwave signal can impact nearby qubits as well as the target qubit, an issue knows as crosstalk.
Quantum Decoherence: A qubit’s quantum state deteriorates rapidly, often even after just fractions of a second, requiring QCs to initiate and complete their algorithms before quantum states collapse.
Implementation Errors: The commands or gates of a quantum algorithm apply various rotations to the qubit, which are implemented by laser or microwave pulses which can also be somewhat imprecise. For example an X-Gate, which is analogous to a NOT gate in a classical computer, essentially “flips” the qubit rotating it by 180 degrees. If the pulse command to do this only leads to a 179-degree rotation, the subsequent calculations will be off by a potentially meaningful amount.
You may be familiar with the term “five 9’s” which has often been used in the context of super-high performance. It generally means a system with 99.999% accuracy, or only one error per 100,000 instances. For service level agreements with, say your cloud provider, five nines would mean less than 5.26 minutes of downtime per year. It’s a high standard, recognizing the reality that certain systems suffer from various unknown or unpredictable challenges. While Quantum Computer makers continue to improve upon the fidelities of their qubits (the underlying physical components which process quantum gates and algorithms), none have been able to achieve greater than 99.9% two-gate fidelities. While that may sound high and would likely have been an acceptable grade on your physics final, it is not enough to enable Quantum Computers to perform the complex algorithms necessary for QCs to outperform existing classical computers.
The non-technical takeaway: Quantum Computations are run via qubits which are very difficult to control, are vulnerable to the tiniest environmental changes and have a natural tendency to move, leading to a degradation of the information.
Error correction is the single largest challenge facing QC advancement today, and there are many ways that companies are addressing this issue.
How to Overcome Noise Constraints in Quantum Computing
In the 19th century, ships typically carried clocks using the time in Greenwich in combination with the sun’s position in the sky for determining longitude during long trips. However, an incorrect clock could lead to dangerous navigational errors, so ships often carried three clocks. Two clocks showing differing times would detect a fault in one, but three were needed to identify which one was faulty (if two matched the third one was off). This is an example of a repetition code, where information is encoded redundantly in multiple devices, enabling detection and correction of a fault. In QCs, because measurement fundamentally disturbs quantum information, we can’t do interim measurements to identify errors because that would terminate the process, so data is shared among multiple qubits, often referred to as ‘ancillary’ qubits, ‘syndrome’ qubits or ‘helper’ qubits. A series of gates entangles these helper qubits with the original qubits, which effectively transfers noise from the system to multiple helpers. We can then measure the helpers via parity check, which, like those redundant clocks, can reveal errors without touching or measuring the original system. However, the trade-off is the requirement for many physical qubits to act as helpers, adding enormous overhead to QCs.
Also, since each step of a quantum algorithm is an opportunity for noise to be introduced, efforts to quicken the runtime or reduce the number of steps (i.e., gates) are intended to minimize the opportunity for noise to corrupt the output. In addition to repetition code methods of finding and correcting errors and overall efforts to minimize circuit depth, there are a few other tools being used to tackle quantum noise. A high-level view of the quantum computing software “stack” should help provide some context for these added methods:
The graphic above is generally referred to as the “full stack” and there are opportunities at each level of the stack to help compensate for or minimize noise. Here are a few methods being deployed:
Quantum Control: At the qubit level, often referred to as the “metal”, engineers continue to optimize the pulses and control signals focused on the qubits as well as create modalities with increasing coherence times. Various ways that the qubits are aligned and/or inter-connected affect this level and advances are being continually announced.
Hardware Aware Optimization: At the Machine Instruction level, focus on transpiler efficiencies can reduce errors and minimize noise impacts. Again, various qubit configurations as well as the specific modalities utilized (superconducting vs optical vs ion vs cold atom, etc.) have an impact on the performance of the algorithms and attention to this level of the stack provides another opportunity for noise reduction.
Compiler Efficiency: Circuit optimization is a target of many players in the QC space. Tools that re-write algorithms to focus on this level of the stack is a growing and important part of the ecosystem. For example, efficient usage of ancillary qubits and/or resetting them quickly to be re-utilized requires less run-time and less steps, which means less opportunity for noise to impact the programs.
Algorithm efficiency: There are many ways to write quantum algorithms so ensuring that the code is as efficient as possible (i.e., eliminating redundant steps or minimizing needs to reset or recalibrate qubits) is another opportunity to minimize noise. The more efficient the code, the quicker it can run, or the shorter its circuit depth needs to be.
Many Shots: A final tool which is a standard procedure in quantum algorithms, is to run the algorithm many times. Each run is referred to as a “shot” and typical algorithms are run with 1000’s of shots. By averaging the output of these shots, a “regression to the mean” is often realized, meaning the averaging of the results helps various noise impacts cancel each other out. [The fact that quantum algorithms are probabilistic and not deterministic is a major reason for the redundant shots, but this redundancy is also a tool to help overcome noise].
The non-technical takeaway: Noise is a major problem impacting the ability of Quantum Computers to achieve their potential. Until fault-tolerant hardware can be developed, quantum engineers are deploying several creative ways to overcome noise in current QCs.
Quantum Companies Addressing Quantum Noise
There are a number of players focused on noise reduction and deploying inventive solutions to optimize the performance of todays quantum machines. Some of these methodologies can achieve performance improvements of orders of magnitude, so these methodologies are yielding significant improvements. As the quantum hardware players release ever-larger quantum machines (for example, IBM has announced it will release a machine with more than 1,000 qubits next year) these error correcting strategies will greatly accelerate the ability of QCs to achieve quantum advantage, with many prognosticators (including yours truly) expecting such achievement sometime next year (at least for certain types of problems). The following is a brief overview of some of the players that offer various quantum noise-reduction solutions:
Classiq: Their flexible and powerful platform automatically creates optimized and hardware-aware circuits form high-level functional models. It automates and simplifies the difficult process of creating quantum algorithms.
Parity QC: Develops blueprints for quantum computers based on their ParityQC architecture creating quantum computers which are scalable by radically reducing the control complexity. This allows them to provide a fully programmable, parallelizable (no SWAP gates), and scalable architecture which can be built with a greatly reduced complexity and a quantum optimization architecture which is independent from the problem. Due to its ability to parallelize gates, the ParityQC Architecture introduces algorithms based on global gates. In each step, a pattern of gates are executed at the same time. This removes the need to implement a control signal for each individual gate and only requires ONE single control signal for all gates instead. This provides a huge advantage for the hardware design and a route to mitigate cross-talk errors during qubit design.
Q-CTRL: Their quantum control infrastructure software for R&D professionals and quantum computing end users delivers the highest performance error-correcting and suppressing techniques globally, and provides a unique capability accelerating the pathway to the first useful quantum computers. This foundational technology also applies to a new generation of quantum sensors, and enables Q-CTRL to shape and underpin every application of quantum technology.
Riverlane: A quantum software provider with a whole stack focus, aiming to squeeze out every bit of efficiency. Their Deltaflow.OS® operating system is compatible with all current quantum hardware platforms including both gate-based and annealing methods. This allows them to provide a fully programmable, parallelizable (no SWAP gates), and scalable architecture which can be built with a greatly reduced complexity and a quantum optimization architecture in conjunction with hardware partners to optimize the design of their architecture for error correction.
Super.tech: This successful member of the first cohort of the Chicago Quantum Exchange/ UChicago Duality Accelerator was acquired by ColdQuanta earlier this year. Their SuperstaQ quantum software platform is optimized across the entire quantum stack enabling 2x reductions in error on typical quantum programs. SuperstaQ includes a library of sophisticated error mitigation techniques, including dynamical decoupling, excited state promotion, and zero noise extrapolation. SuperstaQ automatically optimizes quantum programs based on the target hardware’s pulse-level native gates.
Xanadu: Their PennyLane software is leading programming tool leveraging a cross-platform Python library which enables quantum differentiable programming — that enables seamless integration with machine learning tools. PennyLane also supports a comprehensive set of features, simulators, hardware, and community-led resources that enable users of all levels to easily build, optimize and deploy quantum-classical applications.
Zapata: a quantum computing software company that develops solutions for a wide range of industries. Zapata’s Orquestra™ platform allows users to compose quantum-enabled workflows and orchestrate their execution across classical and quantum technologies. Orquestra combines a powerful software platform, quantum algorithm libraries, and example workflows across machine learning, simulation and optimization. Orquestra automatically scales up and exploits task parallelization opportunities to run quantum algorithms faster.
Disclosure: The author has no beneficial positions in stocks discussed in this review, nor does he have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
Graphic from Dunning, Alexander & Gregory, Rachel & Bateman, James & Cooper, Nathan & Himsworth, Matthew & Jones, Jonathan & Freegarde, Tim. Composite pulses for interferometry in a thermal cold atom cloud. Physical Review A. 90. 033608. 10.1103/PhysRevA.90.033608. (2014).
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
There have been an increasing number of articles describing a coming “Quantum Winter”. While I am still extremely bullish on the sector and do not believe the industry will suffer a full abandonment by investors, the blunt reality is that the investment winds are shifting and will require a more sober view on quantum companies in the near-term.
Here are two graphics to help frame this situation. The first depicts recent movements in the public markets, and the second traces key venture investments. Specifically, the table below highlights the decline in stock price of four publicly traded quantum computing companies including ATOS (European based broad information tech), IONQ (trapped ion quantum computers), Quantum Computing Inc. (quantum software provider) and Rigetti (superconducting full-stack quantum computers). Together, these companies are off 75% from their recent highs whereas the broader NASDAQ index is down 30%.
So, while the overall market is suffering a broad decline including the tech-heavy NASDAQ, this bucket of quantum stocks is down more than double the amount. It is somewhat encouraging that these firms were able to go public recently, but their poor stock performance will make it increasingly difficult for other early-stage quantum companies to follow suit. These four firms are a small sample of the overall quantum industry and the chart is not market-weighted, so this isn’t a statistically clean analysis, but the undeniable conclusion is that investors in publicly traded quantum stocks are looking at a very steep hill regarding their quantum stock results (as are employees in these companies granted stock options at anything close to the IPO prices) and private quantum companies considering public markets as a way to raise operating capital will likely need to wait at least a few quarters, if not longer, before they could consider an IPO.
As for the private sector, venture funding of quantum companies had a break-out year in 2021 with nearly $1.5 billion invested in the top 20 funded quantum businesses. And while 2022 had started out strong, we’ve seen a significant decline in funding in the recently ended quarter, as highlighted in the last column below.
Source: PitchBook (excludes grants and debt financing)
A few additional observations:
The largest equity rounds were for firms creating quantum hardware. The bar to entry for others working on various qubit modalities is now exceptionally high. This is not to say others won’t be added to the list, but the days of seed-funded quantum hardware companies is likely over, rather major institutional support will be required.
Venture led boards are beginning to urge an increase from 24 months of operating capital to 36 months, to ensure adequate runway. This will necessitate a lowering spend by portfolio companies which will translate into longer milestone timelines.
Given the overall market malaise and recent pull-back by venture investors generally, new QC rounds will become more challenging, and down-rounds are likely. Down rounds have lingering and residual negative effects on capital markets, so this undoubtably will cause some heartburn in the industry.
Given the existing dearth of talent in the quantum information industry, combined with rationalized firm valuations and needs to preserve capital, I expect we’ll see increasing M&A activity.
It’s well known that markets move in cycles, so difficult fundraising environments are to be expected. That said, I’m still extremely bullish on the space in general, especially taking a 5-10 year view which is the time range most often cited for achievement of consistent quantum advantage.
My general take-away from this analysis is that valuations for quantum companies will become rationalized in the next few quarters, providing an attractive investment window. In addition, while quantum hardware companies have taken much of the spotlight, there are many other players in the quantum ecosystem that will benefit from broader industry adoption, particularly those involved with the “picks and shovels” of QC such as cryogenics, lasers, optics, controllers, vacuums, etc., and certainly for the software providers, especially those agnostic to the form of hardware used. Quantum sensing and communications are also appropriate focus areas.
In summary, I’m not a believer in a full-on quantum winter, but we are in for some near-term challenges and disruption in the Quantum Computing arena. Tighter budgets, more difficult access to funding, and laser-focus on milestone achievement will be the norm. Of course, the evaporating liquidity will make milestone achievement that much more difficult, so there is likely to be some negative feedback loop effect as well. However, in some sense, this will be positive long-term in that “survival of the fittest” will winnow away some of the marginal players. I predict and expect the industry will come away stronger and I look forward to the eventual “Quantum Spring”.
Disclosure: The author maintains personal long positions in certain companies mentioned herein but does not have any business relationship with any company referenced in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
When I created this blog, my stated purpose was to follow Quantum Computing (QC) from the perspective of an investor. To date, I have generally posted blogs that either covered technical aspects of QC (e.g., this post explaining superposition and entanglement), or showcased the companies involved in commercializing QC (e.g., this post on the evolving ecosystem). However, I hope you’ll indulge me a bit for this latest post, which approaches QC from a philosophical perspective. It’s an aspect of this field that originally gripped my attention and which underlies much of why quantum mechanics conjures such non-intuitive conclusions. Here are a few concepts that will be covered, each of which likely induces head-scratching:
Wave/Particle Duality
Matter/Energy Equivalence
Superposition and Entanglement
The Observer Effect
The Uncertainty Principle
“Imaginary” Numbers
As many of you may already know, a core feature of quantum mechanics concerns the “duality” between particles and waves. Certain aspects also deal with the interchange of matter and energy (you are already likely familiar with Einstein’s E=MC2 equation which famously and simply showed the equivalence between matter and energy). These somewhat non-intuitive principles underpin some fascinating philosophical questions regarding QC. That said, I am approaching this as a lay person, so will not debate any of the theological roots or delve deeply into the underlying physics. However, I hope you will enjoy this mental exercise and that it will spur your curiosity to dig in deeper yourself.
The Quantum Computing “Chicken-and-Egg” Quandary
If you search for resources about the origin of Quantum Computing, you will invariably come across a quote by Richard Feynman, generally cited as the father of QC. In 1981, Feynman said:
“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you better make it quantum mechanical…”
Most current descriptions about how QCs work approach it from the qubit perspective. How to harness the quantum mechanical features of the underlying qubit, be it an atom, electron or photon. A new form of computing paradigm, where we use machines to solve problems or equations that current classical computers would take too long to solve. While this is truly fascinating, and I am confident it will unlock massive opportunities (and value), it is a bit “backwards” from what Feynman was suggesting. His premise was focused on “simulating nature” and since nature is governed by quantum physics, he was suggesting we needed to use quantum physics to better understand nature. It is expected that as QCs become larger and more powerful, we will be able to simulate nature to create better batteries, fertilizers, and medicines, among other things. But QCs will also enable us to answer questions we’ve never thought to ask and which would essentially be gibberish to classical computing processes.
The metaphysics of this concept revolves around using QCs to create better QCs. As we work to scale existing QCs which currently contain tens or hundreds of qubits, an obvious early question is “how do we build better and larger configurations of qubits?” As industry drives towards 1,000,000-qubit machines, it seems obvious (at least to me) that it will take QCs to optimize the configurations of these larger QCs. What is the upper limit of the capabilities such a self-supporting loop can create? This 1,000,000-qubit goal assumes “noisy” qubits, so it is thought that we need 1 million qubits to net-out to 100 logical qubits, and much has been written about the awesome power of 100 logical qubits…but why stop there? What if we had 1,000 or even 1,000,000 logical qubits? The power of such a machine would, essentially, be so massive as to be indescribable.
More on Wave-Particle Duality
Quantum computers derive their power from quantum mechanics, which is the study of the physical properties of nature at the scale of atoms, photons and subatomic particles. Many of the fundamental properties of quantum mechanics revolve around the behaviors of these particles, which exhibit characteristics of both particles and waves. Intuitively, we understand particle behavior which guides the path of a baseball or the motion of a billiard ball. Similarly, we are familiar with waves and how they can sometimes cancel each other or enhance each other. However, when particles exhibit both properties simultaneously, non-intuitive things happen, such as superposition and entanglement. While non-intuitive, these features are well proven experimentally and can be explained and predicted using established mathematics so we must wrestle with the fact that something so non-intuitive is occurring at the smallest scales. Conversely, I have yet to find a satisfactory explanation or formula to describe “the observer effect”. For those of you not familiar with this feature of quantum mechanics, it essentially says the act of measuring something (i.e., observing it) actually changes it. An example of how this manifests in Quantum Computing can be seen if we apply two sequential Hadamard gates. Skipping over the linear algebra and matrix multiplication, just know that if you input a |0〉to two sequential Hadamard gates, |0〉 is output 100% of the time (i.e., it is mathematically equivalent to the identity matrix). However, if you measure the qubit between the two Hadamard gates, the output becomes a superposition that is |0〉 half of the time and |1〉 the other half of the time. The mere act of “observing” the qubit between gates changes the outcome! How does the qubit know it is being observed?
The Y-Gate and “Imaginary” Numbers
Nearly any “Intro to Quantum Mechanics” course, book or article, will mention the Stern-Gerlach experiment as one of the first topics. It’s a fascinating subject that is well covered elsewhere, so I won’t provide much detail here (if interested in learning more, the Wikipedia post on the subject is a great intro and a link is included in the References at the end of this post). The Stern–Gerlach experiment involves sending a beam of silver atoms through a magnetic field and observing the deflection. The results show that particles possess an angular momentum that is similar to the angular momentum of a classically spinning object, but that it has only certain quantized values. Another important result is that only one component of a particle’s spin can be measured at one time, meaning that the measurement of the spin along the z-axis destroys information about a particle’s spin along the x and y axes.
Now, if you’ll bear with me a bit as I reference linear algebra (don’t worry, you don’t need to understand linear algebra to appreciate this point), I want to highlight a very metaphysical aspect of this concept. You’ll note below the matrix notation for two essential “gates” or basic QC functions. The first is known as the “X-Gate” which is analogous to the “NOT” gate in classical computing. If you apply a NOT gate in classical computing, it switches a 1 to a 0 or a 0 to a 1. In Quantum Computing the X-Gate essentially flips the qubit on its head, also switching a |1〉to a |0〉or a |0〉to a |1〉. This is straight forward only requiring the most basic familiarity with matrix multiplication to prove it. However, the “Y-Gate” is quite different. The Y-Gate essentially turns the qubit on its side, and its matrix representation is suddenly quite foreign. The matrix representation of these two gates is shown below:
You will note for the Y-Gate the introduction of “i” (and -i) which is the symbol for the unfortunately named “imaginary” number. “i” is mathematically defined as the solution to “X2 + 1 = 0.” Although there is no “real” number that can solve this equation, it can still be used for certain mathematical functions. It likely would be more fitting to call these “complex” numbers instead of imaginary. Mathematicians would likely describe “i” as “lateral” or “perpendicular” to the plane where the “Real” number lay. Evoking this concept of “Real” versus “Imaginary” suggests the imaginary numbers are surreal or mystical, and while that is itself a metaphysical concept, it is the fact that the information is quite different when orienting along the X-Axis versus orienting perpendicularly on the Y-Axis. Again, for those familiar with linear algebra, this is rudimentary matrix multiplication and for those studying quantum physics, it is one of the first topics covered and proven by the Stern-Gerlach experiments back in the 1920’s. The take-away for this post is that the same quantum “thing”, oriented in one direction, contains different information if you orient it in a perpendicular manner.
Back to the Beginning
As in the beginning of time. That tiny fraction of an instant before the Big Bang. It is generally believed that our current universe was preceded by a reality where everything (all energy and matter) was confined to an infinitesimally small point. For reasons still largely unexplained, this super-concentrated point exploded and expanded into what is now the observable universe. From apparent nothingness came a stupendously large amount of space, time, energy and matter. Have you ever considered why that happened? Surely many of you studied this as you learned about your religion, and largely consider it from a spiritual perspective. But “something” led to the conversion of the pre-universe composition into the current universe comprised of matter and energy. What force led some aspects of the original pinpoint to manifest as matter and some to manifest as energy? Why isn’t it all “energy” or all “matter”? I like to believe that “quantum” was the driving force even at this time-zero. Let me explain.
Most introductory texts to quantum mechanics refers to the “uncertainty” principal. It is referenced by Heisenberg in the context of never quite knowing both the speed and position of a particle, and also leads to QC calculations being probabilistic and not deterministic. This is the concept Einstein was referring to in his famous “God doesn’t play dice…” quote. Imagine for a moment that the original laws governing the Big Bang were completely deterministic. In that case it would seem likely to me, that the universe would not today be made of various “stuff” but would rather be all of one thing. However, nothing interesting can be built from just one component, and certainly nothing organic. So, the propensity of uncertainty may have led to the creation of energy and of matter of varying configurations which spurred a universe made of a dizzying array particles, forces, stars, planets, black holes and the other various wonders of nature. It’s this “quantum-ness” that allows for variability and it’s the variability that creates differing “things”.
Surfing Across Dimensions
This has just been a sampling of some of the head-scratching aspects of quantum and is intended to spur questions to contemplate as opposed to provide answers. The mathematics which helps explain quantum mechanics, also govern the addition (or subtraction) of spatial dimensions, which also challenge our current world view. Perhaps some of the remaining unanswered questions in quantum can be explained by action/forces in dimensions we cannot see? Perhaps someone will come up with a “grand unified theory” to explain how the strong, weak, and electromagnetic forces all work and interact and how they relate to gravity, and perhaps that will help us understand these questions from an intuitive perspective.
In any case, despite the challenging mathematics, the non-intuitiveness of certain features, and the inability to definitively tie together all the disparate features of matter and energy, Quantum Computers continue to scale and to successfully run algorithms. As these devices become more powerful, perhaps they will help uncover some of these mysteries. In the meantime, I hope this post helps stimulate your wonder, and that you dig in deeper to learn and understand more. I welcome your feedback and ponderings and you can reach me at russ@quantumtech.blog.
Disclosure: The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
A prior post entitled “Collaboration Dominates Quantum Computing” included an overview of ColdQuanta, a global quantum technology company building quantum computers, sensors, and related products. This post provides additional information and details on its broad yet complementary business units. As ColdQuanta management relayed to me, ColdQuanta is “not just a Quantum Computing company,” it really is “a quantum technology company.” ColdQuanta’s strong history and momentum in quantum sensing, combined with its recently proven Quantum Computing capabilities, amounts to a powerful leader in broad quantum commercialization. This quantum platform focus supports an overall assessment of the likelihood of their success with a RatingAlpha = 0.95/Exceptional Performance Expected (see the Rating section for details).
Background
Based in Boulder, Colorado, ColdQuanta traces its roots to Drs. Eric Cornell and Carl Weiman who created the first ever Bose-Einstein Condensate (BEC) at UC Boulder in 1995, a feat for which they were awarded a 2001 Nobel Prize. BEC is a new form of matter, which is created when atoms are cooled close to absolute zero. ColdQuanta uses lasers to arrange either cesium atoms (cooled to a few microKelvin, or millionths of a degree for qubits) or rubidium atoms (cooled to nanoKelvin or billionths of a degree to make BEC, where the atoms act as a single quantum object and are used most notably in sensing) and hold them in place. Since temperature is a measure of kinetic movement, locking these atoms in place reduces their movement and hence, reduces their temperature. ColdQuanta uses this cold atom method across multiple quantum applications including gate-based quantum computers as well as a variety of quantum sensing and signal processing applications such as High Precision Clocks, Quantum Positioning Systems (QPS), Quantum Radio Frequency Receivers (QRF) and Quantum Networking and Communications. While Quantum Computing steals most of the “quantum” headlines these days, these other quantum-enabled devices bring enormous advances in their fields and, importantly, current revenues.
Core Technology
ColdQuanta, as its name implies, uses the quantum mechanics of “cold” atoms as its fundamental technology for a variety of important and compelling applications. As Paul Lipman, President of Quantum Information Platforms for ColdQuanta conveyed to me:
“Atoms are nature’s perfect qubit. Atoms are in and of themselves quantum objects. And by cooling the atoms down we remove noise and we’re able to utilize their quantum nature for a variety of applications. So, it’s one core technology, but with applications to compute, to quantum signal processing, to quantum sensing, to extremely sensitive quantum clocks. We’re addressing a wide array of applications and use cases and technologies but with a single core underlying ‘qubit’ if you will.”
Here is a brief summation of how they can create so many disparate devices from a single, core technology [Hemsoth, 2021]:
Each device/application begins with a basic glass cell (see examples pictured below)
The cell is evacuated with an ultra-high vacuum (UHV)
It’s then filled with atoms of a single element
Lasers are used to “trap” the individual atoms, which makes them cold, which in turn allows them to take on quantum properties
Other lasers then arrange the atoms in specific configurations, depending on the application. For example, a checkerboard-type arrangement is used to create qubits, counter-rotating atoms create gyroscopes and linear configurations can be used for accelerometers, etc.
For quantum computing, additional lasers are used to further manipulate the atoms for computational purposes.
ColdQuanta uses this general configuration for two classes of products. One, manifested in its “Albert” quantum matter design platform, is used for quantum sensing and related applications and the other, referred to as “Hilbert” is used for quantum computing.
Albert/Sensing Devices
ColdQuanta has been selling its various quantum sensing devices and components for many years, to notable customers like the Office of the Under Secretary of Defense for Research & Engineering (OUSD R&E) which awarded ColdQuanta a $1.8 million contract, the Defense Advanced Research Projects Agency (DARPA) which awarded a $3.6 million contract and a variety of UK government initiative awards totaling $3.5 million.
In addition to selling quantum sensing products and components, ColdQuanta offers its “Albert” quantum matter design platform via cloud access. Users of the beta platform can now remotely create and manipulate Bose-Einstein Condensate (BEC) on a quantum platform enabling them to control and arrange the quantum state of Albert to define its dynamic behavior and then capture and evaluate the results to accelerate research and refine designs.
“Albert” is the showcase quantum matter design platform ColdQuanta offers, encompassing its capabilities around quantum sensing. The key to ColdQuanta’s ability to leverage its cold atom system for quantum sensing is rooted in two important properties, among others. The first is the ability to place the individual atoms in a superposition (a fundamental quantum mechanical feature) and then measure the atoms to track ultra-minute changes and therefore “sense” various factors (i.e., time) with exquisite precision. The other is when adding energy to the atoms and placing them into a Rydberg excitation, which in turn significantly increases the “size” of the configuration (the insertion of energy expands the outer orbit of the electrons thereby stretching or enlarging the overall size of the atoms) creating a tunable dipole. This configuration is then extremely sensitive to radio frequency (RF) changes.
Examples of how users could create applications with Albert include the following:
Quantum sensing underpins a number of important products including:
Atomic Clocks
Sensors
Gyroscopes
Accelerometers
Gravimeters
These devices, in turn, enable or improve important applications such as:
GPS Resilience
Aircraft Guidance
Power Grids
Cell Towers
Financial Trading Systems
Autonomous Vehicles
Navigation Systems
ColdQuanta’s cold atom approach enables the creation of ruggedized, portable and compact systems. In fact, they have successfully operated two different ColdQuanta systems on the International Space Station (ISS). In addition to the UHV cells shown on the prior page, ColdQuanta has a broad quantum product offering including the following:
In fact, since its founding in 2007, ColdQuanta has been awarded over $60 million in contracts. Selling these components to other pioneers in the evolving quantum space, should provide meaningful and growing revenues, akin to the selling of “picks and shovels” to the early gold prospectors.
Hilbert
ColdQuanta recently released its cloud-based quantum computer called Hilbert which will reach 100 qubits. Hilbert promises superior error correction, high qubit connectivity (starting at 4:1 but should quickly scale to 8:1 and ultimately closer to 100:1), long coherence times, and high gate fidelity, among other features. In addition, because neutral atoms do not have an electrical charge, they can be packed close together making this method of qubit construction highly scalable and compact. And most importantly, despite the super-cold atoms, the device itself operates at room temperature.
As Paul Lipman noted in a recent press release on Hilbert, “the commercial release of Hilbert marks an important and exciting milestone for ColdQuanta and for the cold atom quantum computing modality. Building on our recent world first in executing algorithms on a cold atom quantum computer, Hilbert demonstrates the power and scalability of atomic qubits and their promise to transform the quantum computing landscape.”
Hilbert supports the Qiskit API and will initially be available in beta to customers through ColdQuanta’s comprehensive multi-tenant cloud platform and soon via the Strangeworks Backstage Pass program (see below for additional details). Integration with public cloud services is expected later this year. The product roadmap calls for Hilbert to scale to 1,000 qubits by 2024 with the same strong connectivity, fidelity, and miniaturization at room temperature.
ColdQuanta Executive Management
ColdQuanta has over 150 employees including more than 90 physicists. They have cross-border facilities with a distinct footprint in Boulder, Colorado (USA), Louisville, Kentucky (USA) Madison, Wisconsin (USA), Oxford (England), and now Chicago, Illinois (USA). Senior management has deep and extensive experience in relevant quantum and technologically adjacent fields, and includes the following:
Scott Faris, CEO: Mr. Faris is an experienced technology company executive with over three decades of operating, venture-financing, and scaling experience including a diverse track record in new venture investment, technology company start-up and scaling operations, innovation process management, technology commercialization, corporate development and strategy, strategic alliances, and federal and commercial business development. This background seems well matched with the stage and general capabilities of the Company.
Paul Lipman, President, Quantum Information Platforms: Mr. Lipman is an experienced leader in emerging technologies. Lipman is currently a Board Member at the Quantum Strategy Institute (QSI). Most recently, Lipman was CEO of BullGuard, a global leader in AI-driven cybersecurity, which was acquired by Avira/NortonLifeLock. His career experience includes the development of the world’s first IoT cybersecurity solution, Dojo (acquired by Forescout Technologies) as well as leading multiple innovative cybersecurity companies to successful exits. Prior to BullGuard, Lipman was CEO at SASE pioneer iSheriff (acquired by Mimecast). Earlier in his career, he held CEO, GM and executive leadership positions at Webroot, Keynote Systems, Total Defense and Accenture. Based in Silicon Valley, Lipman holds an MBA from Stanford and a bachelor’s degree in Physics from Manchester University in the UK.
Chester Kennedy, President, Research & Security Solutions: Mr. Kennedy has had a career focused on innovative technologies and their impacts on a variety of industries. Kennedy served as the Chief Executive Officer of BRIDG from 2015 to 2020, leveraging his aerospace and commercial electronics industries experience to lead the construction of a microelectronics fabrication facility and the establishment of a robust customer base. Before BRIDG Kennedy spent 30 years at Lockheed Martin and its heritage organizations most recently as Vice President and Chief Engineer of Training and Logistics Solutions at Lockheed Martin Mission Systems and Training.
Dana Anderson, Co-Founder and CTO: Dr. Anderson is co-founder and former CEO of ColdQuanta. He is a Fellow of JILA, and a Professor in the Dept. Of Physics and Electrical & Computer Engineering at the University of Colorado. He is also Director of the Quantum Applied Science and Engineering (QASE) at CU Boulder. Since 1993 he has been involved in guiding and manipulating cold and ultracold atoms. He and his collaborators Professor Carl Wieman and Dr. Eric Cornell (2001 Nobel Laureates in Physics) first demonstrated guiding of cold atoms through hollow core optical fibers in the mid-1990’s. Drs. Anderson and Cornell performed many of the earliest works guiding cold atoms on an “atom chip,” including the first demonstration of a chip-based atom Michelson interferometer. Professor Anderson’s group demonstrated the first ultracold atom chip portable vacuum system in 2004 and has been heavily involved in DoD-funded activities to develop ultracold atom chip. Dana received his Ph.D. from the University of Arizona and undergrad from Cornell.
Mark Saffman, Chief Scientist for Quantum Information: Dr. Mark Saffman, Professor of Physics at the University of Wisconsin-Madison is a preeminent expert in neutral atom quantum computing. He is an experimental physicist working in the areas of atomic physics, quantum and nonlinear optics, and quantum information processing. In 2010 his research team was the first to demonstrate a quantum CNOT gate and entanglement between two trapped neutral atom qubits. Mark has been recognized with an Alfred P. Sloan fellowship, the Vilas Associate Award from the University of Wisconsin-Madison and is a fellow of the American Physical Society and the Optical Society of America. Mark worked as a Technical Staff Member at TRW Defense and Space Systems and subsequently as an Optical Engineer at Dantec Electronics Inc. in Denmark before going back to graduate school to earn his Ph.D. in Physics from the University of Colorado at Boulder. Mark received his B.Sc. with honors in Applied Physics from the California Institute of Technology.
Competition
While there are a number of competitors in the quantum sensing space, most are early stage, less well capitalized or have a narrow product focus. Certainly, some of these companies, such as Qnami (imaging and diagnostics), Innatera Nanosystems (medical sensors), Spiden (medical sensors) and QDTI (medical sensors), may develop strengths in a narrower field within quantum sensing, but none have the breadth of offering that ColdQuanta offers. Within Quantum Computing specifically, there are a few neutral atom competitors including Atom Computing, Pasqal and QuEra. Each of these QC competitors is also working towards using netural atoms as qubits, with Atom releasing its 100 qubit Phoenix system in July 2021, Pasqal’s 100 qubit machine coming soon to the Microsoft Azure platform, and QuEra’s planned for release on Amazon Braket later this year. Since all these neutral atom quantum computers are extremely early in their release (or imminent), it is difficult to assess the feature and benefit differences among them. Additionally, other forms of qubit, are already powering various quantum computers made by IBM, Rigetti, IonQ, Quantinuum, and others. Each of these other qubit modalities have various strengths and weaknesses compared to the ColdQuanta structure so it will be interesting to follow the industry and see which platforms garner the strongest following and commercialization momentum. That said, in this early stage of QC, there are broad opportunities for early movers, including ColdQuanta, to gain traction and ultimately customers.
Funding
ColdQuanta has completed a series of investment rounds totaling $68.75 million so it has been well capitalized. Their original funding in 2017 came via a $12 million grant from Small Business Innovation Research. This was followed by a $6.75 million seed round in 2018, led by Maverick Ventures and joined by Global Frontier Investments (each receiving board seats). Several additional grants and seed investments were received in 2019 and 2020 and a $32 million Series A round was completed in late 2020, with existing investors participating and joined by Foundry Group and Lennox Capital Partners. In 2021 a later stage venture round of $20 million was completed with GrayArch Partners and Wisconsin Alumni Research Foundation joining with the existing investors. Overall, this represents a well-capitalized enterprise supported by some prominent names in venture investing.
Collaborations, Partnerships and a Recent Acquisition
In the prior post on Collaboration Dominating Quantum Computing, some of ColdQuanta’s various partnerships and collaborations were highlighted, so readers of that post may recognize some of the following details:
Super.tech
In May of this year, ColdQuanta announced the acquisition of software startup Super.tech, known for its innovations in quantum software. Super.tech, co-founded by Pranav Gokhale and Fred Chong, is a member of the first cohort of the Duality incubator, run by the Chicago Quantum Exchange and the University of Chicago. Super.tech has developed SuperstaQ which enables users to write quantum programs in any source language and target any quantum computer, providing API endpoints that enable deployment of quantum solvers for practical applications, without needing any quantum experience. It has also created the SupermarQ suite of quantum computing benchmarks. Super.tech will become the Chicago office of ColdQuanta and Gokhale, Chong and the other dozen or so Super.tech employees will join the ColdQuanta staff.
Classiq/ColdQuanta
In January of this year ColdQuanta and Classiq announced a partnership to make 100-qubit quantum circuits a reality for companies and researchers seeking quantum computing solutions. The partnership combines ColdQuanta’s cold atom quantum computers and Classiq’s quantum algorithm design software. They aim to provide customers with the ability to create, simulate and execute unique quantum circuits to address a wide range of finance, material science, supply chain, and machine learning challenges. As Nir Minerbi, CEO of Classiq noted, “as the industry moves from toy problems solved by toy circuits running on small quantum computers to solving real problems that require complex circuits on larger quantum computers, there is an acute need for a high-level platform to develop these circuits quickly and efficiently.” By entering into this partnership now, the companies should be well aligned to scale together as ColdQuanta releases larger QCs in the future.
ColdQuanta/Strangeworks
This past December, ColdQuanta and Strangeworks announced the addition of the forthcoming Hilbert Quantum Computer to the Strangeworks Ecosystem. Hilbert will be available for early access by select members of the Strangeworks Backstage Pass program with general available later this year. As noted above, the Backstage Pass program is a vital tool for early development and evaluation of new QC capabilities, and ColdQuanta is benefiting from important feedback in advance of its broader public release. Think of it as a beta release which is accessible to an optimal set of users and therefore able to provide deep insights on strengths and weaknesses of the system.
ColdQuanta/IBM Q
In May of last year, ColdQuanta announced that it had joined the IBM Quantum Network and would be integrating IBM’s Qiskit open-source software development kit (SDK). ColdQuanta plans to make its Hilbert QC available via IBM Q, IBM’s quantum network, and combined with its integration with Qiskit, will enable ColdQuanta customers to accelerate their quantum computing initiatives. The companies also noted that they will pursue joint development opportunities with the goal of accelerating the adoption of other quantum technologies.
Learning More
For quantum enthusiasts and investors seeking to learn more about ColdQuanta and their Albert and Hilbert platforms, I encourage you to visit their website and sign-up for updates. They are also highly active in the various quantum conferences held throughout the year, so you can learn more by speaking with them at any of those in-person and/or on-line events. They are also quite active on the social platforms including LinkedIn, Twitter, Facebook and YouTube and I encourage you to follow them on any or all those mediums.
For prospective customers interested in their sensing devices, it’s easy to create an Albert Beta Account here, or review some of their Albert resources and documentation here. For more information about their various sensing products, they maintain product details here. For details about their Hilbert universal and scalable Quantum Computing platform, visit here.
Summary
ColdQuanta has a solid team, protective IP, a highly regarded product portfolio, a strong balance sheet, and now a quantum computing platform. It has diverse customers, legacy revenues, and should enjoy synergies both between its two broad “quantum” offerings (sensing and computing which both leverage neutral atom configurations) and now across its Quantum Computing hardware platform and recently acquired software platform. A later start to Quantum Computing, and a broad geographic footprint provide a few modest headwinds. The following table highlights some of the key attributes of ColdQuanta:
Rating
Apropos of the probabilistic nature of quantum algorithms, I wanted to leverage the nomenclature to create a company rating system and assign a scale to my overall assessment of a company’s potential. Accordingly, I am going to use the formula below when reviewing companies, whereby the “alpha” coefficient correlates with “positivity” (and the formula adheres to the Born rule).
Given my overall assessment of ColdQuanta including its strong IP, broad and complementary offering, and prestigious existing customers (and revenues), I am assigning the highest rating to ColdQuanta at this time, with an Alpha of 0.95 which equates to an “Exceptional performance expected.” When I began researching the Company, I had originally considered an evaluation one notch lower due to the non-availability at that time of their Hilbert Quantum Computer, but the release of the device provided the added impetus, in my view, to award this highest rating.
Disclosure: The author has no beneficial positions in stocks discussed in this review, nor does he have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
Lipman, Paul. Interviewed by Russ Fein, April 27, 2022, and March 11, 2022.
Hsin-Yuan (Robert) Huang Michael Blythe Broughton Jordan Cotler Sitan Chen Jerry Li Masoud Mohseni Hartmut Neven Ryan Babbush Richard Kueng John Preskill Jarrod Ryan McClean, “Quantum advantage in learning from experiments”, arXiv:2112.00778, December 3, 2021
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In a prior post entitled “The Case for an Annual ‘Quantum Games’ Competition”, I described how the amount of innovation and technical advancement in Quantum Computing (QC) has been incredible over the past 12 months or so, but how challenging it is to compare machine performance. Should we focus on who has the “most qubits”? [hint, the answer is no] Or highest “quantum volume”? Or which can run the longest before decoherence? Or should we focus on #AQ as IonQ has suggested? How about SupermarQ or QED-C proposed alternative benchmarks?
So, I suggested an annual “Quantum Games” or world Olympics to spur innovation and friendly competition. I volunteered to be a judge so that I could enjoy a front-row seat to watch the competitors give their best to the challenges. I was thrilled that Classiq has since created the “Classiq Coding Competition” and honored to be a judge in this recently announced competition. Let me explain why I’m so excited, but first some more details on the Competition.
Classiq’s Worldwide Competition – a $25,000 Challenge to Build the Best Quantum Circuits
As Nir Minerbi, the Classiq CEO noted in the Contest announcement, “Creating efficient quantum algorithms is part engineering, part art. The Classiq Coding Competition is a call to the world’s quantum software community to showcase their talents and demonstrate how quantum computing can take humans to new heights. Efficient circuits enhance the ability of any quantum computer to solve important problems.”
Minerbi went on to add “You would be surprised how much can be achieved with compact, efficient circuits. The onboard computer used in the Apollo 11 space mission got a man to the moon using just 72 kilobytes of ROM. Quantum computing is taking off, and the need to create elegant and efficient quantum algorithms will exist for years to come. Organizations that manage to fit larger problems into available computers will reap their quantum benefits sooner than others. The Classiq Coding Competition will encourage the creativity and ingenuity required to make this happen and highlight the art of the possible (emphasis added) in compact, efficient circuits.”
The Competition includes four problems or challenges briefly described below:
Problem
General Description
Success Metric
Log-Normal State Preparation
Many quantum algorithms rely on initializing qubits in a specific state. The promised speedup of the algorithm depends on the ability to prepare the quantum state efficiently.
Shortest circuit depth that provides an error below 0.01
Kakuro – A Constraint Satisfaction Problem
Kakuro is a logic puzzle played on a grid of cells. The challenge is to solve the puzzle using Grover’s algorithm.
Minimized number of 2-qubit CX gates
Decomposing a Multi-Controlled Toffoli Gate
Decompose an MCX gate with 14 control qubits, one target qubit and up to five auxiliary qubits.
Shortest circuit depth
Hamiltonian Simulation
Describes the evolution of molecules and solid-state systems by solving the Schrodinger equation. Quantum computers enable such simulation in a scalable manner.
Using the CX and single qubit gates only with up to 10 total qubits, shortest circuit depth and an error below 0.1
While the descriptions in this chart will not resonate with readers unfamiliar with quantum algorithm construction, suffice it to say that it is a disparate and varied set of challenges with many possible solutions, which is part of the reason this is such an interesting challenge.
Here are a few more highlights of the Competition:
Submission Deadline: June 5, 2022 (by midnight US Eastern time)
Cash prizes of $25,000: with $3,000 to Gold medal, $1,500 to the Silver medal and $500 to the Bronze medal winner for each of the four problems. There is also a $1,000 prize to each of two select winners in the “Youth” category (aged 18 and under) and $1,000 to each of the three most innovative solutions.
Multiple submissions are allowed
Submissions may rely on any preferred framework (i.e., Qiskit, Pennylane, etc.), however the Classiq platform may not but used
For those unable to receive a cash prize, Classiq has designated several worthy charities where the prize can be donated
A panel of five judges will review submissions
For more details, please visit the Classiq Competition page here
There has been significant and early enthusiasm for the Coding Competition with about 250 registrations received as of today, from all over the world (see map below) and more than 30 early solutions already submitted despite the contest being open for 12 more days.
Why is This Exciting for Non-Coders?
Nir’s quote about “highlighting the art of the possible” is why I’m so excited and why you should be too. Some think of computer programs as rigid, dull instructions that either solve a problem or return an error. However, the reality is that programming is a true art, and there are nuanced ways of sequencing, connecting and interweaving programming commands. This is especially true with Quantum Computing in the early “NISQ” (noisy intermediate stage) environment where programmers need to deal with a few additional challenges besides simply programming the solution. These include:
Building in error correction: Because of the noise inherent in current QCs, many of the available physical qubits must be used for error correction overhead. Handling this in an efficient manner is quite challenging, especially with the limited numbers of qubits available today.
The probabilistic nature of quantum algorithms: Quantum programs or algorithms are not simply run once with the program returning an answer. Quantum effects are given by probabilities and the algorithms are generally run many times; each time being referred to as a “shot”. Often 10,000 shots are performed before an answer is determined. The program needs to “reset” the qubits between shots to allow the next shot to begin.
Not all qubits are created equal: Certain QCs have limitations on which qubits can “entangle” with certain other qubits. Some can only entangle with nearest neighbors and others are more flexible. Some QCs have qubits that can maintain their state longer and so can run deeper algorithms. In some instances, certain qubits within a given QC are less reliable than others, and there are other peculiarities among different QCs. Programmers need to be able to adapt their programs to factor in these differing characteristics.
Machines and Development Kits vary: There are a number of different quantum computers available via the cloud and each has differing constraints and capabilities. In addition, there are a variety of development kits or programming environments (i.e., Qiskit, Q#, Cirq, Strawberry Fields, Forest, etc.).
This is not meant to be a technical primer on programming QCs but rather is intended to showcase how much “art” there is in programming. For those interested in learning more about these various constraints and challenges I encourage you to read Yuval Boger’s (Classiq’s CMO) excellent post on this topic here.
Quantum Computers are still quite early in their development. Most available machines have limited numbers of qubits and various strengths and weaknesses regarding coherence, connectability and processing speed. Ultimately, Quantum Computers will only be as valuable and impactful as the programs written for them, so seeing creativity and outside-the-box approaches to challenges with these early, noisy, faulty machines will be quite revealing. I’m excited to get a glimpse into the varying approaches entrants use, and creative methods they employ, to overcome some of the challenges.
I look forward to reviewing the submissions in detail with my illustrious co-judges and plan to provide some insights about the competition in a future post. In the meantime, I encourage you to enter the contest and/or tell your friends and colleagues about it. It’s a great chance to test your skills, perhaps win some money, and most importantly, earn the bragging rights as a Medalist in the first ever Quantum Computing Coding Competition.
Disclosure: The author has no beneficial positions in stocks discussed in this review, nor does he have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.