In January of 2022 the Quantum Leap published a profile on Quantinuum (see original post here), where it wasassigned the highest rating with an Alpha of 0.95 which equates to “Exceptional Performance Expected”. Earlier this month I was invited to the official launch of Quantinuum’s newest Quantum Computer, referred to as their System Model H2, and was quite impressed with the Company’s progress so I am pleased to say that Quantinuum continues to earn this highest rating (see “Rating” section for further details).
Background
In October of 2020, Honeywell Quantum Solutions (HQS), a division of Honeywell, introduced its first quantum computer, the System Model H1, which featured 10 fully connected qubits and a quantum volume of 128, which was the highest reported at the time (surpassing IBM’s prior record of 64). In June of 2021, after a series of successful collaborations, Cambridge Quantum Computing (CQC) reached an agreement to merge with HQS. Then in November of 2021, Honeywell spun out HQS to formally combine the businesses into a new stand-alone company called “Quantinuum” (the Company) and capitalized it with $300m. Then in December of 2021, the Model H1-2 successfully passed the quantum volume benchmark of 2,048, a new global record and consistent with the Company’s stated timeline of annual 10x increases in quantum volume. At the time of Quantinuum’s formation, a long-term roadmap was published and on May 9th the Company announced the release of the Model H2, meeting their previously committed timeline for this model.
System Model H2 Commercial Release
I attended the Quantinuum press/analyst day earlier this month at their headquarters in Broomfield, Colorado, where the Company announced its latest quantum machine, the System Model H2. Quantinuum uses ytterbium ions to create its trapped ion quantum computer and this new machine builds on the success of the previous models and now features a novel “racetrack” design (which can be seen in the center of the photo below right) and some impressive, industry leading performance metrics:
32 Qubits
1Q Fidelity 99.997%
2Q Fidelity 99.8%
SPAM 99.8%
Crosstalk error 0.0005%
All-to-all connectivity
Mid-circuit measurement
Qubit reuse
Long coherence times
The day included the unveiling of the actual H2, an engineering marvel, as well as unfettered access to virtually the entire management team and the leading scientists involved with H2. It was an impressive and informative day, and I was especially struck by the transparency the Company exhibited, both with access to its team members as well as via the benchmarking of the H2’s performance metrics.
The all-to-all connectivity, facilitated by their quantum charge-coupled device (QCCD) architecture, with 32 qubits that have impressive fidelities and decent coherence times, has enabled Quantinuum to do some remarkable things including:
Achieving a record quantum volume of 65,536 (216),
Creating a 32-qubit GHZ state (a non-classical state with all 32 qubits globally entangled), the largest on record; and
Creating non-Abelian anyons, a new state of matter, for the first time ever.
The Company plans to expand the H2 to 50 qubits sometime next year. At that level, it should begin to perform computations that are beyond the reach of any classical simulations by the most powerful supercomputers. This would be no small feat and makes the broader Quantum Computing leap to new realms imminently possible.
Quantinuum has had an over-arching philosophy, as emphasized by Ilyas Khan, founder and chief product officer, in the press release accompany the formal announcement of the H2, “that when incredible tools are given to brilliant people, they will find something amazing to do…”. Quantinuum had been working with certain of their clients on the H2 including JPMorgan Chase which published a paper on using the machine for portfolio optimization. The H2 is now available via cloud-based access directly from Quantinuum and will be available through Microsoft Azure Quantum beginning next month. I am excited to see what amazing and creative things users are able to do with this new quantum computing power.
Company Philosophy Around Transparency
As noted above, Quantinuum provided access to much of their senior team and encouraged direct communications. I was able to engage in meaningful conversations with key employees including Dr. Russell Stutz, Director of Commercial Products, Dr. Steve Sanders, Senior Director for Hardware Technology Development, Dr. Patty Lee, Chief Scientist for Hardware Technology Development, Dr. Chris Langer, Fellow and Chairman of the Technology Board, Dr. Jenni Strabley, Senior Director of Offering Management, Dr. Brian Neyenhuis, Commercial Operations Group leader and of course Raj Hazra, CEO and Tony Uttley, President and COO, among others. The team was engaging and shared many insights and descriptions of their process and achievements.
In addition to personnel, the team provided access to a wealth of academic papers and detailed presentations including the performance metrics that the Company undertook on their own deep dive into the machine’s capabilities and limits. This included a presentation by Dr. David Hayes, Senior Manager for Architecture and Theory, who was introduced by Tony Uttley as the person they gave the directive to “kick the tires as hard as you possibly can on this [new H2] system so that when we release it to the world we can say ‘here are all the specs, everything that you can do under all different conditions’ and then make that transparent to the entire community.” During Dr. Hayes’ portion of the presentations, he shared their benchmarking approach which was summarized and published in a detailed 24-page report. They dug into the component/processor metrics (i.e., 1 and 2 gate performance), the system-level performance (i.e., complex logic operations) and algorithm/application performance. Below are some highlights:
Of note was their robust approach to benchmarking and performance metrics. Rather than just cherry-picking favorable results, they presented deep and wide methodologies for performance measurement and made it all publicly available. This transparency is an important philosophy for the industry and is the best way to separate performance from hype. As the Company noted, they hope this unprecedented level of openness becomes a new standard for Quantum Computing. For those interested in the specific benchmarking results, they are published here.
Some Details and Commentary around H2 Achievements
Some of the announced achievements of the H2 were noted on page 2 and I expect additional achievements will be uncovered as broader access to the H2 is provided next month. Below are some added details around the announced capabilities:
A record quantum volume of 65,536: Quantum computers are difficult to compare so “quantum volume” was derived is a single number designed to show the overall performance of a quantum computer. It is a measurement and not a calculation and takes into account several features of a quantum computer, including number of qubits, gate and measurement errors, crosstalk and connectivity. Roughly speaking, it is an exponential of the circuit size. IBM was the first company to espouse this metric and they achieved a quantum volume of 32 on their 28 qubit “Raleigh” QC in 2020. As recently as October 2022, Quantinuum set a quantum volume record of 8,192, which it increased to 32,768 on the H1-1 in February of this year and then to 65,536 on the H2 this month. This >2,000x improvement in quantum volume in less than 3 and ½ years suggests a very rapid scaling in QC capabilities.
32-Qubit GHZ State: In physics, a Greenberger-Horne-Zeilinger state (GHZ state) is a certain type of entangled quantum state that involves at least three subsystems. This GHZ metric is a demanding test of qubit coherence that is widely measured and reported across a variety of quantum hardware and becomes increasingly difficult to achieve as qubit count increases. The verification of the entangled GHZ state on all 32-qubits confirmed the all-to-all connectivity of the platform, which combined with ultra-precise control mechanisms enabled the team to achieve an entangled state of 32 qubits with a fidelity of 82% setting a new world record.
Creation of Non-Abelian Anyons: This was perhaps the least expected performance metric, given that it represents an entirely new form of matter, which had been theorized but never before shown experimentally. Interestingly, both Quantinuum and Google announced this breakthrough achievement days apart (see Google announcement here), and Microsoft has long been working on topological based qubits. While the physics describing this state are beyond the scope of this write-up, there are two important implications of this achievement:
Anyons are exotic quasi-particles that, as Quantinuum noted, “can theoretically store quantum information in their internal states which can only be changed by “braiding” them around each other in spacetime. Small perturbations in the trajectory of these braids would then leave the topology of the braid unchanged, making this paradigm inherently robust. It is as if they are “deaf” to the noise of the system.” It has been suggested that this could be a viable system for creating universal, fault-tolerant quantum computers. The problem was that non-abelian anyons had never been detected, much less controlled, until now.
Given that this is a new form of matter, this discovery/achievement also has the potential to pave new paths for research within condensed matter and high-energy physics, akin to the discoveries of the Large Hadron Collider.
Summary
There are more than 25 different quantum hardware companies working on creating Quantum Computers and at least 6 different core modalities (e.g., superconducting, trapped ions, neutral atoms, photonics, etc.) behind these devices. While it is still unclear which modality and which company will eventually create an enterprise-grade Quantum Computer, Quantinuum has shown that it is not afraid to publish their roadmap and to-date, prove it can meet its release and performance goals. Their transparency is both refreshing and vital in these early stages of the NISQ (noisy intermediate-scale quantum) era. The Company has continued to show impressive performance metrics, meet its development timeline, and now introduce non-Abelian anyons as a possible tool for fault-tolerant quantum computing, all of which confirms Quantinuum is currently in a very strong market position.
Quantinuum released its new H2 machine with 32 qubits with a commitment to increase this to 50 qubits by next year. As noted above, a 50-qubit quantum computer with all-to-all connectivity and high fidelities will begin to exceed the simulation capabilities of classical computers. And, they are well on their way to additional H-Model machines including the H3 which will include a grid-like architecture. This increasingly powerful quantum computing platform, combined with Quantinuum’s close association with beta customers and making the H2 available to all via Azure Quantum in coming weeks, will allow users to push the H2 capabilities to its limits and begin to obtain actionable results. I expect Quantinuum will continue to share important details which I look forward to following and reporting on.
Rating
Apropos of the probabilistic nature of quantum algorithms, The Quantum Leap has created a company rating system to provide overall assessment of a company’s potential. Accordingly, it uses the formula below when reviewing companies, whereby the “alpha” coefficient correlates with “positivity” (and the formula adheres to the Born rule). Given the overall assessment of Quantinuum including its strong position as a full-stack player, the strengths of the legacy businesses, the latest record-setting achievements, and most importantly, its ability to continue to hit previously stated performance achievements, I am confirming the highest rating for Quantinuum, with an Alpha of 0.95 which equates to an “Exceptional Performance Expected”.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
Presenters Hazra, Raj; Uttley, Tony; Dreyer, Dr. Henrik; Amaro, Dr. David; Hayes, Dr. David; Stutz, Dr. Russell; Sanders, Dr. Steve, Quantinuum Press and Analyst Event, May 3, 2023.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
During the height of the cold war, when the fear of a nuclear attack by the USSR was palpable, a large number of important government agencies were established in and around Boulder, Colorado. Given its position just east of the Rocky Mountains (often referred to as the Front Range), the area was thought to be protected from nuclear missiles launched from Russia. This led to several important national labs and government agencies tracing their origins to the shadows of the Flatirons and the 50+ “14ers” (peaks exceeding 14,000 feet in elevation) abutting them. Today, some of those entities still thrive, including the National Institute of Standards and Technology (NIST), the National Oceanic and Atmospheric Administration (NOAA) and the National Center for Atmospheric Research (NCAR), which were formed via the consolidation of legacy agencies, many of which were established in Boulder and nearby communities during the Eisenhower administration.
These common roots and the associated important technologies they developed seeded an important technological ecosystem in the area. As some readers of The Quantum Leap may know, I’ve long been intrigued by the interconnected quantum ecosystem. A post entitled “Collaboration is Dominating Quantum Computing” covered some of the dynamics and synergies among Classiq, Strangeworks, Infleqtion (f/k/a ColdQuanta) and others. And if you’ll permit a small personal digression, I recently relocated to the Boulder area and have been struck by the vibrant quantum footprint here, so wanted to provide some details on this local quantum environment. Today, Boulder and the surrounding towns enjoy a concentration of quantum-focused government entities, academic institutions, large corporations and growing entrepreneurial firms. Additionally, as Kenna Hughes-Castleberry, Science Communicator for JILA and science and technology journalist noted, “Colorado boasts a highly educated workforce, with over 42% of its population holding a bachelor’s degree or higher. While many quantum players within the state are smaller start-ups, we also have branches of Microsoft and Google in Boulder and Denver, adding more layers to the diverse technological ecosystem.” There are many obvious synergies, such as having the universities produce qualified quantum professionals and spin out technologies and companies, having the commercial players offer jobs to graduates, and cooperation and trade among neighboring companies. Given the highly complex nature of quantum science, and its position as a bleeding edge technology, it makes sense that certain companies specialize in very specific areas so that collaboration among companies can enable broader approaches to physics challenges. [Note to readers: If I’ve missed any local quantum companies, please contact me and I will add to the post as appropriate].
NIST (National Institute of Standards and Technology): is a non-regulatory federal agency within the U.S. Department of Commerce. For more than a century, NIST has helped to keep U.S. technology at the leading edge. Their measurements support the smallest of technologies to the largest and most complex human-made creations. NIST’s mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life — from nanoscale devices so tiny that tens of thousands can fit on the end of a single human hair up to earthquake-resistant skyscrapers and global communication networks.
Congress established the agency to remove a major challenge to U.S. industrial competitiveness at the time — a second-rate measurement infrastructure that lagged behind the capabilities of the United Kingdom, Germany and other economic rivals. From the smart electric power grid and electronic health records to atomic clocks, advanced nanomaterials and computer chips, innumerable products and services rely in some way on technology, measurement and standards provided by NIST.
Even before “quantum” was the catchall descriptor, this area was home to crucial “AMO Physics” (Atomic, molecular and optical) research. In fact, the very first quantum computing gate was realized nearly 30 years ago by Chris Monroe and David Wineland at NIST, where they demonstrated a CNOT gate on an early trapped ion computer which contributed to Wineland’s Nobel Prize in physics in 2012. His colleague, Jan Hall, has been doing AMO/Quantum research at CU Boulder since 1964 and he was awarded a Nobel in 2005 largely for the creation of the optical frequency comb which has been instrumental in advancing atomic clock precision. And the 2001 Nobel Prize in physics was awarded to CU Boulder affiliated researchers Carl Weiman and Eric Cornell who received the award for the creation of Bose-Einstein condensate, a new type of matter which helped scientists better understand quantum behavior. Additional Nobel prizes have been earned by NIST researchers Dan Schechtman (2011 in Chemistry for Quasicrystals) and Bill Phillips (1997 in Physics for Laser Cooling). This concentration of Nobel talent/research has helped spur the current Boulder-centric quantum ecosystem and many of the co-workers and collaborators with these Nobel winners have gone on to form or take important roles in local quantum companies.
In addition, the Quantum Physics Division of NIST is part of JILA, the joint research and training institute between NIST and the nearby University of Colorado Boulder (CU Boulder).
JILA:
Founded in 1962, its name was originally an acronym for “Joint Institute for Laboratory Astrophysics”. However, for many years JILA has been a world-renowned and award-winning physics institute delving into frontier bending research in quantum information science & technology, atomic & molecular physics, laser physics, biophysics, chemical physics, nanoscience and precision measurement, so the name is no longer an acronym.
Collaborations among JILA Fellows, JILA research associates and students, CU Boulder professors, NIST staff members, and other world-leading scientists from around the globe play a key role in generating JILA’s renowned pioneering research. JILA’s CU members hold faculty appointments in the Departments of Physics; Astrophysical and Planetary Science; Chemistry and Biochemistry; and Molecular, Cellular, and Developmental Biology as well as Engineering. JILA’s Quantum Physics Division of NIST members hold joint faculty appointments at CU Boulder in the same departments.
Many of the quantum companies in the area such as Vescent Photonics, Octave Photonics, Stable Laser Systems, KM Labs, ColdQuanta and others can trace their roots directly to NIST and/or JILA.
Regional Academic Institutions
University of Colorado Boulder: As noted above, CU Boulder has strong direct ties with NIST, which is part of the reason for its leading reputation in quantum science. CU is well positioned to be a national leader in quantum research and education and their physics department is ranked among the top 15 in the world (per Academic Rankings of World Universities, 2020). Heather Lewandowski, Associate Chair of the Physics department at CU Boulder and Fellow at JILA noted: “We are partnering with many of the local quantum related companies to both enhance the research and to bring additional opportunities to our students. For instance, we have a new capstone course, where students work on a team on a project sponsored by a local company. The students get insights into what it’s like to work in the quantum industry, and are able to develop the skills to make them well prepared to enter the workforce.” The University has also established CUbit (pronounced “Q-Bit”) which is an interdisciplinary hub that reinforces Colorado’s prominence in quantum information science and technology by partnering with regional universities and laboratories and linking closely with quantum-intensive companies. Through CUbit, CU Boulder has a strong focus on quantum research in sensing & measurement, networks & communication, and computing, which it leverages through:
Quantum Systems through Entangled Science and Engineering (Q-Sense): led by CU Boulder in partnership with seven universities, three national labs and NIST. Prominent quantum researchers collaborate to explore how advanced quantum sensing can reveal new fundamental physics, develop and apply novel quantum technologies, provide tools for a national quantum sensing infrastructure, and train a quantum-savvy workforce.
Quantum System Accelerator (QSA): a multiorganization initiative established to design and deliver scalable quantum computers within five years.
Joint Quantum Engineering Initiative (JQEI): Faculty from the College of Engineering & Applied Science and scientific staff from NIST Boulder Labs establish and operate a lab cluster at CU Boulder. JQEI empowers research and development to deliver quantum innovations for adoption by industry and use in society.
CU Boulder also offers a “Quantum Scholars Program” which provides a scholarship and learning opportunities connecting with local industry and quantum applications in the Colorado community.
Colorado School of Mines (Mines): Located in Golden, Colorado, Mines boasts a robust Quantum Engineering Program. Mines launched one of the nation’s first quantum engineering programs at the graduate level in the Fall of 2020 with hands-on training on quantum hardware on campus and direct student access through the cloud to Google’s quantum computer. The program offers an undergraduate minor and graduate master’s degrees (thesis and non-thesis), with specialization tracks in quantum hardware and software, as well as professional upskilling via a graduate certificate for experienced engineers and scientists in industry. Doctorates in the quantum engineering program can be obtained in any of six departments on campus, who all contribute jointly to the program. In 2021 Mines received a $3m grant from the National Science Foundation in the form of a NSF Research Traineeship to support the development of rigorous, integrated and interdisciplinary training programs preparing both master’s and doctoral students for careers in the burgeoning field of quantum information science and engineering (QISE), whether for industry, government, academia or national labs. In 2019 the work of Mine’s Mark Lusk and University of Denver’s Mark Seimens on the possibility of using a laser beam as the medium for quantum science received a $1m grant from the W.M. Keck Foundation. Then in 2021, two physicists at Mines and CU Boulder also received a $1 million grant from the W.M. Keck Foundation to develop a first-of-its-kind quantum simulator that could be used to develop novel materials and, in the future, lead to the development of a high-performance quantum computer. These grants highlight both the importance of quantum research in the area as well as the collaborative nature of these institutions.
University of Denver: Offers Bachelors, Masters and Doctoral degrees in physics. As noted above, in 2019, the W.M. Keck Foundation awarded a grant to fund a collaboration between University of Denver’s Mark Siemens and his colleague Mark Lusk at Mines. This inter-university project focuses on the possibilities of using laser light technology to conduct quantum experimentation at room temperature, rather than at ultra-low temperatures. “This new connection is really exciting for us,” Siemens said. “It could launch a new age of accessibility for quantum science and, ultimately, computing. Imagine doing quantum measurements and calculations with a glorified laser pointer!”
Colorado State University (CSU): located in Ft. Collins and also situated in the Front Range, still refers to its “quantum” physics department as Atomic, Molecular and Optical Physics, a throwback to this field’s early roots, however their quantum work is quite current. Professors and researchers are working on laser spectroscopy of trapped ions and other atoms, single atom detection, ultracold neutral atom plasmas & novel ultracold cooling, atomic clocks, and quantum computing among other areas.
Atom Computing: Atom’s headquarters and original R&D machine are in California, but they recently opened a new facility in Boulder where they are creating their production units and have pledged $100 million in investment. Atom has an impressive roster of employees and consultants including Dr. Ben Bloom, a co-founder and CTO, who has deep connections in the Boulder quantum ecosystem including a PhD from CU Boulder, Dr. Jun Ye, a Scientific Advisor, who is a physics professor at CU Boulder, Fellow of JILA and NIST and was recently named member of President Biden’s National Quantum Initiative Advisory Committee, and Scientific Advisor Dr. Eliot Kapit who is currently an associate professor of physics and director of quantum engineering at Colorado School of Mines.
Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. They have published some impressive results using their 100-qubit prototype system and are working on their second-generation systems. They are actively collaborating with software and application developers and were recently selected by the Defense Advanced Research Projects Agency (DARPA) to develop a next-generation system through its Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program.
FieldLine: This Boulder based company was founded by by Orang Alem, Svenja Knappe, and Jeramy Hughs, each with NIST and CU Boulder backgrounds. Dr. Knappe continues her affiliation with CU Boulder as an Associtate Research Professor. The FieldLine HEDscan system is a non-invasive, wearable magnetoencephalography (MEG) device based on their optically-pumped magnetometer (OPM) sensor technology. Small quantum sensors are placed directly on the head to record and map neural activity with high fidelity. These ulta-high sensitivity magnetic field sensors are well suited for recording magnetic brain signals for basic neuroscience or clinical diagnosis and should help improve our understanding of diseases such as Parkinson’s or psychiactric disorders. Their lightweight, wearable HEDscan helmets can accommodate people of all ages and head sizes and can be used in any room of any medical facility, without the need for expensive building modifications.
Infleqtion: Infleqtion (f/k/a ColdQuanta) was co-founded by its current CTO Dr. Dana Anderson who is a Fellow of JILA and a Professor in the Dept. of Physics and Electrical & Computer Engineering at CU Boulder. He was a collaborator with Drs. Eric Cornell and Carl Weiman who created the first ever Bose-Einstein Condensate (BEC) at UC Boulder in 1995, a feat for which they were awarded a 2001 Nobel Prize. BEC is a new form of matter, which is created when atoms are cooled close to absolute zero. Infleqtion uses lasers to arrange either cesium atoms (cooled to a few microKelvin, or millionths of a degree for qubits) or rubidium atoms (cooled to nanoKelvin or billionths of a degree to make BEC, where the atoms act as a single quantum object and are used most notably in sensing) and hold them in place. Infleqtion uses this cold atom method across multiple quantum applications including gate-based quantum computers as well as a variety of quantum sensing and signal processing applications such as High Precision Clocks, Quantum Positioning Systems (QPS), Quantum Radio Frequency Receivers (QRF) and Quantum Networking and Communications.
Infleqtion and CU Boulder recently announced a formal collaboration to advance quantum sensing through machine learning techniques focused on applications that require unprecedented positioning and navigation capabilities in real-world environments. Machine learning-enabled quantum signal processing provides a means of leveraging the quantum mechanical aspects of ultracold atom sensors. As Anjul Loiaconoa, VP of Quantum Signal Processing at Infleqtion noted in the press release describing the collaboration: “At Infleqtion, we pride ourselves on our long-standing history of developing deployable compact quantum hardware. Our expertise in this area is unparalleled, and now, by combining it with the cutting-edge capabilities being developed at CU, we are poised to lead in the field of software-defined quantum sensors, a revolutionary solution for today’s challenges in navigation.”
KMLabs: has deep roots in the area having been spun out of the Kapteyn-Murnane group at JILA. Henry Kapteyn, the co-founder and CTO, is an award-winning researcher in the area of ultrafast optical science. He is a Professor of Physics at CU Boulder and a Fellow of JILA. CEO Daisy Raymondson, received her PhD from CU Boulder and was a graduate research assistant at JILA. KMLabs focuses on delivering the optimized tabletop ultrafast laser sources that span the vacuum ultraviolet (VUV) to extreme ultraviolet (EUV) to soft X-ray (SXR) range of the electromagnetic spectrum—about 1 to 150 nm. Because laser technology has generally been limited to wavelengths only as short as 193 nm, scientific research in the VUV to SXR range has been relatively less explored. With recent developments in high harmonic generation (HHG), research in this area is coming of age and opening up exciting new opportunities for scientific discovery. Specifically, the potential for nanoscale imaging, spectroscopy, and probing ultrafast dynamics are extending the domain of nano and quantum experimentation.
Longpath Technologies: Longpath was co-founded by a group professionals from CU Boulder including Greg Rieker (CTO) who is also an Associate Professor at CU Boulder, Caroline Alden (VP Products and Markets) who spent 7 years at CU Boulder and also worked at NOAA, Robert Wright (VP Engineering) who spent nearly 6 years at CU Boulder as a researcher, and Sean Coburn (Senior Research Scientist) who received his PhD from CU Boulder and has been a Senior Research Associate there for the past 16 years. The core laser technology in the LongPath system was the basis of Nobel Prize-winning work at the University of Colorado and NIST. Their eye-safe, long-path laser systems probe the distinct absorption ‘fingerprint’ of many different molecules (methane, H₂S, CO₂, H₂O, and more) across 50,000+ wavelengths (colors) of light. LongPath, CU Boulder, and NIST engineers were the first ever to make outdoor fielded measurements using this groundbreaking technology. In addition to applications in energy, LongPath’s technology is well suited to penetrate other large markets including agriculture, waste management, mining, and urban monitoring.
Maybell Quantum Industries: Denver based Maybell Quantum has a number of locally grown scientists on its staff. They build the hardware for the quantum revolution, including sub-Kelvin cryogenic systems and superconducting quantum I/O. Many quantum computers must be cooled to nearly absolute zero to operate in a device called a dilution refrigerator (DR). Typically, DRs require hundreds of square feet of specialized lab space. Maybell launched their first product, “The Fridge,” in 2022. The Fridge can cool 4X the qubits to 10mK in 1/10th the space of competing platforms and can operate in any space with the right electrical outlet. This year, they followed it up with their “Big Fridge,” which has double the cooling power and can fit 10x the qubits in 1/8th the space of competitor’s systems. They also offer a line of ultra-high-density superconducting RF ribbon cables or “Flexlines” which reduce the thermal load and vibrational noise common in traditional cryogenic wiring. Maybell’s products are critical hardware to many qubit modalities, including superconducting, topological, and some photonics-based systems, as well as condensed matter physics, low temperature physics, quantum sensing and other applications.
Meadowlark Optics: In 1979, Tom Baur, a researcher at the nearby NCAR (National Center for Atmospheric Research), needed a solution that resulted in manufacturing his own custom Pockels cells. With that flagship product, word spread and Meadowlark Optics quickly became a place to turn to for custom polarization optics. Today Meadowlark Optics designs, develops and manufactures an extensive range of high-quality polarization systems and components including liquid crystal devices from ultraviolet to Middle-wave infrared. Standard products include shutters, rotators, waveplates/retarders, spatial light modulators, tunable optical filters, tri-color filters, polarizers, polarimeters and more.
Octave Photonics: is another company with deep local quantum-DNA. Octave Photonics was founded in 2019 by David Carlson and Zach Newman when they were postdocs in the Time and Frequency Division at NIST. They were joined in 2021 by Daniel Hickstein, another former NIST post-doc and CU Boulder PhD, who worked at nearby KMLabs for three years before moving over to Octave. In addition to the NIST origins, the Company continues to collaborate with the QNS group at NIST and the Diddams group at CU Boulder, as well as local companies Vescent Photonics and Infleqtion. Octave Photonics enables next-generation laser systems by packaging nanophotonic chips into ready-to-use devices. Their nonlinear photonic devices provide precise control over the optical spectrum of a laser system, allowing laser frequency combs to be constructed with unprecedented compactness and robustness. Their products are used for supercontinuum generation, low-power frequency combs, and optical atomic clocks.
QuSpin: Louisville, Colorado based QuSpin was founded by Vishal Shah, another UC Boulder PhD. The current team includes Daniel Barry, an engineer from CU Denver and Jeff Orton a Senior Engineer educated at Colorado State University. QuSpin builds optical atomic magnetometers for biomedical and geophysical applications. Their technology is based on optically pumped magnetometers (OPM’s) which are passive field sensors comprised of a laser source, glass vapor cell containing ‘sensing’ atoms in a gaseous state, and a photodetector. Their recently released Neuro-1 is a state-of-the-art, integrated OPM sensor system designed for high-channel biomagnetic applications such as Magnetoencephalograpy (MEG).
Quantinuum: is the world’s largest standalone quantum computing company, formed by the combination of Honeywell Quantum Solutions’ hardware and Cambridge Quantum’s middleware and applications. Quantinuum accelerates quantum computing and the development of applications across chemistry, cybersecurity, finance and optimization. Its focus is to create scalable and commercial quantum solutions to solve the world’s most pressing problems in fields such as energy, logistics, climate change, and health. This past week Quantinuum announced the release of their trapped-ion System Model H2 which was built in their Broomfield, Colorado U.S. Headquarters. This second-generation system currently features 32 fully connected qubits with impressive performance metrics including a world record Quantum Volume of 65,536 (216), 99.997% single-qubit gate fidelity and 99.8% two-qubit gate fidelity. [Watch TheQuantum Leap Blog for an upcoming post featuring Quantinuum].
Stable Laser Systems (SLS): was formed in 2009 by Mark Notcutt, who trained with Jan Hall (2005 Nobel Prize winner) at JILA and for which Hall remains a consultant. Their aim is to provide best-in-class frequency stabilization products to the market. SLS offers complete systems which deliver narrow, stable linewidth lasers right “out of the box.” Their 1 Hz Stabilized Laser System provides < 1 Hz linewidth at 1530-1575 nm and less than 20 kHz daily drift from a relatively modest footprint (19” 6U rackmount box) with a convenient single-switch lock function. They also offer customized breadboard-based systems for various R&D requirements. Their products are used for atomic clocks, atomic laser cooling and trapping, high-precision spectroscopy, long-range radar and sensing, time distribution and a number of other applications.
Vapor Cell Technologies: Vapor Cell Technologies (VCT), based in Boulder, specializes in designing, manufacturing, integrating, and selling chip-scale vapor cells, the core components of quantum 1.0 technologies such as atomic clocks and atomic sensors. By harnessing the unique, quantum-noise-limited properties of atomic and molecular vapors, VCT’s aim is to create robust, reliable, and consistent vapor cells that can transition quantum technologies from laboratory demonstrations to market-ready applications.
VCT’s work has garnered the support NIST, the National Science Foundation (NSF), the Department of Defense (DoD), the MEMS & Sensors Industry Group (MSIG), as well as various small businesses and publicly traded companies. Utilizing semiconductor techniques and methodologies, VCT leverages the highly evolved materials, processing, and analytics found in the semiconductor industry to produce “bulletproof” devices. This approach enables them to integrate thin films for optical, electrical, and anti-corrosive properties essential for customizable quantum systems. They provide wafer-level services and multi-project wafers to clients, and have successfully produced Rubidium, Cesium, Iodine, and exotic species of vapor cells in mm-scale geometries.
Doug Bopp, VCT’s founder, who received his PhD from UC Boulder and worked as a research assistant at NIST, confirmed the Company’s local connections noting “above all, we are committed to fostering a thriving quantum ecosystem. Building on our extensive experience at NIST, we apply state-of-the-art metrology and fabrication solutions to produce chip-scale vapor cells. We recognize the diverse range of atomic technologies, each requiring high-fidelity components, and are dedicated to supplying these essential elements to quantum engineers as they develop next-generation products with pioneering components.”
Vescent: was founded by Mike Anderson, Scott Davis and Scott Rommell, each of whom has strong Colorado lineage. Drs. Anderson and Davis received their PhDs from JILA at CU Boulder and Scott Rommell studied physics at nearby University of Northern Colorado. Dr. Davis is particularly well entrenched in the local quantum infrastructure, serving on the Steering Committee of the Quantum Economics Development Consortium (QED-C) and Vescent is active with CUbit and the Colorado Photonics Industry Association (CPIA).
Vescent specializes in precision optics that operate nearly all quantum devices. It provides the “picks and shovels,” which are technologically enabling components, to nearly every aspect of the quantum ecosystem. Vescent offers a suite of technologically advanced products and has successfully evolved from providing niche R&D focused products (albeit widely admired and respected by researchers) to rugged, field deployable components and OEM products. Vescent is the leading designer and manufacturer of lasers, electro-optic tools, and control electronics used in precision optical measurements based on the quantum nature of physical systems. They focus on providing low-SWaP (size, weight and power), truly field-deployable products which enable state-of-the-art timing, time transfer, frequency transfer, quantum computing, precision navigation in both GPS-enabled & GPS-denied environments, next-generation spectroscopic techniques. Vescent recently leveraged its active local quantum connections by winning a $16 million contract from the Office of Naval Research (ONR) to develop portable atomic clocks under the Compact Rubidium Optical Clock (CROC) program in collaboration with Infleqtion, Octave Photonics and NIST.
Xairos: located South of Denver in Lone Tree, Colorado (and setting up a Boulder office), is a space and technology company that is revolutionizing time synchronization. They have successfully built a proof-of-concept (POC) demonstrating their patented quantum technology providing orders of magnitude better accuracy and security than GPS can deliver. Governments and network operators worldwide are looking for a more accurate GPS alternative to enable technologies such as 6G, autonomous vehicles, and quantum networks. Xairos’ POC has demonstrated time synchronization that is >1,000x more accurate than GPS, solving a considerable problem for new technologies and applications. GPS represents a single point of failure for critical national infrastructure. Xairos’s patented technology is highly advanced and secure, eliminating the hacking and signal jamming to which GPS is prone. Their system could reduce expensive outages that disrupt travel, degrade network performance, and even eliminate national threats like the interference from Russia in Ukraine. CEO David Mitlyng recently joined a Colorado quantum delegation for a trip to Finland, organized by the Colorado Office of Economic Development and International Trade, where he was joined by local colleagues from Atom Computing, Maybell, NIST and others.
FormFactor: is a semiconductor test and measurement provider that has an office in Boulder, which it maintained after acquiring High Precision Devices, a maker of cryogenic probe systems and cryostats used in the development of quantum computing, superconducting computing and ultra-sensitive sensors. The location now enables quantum developers to leverage FormFactor’s state-of-the-art Advanced Cryogenic Lab to characterize qubits and resonators using cryostats with groundbreaking probe sockets to accelerate development cycles by more than 2X, with no up-front capital investment.
Lockeed Martin: entered into a Master Research Agreement with CU Boulder in 2019 and in 2022 broadened its local quantum presence by joining CUbit’s partnership program. As noted in a press release at that time: “Lockheed Martin recognizes the value of collaborating across the entire innovation spectrum and our engagement with CUbit Innovation Partners is an important extension of our existing alliance with CU Boulder,” said Valerie Browning, vice president, Research and Technology at Lockheed Martin. “This new quantum focus will provide Lockheed Martin access to cutting-edge quantum sensing research while providing CU Boulder visibility into real world applications that may benefit from a quantum advantage.”
Thorlabs: is a multinational vertically integrated photonics products manufacturer. They maintain a laser manufacturing facility in Lafayette, Colorado and supply an array of photonics equipment for quantum technologies and applications including single photon sources and detectors, single-crystal diamonds with nitrogen-vacancy centers, mirror mounts, turnkey ultra-low noise lasers and related items.
Select Local Collaborations
Vescent/Infleqtion/Octave Photonics/NIST: In December 2021, Vescent Photonics was awarded a contract worth up to $16.2 million to develop portable atomic clocks for the Office of Naval Research (ONR) Compact Rubidium Optical Clock (CROC) program, which will be fulfilled by the consortium of local companies noted in the Vescent description above. The group aims to improve upon existing commercial atomic clocks by interrogating a two-photon optical clock transition in a warm vapor of rubidium (Rb) atoms.
CUbit: As noted above, CUbit is an agency affiliated with CU Boulder, partnering with industry to catalyze advancement of quantum information technologies and strengthen the regional quantum ecosystem. Its Innovation Partners include Atom Computing, Infleqtion and Meadowlark Optics and its Community Partners include Vescent, Maybell and Octave Photonics.
Microelectronics Commons: is a national network that will create direct pathways to commercialization for microelectronics researchers and designers from “lab to fab.” It is a program of the Department of Defense funded by the recently passed CHIPS and Science Act of 2022. Vescent, Infleqtion, Honeywell, Quantinuum and Octave Photonics have teamed up to participate in this program.
Summary
I have been quite impressed with the vibrancy of this local quantum community and look forward to meeting more of the participants. I hope readers have a broader appreciation for the energy and dynamics of this environment, and I look forward to providing further updates. For ease of learning more or contacting any of the companies or people noted in this post, below is a table of entities mentioned along with hyperlinks to their websites:
Disclosure: The author does not currently have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In recent posts I have focused on the technical specifics of Quantum Computing and quantum sciences more generally (i.e., optical clocks and quantum in space), and showcased certain companies operating in the industry. However, for this post I want to focus on a more abstract theme.
If you ask just about any pundit or professional participating in the Quantum Computing industry, the biggest question of today, nearly universally posed is: “When will we achieve Quantum Advantage?” Quantum Advantage is generally defined as:
“The achievement of demonstrated and measured success in processing a real-world problem faster on a quantum computer than on a classical computer.”
In this writer’s opinion, this focus on having Quantum Computers do things faster focuses on the wrong attributes of Quantum Computing. It is not the speed, per se, that is the key attribute that will deliver QC value, so doing things faster than classic misses the point.
The brains and core workhorse of a classical computer is its CPU or central processing unit. CPUs are made up of integrated circuits which (to grossly oversimplify) are simply billions of on/off switches. These integrated circuits are comprised of individual “bits” which are either ‘one’ or ‘zero’ (binary) and all computer processing is rooted in Boolean logic. Specifically, there are only three fundamental gates (AND, NOT, OR). That’s it. There is an art to programming and a skill for parsing and processing information. Today’s classical computers can apply these rules incredibly fast (gaming PCs operate at ~4GHz meaning they can manipulate 4 billion bits per second). Clever programmers have found increasingly efficient and profound ways to implement programs despite having only two inputs (1 or 0) and only three logic gates. We can do AMAZING things with this somewhat limited architecture.
Now let’s switch our focus to QC. Instead of bit, Quantum Computers operate using qubits, which are the quantum version of on/off switches, however qubits can be in a superposition of both ‘on’ and ‘off‘ at the same time. They can also be entangled, they have wave functions, and they can utilize more logic gates so can perform vastly different operations. And despite what many famous physicists have proclaimed, this is not voodoo science that nobody can understand. Quantum Computers are available today albeit with limited numbers of qubits. It is fundamental physics…it’s just different from classical physics.
What does this mean and what is the thrust of this post? Since Quantum Computers operate so differently, we can ask different questions. Doing anything faster is not that novel (yeah, sure, you can break RSA encryption and do a few other notable things super-fast). SPEED is not the value-add, per se. With different physics you can (and should) ask different questions.
Let’s look at the following example to help make this point more tangibly. Imagine that you and your partner are planning a San Francisco dream vacation. You are considering staying at one of the following two hotels:
Based on the above descriptions, which hotel should you pick?
One strategy might be to score each of the features, and then add up the scores and select the hotel with the highest score. But what about trade-offs? You may not care that much about amenities if you are conveniently located. You may also really enjoy certain features, but what if the view from the room is the most important consideration? Are any of these items deal-breakers by themselves?
Now, let’s approach this problem from a quantum perspective. If one assigns a “score” to each feature, this sounds a lot like weighting or using a superposition to program each feature. There are also various trade-offs. You might be willing to sacrifice having a spa for proximity to a dynamic neighborhood, or perhaps room amenities are the most important feature and outweighs all others. These tradeoffs suggest that certain features are correlated or entangled. Most of us don’t need a Quantum Computer to select which hotel we would prefer because our brain already does an informal weighting of the various features and considers the tradeoffs, and likely factors in other subtle variables not in the chart. Interestingly, different people will choose different outcomes to the same inputs…and the same person might select a different outcome over differing times.
In this context, using superposition (weighting) and entanglement (tradeoffs) does not involve mysterious quantum physics that are beyond comprehension. It is the way our brains already work. We assess multiple variables with all sorts of subtleties on characteristics and complex “entangled” trade-offs and inter-relationships. Leveraging these features of analysis is where the “art” to QC programming will lay. This is where QC programmers will create the next generation of eBay’s, DoorDash’s, Oracle’s, Google Search’s or _______ (insert your favorite killer app).
How can superposition, entanglement, and other unique features of QCs alter the framing of the problem or the context of the answer? What are the questions nobody has ever thought to ask a computer before? Here are just a few simplistic examples:
As with the hotel choices above, QCs will be particularly valuable in problems involving weighting and tradeoffs, for example:
Given the detailed profile of each athlete in the draft pool, and matching that with the specific needs of the team, which player should be drafted?
With the following list of symptoms, what is the prognosis?
What asset portfolio gives me the best risk/reward profile?
These are generally “optimization” problems, which have been well addressed by those following Quantum Computing, and early use cases using optimization are abundant.
So here are a few others, more outside the box:
Is there a way to hear colors?
Can we leverage QCs to enhance the capabilities of AI Chatbots like ChatGPT or Google Lambda? Should we?
Can we leverage quantum effects to detect neutrinos and if so, can we then use them to create high-definition holograms?
Some of these seem fairly straight-forward, such as the pro team draft choice. There are many inputs that can be weighted and for which there are trade-offs. Could someone program a quantum computer to do a better job confirming draft choices versus a classical compute? I have to imagine that as the list of features grows substantially, that there could be a Quantum Advantage. And this would be a valuable tool to provide in that instance. But it’s the less clear-cut problems that excite me. Imagining ways to make colors impact auditory sensors, or finding solutions sets to complex problems that were never before considered. Inquiries such as these can open all sorts of amazing new potential.
So as the QC hardware makers continue to let people test things on their increasingly powerful machines, and as the QC software companies expand the capabilities of their programs, I’m excited to see what people try. What new and novel questions they ask and what interesting and unpredicted answers are returned.
What do you think the new “killer app” will be on a Quantum Computer? What question would you like a QC to answer? I’d love to hear your ideas.
Disclosure: The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In an early Quantum Leap post back in December 2021, I wrote about the various qubits being used to drive Quantum Computers (QC) and neutral atoms didn’t make it into the post. When I revisited the QC landscape in October of last year, four neutral atom companies made the list, representing a significant advancement in that modality. There are numerous strengths to this approach, which I’ll describe in greater detail below, but to pique your interest in learning more, consider the following recent announcements by neutral atom companies:
Infleqtion (f/k/a ColdQuanta) completed a $110 million Series B equity round
Professor Alain Aspect, a co-founder of PASQAL, was awarded the 2022 Nobel Prize in Physics
Atom Computer had a ribbon-cutting ceremony for the opening of its new $100 million facility in Boulder, Colorado
QuEra’s 256-qubit Aquila QC was made available on Amazon’s Bracket, its cloud-based QC platform
Why the recent surge in jaw-dropping announcements? Why are neutral atoms seeming to leapfrog other qubit modalities? Keep reading to find out.
The table below highlights the companies working to make Quantum Computers using neutral atoms as qubits:
And as an added feature I am writing this post to be “entangled” with the posts of Brian Siegelwax, a respected colleague and quantum algorithm designer (see his overview on Neutral Atoms here). My focus will be on the hardware and corporate details about the companies involved, while Brian’s focus will be on actual implementation of the platforms and what it is like to program on their devices. Unfortunately, most of the systems created by the companies noted in this post are not yet available (other than QuEra’s), so I will update this post along with the applicable hot links to Brian’s companion articles, as they become available.
Neutral Atoms as Qubits
Neutral Atoms, sometimes referred to as “cold atoms,” are built from an array of individual atoms that are trapped in a room-temperature vacuum by using lasers as optical “tweezers” to restrict the movement of the individual atoms and thereby chill them (hence the “cold atom” reference). These neutral atoms can be put into a highly excited state by firing certain laser pulses at them which expands the radius of the outer electron(s) (a Rydberg state), which can be used to entangle them with each other, among other features.
While there are a few notable differences among the approaches the neutral atom players use, there are also many similarities. The graphic below highlights the Atom Computing set-up which is representative of the broad cold atom approach. It includes two sets of lasers and related controllers and AOD’s (acousto-optic deflectors), a vacuum chamber, and a photon-sensitive camera to read results.
Let’s drill down a bit further to explain a bit more of the underlying science.
Each of the players focused on neutral atoms uses elements from either the first column of the atomic periodical table (alkali metals such as Rubidium or Cesium) or second column (alkali earth metals such as Strontium). In either case, there are equal numbers of electrons and protons among those elements and so the electrical charges balance out, hence the “neutral” label. The alkali metals have a single electron in the outer orbit whereas the alkali earth metals have two electrons in the outer orbit (some believe the two-valence electron configuration, which is a “closed shell”, provides greater stability and protection from external noise). It is the focus on these outer electrons which produce the quantum-mechanical effects that drive the algorithms or desired analog activity.
In a neutral-atom quantum processor, atoms are first heated to a gaseous cloud and then suspended in an ultrahigh vacuum via arrays of tightly focused lasers of specific wavelengths, often referred to as “optical tweezers.” Every element reacts to very specific wavelengths of light, so can be manipulated by lasers tuned to those specific wavelengths. These optical tweezers can also be used to configure the atoms into specific geometric arrays. For digital, gate-based computation, single-gate and multiple-gate implementations can be programmed via differing light pulses. Rob Hays, CEO of Atom Computing (and a deep veteran of the computing industry as former Chief Strategy Officer of Lenovo and a 20-year leadership tenure at Intel where he led the Xeon processor roadmap) explained that “every element has a magic wavelength of light that allows atoms to be captured by optical tweezers.” He further noted that “with a different wavelength of light, we can effectively control the spin of the nucleus in any position in three dimensions…and that’s how we create single qubit gates…and then what we can do is create entanglement with two qubit gates by using different wavelengths of light to excite the electron cloud into what’s called a Rydberg state where the radius of the electron orbit gets much larger to the point where it crosses paths with neighboring atoms and gets entanglement.” This is the foundation for one of the key strengths of neutral atom QC’s, namely its strong connectivity. For analog operations the tweezer moves the atoms into the desired configuration and other laser or microwave pulses then trigger the atoms into performing Hamiltonians (more on this as well as the various differences between the digital and analog approach below). In both cases, the final results are read out optically.
Some important characteristics of neutral atom-based qubits include:
Exceptionally Long Coherence Times: Leading super-conducting and photonic Quantum Computers have achieved coherence times measured in microseconds (millionths of a second), which doesn’t provide much time to run algorithms (although they also have very fast gate speeds). Neutral atom players generally enjoy coherence measured in full seconds and, in fact, Atom Computing published a paper in Nature Communications in May 2022 touting coherence times exceeding 40 seconds.
Strong Connectivity: The topography of the neutral atom structure is quite flexible, and these modalities typically enjoy robust connectivity among qubits, often achieving all-to-all connectivity. In fact, neutral atoms can also implement multi-qubit gates (involving more than 2 qubits such as a CCNOT or Toffoli Gate) and can even implement 3-level qubits or “Qutrits”.
Scalability: Because neutral atom players use “atoms” and since all atoms of a given element isotope are intrinsically identical, all qubits based on such elements are identical to each other. In addition, since there is no ionic charge contained in the elements being used, the atoms can be packed into tight arrays, often only microns apart. Also, rather than a sperate laser for each qubit, since the atoms are manipulated by common wavelengths, a laser of a specific wavelength can be split into “beamlets” in order to control multiple atoms.
External Cryogenics Not Required: Modalities which require cryogenic chillers are burdened with significant added overhead and must typically contend with long chill-up/chill-down cycles.
Reduced Wiring Complexity: All of the functions to control the neutral atoms are performed via light propagating through free space. This is opposed to superconducting qubits which require multiple electrical cables for each qubit.
Can be Operated in Analog or Digital Mode (or both): Digital or gate-based operations are required for full algorithm development, but some early quantum advantages may be achieved utilizing qubits geometrically or in analog mode. This is an important distinction, so I will elaborate further in the next section.
Leveraging Three Decades of Legacy Research: While using neutral atoms in quantum computing is relatively new, the neutral atom technology has been successfully deployed in other physics research and has powered the world’s most accurate atomic clocks for many years. The laser-cooling technology is based on research that led to the 1997 Nobel Prize and optical tweezers are based on research that led to the 2018 Nobel Prize.
While this is an impressive feature list, neutral atom quantum computers are relatively new to the Quantum Computing landscape and have yet to showcase important real-world results. There are also meaningful technological challenges in refinement of the lasers and the ultra-high vacuums. Dr. Mark Saffman, Chief Scientist for Quantum Information at Infleqtion and Professor of Physics at the University of Wisconsin-Madison, had tremendous insights for me regarding the differences among analog and digital mode for neutral atom QC, and noted that Infleqtion has “recognized the challenges of some of the specialized laser systems being used,” and noted that they are “working with partners on developing more integrated laser technology…with a real challenge currently being developing faster calibration and tuning routines in order to keep the machines in a calibrated state.” That said, Infleqtion and their neutral atom peers are advancing at a furious pace, and I expect significant progress to be made in 2023.
Analog vs Digital/Gate Mode
Richard Feynman is often cited as the father of quantum computing and he is credited with saying “…trying to find a computer simulation of physics seems to me to be an excellent program to follow… and nature isn’t classical dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical…” While Feynman said this in 1981, well before today’s Quantum Computers were possible, he was quite prescient. He wasn’t referring to algorithms or gates or quantum computer code, he was talking about literally simulating nature and that is what some of these neutral atom companies are offering today in analog mode.
Many of you are likely familiar with “digital quantum” algorithms where the information is encoded into single and multiple-qubit functions which are driven by a series of commands or gates, much like traditional computers are currently programed. The specific steps and their order are vital to a successful code or algorithm and there is an art to how such commands are created and sequenced. Most press about QC covers this “digital” mode, and the fidelities and speeds of the gates as well as the length of coherence are two of the bigger hurdles being addressed by today’s players. The challenges facing current digital QC approaches are rooted in the fidelities of the systems, which are quite fickle today and subject to many disruptive factors or “noise” (see a prior post about this noise here).
“Analog quantum” computing also uses qubits and the various quantum mechanical properties that power digital quantum computers (superposition, entanglement and wave properties, etc.) but there are no gates. The exquisite control required to execute gates is one of the major hurdles facing QC development and by circumventing the need to utilize gates, analog quantum computing has surpassed the digital mode on a number of fronts. By transforming certain problems into a “geometric” structure (like Feynman suggested) instead of a sequential gate-based formula, results can be derived without gates. As Alex Keesling, CEO of QuEra told me, “…whereas in gate-based [digital] quantum computing the focus is on the sequence of the gates, in analog quantum processing it’s more about the position of the atoms and where you place them so they can mirror real life problems. We arrange the atoms and define the forces that drive them and then measure the result…so it’s a geometric encoding of the problem itself.”
It took me a while to appreciate this difference, and it is only useful for a certain subset of problem type but given that analog quantum computers require less engineering overhead versus digital Quantum Computers, they are already providing meaningful results and can operate with a larger number of qubits (such as QuEra’s Aquila QC with its 256 qubits and PASQAL’s Fresnel with 324 qubits). So let me explain this further.
Analog quantum computers utilize a paradigm of quantum computing which utilizes the conversion of a problem into a mathematical object known as a Hamiltonian. The Hamiltonian is an operator that corresponds to the total energy of a system, including both kinetic and potential energy. It is somewhat similar to how some companies, such as D-Wave, are using quantum annealing as a way to measure the global minimum energy of a system to get useful output from today’s noisy Quantum Computers. The “Traveling Salesman Problem” is a typical optimization problem (e.g., finding the shortest or least expensive route for the salesman to follow to cover all of his customers, or finding the best placement of cell-phone towers to cover a given area). However, in addition to optimization problems, quantum analog computers can also solve for problems in chemistry simulation and material engineering. Specifically, geometrically creating a “digital twin” of the systems under study and then using the Hamiltonian functionality of the analog processing, users can better understand underlying physics, phase transitions of materials and dynamics of particle collisions, among other features. Further, given the analog mode’s ability to parse sets of data into subsets via Hamiltonian simulations, it is also showing increasing promise in machine learning.
In summary, Analog mode for neutral atom Quantum Computers is showing near-term utility for certain classes of optimization and material engineering problems as well as accelerating quantum machine learning. Currently, each of QuEra and PASQAL are using 2D arrays of neutral atoms in their analog processors. They also have the capability of using 3D arrays (as the technology further evolves), which would provide even greater power from their geometric approach and can also eventually use an analog-digital hybrid approach with the same neutral atom technology. It will be fascinating to watch as the analog and digital approaches scale, and to see which company is able to provide the fastest path to quantum advantage.
The Leading Neutral Atom Players
A special thanks to Yuval Boger and Brian Siegelax and to Georges-Olivier Reymond of Pasqal, Alexander Keesling of QuEra, Rob Hays, Mickey McDonald, and Kortny Rolston-Duce of Atom Computing and Mark Saffman, Max Perez and Sarah Schupp of Infleqtion for their patience and insights about their companies as well as Quantum Computing more generally. Many of the details in this post were derived from my conversations with them.
PASQAL
The Nobel Prize in Physics for 2022 was awarded to Alain Aspect, John Clauser and Anton Zeilinger “for their experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” Professor Dr. Aspect eventual shifted his research focus from photons to neutral atoms and partnered with Georges-Olivier Reymond and Chrstophe Jurczak to create PASQAL. In fact, PASQAL is the first quantum computing company with a Nobel Prize winning co-founder. In September of last year, PASQAL unveiled “Fresnel”, its 324-qubit quantum processor, and they expect a 1000-qubit machine to be available next year.
PASQAL is advancing neutral atom Quantum Computers focused on Analog mode and has already amassed an impressive roster of customers (including BMW, Airbus, LG, Siemans, Saudi ARAMCO and others) and use cases. For example, the company has developed a Quantum Machine Learning algorithm applicable to smart grids, aiming to improve the efficiency of electricity distribution. As noted above, Analog quantum computing has interesting applications in problems that can be structured graphically, such as for material design and for optimization. According to PASQAL, some of the world’s most interesting data is relational and can be encoded in graphs: nodes and links in a network, financial indicators (for portfolio risk optimization) and atoms in a molecular diagram. Graph structures can be rich sources of information, allowing the system to uncover hot spots in a network, clusters in a dataset, or infer function from structure in chemical compounds. Such problems are extremely hard to solve with classical computers but lend themselves to Analog quantum computing. As Georges Reymond told me, “you just need quantum developers that are smart enough to design the specific Hamiltonian that you need. Alternatively, we have a team that can help you do that.” He added that “since you are programming very close to the qubits, you can change the geometry of the register into any shape you want.” He also noted that their Pulsar tool, which is a Python library of applicable primitives, and related Pulsar Studio which uses a no-code graphical interface to help address the given problem and then automatically generates the line code required, makes utilization of analog QC mode more accessible.
While Fresnel is not available other than to existing customers (and so my colleague Brian Siegelwax was not able to test it out himself although he has access to Pulsar and Pulsar Studio), the strong client roster is testament to its general utility. I look forward to PASQAL making its machine(s) more broadly available and to seeing how Brian gauges its utility. UPDATE: Brian has now provided his thoughts on PASQAL’s Pulsar Studio here.
QuEra
Full-stack Quantum Computing firm QuEra, based in Boston, traces its roots to quantum research performed at nearby Harvard University and MIT. Their signature 256-qubit Quantum Computer known as “Aquila” is available now for general use on Amazon Braket, and is complemented by their Bloqade open-source software and GenericTensorNetworks algorithm platform. The management team is quite strong and the fact that they are the first neutral atom player to broadly offer access to their QC, gives them a bit of a front-runner status in the neutral atom field. While their underlying technology and approach can apply to digital gate-based algorithms, they have opted instead to focus on analog processing. Their field programmable qubit arrays (FPQA) offer near-arbitrary configurations of atoms and highly flexible connectivity. Aqila promises rapid development cycles, easy geographic encoding of problems and the exploration of exotic topologies.
The Company has generated 11-figures of R&D and development income from a broad array of government agencies and commercial customers and is the only neutral atom quantum computing company that has made its systems generally available to the public (albeit at a somewhat limited 10 hours per week). Management encourages users to try the platform and they are interested in real-world feedback, including error analysis, so they can utilize the input to further evolve their technology. Brian Sieglewax is generally bullish on Aquila as he describes in a recent post and follow-up on using Aquila for Maximal Independet Set (MIS) and a deeper dive by Brian regarding implementation of Rydberg Toffoli gates. I applaud QuEra’s accessibility focus and firmly believe that QC makers will learn the most, and gain the quickest technological progress, with diverse direct feedback from users, warts and all. The mix of QuEra’s strong management team and current availability of their system, suggests they will continue to make rapid and important progress. I look forward to following progress along their roadmap and to learning what novel applications users are able to execute.
Atom Computing
I recently had the benefit (and pleasure) of spending some time with Rob Hays, via video call, as well as an on-site tour of the new Atom Computing facility in Boulder, led by Mickey McDonald (Principal Quantum Engineer) and Kortney Rolston-Duce (Director of Marketing and Communications), who also indulged me with a private white-boarding session where they did their best to answer all of my Quantum neutral atom 101 questions, and which (finally) helped connect many of the dots that had been swimming around my head. All of these interactions were immensely informative, and I was impressed with the team and with what I learned. Atom’s headquarters and original R&D machine are in Berkeley, but they are using Boulder to create their production unit(s). In fact, they have an interesting approach in simultaneously creating twin machines with the intention of always maintaining customer access to one, while any upgrades or maintenance are performed on the other. Given their lack of requiring cryogenic freezers, their QC’s are not the usual chandeliers many of us are familiar with, but instead are room-sized “black boxes” housing all their optics and the majority of the controllers in various modularized sections.
Atom has an impressive roster of employees and consultants including Dr. Ben Bloom, a co-founder and CTO, who has deep connections in the Boulder quantum ecosystem, and Dr. Jun Ye, their Scientific Advisor, who is a physics professor at nearby CU Boulder, Fellow of the Joint Institute for Laboratory Astrophysics (JILA) and the National Institute of Standards and Technology (NIST) and was recently named member of President Biden’s National Quantum Initiative Advisory Committee. They also have an enviable roster of investors including Venrock, Innovation Endeavors, Prelude Ventures, Prime Movers Lab, and Third Point Ventures, among others.
While they have published some impressive results from their Quantum Computers including “Phoenix”, their first-generation platform, they have opted not to make Phoenix publicly available (although it is accessible by select early customers). However, they are working furiously on their second-generation systems which they plan to make available online via a Quantum Computing as a Service (QCaaS) model. They are actively collaborating with software and application developers, and I look forward to feedback from users (including Mr. Siegelwax), once Atom makes their systems more widely available.
Infleqtion (f/k/a ColdQuanta)
Infleqtion, located a bike-ride away from Atom Computing, traces its roots to Drs. Eric Cornell and Carl Weiman who created the first ever Bose-Einstein Condensate (BEC) at UC Boulder in 1995, a feat for which they were awarded a 2001 Nobel Prize. BEC is a new form of matter, which is created when atoms are cooled close to absolute zero. Infleqtion uses neutral atoms across multiple quantum applications including gate-based quantum computers as well as a variety of quantum sensing and signal processing applications such as High Precision Clocks, Quantum Positioning Systems (QPS), Quantum Radio Frequency Receivers (QRF) and Quantum Networking and Communications as well as some of the fundamental components used by others (i.e., ultra-high vacuum cells). While Quantum Computing steals most of the “quantum” headlines these days, these other devices bring enormous advances in their fields and, importantly, current revenues. I have been fortunate to know a number of management members of Infleqtion and have been closely following their progress since an original blog post in April 2022 about Collaborations and a follow-up dedicated to ColdQuanta in May 2022.
Infleqtion had a number of important milestones noted in 2022 including:
Completion of a $110 million B-round, including A$29 million earmarked to create a Quantum Technology Centre in Australia.
Acquisition of Super.tech, a leading developer of quantum software and related platforms, and announced collaboration with Morningstar to integrate Super.tech’s SuperstaQ software into Morningstar Direct.
Participation as a subcontractor on the Office of Naval Research’s Compact Rubidium Optical Clock program, valued at up to $16.2m.
“Albert,” their BEC design device, was named one of TIME’s Best Inventions of 2022 and winner of the 2022 Prism Award, Quantum.
Won the 2022 Best of Sensors Award, for their high-performance test and calibration instrument known as “Maxwell.”
Dr. Fred Chong, Chief Scientist for Quantum Software, was named IEEE Fellow for his Enabling Practical-scale Quantum Computing (EPiQC) project.
Dr. Bob Sutor, VP and Chief Quantum Advocate, testified at Senate committee hearings regarding the importance of Quantum Computing technologies.
While neither Albert (BEC design platform) nor Hilbert (their 100 qubit QC unit) are regularly available to the public, they continue to make progress advancing both systems and I look forward to an update from Mr. Siegelwax once those systems can be tested (for now, here is his review of Albert when it was available last year). In the meantime, Infleqtion continues to generate meaningful revenues and advance the technologies of its broad quantum-related components and I’m certain they are leveraging their learnings across their portfolio.
[Note: “planqc”, a recent graduate of the Creative Destruction Lab startup incubator, is the newest entrant to the neutral atom field, and is included in the table of players, but was not covered in detail in this post due to its very early stage. I look forward to providing more details on planqc in future posts.
Conclusion
While neutral atom Quantum Computing is not without its shortcomings, has yet to supply consistent and robust performance, and lags being other modalities that have been accessible longer (i.e., superconducting and ion trap modalities), they are gaining strong momentum and feature important theoretical advantages. If the speed of innovation in 2022 is a harbinger of the rate of progress we should expect in 2023, I am excited about the prospect of reporting on this progress and look forward to providing updates.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
Hays, Rob, CEO of Atom Computing, Interview conducted by author on Nov. 17, 2022.
Keesling, Alex, CEO of QuEra, and Boger, Yuval, Consultant, Interview conducted by author on Nov. 10, 2022.
McDonald, Mickey, Principal Quantum Engineer Atom Computing and Rolston-Duce, Kortny, Director of Marketing and Communications Atom Computing, Interview conducted by author during site tour on Nov. 17, 2022.
Reymond, Georges-Olivier, CEO of PASQAL, Interview conducted by author on Nov. 15, 2022.
Saffman, Mark, Chief Scientist for Quantum Information, Infeqtion, Interview conducted by author on Dec. 16, 2022.
Ebadi, Keesling, Cain, Wang, Levine, et. al., “Quantum Optimization of Maximum Independent Set using Rydberg Atom Arrays,” Quantum, February 18, 2022, arXiv:2202.09372v1 [quant-ph]
Silverio, Grijalva, Dalyac, Leclerc, et. al, “Pulsar: An open-source package for the design of pulse sequences in programmable neutral-atom arrays,” Quantum, January 11, 2022, arXiv:2104.15044v3 [quant-ph].
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
When Robert Lamm wrote that first hit song for the band Chicago in 1969, he was likely referring to the pressure that time places on society, not the technological advances dependent on precise time keeping. It’s a crucially important and prescient question that enables modern technology in ways most people are unaware. Why am I featuring this in a “Quantum Leap” blog? If we can now readily obtain the “official” time by syncing our cell phone with GPS satellites or our computer with an atomic clock with accuracy to within one second per 60 million years, why do we need to measure time more accurately than that? Keep reading and I hope you’ll understand.
How do Clocks Work?
“Time” is not some absolute and discrete “thing”, it’s a somewhat arbitrary convention that society agrees to agree on. [It’s also “relative” as in Einstein’s theory, which essentially means that time can differ based on conditions of the measurer]. In the very early days, it was measured by the earth’s rotation, with a day being defined as one rotation. Ancient Egyptians defined a second by dividing the day by 24 into hours, then by 60 into minutes and then again by 60 into seconds (rooted in night cycles and decans), so a second was 1/86,400th of a day. In other words, our current “second” is a man-made construct. In this section I want to explain a bit of the history on the evolution of clocks so that you have a fundamental understanding of how time is measured. The subsequent sections will explain why accuracy and precision of time measurement is so important and enabling, along with listing some of the companies in this field.
Clocks work by counting a periodic event with a known frequency. In the above example, it is the daily rotation of the earth. When grandfather clocks were the standard time-keeping devices, they worked by having a pendulum swing back and forth with its gears counting the swings. The arm of the pendulum in that grandfather clock is typically adjusted to make each half-swing one second. One “cycle” per second is known as 1 Hertz (Hz).
When electronic wrist watches were developed, they used a piece of quartz which vibrates at a certain frequency (32,768 Hz) so in this case a “second” is measured as 32,768 vibrations. The higher the base frequency, generally the more accurate the clock. For example, if that grandfather clock is off by 0.1 Hz, it will be off by one second in ten. If the quartz wristwatch is off by 0.1 Hz, it will be off by one second out of 327,689 or roughly 0.26 seconds per day.
Around 70 years ago, scientists realized that atoms could be used as clocks. When certain atoms are exposed to specific energies (e.g., microwave frequencies) the outer electrons transition between orbits. Specifically, the electron jumps to a higher energy orbit (or takes a “quantum leap”) and the time it takes to return to the original lower energy state is the measurement frequency, hence the “quantum” connection. Since 1967, The International System of Units (SI) has defined the “second” as the period equal to 9,192,631,770 cycles of the radiation transition of Cesium-133. Cesium oscillators, such as the atomic clock maintained by NIST in Boulder, CO (UTC(NIST)), is accurate to within 0.03 nanoseconds per day. The SI aggregates the data of more than 400 atomic clocks operated by over 80 laboratories around the world, averaging their “time” to create the world’s “official” time known as UTC.
Why Do We Need Such Precision Regarding Time?
GPS satellites are a relatively ubiquitous technology and while the name refers to their role in positioning, it is their role as timekeepers that is most relevant to this post and many systems use GPS to derive their time. Specifically, each satellite has an onboard atomic clock, and the signals it beams down to your GPS receiver utilizes the precision of that clock to enable the GPS receiver to triangulate signals and determine position (as well as transmitted time).
In January of 2016, the US Air Force took one of the many satellites in the US GPS constellation offline, and an incorrect time stamp was accidentally uploaded to several other GPS satellites leading to a thirteen-millionths of a second error in their time – less time than it takes the sound of a bullet to leave the chamber. It caused global telecommunications networks to begin to fail, BBC digital radio was out for two days, electrical power grids began to malfunction and even police and fire EMS radio equipment in the US and Canada stopped functioning. This 13-mircosecond error in GPS clocks wreaked havoc on our modern world.
To help illustrate why accurate timekeeping is so important, imagine that you oversee a train tunnel that brings goods in and out of a city. If the trains that run on the tracks are accurate to within 5 minutes of their schedule, that means you must allow a 10-minute window for that train to have access to the tunnel (+/- the five minutes) and therefore you can only schedule 6 trains per hour to use the tunnel. If those trains were more accurate and arrived within 2 minutes of their schedule, you could schedule 15 trains per hour (60 minutes divided by the 4-minute window). So, the throughput of the tunnel is directly proportional to the accuracy of the trains.
This same concept impacts many critical infrastructure elements of our modern society, including:
Stock exchanges
Power Grids
Telecommunications systems
Computer networks
Defense applications (e.g., ballistics accuracy, navigation without GPS, etc.)
Stock exchanges are increasingly driven by high-frequency computer trading and keeping the exchanges fair and equitable under such conditions is a core concern of regulators. All trades are required to maintain timestamps because cutting in line, known as “front running,” is illegal. FINRA, the regulatory body that governs domestic exchanges, maintains “clock synchronization” requirements relative to UTC(NIST). The more precise this requirement, the more trading volume the exchanges can accommodate. The US power grid consists of more than 360,000 miles of transmission lines connecting to about 7,000 power plants, all of which must be synchronized and monitored. Monitoring for faults is one of the core attributes requiring accurate time measurement. Faults in transmission lines are measured at both ends of a given line by synchronized clocks, which can then determine which transmission tower is the source of the fault. Given the broad interdependence of the energy grid and its many power plants, any faults in the system can affect the broader grid unless resolved quickly and accurate clocks help pinpoint the faults in real-time. The current telecommunications system is a two-way transmission medium and maximizing the throughput of data is important both for user experience and for profit. Fitting more bandwidth within a given transmission line means the telecom can earn more money on it. In fact, there is talk that the next generation of cellular protocol (i.e., “6G”) will require each cell tower to maintain an internal atomic clock to optimize bandwidth/throughput. This throughput concept also applies to dispersed networks (i.e., the Internet, the Metaverse, etc.). For example, Google Spanner is a worldwide database designed to operate seamlessly across hundreds of datacenters, millions of machines and trillions of lines of information. Precise timing is vital for seamless handing off between locations and to eliminate drag, but also to ensure that nobody is writing to a given byte at the same time someone is reading that byte. Google achieved this global no-latency network by using their own atomic clocks to create a proprietary time protocol (TrueTime API). Similarly, Meta has utilized atomic clocks in their Metaverse to ensure minimal latency, among other important features.
Time is Money
You undoubtedly have heard the cliché that “time is money”. Here are two examples of how this can be literally true, especially as it relates to US defense initiatives:
In January of this year, Frequency Electronics was awarded a contract worth up to $20.2 million for the development of a Mercury Ion Atomic Clock for applications in various US Naval platforms.
In December 2021, Vescent Photonics was awarded a contract worth up to $16.2 million to develop portable atomic clocks for the Office of Naval Research (ONR) Compact Rubidium Optical Clock (CROC) program.
These are just two examples of such programs, highlighting the increasing importance of exquisitely accurate, field-deployable atomic clocks. A report released earlier this month suggests that the overall size of the atomic clock market will exceed $740 million by 2028. The Vescent deal cited above is being fulfilled in partnership with Infleqtion, Octave Photonics and NIST. The group aims to improve upon existing commercial atomic clocks by interrogating a two-photon optical clock transition in a warm vapor of rubidium (Rb) atoms. As Scott Davis, CEO of Vescent told me, “Exploiting the frequencies of quantized atomic energy levels to define the second, i.e., atomic clocks, has changed the world. These historically have used microwave transitions (lower energy). After the advent of the optical frequency comb, quantized transition at optical frequencies can be utilized. This represents an orders of magnitude step in performance. Vescent manufactures combs designed, for the first time, to leave the lab. This is enabling a next generation of deployed, higher performing, optical atomic clocks.” The CROC program is likely the first of many similar programs where Vescent will apply its technologies.
Can we Create Even More Precise Clocks?
The short answer is yes, by using “quantum” clocks versus existing atomic clocks. As the quantum information industry continues to advance, developments have broad benefits across the industry. As Jun Ye, a Fellow at both NIST and JILA and recently named member of President Biden’s National Quantum Initiative Advisory Committee noted to me, he is working on “experimental atomic clocks which explore the new measurement frontier based on quantum science. From this perspective quantum [optical] atomic clocks and quantum information processing are connected through shared intellectual development and technological advances.”
The following graphic helps display the way frequency combs act like a gear between the ultrafast optical frequencies and the microwave frequencies, which can be counted by current detectors.
In addition to the creation of frequency combs (2005 Nobel Prize in physics), controlling atoms used for measuring frequency transitions is also vital in increasing the accuracy of underlying clocks (the movements of the atoms lead to doppler effects, similar to what you hear as a speeding car goes by, reducing precision of measurement) so “laser cooling” helps push accuracy up. Companies like Infleqtion (f/k/a ColdQuanta) are leveraging their broad capabilities in cold atom science to contribute to improved clocks, with the challenge now being to move these optical atomic clocks out of the lab and into the field. As Max Perez, VP of Research and Security Solutions at Infleqtion told me, “It’s important to get noise out of the system so you can achieve long-term stability…the challenge has been that the laser systems required for cold-atom clocks have been expensive and complicated. The lasers need to be highly tuned with very specific and narrow line widths…and a big part of what we are doing is bringing down the cost and size by leveraging our various technologies.”
Early atomic clocks were room-sized and much less accurate than today. As is common with many technologies, science has been able to improve the accuracy and decrease the size (and cost) of atomic clocks. In fact, today it is possible to compact certain atomic clocks into a microchip.
The example to the right is only 35g and is less than 17cm3 in volume. Photonic clocks are earlier in their evolution and progress is being made to move these out of the lab and into the field (with the aim to also bring them down to chip-scale via photonically integrated circuits).
The following highlights some of the companies manufacturing atomic clocks and/or the components that are used to create them:
Conclusion
The surging attention and resources dedicated to quantum mechanics has yielded amazing technological advances. Much of the Quantum Leap blog has focused on applications in Quantum Computing (QC), but other related technologies are also pushing the frontiers of technology and knowledge. More accurate clocks, leveraging certain advances developed for broader quantum information sciences, are already beginning to have practical applications beyond QC including fundamental advances in physics, more accurate sensors and more precise time keeping. As Jun Ye further noted “Atomic clocks…represent some of the most exquisitely sensitive and accurate scientific instruments that humankind has built to explore the unknows of nature.” It’s an exciting time to be following quantum science and I look forward to tracking and reporting on evolving breakthroughs.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
“Clocks” graphic by Andrey Grushnikov via Pexels.com.
Davis, Scott, CEO of Vescent, Interview contacted by the author on December 16, 2022.
Perez, Max, VP of Research and Security Solutions at Infleqtion, interview conducted by the author on December 16, 2022.
Ye, Jun, NIST Fellow, head of the Ye Group at JILA and Adjoint Professor at the University of Colorado Boulder, via email exchange, November 24, 2022.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
Some have described the rapidly accelerating global push in Quantum Computing as a figurative “space race” given the potential reach of its computational power and its applications in drug development, logistics, material science, and its potential ability to overpower existing encryption techniques. However, this post is focused on the literal quantum space race – the increasing number of quantum devices in orbit and their profound applications. While the fragility of quantum states has been a core challenge in advancing Quantum Computers, that same challenge is a powerful asset for creating ultra-sensitive measuring instruments, and these quantum sensors are now making their way into orbit.
Quantum sensing and quantum communications are making important advances in space in the following areas:
Earth Sensing and Observation
Quantum Key Distribution (QKD) and Secure Communication Networks
Time and Frequency Transfer
Fundamental Physics and Space Exploration
Today there are 77 countries with space agencies,16 of these countries have launch capabilities, and more than 4,500 satellites are currently in earth orbit. Satellites containing quantum devices are increasingly being placed into orbit, and quantum devices have been used in, and deployed from, the International Space Station. As Arthur Herman noted in a recent Forbes article: “Quantum communication satellites will become hubs of not only a future quantum internet, but hubs for hack-proof networks for transfer of classified data and communications – not to mention a command-and-control architecture thatwill be an integral part of space domain dominance” [emphasis added].
The following chart is a partial sampling of existing and planned quantum space launches:
Note: Above chart not intended to be all-inclusive, and some programs have contributions from additional countries.
We are already increasingly dependent on satellites for global communications and GPS service, among other applications, and space-based experiments are advancing basic science and human knowledge. Adding the powerful capabilities of quantum technologies will accelerate and expand upon these space-based advances. The following summarizes some important space-based quantum initiatives:
Earth Sensing and Observation
A key attribute of quantum mechanics which is one of the main rate delimiters in advancing Quantum Computing, is the fragility of the tiny particles placed into a quantum state. Specifically, attempting to control individual atoms, electrons or photons has been very difficult due to the sensitivities of such particles to external forces including gravity, electromagnetic radiation, temperature fluctuations, and vibrations. However, it is this sensitivity to such forces that make “qubits” such powerful sensors enabling them to study and assay the earth in detail never before available.
Space (satellite) based quantum sensors can provide reliable detection, imaging, and mapping of underground earth environments from transit tunnels, sewers and water pipes to ancient ruins, mines, and subterranean habitats. There are important civil engineering benefits that more precise sensing can achieve, particularly around large projects (e.g., nuclear power plants, high-speed rail, etc.) where existing subsurface surveys are extremely expensive, time-consuming, and often not as precise as necessary. Such space-based sensors can also be used to track minute gravitational changes and tectonic shifts that can forewarn of avalanches, earthquakes, volcanic eruptions, or tsunamis. The strength of Earth’s gravitational field varies from place to place, often due to underlying causes of climate change. Variations in gravity are caused by factors such as relative positions of mountains and ocean trenches and variations in the density of the Earth’s interior, but also by small fluctuations in underground water reservoirs or changes in ice mass, so gravimetry is an important new tool to help monitor global warming.
Quantum Key Distribution (QKD) and Secure Communication Networks
QKD is a secure communication method that uses quantum properties of photons to encrypt secret keys that can be share by two parties to encode their communications. The technique is considered un-hackable since any attempt to eavesdrop destroys the keys. Current forms of encryption, such as the widely used public-key cryptosystem developed by RSA, rely on the difficulty of solving mathematical problems whereas QKD instead relies on physical processes. In quantum physics, there is a “no-cloning” theorem which states that it is impossible to create identical copies of an unknown quantum state. This prevents hackers from simply copying the quantum encoded information. Another quantum property known as “observer effect” causes quantum states to change upon observation and therefore, if anyone were to try and read the QKD it would change it and that change would be instantly known by the parties involved. (If interested in learning more about QKD please see here.)
QKD has already been successfully implemented via fiber optic cables, but only over short distances. Beyond 100 kilometers (about 60 miles) the signal degrades and beyond 300 kilometers the information transmission becomes prohibitively slow (i.e., only about one bit per second). In fact, the signal degradation increases exponentially as the distance increases. By using satellites in low-earth orbit (LEO) to send and receive transmissions via line-of-sight, this distance challenge can be largely overcome. LEO orbits can provide line-of-sight transmission between earth-based ground stations that are up to about 700 kilometers (about 430 miles) apart, although this limitation can be exceeded if the key can be stored in the satellite while it orbits or, preferably, by relaying the signal among connected satellites.
Naturally, un-hackable communications is a key objective of many governments as well as certain industrial firms, hence the broad number of countries currently working on space-based QKD.
Time and Frequency Transfer
An overwhelming array of modern conveniences are reliant upon highly accurate clocks. [In fact, this is such a prevalent and important observation that my next post will be dedicated to need for more precise time measurement]. Many electric power grids use clocks to fine-tune current flow. Telecom networks rely on GPS clocks to keep cell towers synchronized so calls can be passed between them. The finance sector uses clocks to timestamp ATM, credit card and high-frequency trades. Doppler radar, seismic monitoring and even multi-camera sequencing for film production all use highly precise clocks. Today’s earth-based atomic clocks are extremely accurate, and you can readily synchronize your computer to the atomic clock of your choice. However, relying on existing atomic clocks for timestamping, such as currently done for GPS satellites, is becoming increasingly challenging. GPS navigation is currently accurate to about three meters (about 10 feet), so it presents challenges to using it for autonomous driving, as one example.
In order to improve on existing time keeping and related applications, we need both a more accurate clock as well as more precise dissemination and sharing of time. Quantum technologies can improve time accuracy by orders of magnitude and placing them in space can enhance dissemination. Increased time accuracy will improve current communications and geolocation services as well as enable new applications and a space-based quantum clock can enable long-range transfer timing.
Fundamental Physics and Space Exploration
NASA’s Cold Atom Lab aboard the International Space Station (ISS) has used atom interferometry to create a new generation of exquisitely precise quantum sensors that scientists are using to explore the universe. Applications of these spaceborne quantum sensors include tests of general relativity, searches for dark energy and gravitational waves, spacecraft navigation and drag referencing, and gravity science, including planetary geodesy—the study of a planet’s shape, orientation, and gravity field.
In 2019, the image of a supermassive black hole was created using earth-based synthetic aperture telescopes. By precisely measuring the arrival time of radio waves at two different locations, an image of their source was created. Because visible light wavelengths are much shorter than radio waves (nanometers vs meters), more sensitive detectors and clocks are required to use this methodology for visible light, such as those now being placed in orbit. The resolution of such an image would match the resolution of a conventional telescope with an aperture equal to the distance between the two satellites. Such telescopes would be extremely sensitive, potentially enabling astronomers to study planets around other stars in vast detail.
Space-based quantum sensors will also be crucial for space exploration. As spacecraft venture further away from Earth, the ability to provide navigational instructions diminishes. Naturally “GPS” would be unavailable in deep space, and Earth-based control signals have increased time lag times as spacecrafts travel further away. Additionally, if such Earth-based navigational commands are not precise enough, the target craft may miss its destination completely. Sensors that can measure a vehicle’s acceleration and rotation can enable navigation without requiring external commands. In addition, space-based quantum sensors are planned to help search for water and other resources on the moon and Mars.
Conclusions
The pace of advances in quantum science is rapid and paradigm shifting. While Quantum Computing gets most of the headlines, important advances in quantum sensing and communication is also advancing rapidly including via deployment in space. By placing powerful quantum devices into space, significant advances in earth observation, space exploration and secure communications are being achieved. Given the intense competitive nature of terrestrial quantum advances, extending this to a “space race” is inevitable and, in fact, is already underway. Readers should anticipate more and more headlines on this topic, and I look forward to providing periodic updates.
Disclosure: The author does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
In December 2021, in an early iteration of this Blog, I described the various qubit modalities in use by some of the Quantum Computing (QC) hardware players. A lot has happened since that post, so I thought it would be constructive to revisit the topic.
When that earlier post was published (click here if interested in reviewing), it described 10 leading quantum hardware companies focusing on four core qubit types (superconducting, trapped ions, photonics and quantum dots). Today there are dozens of quantum hardware companies, a few additional common modalities (notably neutral atoms) and significant advances made across the spectrum.
Qubit Dynamics
While many articles describing and comparing QCs focus on the number of qubits, this core number belies the complexity in comparing actual QC performance due to additional limitations described below. Qubit count is the equivalent of only using horsepower to describe a car. While horsepower is an important metric, most car buyers are equally if not more focused on comfort, handling, fuel economy, styling, etc. Some effort has been made to “consolidate” these variables for QC into a single performance metric (such as Quantum Volume, CLOPS (circuit layer operations per second) or QED-C’s Benchmarks), although no single measurement has yet been adopted by the broad QC ecosystem. For the casual reader, I’d caution you to not focus too much on the number of qubits a given QC has. While “more is better” is generally a useful mantra, as you’ll see below, it is not that simple.
As you may know or recall, placing qubits in a superposition (both “0” and “1” at the same time) and entangling multiple qubits where one is dependent on the status of the other (entanglement) are two fundamental quantum properties which help empower Quantum Computers and allow them to perform certain calculations that can’t easily be executed on traditional computers. Before we review the various types of qubits (i.e., quantum hardware platforms), it may be helpful to summarize some of the limitations faced when placing qubits in superposition and/or entangling multiple qubits, and discuss the key metrics used to measure these properties.
Two-qubit Gate Error Rate: Entanglement is a core property of QCs and the two-qubit gate error rate is the second-most-often reported metric (after qubit count). An error rate of 1% is the equivalent of 99% gate fidelity. You may have come across the concept of a ‘CNOT gate’ or controlled-not gate, which simply takes two qubits and when the first (control qubit) is in a desired state, it flips the second (target qubit). While this sounds basic and simplistic, it is this correlating of the qubits that enables the exponential speedup of QCs. Said another way, it is a method for enabling QCs to analyze multiple pathways simultaneously, and so is truly a fundamental property being leveraged by QCs. Many in the industry suggest that 2Q fidelities exceeding 99.99% will be required to achieve quantum advantage and some modalities are approaching that (for example, IonQ has achieved 99.92%), but most are still considerably below that threshold.
Single qubit/Rotation Error Rate: Single qubit gates, also often referred to as “rotations” adjust the qubits around various axes (i.e., x-axis, y-axis, and z-axis). In classical computing, you may be familiar with a NOT gate, which essentially returns the opposite of whatever is read by the machine. So, a NOT applied to a 0 “flips” it to a 1. Similarly, in quantum computing, we have the X-Gate, which rotates the qubit 180-degrees (around the X-axis) and so also takes a 0 and “flips” it to a 1. Given the exquisite control required to manipulate qubits, it is possible that the pulse instructing the qubit to “flip” may only apply 179-degress of rotation instead of the required 180 and therefore lead to some error, especially if such imprecision impacts many qubits within an algorithm.
Decoherence Time (T1 and T2): T1 (qubit lifetime) and T2 (qubit coherence time) are effectively two ways to view equivalent information, namely “how long do the qubits remain in a state useful for computation?” Specifically, T1 measures a qubit lifetime, or for how long we can distinguish a 1 from a 0, while T2 is focused on phase coherence, a more subtle but also crucial aspect of qubit performance. Many early QC modalities such as superconducting have modest T2 lifetimes, capping out at 100 microseconds (or millionths of a second) whereas some recent entrants such as neutral atoms, have achieved T2 as long as 10 seconds and certain trapped ions have extended that to 50 seconds. These many orders of magnitude difference in T2 among qubit modalities is a key differentiator among them.
Gate Speed: Is a metric that measures how quickly a QC can perform a given quantum gate. This is especially important relative to the decoherence time noted above, in that the QC must implement its gates BEFORE the system breaks down or decoheres. Gate speed will become increasingly important as a raw metric of time-to-solution where microseconds add up. Interestingly, the modalities with relatively short T2 times (i.e., superconducting, and photonic) generally have the fastest gate speeds (measured in nanoseconds or billionths of a second).
Connectivity: Sometimes referred to as topology, is a general layout of the qubits in a grid and is concerned with how many neighboring qubits a given qubit can interact with. In many standard layouts, the qubits are lined up in rows and columns with each qubit able to connect to its four “nearest neighbors”. Other systems can have “all-to-all” qubit connectivity, meaning every qubit is connected to every other one. If two qubits can’t directly interact with each other, “swaps” can be inserted, to move the information around and enable virtual connections, however this leads to added overhead, which translates into increased error rates.
SPAM (State Preparation and Measurement) Error Rate: At the start of any quantum algorithm, the user must first set the initial state, and then in the end, that user must measure the result. SPAM error measures the likelihood of a system doing this correctly. A 1% SPAM error on a five-qubit system provides a very high likelihood that the results will be read correctly (99%5=95%) but as the system scales, this becomes more problematic.
Qubit Modalities
When the bits created for classical computing were first created, there were several different transistor designs developed. Similarly, today there are many ways to create a qubit and there are crucial performance trade-offs among them. The following is a brief overview of some of the more common types:
Superconducting Qubits: Some leading Quantum Computing firms including Google and IBM are using superconducting transmons as qubits, the core of which is a Josephson Junction which consists of a pair of superconducting metal strips separated by a tiny gap of just one nanometer (which is less than the width of a DNA molecule). The superconducting state, achieved at near absolute-zero temperatures, allows a resistance-free oscillation back and forth around a circuit loop. A microwave resonator then excites the current into a superposition state and the quantum effects are a result of how the electrons then cross this gap. Superconducting qubits have been used for many years so there is abundant experimental knowledge, and they appear to be quite scalable. However, the requirement to operate near absolute zero temperature adds a layer of complexity and makes some of the measurement instrumentation difficult to engineer due to the low temperature environment.
Trapped Ions: Another common qubit construct utilizes the differential in charge that certain elemental ions exhibit. Ions are normal atoms that have gained or lost electrons, thus acquiring an electrical charge. Such charged atoms can be held in place via electric fields and the energy states of the outer electrons can be manipulated using lasers to excite or cool the target electron. These target electrons move or “leap” (the origin of the term “quantum leap”) between outer orbits, as they absorb or emit single photons. These photons are measured using photo-multiplier tubes (PMT’s) or charge-coupled device (CCD) cameras. Trapped Ions are highly accurate and stable although are slow to react and need the coordinated control of many lasers.
Photonic Qubits: Photons do not have mass or charge and therefore do not interact with each other, making them ideal candidates of quantum information processing. However, this same feature makes two-gate implementation particularly challenging. Photons are manipulated using phase shifters and beam splitters and are sent through a maze of optical channels on a specially designed chip where they are measured by their horizontal or vertical polarity.
Neutral Atoms: Sometimes referred to as “cold atoms”
are built from an array of individual atoms that are trapped in a room-temperature vacuum and chilled to ultra-low temperatures by using lasers as optical “tweezers” to restrict the movement of the individual atoms and thereby chill them. These neutral atoms can be put into a highly excited state by firing laser pulses at them which expands the radius of the outer electron (a Rydberg state), which can be used to entangle them with each other. In addition to large connectivity, neutral atoms can implement multi-qubit gates involving more than 2 qubits, which is instrumental in several quantum algorithms (i.e., Grover search) and highly efficient for Toffoli (CCNOT) gates.
Semiconductor/Silicon Dots: A quantum dot is a nanoparticle created from any semiconductor material such as cadmium sulfide, germanium, or similar elements, but most often from silicon (due to the large amount of knowledge derived from decades of silicon chip manufacturing in the semiconductor industry). Artificial atoms are created by adding an electron to a pure silicon atom which is held in place using electrical fields. The spin of the electron is then controlled and measured via microwaves.
The following table highlights some of the features of these various qubit modalities, as of Oct. 2022:
There are a few other modalities including NV Diamonds, Topological, Nuclear Magnetic Resonance (which seems more experimental and very difficult to scale) and Quantum Annealing (used by D-Wave, one of the first firms to offer commercial “Quantum” computers, but annealing is not a true gate-capable construct) and it is likely that more methodologies will be developed.
The following table summarizes some of the benefits and challenges along with select current proponents of key qubit technologies currently in use:
The table above is not intended to be all-inclusive. In fact, there is an excellent compendium of qubit technologies put out by Doug Finke’s Quantum Computing Report which can be accessed here (behind a pay wall, but well worth the fee), and which includes over 150 different quantum hardware computing programs/efforts. A special thank-you also to David Shaw and his Fact Based Insight website which has covered this topic in great detail.
Conclusions
As noted in this post, there have been significant advancements in Quantum Computing hardware over the past year or so and I expect this momentum will continue in 2023. Presently there are QCs with 10s to 100s of qubits, and the coherence, connect-ability and control on these early machines continues to improve. In 2023 we should see machines with 1000’s of qubits (e.g., IBM is on pace to release their Osprey QC with 433 qubits before year end and their 1,121 qubit Condor QC next year). Adding sophisticated control and advanced algorithm compilation further extends the capability of these early machines. Whether and when we can achieve universally recognized quantum advantage (i.e., these QCs performing operations that existing supercomputers cannot do) during this NISQ (noisy intermediate stage quantum) era remains to be seen, but this author believes this will happen in the ’23-’24 timeframe and is excited to continue tracking (and reporting on) the progress.
Disclosure: The author has modest positions in some stocks discussed in this review but does not have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
References:
Qubit images from Science, C. Bickel, December 2016, Science, V. Altounian, September 2018, New Journal of Physics, Lianghui, Yong, Zhengo-Wei, Guang-Can and Xingxiang, June 2010
Performance Tables and additional modality details from Fact Based Insight, Accessed October 2022
We all know what “noise” is. And we all appreciate that it is usually an unwelcomed invasion of our peace and quiet. Screaming babies on airplanes, jackhammers in the street, leaf blowers outside your window – all can ruin an otherwise tranquil setting. “Noise” in computer lingo represents a similar disconcerting situation. In Quantum Computing (QC), you likely have come across the concept of noise as a major obstacle to QC’s achieving their potential. In fact, John Preskill, a professor of theoretical physics at Caltech and one of the pioneers of QC, coined the acronym “NISQ”, standing for Noisy Intermediate-Scale Quantum Computers which is used to describe today’s QC stage. There are several significant challenges facing QC makers today, and “noise” is one of the most difficult to overcome.
Quantum Computing Noise
There are many causes for the underlying noise in QCs. In order to best visualize and understand this, here is a reminder of how qubits (the underlying components of QC processing, comprised of individual atoms, photons or electrons) store and manipulate information:
The graphic above depicts a few rotations of a qubit, with the blue arrows pointing to various points before and after a rotation (various rotations are implemented via gates representing algorithm commands) and the red arrow showing the axis of rotation. The ending position of the blue arrow contains important and precise information but can move incorrectly due to several noise factors. Here are a few of the core sources:
The Environment: Qubits are exquisitely sensitive to any changes in their environment. Small changes in temperature or stray electrical or magnetics fields can disturb qubits and cause a degradation of the information. Even weak galactic space radiation can push qubits and thereby degrade them.
Crosstalk: Quantum Computers are powered by qubits acting together. Generally, individual qubits are manipulated by lasers or microwaves. However, sometimes the laser or microwave signal can impact nearby qubits as well as the target qubit, an issue knows as crosstalk.
Quantum Decoherence: A qubit’s quantum state deteriorates rapidly, often even after just fractions of a second, requiring QCs to initiate and complete their algorithms before quantum states collapse.
Implementation Errors: The commands or gates of a quantum algorithm apply various rotations to the qubit, which are implemented by laser or microwave pulses which can also be somewhat imprecise. For example an X-Gate, which is analogous to a NOT gate in a classical computer, essentially “flips” the qubit rotating it by 180 degrees. If the pulse command to do this only leads to a 179-degree rotation, the subsequent calculations will be off by a potentially meaningful amount.
You may be familiar with the term “five 9’s” which has often been used in the context of super-high performance. It generally means a system with 99.999% accuracy, or only one error per 100,000 instances. For service level agreements with, say your cloud provider, five nines would mean less than 5.26 minutes of downtime per year. It’s a high standard, recognizing the reality that certain systems suffer from various unknown or unpredictable challenges. While Quantum Computer makers continue to improve upon the fidelities of their qubits (the underlying physical components which process quantum gates and algorithms), none have been able to achieve greater than 99.9% two-gate fidelities. While that may sound high and would likely have been an acceptable grade on your physics final, it is not enough to enable Quantum Computers to perform the complex algorithms necessary for QCs to outperform existing classical computers.
The non-technical takeaway: Quantum Computations are run via qubits which are very difficult to control, are vulnerable to the tiniest environmental changes and have a natural tendency to move, leading to a degradation of the information.
Error correction is the single largest challenge facing QC advancement today, and there are many ways that companies are addressing this issue.
How to Overcome Noise Constraints in Quantum Computing
In the 19th century, ships typically carried clocks using the time in Greenwich in combination with the sun’s position in the sky for determining longitude during long trips. However, an incorrect clock could lead to dangerous navigational errors, so ships often carried three clocks. Two clocks showing differing times would detect a fault in one, but three were needed to identify which one was faulty (if two matched the third one was off). This is an example of a repetition code, where information is encoded redundantly in multiple devices, enabling detection and correction of a fault. In QCs, because measurement fundamentally disturbs quantum information, we can’t do interim measurements to identify errors because that would terminate the process, so data is shared among multiple qubits, often referred to as ‘ancillary’ qubits, ‘syndrome’ qubits or ‘helper’ qubits. A series of gates entangles these helper qubits with the original qubits, which effectively transfers noise from the system to multiple helpers. We can then measure the helpers via parity check, which, like those redundant clocks, can reveal errors without touching or measuring the original system. However, the trade-off is the requirement for many physical qubits to act as helpers, adding enormous overhead to QCs.
Also, since each step of a quantum algorithm is an opportunity for noise to be introduced, efforts to quicken the runtime or reduce the number of steps (i.e., gates) are intended to minimize the opportunity for noise to corrupt the output. In addition to repetition code methods of finding and correcting errors and overall efforts to minimize circuit depth, there are a few other tools being used to tackle quantum noise. A high-level view of the quantum computing software “stack” should help provide some context for these added methods:
The graphic above is generally referred to as the “full stack” and there are opportunities at each level of the stack to help compensate for or minimize noise. Here are a few methods being deployed:
Quantum Control: At the qubit level, often referred to as the “metal”, engineers continue to optimize the pulses and control signals focused on the qubits as well as create modalities with increasing coherence times. Various ways that the qubits are aligned and/or inter-connected affect this level and advances are being continually announced.
Hardware Aware Optimization: At the Machine Instruction level, focus on transpiler efficiencies can reduce errors and minimize noise impacts. Again, various qubit configurations as well as the specific modalities utilized (superconducting vs optical vs ion vs cold atom, etc.) have an impact on the performance of the algorithms and attention to this level of the stack provides another opportunity for noise reduction.
Compiler Efficiency: Circuit optimization is a target of many players in the QC space. Tools that re-write algorithms to focus on this level of the stack is a growing and important part of the ecosystem. For example, efficient usage of ancillary qubits and/or resetting them quickly to be re-utilized requires less run-time and less steps, which means less opportunity for noise to impact the programs.
Algorithm efficiency: There are many ways to write quantum algorithms so ensuring that the code is as efficient as possible (i.e., eliminating redundant steps or minimizing needs to reset or recalibrate qubits) is another opportunity to minimize noise. The more efficient the code, the quicker it can run, or the shorter its circuit depth needs to be.
Many Shots: A final tool which is a standard procedure in quantum algorithms, is to run the algorithm many times. Each run is referred to as a “shot” and typical algorithms are run with 1000’s of shots. By averaging the output of these shots, a “regression to the mean” is often realized, meaning the averaging of the results helps various noise impacts cancel each other out. [The fact that quantum algorithms are probabilistic and not deterministic is a major reason for the redundant shots, but this redundancy is also a tool to help overcome noise].
The non-technical takeaway: Noise is a major problem impacting the ability of Quantum Computers to achieve their potential. Until fault-tolerant hardware can be developed, quantum engineers are deploying several creative ways to overcome noise in current QCs.
Quantum Companies Addressing Quantum Noise
There are a number of players focused on noise reduction and deploying inventive solutions to optimize the performance of todays quantum machines. Some of these methodologies can achieve performance improvements of orders of magnitude, so these methodologies are yielding significant improvements. As the quantum hardware players release ever-larger quantum machines (for example, IBM has announced it will release a machine with more than 1,000 qubits next year) these error correcting strategies will greatly accelerate the ability of QCs to achieve quantum advantage, with many prognosticators (including yours truly) expecting such achievement sometime next year (at least for certain types of problems). The following is a brief overview of some of the players that offer various quantum noise-reduction solutions:
Classiq: Their flexible and powerful platform automatically creates optimized and hardware-aware circuits form high-level functional models. It automates and simplifies the difficult process of creating quantum algorithms.
Parity QC: Develops blueprints for quantum computers based on their ParityQC architecture creating quantum computers which are scalable by radically reducing the control complexity. This allows them to provide a fully programmable, parallelizable (no SWAP gates), and scalable architecture which can be built with a greatly reduced complexity and a quantum optimization architecture which is independent from the problem. Due to its ability to parallelize gates, the ParityQC Architecture introduces algorithms based on global gates. In each step, a pattern of gates are executed at the same time. This removes the need to implement a control signal for each individual gate and only requires ONE single control signal for all gates instead. This provides a huge advantage for the hardware design and a route to mitigate cross-talk errors during qubit design.
Q-CTRL: Their quantum control infrastructure software for R&D professionals and quantum computing end users delivers the highest performance error-correcting and suppressing techniques globally, and provides a unique capability accelerating the pathway to the first useful quantum computers. This foundational technology also applies to a new generation of quantum sensors, and enables Q-CTRL to shape and underpin every application of quantum technology.
Riverlane: A quantum software provider with a whole stack focus, aiming to squeeze out every bit of efficiency. Their Deltaflow.OS® operating system is compatible with all current quantum hardware platforms including both gate-based and annealing methods. This allows them to provide a fully programmable, parallelizable (no SWAP gates), and scalable architecture which can be built with a greatly reduced complexity and a quantum optimization architecture in conjunction with hardware partners to optimize the design of their architecture for error correction.
Super.tech: This successful member of the first cohort of the Chicago Quantum Exchange/ UChicago Duality Accelerator was acquired by ColdQuanta earlier this year. Their SuperstaQ quantum software platform is optimized across the entire quantum stack enabling 2x reductions in error on typical quantum programs. SuperstaQ includes a library of sophisticated error mitigation techniques, including dynamical decoupling, excited state promotion, and zero noise extrapolation. SuperstaQ automatically optimizes quantum programs based on the target hardware’s pulse-level native gates.
Xanadu: Their PennyLane software is leading programming tool leveraging a cross-platform Python library which enables quantum differentiable programming — that enables seamless integration with machine learning tools. PennyLane also supports a comprehensive set of features, simulators, hardware, and community-led resources that enable users of all levels to easily build, optimize and deploy quantum-classical applications.
Zapata: a quantum computing software company that develops solutions for a wide range of industries. Zapata’s Orquestra™ platform allows users to compose quantum-enabled workflows and orchestrate their execution across classical and quantum technologies. Orquestra combines a powerful software platform, quantum algorithm libraries, and example workflows across machine learning, simulation and optimization. Orquestra automatically scales up and exploits task parallelization opportunities to run quantum algorithms faster.
Disclosure: The author has no beneficial positions in stocks discussed in this review, nor does he have any business relationship with any company mentioned in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
Graphic from Dunning, Alexander & Gregory, Rachel & Bateman, James & Cooper, Nathan & Himsworth, Matthew & Jones, Jonathan & Freegarde, Tim. Composite pulses for interferometry in a thermal cold atom cloud. Physical Review A. 90. 033608. 10.1103/PhysRevA.90.033608. (2014).
If you enjoyed this post, please visit my website, and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
There have been an increasing number of articles describing a coming “Quantum Winter”. While I am still extremely bullish on the sector and do not believe the industry will suffer a full abandonment by investors, the blunt reality is that the investment winds are shifting and will require a more sober view on quantum companies in the near-term.
Here are two graphics to help frame this situation. The first depicts recent movements in the public markets, and the second traces key venture investments. Specifically, the table below highlights the decline in stock price of four publicly traded quantum computing companies including ATOS (European based broad information tech), IONQ (trapped ion quantum computers), Quantum Computing Inc. (quantum software provider) and Rigetti (superconducting full-stack quantum computers). Together, these companies are off 75% from their recent highs whereas the broader NASDAQ index is down 30%.
So, while the overall market is suffering a broad decline including the tech-heavy NASDAQ, this bucket of quantum stocks is down more than double the amount. It is somewhat encouraging that these firms were able to go public recently, but their poor stock performance will make it increasingly difficult for other early-stage quantum companies to follow suit. These four firms are a small sample of the overall quantum industry and the chart is not market-weighted, so this isn’t a statistically clean analysis, but the undeniable conclusion is that investors in publicly traded quantum stocks are looking at a very steep hill regarding their quantum stock results (as are employees in these companies granted stock options at anything close to the IPO prices) and private quantum companies considering public markets as a way to raise operating capital will likely need to wait at least a few quarters, if not longer, before they could consider an IPO.
As for the private sector, venture funding of quantum companies had a break-out year in 2021 with nearly $1.5 billion invested in the top 20 funded quantum businesses. And while 2022 had started out strong, we’ve seen a significant decline in funding in the recently ended quarter, as highlighted in the last column below.
Source: PitchBook (excludes grants and debt financing)
A few additional observations:
The largest equity rounds were for firms creating quantum hardware. The bar to entry for others working on various qubit modalities is now exceptionally high. This is not to say others won’t be added to the list, but the days of seed-funded quantum hardware companies is likely over, rather major institutional support will be required.
Venture led boards are beginning to urge an increase from 24 months of operating capital to 36 months, to ensure adequate runway. This will necessitate a lowering spend by portfolio companies which will translate into longer milestone timelines.
Given the overall market malaise and recent pull-back by venture investors generally, new QC rounds will become more challenging, and down-rounds are likely. Down rounds have lingering and residual negative effects on capital markets, so this undoubtably will cause some heartburn in the industry.
Given the existing dearth of talent in the quantum information industry, combined with rationalized firm valuations and needs to preserve capital, I expect we’ll see increasing M&A activity.
It’s well known that markets move in cycles, so difficult fundraising environments are to be expected. That said, I’m still extremely bullish on the space in general, especially taking a 5-10 year view which is the time range most often cited for achievement of consistent quantum advantage.
My general take-away from this analysis is that valuations for quantum companies will become rationalized in the next few quarters, providing an attractive investment window. In addition, while quantum hardware companies have taken much of the spotlight, there are many other players in the quantum ecosystem that will benefit from broader industry adoption, particularly those involved with the “picks and shovels” of QC such as cryogenics, lasers, optics, controllers, vacuums, etc., and certainly for the software providers, especially those agnostic to the form of hardware used. Quantum sensing and communications are also appropriate focus areas.
In summary, I’m not a believer in a full-on quantum winter, but we are in for some near-term challenges and disruption in the Quantum Computing arena. Tighter budgets, more difficult access to funding, and laser-focus on milestone achievement will be the norm. Of course, the evaporating liquidity will make milestone achievement that much more difficult, so there is likely to be some negative feedback loop effect as well. However, in some sense, this will be positive long-term in that “survival of the fittest” will winnow away some of the marginal players. I predict and expect the industry will come away stronger and I look forward to the eventual “Quantum Spring”.
Disclosure: The author maintains personal long positions in certain companies mentioned herein but does not have any business relationship with any company referenced in this post. The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates. Views are not intended to provide, and should not be relied upon for, investment advice.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.
When I created this blog, my stated purpose was to follow Quantum Computing (QC) from the perspective of an investor. To date, I have generally posted blogs that either covered technical aspects of QC (e.g., this post explaining superposition and entanglement), or showcased the companies involved in commercializing QC (e.g., this post on the evolving ecosystem). However, I hope you’ll indulge me a bit for this latest post, which approaches QC from a philosophical perspective. It’s an aspect of this field that originally gripped my attention and which underlies much of why quantum mechanics conjures such non-intuitive conclusions. Here are a few concepts that will be covered, each of which likely induces head-scratching:
Wave/Particle Duality
Matter/Energy Equivalence
Superposition and Entanglement
The Observer Effect
The Uncertainty Principle
“Imaginary” Numbers
As many of you may already know, a core feature of quantum mechanics concerns the “duality” between particles and waves. Certain aspects also deal with the interchange of matter and energy (you are already likely familiar with Einstein’s E=MC2 equation which famously and simply showed the equivalence between matter and energy). These somewhat non-intuitive principles underpin some fascinating philosophical questions regarding QC. That said, I am approaching this as a lay person, so will not debate any of the theological roots or delve deeply into the underlying physics. However, I hope you will enjoy this mental exercise and that it will spur your curiosity to dig in deeper yourself.
The Quantum Computing “Chicken-and-Egg” Quandary
If you search for resources about the origin of Quantum Computing, you will invariably come across a quote by Richard Feynman, generally cited as the father of QC. In 1981, Feynman said:
“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you better make it quantum mechanical…”
Most current descriptions about how QCs work approach it from the qubit perspective. How to harness the quantum mechanical features of the underlying qubit, be it an atom, electron or photon. A new form of computing paradigm, where we use machines to solve problems or equations that current classical computers would take too long to solve. While this is truly fascinating, and I am confident it will unlock massive opportunities (and value), it is a bit “backwards” from what Feynman was suggesting. His premise was focused on “simulating nature” and since nature is governed by quantum physics, he was suggesting we needed to use quantum physics to better understand nature. It is expected that as QCs become larger and more powerful, we will be able to simulate nature to create better batteries, fertilizers, and medicines, among other things. But QCs will also enable us to answer questions we’ve never thought to ask and which would essentially be gibberish to classical computing processes.
The metaphysics of this concept revolves around using QCs to create better QCs. As we work to scale existing QCs which currently contain tens or hundreds of qubits, an obvious early question is “how do we build better and larger configurations of qubits?” As industry drives towards 1,000,000-qubit machines, it seems obvious (at least to me) that it will take QCs to optimize the configurations of these larger QCs. What is the upper limit of the capabilities such a self-supporting loop can create? This 1,000,000-qubit goal assumes “noisy” qubits, so it is thought that we need 1 million qubits to net-out to 100 logical qubits, and much has been written about the awesome power of 100 logical qubits…but why stop there? What if we had 1,000 or even 1,000,000 logical qubits? The power of such a machine would, essentially, be so massive as to be indescribable.
More on Wave-Particle Duality
Quantum computers derive their power from quantum mechanics, which is the study of the physical properties of nature at the scale of atoms, photons and subatomic particles. Many of the fundamental properties of quantum mechanics revolve around the behaviors of these particles, which exhibit characteristics of both particles and waves. Intuitively, we understand particle behavior which guides the path of a baseball or the motion of a billiard ball. Similarly, we are familiar with waves and how they can sometimes cancel each other or enhance each other. However, when particles exhibit both properties simultaneously, non-intuitive things happen, such as superposition and entanglement. While non-intuitive, these features are well proven experimentally and can be explained and predicted using established mathematics so we must wrestle with the fact that something so non-intuitive is occurring at the smallest scales. Conversely, I have yet to find a satisfactory explanation or formula to describe “the observer effect”. For those of you not familiar with this feature of quantum mechanics, it essentially says the act of measuring something (i.e., observing it) actually changes it. An example of how this manifests in Quantum Computing can be seen if we apply two sequential Hadamard gates. Skipping over the linear algebra and matrix multiplication, just know that if you input a |0〉to two sequential Hadamard gates, |0〉 is output 100% of the time (i.e., it is mathematically equivalent to the identity matrix). However, if you measure the qubit between the two Hadamard gates, the output becomes a superposition that is |0〉 half of the time and |1〉 the other half of the time. The mere act of “observing” the qubit between gates changes the outcome! How does the qubit know it is being observed?
The Y-Gate and “Imaginary” Numbers
Nearly any “Intro to Quantum Mechanics” course, book or article, will mention the Stern-Gerlach experiment as one of the first topics. It’s a fascinating subject that is well covered elsewhere, so I won’t provide much detail here (if interested in learning more, the Wikipedia post on the subject is a great intro and a link is included in the References at the end of this post). The Stern–Gerlach experiment involves sending a beam of silver atoms through a magnetic field and observing the deflection. The results show that particles possess an angular momentum that is similar to the angular momentum of a classically spinning object, but that it has only certain quantized values. Another important result is that only one component of a particle’s spin can be measured at one time, meaning that the measurement of the spin along the z-axis destroys information about a particle’s spin along the x and y axes.
Now, if you’ll bear with me a bit as I reference linear algebra (don’t worry, you don’t need to understand linear algebra to appreciate this point), I want to highlight a very metaphysical aspect of this concept. You’ll note below the matrix notation for two essential “gates” or basic QC functions. The first is known as the “X-Gate” which is analogous to the “NOT” gate in classical computing. If you apply a NOT gate in classical computing, it switches a 1 to a 0 or a 0 to a 1. In Quantum Computing the X-Gate essentially flips the qubit on its head, also switching a |1〉to a |0〉or a |0〉to a |1〉. This is straight forward only requiring the most basic familiarity with matrix multiplication to prove it. However, the “Y-Gate” is quite different. The Y-Gate essentially turns the qubit on its side, and its matrix representation is suddenly quite foreign. The matrix representation of these two gates is shown below:
You will note for the Y-Gate the introduction of “i” (and -i) which is the symbol for the unfortunately named “imaginary” number. “i” is mathematically defined as the solution to “X2 + 1 = 0.” Although there is no “real” number that can solve this equation, it can still be used for certain mathematical functions. It likely would be more fitting to call these “complex” numbers instead of imaginary. Mathematicians would likely describe “i” as “lateral” or “perpendicular” to the plane where the “Real” number lay. Evoking this concept of “Real” versus “Imaginary” suggests the imaginary numbers are surreal or mystical, and while that is itself a metaphysical concept, it is the fact that the information is quite different when orienting along the X-Axis versus orienting perpendicularly on the Y-Axis. Again, for those familiar with linear algebra, this is rudimentary matrix multiplication and for those studying quantum physics, it is one of the first topics covered and proven by the Stern-Gerlach experiments back in the 1920’s. The take-away for this post is that the same quantum “thing”, oriented in one direction, contains different information if you orient it in a perpendicular manner.
Back to the Beginning
As in the beginning of time. That tiny fraction of an instant before the Big Bang. It is generally believed that our current universe was preceded by a reality where everything (all energy and matter) was confined to an infinitesimally small point. For reasons still largely unexplained, this super-concentrated point exploded and expanded into what is now the observable universe. From apparent nothingness came a stupendously large amount of space, time, energy and matter. Have you ever considered why that happened? Surely many of you studied this as you learned about your religion, and largely consider it from a spiritual perspective. But “something” led to the conversion of the pre-universe composition into the current universe comprised of matter and energy. What force led some aspects of the original pinpoint to manifest as matter and some to manifest as energy? Why isn’t it all “energy” or all “matter”? I like to believe that “quantum” was the driving force even at this time-zero. Let me explain.
Most introductory texts to quantum mechanics refers to the “uncertainty” principal. It is referenced by Heisenberg in the context of never quite knowing both the speed and position of a particle, and also leads to QC calculations being probabilistic and not deterministic. This is the concept Einstein was referring to in his famous “God doesn’t play dice…” quote. Imagine for a moment that the original laws governing the Big Bang were completely deterministic. In that case it would seem likely to me, that the universe would not today be made of various “stuff” but would rather be all of one thing. However, nothing interesting can be built from just one component, and certainly nothing organic. So, the propensity of uncertainty may have led to the creation of energy and of matter of varying configurations which spurred a universe made of a dizzying array particles, forces, stars, planets, black holes and the other various wonders of nature. It’s this “quantum-ness” that allows for variability and it’s the variability that creates differing “things”.
Surfing Across Dimensions
This has just been a sampling of some of the head-scratching aspects of quantum and is intended to spur questions to contemplate as opposed to provide answers. The mathematics which helps explain quantum mechanics, also govern the addition (or subtraction) of spatial dimensions, which also challenge our current world view. Perhaps some of the remaining unanswered questions in quantum can be explained by action/forces in dimensions we cannot see? Perhaps someone will come up with a “grand unified theory” to explain how the strong, weak, and electromagnetic forces all work and interact and how they relate to gravity, and perhaps that will help us understand these questions from an intuitive perspective.
In any case, despite the challenging mathematics, the non-intuitiveness of certain features, and the inability to definitively tie together all the disparate features of matter and energy, Quantum Computers continue to scale and to successfully run algorithms. As these devices become more powerful, perhaps they will help uncover some of these mysteries. In the meantime, I hope this post helps stimulate your wonder, and that you dig in deeper to learn and understand more. I welcome your feedback and ponderings and you can reach me at russ@quantumtech.blog.
Disclosure: The views expressed herein are solely the views of the author and are not necessarily the views of Corporate Fuel Partners or any of its affiliates.
If you enjoyed this post, please visit my website and enter your email to receive future posts and updates: http://quantumtech.blog
Russ Fein is a private equity/venture investor with deep interests in Quantum Computing (QC). For more of his thoughts about QC please visit the link to the left. For more information about his firm, please visit Corporate Fuel. Russ can be reached at russ@quantumtech.blog.