Quantum computing has been the technology sector's most persistently overhyped frontier for nearly three decades. The promise is genuine: quantum computers, operating on the principles of superposition and entanglement, could in principle solve certain classes of problems exponentially faster than any classical computer. The most important of these problems — simulating the quantum behaviour of molecular systems, optimising complex logistical networks, breaking widely used cryptographic schemes — would represent transformative capabilities with hundreds of billions of dollars of commercial value. The hype cycle has accordingly been enormous.
But a pattern has emerged in quantum computing commentary: enthusiasm outpaces capability, demonstrations are misrepresented as commercial milestones, and investors struggle to distinguish genuine technical progress from marketing. A quantum computer that performs well on a synthetic benchmark may be completely unable to perform any commercially useful computation. A company that announces a new qubit count may be making a device less useful, not more, if the additional qubits reduce overall system coherence. Understanding what is actually happening in quantum computing in 2024 requires engagement with the technical substance — the gate fidelities, the qubit coherence times, the error correction overhead, the architectural choices — that most investment commentary fails to provide.
This article attempts to provide that engagement. It examines the real technical progress made by the leading quantum computing companies — IonQ, Quantinuum, and PsiQuantum — in 2024, what these milestones actually mean, and how investors should think about the gap between current devices and the fault-tolerant quantum computers that would deliver commercial value at scale.
The Fundamental Technical Divide: NISQ vs Fault-Tolerant
To understand quantum computing progress in 2024, it is essential first to understand the divide between two fundamentally different classes of quantum computing device: noisy intermediate-scale quantum (NISQ) computers and fault-tolerant quantum computers.
NISQ devices are the quantum computers that exist today. They have qubit counts ranging from tens to hundreds of physical qubits, gate error rates typically in the range of 0.1 percent to 1 percent per two-qubit gate, and coherence times that limit circuit depth — the number of sequential gate operations — before errors accumulate to the point where the computation becomes meaningless. The "noisy" in NISQ is not a minor qualification: it describes a fundamental limitation. Every physical qubit interacts with its environment, and these interactions introduce errors at a rate that grows with circuit depth. Above a certain circuit depth, the output of a NISQ device is dominated by accumulated errors rather than meaningful computation.
Fault-tolerant quantum computers are the devices that would deliver transformative commercial capability. In a fault-tolerant architecture, physical qubits are combined into logical qubits using quantum error correction codes — most prominently, the surface code — that can detect and correct errors without measuring the quantum state directly. The overhead is substantial: current theoretical estimates suggest that a single logical qubit with sufficiently low logical error rate requires between 1,000 and 10,000 physical qubits, depending on the physical error rate of the underlying hardware. A fault-tolerant quantum computer capable of simulating drug-relevant molecular systems at commercially meaningful scale might require millions of physical qubits. No such device currently exists.
This divide is not a reason for pessimism about quantum computing's commercial prospects. It is simply a statement of where we are in the development trajectory. Understanding this divide is essential to interpreting what the progress milestones of 2024 actually mean.
IonQ: Trapped-Ion Scaling and Algorithmic Qubits
IonQ, the first pure-play quantum computing company to complete a public listing (via SPAC in October 2021), is the leading public-market representative of the trapped-ion quantum computing approach. IonQ became publicly traded at a valuation of approximately $2 billion — a significant marker for the sector that validated investor appetite for pure-play quantum exposure before any company had achieved commercial revenue at scale.
The trapped-ion approach traps individual ytterbium ions using electromagnetic fields and manipulates their quantum states using precisely calibrated laser pulses. The fundamental advantage of trapped-ion qubits is their quality: because the qubits are identical atomic systems rather than manufactured devices, they have inherently uniform properties and achieve higher gate fidelities than any other physical qubit modality. IonQ's reported two-qubit gate fidelities are typically above 99.5 percent — among the highest of any commercial quantum computing platform.
IonQ introduced the concept of "algorithmic qubits" (#AQ) as a metric for the effective computational power of its systems — a composite measure that accounts for both qubit count and gate fidelity to estimate how complex an algorithm the device can reliably execute. This framing is useful precisely because it cuts through the raw qubit count comparison that dominated earlier quantum computing marketing. A system with 1,000 qubits and 95 percent gate fidelity may be less computationally useful than a system with 30 qubits and 99.9 percent gate fidelity, because the accumulated errors in the former system may dominate any computation of meaningful depth.
In 2024, IonQ continued advancing its hardware roadmap with the Forte system, targeting 35 algorithmic qubits, and announced partnerships with Hyundai for quantum machine learning applications in automotive materials research and with Airbus for quantum optimisation in aircraft loading and routing. These partnerships represent genuine commercial engagement — customers paying for access to quantum computing services and exploring whether quantum algorithms can deliver advantage on real industrial problems — rather than speculative research relationships. Whether quantum advantage is demonstrable on these specific problems with current hardware remains uncertain; but the existence of paying customers doing real work on quantum platforms is a meaningful commercial milestone.
IonQ's longer-term roadmap targets 64 algorithmic qubits by 2025 and the introduction of quantum error correction capabilities in subsequent years. The trapped-ion approach has an inherent constraint on scaling: as the number of trapped ions increases, the laser addressing and control systems become more complex, and the time required for gate operations increases because the shared motional modes used for two-qubit gates must be carefully managed across larger ion chains. IonQ is addressing this constraint through modular architectures and photonic interconnects that would allow multiple smaller ion traps to be connected into larger effective systems.
Quantinuum: The Highest Gate Fidelities and a Path to Error Correction
Quantinuum, formed through the merger of Honeywell Quantum Solutions and Cambridge Quantum Computing in 2021 and valued at approximately $5 billion after a $300 million funding round led by JPMorgan Chase in 2023, has arguably made the most significant technical progress in commercial quantum computing of any company in 2024.
The company's H-series trapped-ion systems use ytterbium ions, like IonQ, but with a distinctive architecture: individual ions are transported between different zones of the ion trap using electric fields, allowing operations to be performed on selected pairs of qubits with exceptional precision. This "quantum charge-coupled device" (QCCD) architecture achieves the highest two-qubit gate fidelities of any commercial quantum computing platform — the H2 system has demonstrated two-qubit gate fidelities above 99.8 percent, and the company has reported certain benchmark gate fidelities exceeding 99.9 percent.
The significance of gate fidelity cannot be overstated for the path to fault tolerance. The surface code, the leading quantum error correction approach, has a threshold error rate: if physical qubit error rates are below approximately 1 percent per operation, the code can in principle correct errors and achieve arbitrarily low logical error rates by increasing the code distance. If physical error rates are above this threshold, error correction makes things worse rather than better. Quantinuum's physical gate fidelities — above 99.8 percent, or below 0.2 percent error per operation — represent a comfortable margin below the surface code threshold, which means that the company's hardware is, in principle, ready for quantum error correction today.
In 2024, Quantinuum demonstrated this readiness with a landmark result: the creation of the first logical qubit with a logical error rate below its physical error rate on a commercial quantum computing system. This is the defining milestone for quantum error correction — demonstrating that the encoding overhead is "worth it" by showing that the logical qubit is more reliable than the physical qubits it is made from. This milestone does not mean fault-tolerant quantum computing is here; the demonstration used small code distances and limited logical qubit counts. But it demonstrates that Quantinuum's hardware quality is sufficient to enter the fault-tolerant regime, and that the scaling path to fault tolerance is no longer purely theoretical.
Quantinuum's commercial trajectory reflects its technical position. The company's quantum chemistry and molecular simulation capabilities — built on the high-fidelity operations of its trapped-ion systems — are attracting serious engagement from pharmaceutical and materials companies exploring whether quantum simulation can accelerate their R&D pipelines. These are not pilot experiments; they are paid engagements with companies that are seriously evaluating whether quantum algorithms can deliver commercial value for specific research questions.
PsiQuantum: The Photonic Long Bet
PsiQuantum occupies a unique position in the quantum computing landscape: it is the company most explicitly focused on fault-tolerant quantum computing from inception, with a technical architecture designed from the outset for million-qubit scale rather than optimised for near-term NISQ applications. The company has raised approximately $665 million in total funding — making it the best-funded private quantum computing company in the world — from investors including Microsoft, Blackbird Ventures, and government programmes in both the US and Australia.
PsiQuantum's approach is built on silicon photonics: quantum computing using photons (particles of light) rather than trapped ions or superconducting circuits. The key insight of the company's founders — Jeremy O'Brien, Terry Rudolph, Pete Shadbolt, and Mark Thompson, all from the University of Bristol's quantum photonics laboratory — is that photonic qubits can be manufactured using existing semiconductor foundry processes. The company has worked with GLOBALFOUNDRIES, a major semiconductor manufacturing partner, to produce photonic chips using standard CMOS-compatible processes at scale.
This foundry compatibility is the core of PsiQuantum's commercial thesis. Trapped-ion and superconducting qubit systems require bespoke manufacturing — specialised cryogenic hardware, custom ion trap geometries, or non-standard superconducting circuit fabrication. Scaling to millions of qubits with bespoke manufacturing implies manufacturing costs and complexity that may be prohibitive. Photonic chips manufactured in standard semiconductor foundries can, in principle, achieve the volumes required for million-qubit systems because the manufacturing infrastructure already exists at scale for classical semiconductor production.
The technical challenge of photonic quantum computing is substantial. Single photon sources and detectors — the fundamental hardware components of a photonic quantum computer — must achieve efficiencies and reliabilities not yet demonstrated in commercial silicon photonics. Two-qubit gates for photons are inherently probabilistic rather than deterministic, requiring complex photonic switching networks and resource overhead that partially offsets the manufacturing advantage. PsiQuantum's specific approach uses fusion-based quantum computing, a protocol developed by the company that performs quantum error correction using a particular type of entangling measurement rather than the conventional gate-based approach.
In 2024, PsiQuantum's most significant public development was the announcement of a partnership with the Australian and Queensland state governments for the construction of a fault-tolerant quantum computing facility in Brisbane — a commitment that reflects government conviction in the photonic approach and provides significant capital for the next phase of the company's development. The company also published technical papers advancing the theoretical foundations of fusion-based quantum computing, including analyses of resource requirements for commercially meaningful quantum algorithms.
PsiQuantum's timeline is explicitly longer than its competitors: the company is not building NISQ devices for near-term commercial deployment, and it does not expect to have commercially useful fault-tolerant quantum computing available until the late 2020s at earliest. This patience in development strategy reflects a conviction that NISQ devices will not deliver transformative commercial value and that the only path to the applications that justify the investment — pharmaceutical simulation, cryptography, optimisation — is through fault tolerance. Whether this conviction proves correct, and whether PsiQuantum's photonic approach reaches fault tolerance faster or more cheaply than competing approaches, are the central questions on which the company's investment thesis rests.
The Race to Fault Tolerance: Architecture Comparison
The competition between quantum computing architectures — superconducting (IBM, Google, Rigetti), trapped-ion (IonQ, Quantinuum), neutral atom (Pasqal, QuEra), and photonic (PsiQuantum) — is fundamentally a race to demonstrate fault-tolerant quantum computing at commercially relevant scale. Each architecture has distinct advantages and constraints that determine its path to this goal.
Superconducting qubits have the highest gate speeds — two-qubit gates in tens of nanoseconds — and have historically led in qubit count, with IBM's Condor system reaching 1,121 qubits in 2023. But superconducting qubits require dilution refrigerators operating at millikelvin temperatures, have limited connectivity without complex inter-qubit routing, and have gate fidelities that trail the best trapped-ion systems. IBM's published error rates for two-qubit gates on current systems are typically in the range of 0.1 to 1 percent — better than early devices but short of the extreme fidelities achieved by Quantinuum.
Neutral atom systems — where individual atoms are trapped in arrays of optical tweezers rather than electromagnetic ion traps — have attracted significant investment and excitement in 2024. The approach offers high qubit connectivity (atoms can be physically moved to create gates between arbitrary pairs), demonstrated gate fidelities competitive with trapped ions, and a native compatibility with quantum error correction codes that require two-dimensional qubit arrays. Pasqal in Paris and QuEra Computing in Boston (spun out of Harvard and MIT) are the leading companies in this space, each having demonstrated systems with hundreds of physical qubits and beginning to explore error correction implementations.
The investor's challenge in the quantum computing landscape is that the architecture race is genuinely uncertain: the physical and engineering trade-offs between approaches are well-understood, but the rate at which each approach will advance along its development curve is not. A portfolio that includes exposure to multiple architectures — through both direct investment and public market positions in companies like IonQ — is better positioned to capture the upside from whichever approach reaches commercial fault tolerance first than a concentrated bet on a single architecture.
Commercial Applications: Where Is the Value?
The applications that are most frequently cited as motivating quantum computing investment are pharmaceutical molecular simulation, materials discovery, financial portfolio optimisation, cryptography, and logistics optimisation. Of these, molecular simulation has the strongest technical foundation for quantum advantage: the problem of simulating quantum chemical systems is provably hard for classical computers in general, and the precision required for drug discovery — understanding how a drug molecule binds to a protein receptor at a quantum mechanical level — is precisely the class of problem where quantum computing's inherent quantum mechanical nature gives it a structural advantage.
But the commercially meaningful molecular simulation problems — simulating the active site of a key enzyme involved in a drug target, for example, or calculating the binding affinity of a candidate molecule — require quantum computations of a complexity that current NISQ devices cannot perform reliably. The number of logical qubits required for commercially useful pharmaceutical molecular simulations has been estimated at between 1,000 and 100,000, depending on the specific problem — numbers that require fault-tolerant quantum computing at scales not yet demonstrated.
The near-term commercial value of quantum computing is therefore more modest and more heterogeneous than the headline applications suggest. Companies like Quantinuum are finding commercial traction in quantum-enhanced security (quantum random number generation and quantum key distribution) and in quantum chemistry benchmarking for research purposes. IonQ is finding commercial interest in quantum machine learning applications and in quantum simulation of materials properties for industrial chemistry. These are real revenues, but they are not the transformative applications that justify the sector's total private and public market capitalisation.
The investment thesis for quantum computing thus has two distinct time horizons: a near-term horizon in which NISQ applications generate modest but real commercial revenue while providing development capital for the transition to fault-tolerant systems; and a long-term horizon in which fault-tolerant systems deliver the transformative applications that justify the sector's capitalisation. Investors who understand this structure can calibrate their positions accordingly — owning public market exposure to established companies like IonQ for the near-term horizon, while seeking seed-stage positions in companies with credible fault-tolerant architectures for the long-term horizon.
Conclusion: Genuine Progress Amid Ongoing Uncertainty
Quantum computing in 2024 is genuinely progressing. Quantinuum has demonstrated logical qubit performance above the threshold for quantum error correction. IonQ has established commercial customer relationships and a credible roadmap toward 64 algorithmic qubits. PsiQuantum has secured government partnerships that provide capital and validation for its long-term photonic approach. The sector is not at the point of transformative commercial value, but neither is it at the point of pure speculation.
The investor's task is to maintain the intellectual honesty to hold both truths simultaneously: that quantum computing's long-term commercial potential is genuine and large, and that the path to realising that potential involves substantial remaining technical and timeline uncertainty. The companies that will capture the bulk of quantum computing's commercial value — by achieving fault-tolerant systems at commercial scale — have not yet done so, and investors who price them as if they had are likely to be disappointed. But investors who dismiss the sector because commercial value has not yet arrived are likely to miss one of the most significant technology transitions of the next two decades.
At Lumino Capital, our approach to quantum computing is shaped by this dual perspective. We look for companies with clear technical differentiation in the path to fault tolerance, credible team pedigree from frontier academic and industrial research, and capital efficiency in navigating the development timeline. The quantum computing companies that will generate venture-level returns are those that reach fault tolerance first — and the evidence from IonQ, Quantinuum, and PsiQuantum suggests that the field is advancing toward that goal faster than the sceptics acknowledge and more slowly than the enthusiasts claim.