According to Associate Professor Jayne Thompson of Nanyang Technological University, the most significant shift in quantum computing is already happening. Fault-tolerant quantum processors are no longer theoretical. In her presentation at Tech Week Singapore 2025, she outlined how recent advances in error correction, hardware engineering, and algorithm design have pushed the field into a phase where attacking modern cryptography, restructuring AI inference, and transforming scientific modelling are not distant possibilities but foreseeable outcomes with the new generations of fault tolerant processors.
- The Dawn of Fault-Tolerant Quantum Computing
- Fragmented Hardware Landscape: With Good Reason
- A Whole New Algorithmic Universe
- Where Will Be The Early Impact of Quantum Technology?
- Scaling Challenges and the Path Ahead
- Quantum Computing Still in Its Infancy, Yet Growing Exponentially
- Quantum Computing Promises A Decade of Acceleration
Thompson, who also conducts research at the NTU College of Computing and Data Science and the Centre for Quantum Technologies, and who has worked across government labs and the quantum compiler development industry, described a technology that is still early but accelerating so quickly that organisations need to prepare now. Her assessment positions quantum computing as a shift with implications for global security, industrial R&D, and computational practice.
Her message underscores that the timelines are shortening and the implications are imminent.
The Dawn of Fault-Tolerant Quantum Computing
In Thompson’s view, the field has already entered an entirely new phase. “The era of fault-tolerant quantum computing began about two years ago,” she noted, describing one of the most important technological milestones since the invention of the transistor.

Fault tolerance matters because early quantum processors are unstable. Qubits, the quantum equivalent of classical bits, are highly sensitive and prone to decoherence, random flips, and noise within microseconds. Without error correction, large-scale computation is impossible. That barrier, Thompson argued, is now beginning to fall.
She explained that fault-tolerant quantum computing allows errors to be corrected during computation, ensuring accurate results even when the underlying hardware is imperfect. The long-term promise of quantum computing, particularly its ability to solve problems that classical computers cannot practically manage, depends on operations remaining reliable for the duration of long, complex algorithms.
According to Thompson, the industry is converging on a common roadmap. She observed that most major quantum computing companies are targeting 100 to 200 logical, error-corrected qubits by 2030. Logical qubits behave as stable units of computation, each supported by many physical qubits and continuous error-correction cycles. With only a few dozen logical qubits available today, reaching 100 would represent a major milestone.
Yet, as Thompson emphasised, hitting this threshold is merely the beginning. Once quantum computers reach around 10,000 logical qubits and can keep computations alive for a week – potentially achievable as early as the late-2030s – quantum computers will be able to run algorithms powerful enough to break RSA (Rivest-Shamir-Adleman) public-key cryptography, which is the foundation of secure global transactions, encrypted communications, and much of the internet.
“By the time we hit around 10,000 logical fault-tolerant qubits, RSA public-key cryptography will likely fall,” she warned.
She added that the threshold could be even lower, noting that “the epoch could be as low as 4,000 logical qubits depending on the exact algorithm used.”
These thresholds could leave long-standing digital infrastructure vulnerable, including cryptosystems, financial exchanges, and other systems built over decades. She emphasised that migration to quantum-safe systems needs to begin immediately.
Fragmented Hardware Landscape: With Good Reason
The quantum hardware landscape remains unusually diverse. Unlike classical computing, which converged on silicon CMOS, quantum computing has no universally accepted physical implementation. Organisations are pursuing superconducting qubits, trapped ions, neutral atoms, photonic systems, and topological qubits.
Thompson argued that this fragmentation is justified. Each modality comes with specific strengths and trade-offs. She highlighted neutral atoms for their low noise and scalability, and superconducting qubits for their engineering maturity and fast gate speeds. She noted that companies such as Google already have error-corrected superconducting qubits, and IBM’s roadmap places significant emphasis on this approach.
Some critics point to the extreme cooling requirements of superconducting systems. Thompson dismissed this as a negligible factor relative to overall investment, stating that liquid-helium cooling is unlikely to represent even 1% of total costs. The real challenges lie in precision control, chip engineering, scalability, and deep technical expertise.
She noted that quantum computing draws on physics, computer science, material science, and electrical engineering, and that commercialisation depends on integrating these domains into workable systems.
A Whole New Algorithmic Universe
While hardware attracts significant public attention, Thompson’s background gives her a strong focus on algorithms, the true heart of quantum advantage. Early breakthroughs such as Shor’s algorithm were rare, but the landscape has expanded dramatically since the early 1990s.
She noted that “Shor’s was one of the first non-toy quantum algorithms,” adding that researchers have since developed “a lot of algorithms for linear algebra, Monte Carlo simulation, various tasks inside machine learning, algorithms for tasks inside simulation of different physical systems, and heuristics for optimisation.” These areas reflect the typical use cases of quantum computing and illustrate how widely algorithm development has progressed across scientific and industrial applications.
Thompson emphasised that quantum computation is not merely faster; it operates differently. Qubits exist in superposition, expanding the representational space exponentially. Two qubits form a four-dimensional vector; at 300 qubits the size of the associated vector exceeds the number of atoms in the observable universe. Yet only a limited number of classical bits can be extracted from this space, so quantum algorithms rely on interference patterns to direct computation towards meaningful results.
She illustrated this through a teaching example involving an exam: A student with only one bit of memory cannot store answers to two possible questions. A quantum student, however, stores both answers simultaneously in a superposition state, enabling a dramatically higher probability of success. This analogy reveals the deeper truth behind quantum advantage: Classical systems must prepare for all contingencies, but quantum systems need not choose in advance.
“Quantum mechanically, we have another option. We place the memory into a superposition,” she explained. “One part of the wavefunction stores the answer to question 1; the other stores the answer to question 2.” This reduces information waste, which she noted is a fundamental source of quantum efficiency.
Where Will Be The Early Impact of Quantum Technology?
The implications extend directly to real technologies, such as AI inference. Classical AI systems consume vast computational power, and projections suggest that by 2030, 12% of the energy consumption in the US may come from data centres and AI [1]. This constraint is where quantum computation may offer a path forward. Thompson noted that “whenever a classical agent wastes information, we can produce a quantum agent with a smaller chip size and less memory, and we also potentially save energy in AI inference.”
This perspective emphasises the physical constraints shaping classical and quantum computation. Quantum computing does not merely perform the same computations faster; it changes the nature of what computation can be. There are five areas where the early impact of these developments is likely to be felt.
1. Simulation of Quantum Systems: The First and Biggest Wave
The earliest breakthroughs will likely arise from chemical simulation, materials science, drug discovery, and healthcare.
“If quantum properties affect macroscopic behaviour, and they often do, then quantum-level modelling becomes essential,” Thompson said.
Chemical interactions are inherently quantum mechanical, and classical computers simulate them only through approximations that can become inaccurate for complex molecules.
Quantum computers, by contrast, natively simulate quantum behaviour.
This ability could revolutionise:
- Catalyst design
- Pharmaceutical drug discovery
- Energy-efficient materials
- Biological modelling
- Battery chemistry
As Thompson noted, “Quantum computing is expected to play a role in future simulation of chemical systems, as well as the design of materials and drugs.” Given the enormous R&D costs in these industries, even incremental improvements could generate trillion-dollar impacts.
2. Financial Services: Optimization and Monte Carlo Simulation
Finance is poised to be one of the earliest sectors to feel tangible economic change. Quantum computing offers algorithmic speed-ups for:
- Pricing financial instruments
- Estimating risk
- Portfolio optimisation
- Monte Carlo simulations
Thompson explains that classical Monte Carlo simulations require roughly 1/ε² samples for accuracy ε, which quickly escalates computational cost. She noted that quantum algorithms use interference to achieve a quadratic speed-up, saying that “instead of needing 1 million classical samples, we may need around 1,000 quantum samples.” This represents a structural shift in computational economics rather than a marginal improvement.
Her presentation further highlighted that rare-event probability estimation, which is critical for financial stability, insurance modelling, and macroeconomic forecasting, can be quadratically faster on quantum hardware.
3. Cryptography and Cybersecurity
Perhaps the most urgent impact is in cybersecurity. Thompson was unequivocal: “Quantum computing will break the RSA public-key cryptography system.” It will also break elliptic-curve cryptography such as Diffie-Hellman. The timeline is not abstract either. She warned that cryptosystems “must migrate in the next 10 years, or they will be vulnerable.”
This includes cryptocurrencies. When asked whether Bitcoin is at risk, her answer was blunt: “Yes, Bitcoin will have problems. It is based on the hardness of the discrete logarithm.”
4. Optimisation, Logistics, and Scheduling
Quantum computing could reshape the architecture of global supply chains. Optimisation under constraints such as scheduling flights, routing deliveries, or balancing grids falls naturally within quantum advantage territory.
“We expect quantum computing to solve combinatorial optimisation problems, including optimisation on graphs,” Thompson noted. These capabilities could change logistics, manufacturing, transportation, and energy networks worldwide.
5. AI and Energy Efficiency
Thompson’s work on quantum-enhanced interactive agents points toward a future where quantum computing transforms AI inference.
“Quantum systems can be in many states at the same time. When a classical agent executes one sequence of decisions, it collapses into one future. But from a quantum perspective, we can evolve all possible future outcomes simultaneously,” she added. This allows machines to process uncertainty more efficiently, opening the door to AI that is more context-sensitive, energy-efficient, and powerful.
Scaling Challenges and the Path Ahead
The path from 100 to 10,000 logical qubits is not simply a matter of building larger chips. Quantum scaling requires improvements across the entire stack: qubit lifetimes, gate speeds, coherence, measurement precision, and error-correction throughput.
“We need very fast measurement capability to detect errors; fast decoding to determine what error occurred; and fast feedback to fix it before errors compound,” Thompson explained. Every modality, from superconducting to neutral atoms, faces unique challenges, and this diversity keeps innovation competitive.
Thompson also emphasised that commercial viability is not merely about hardware availability. Migration itself takes years. Companies must identify quantum-suitable problems, develop algorithmic expertise, integrate quantum compilers, and adopt hybrid quantum-classical workflows.
“If quantum computing is going to begin to have a major impact in about 10 years, then the migration must begin now,” she warned. The future belongs to organisations that start early.
Quantum Computing Still in Its Infancy, Yet Growing Exponentially
Quantum computing is, remarkably, only about 30 years old. “We are about a century younger than our classical counterparts,” Thompson observed. Yet growth is accelerating. She cited Boston Consulting Group figures showing 50% year-on-year expansion of the global quantum computing market, with IBM pledging US$150 million into new technologies this decade.
Despite this momentum, the field is still defining its own trajectory. Thompson stressed that quantum scientists are not domain experts across all industries; meaningful impact requires close collaboration with end-users.
“End-users and domain specialists are the best people to help identify applications and remove pain points in workflows,” Thompson remarked. The shift she described comes against a backdrop of classical limitations that have shaped industrial practices for decades; once those constraints disappear, entirely new workflows may emerge.
This reflects a broader shift in how researchers and organisations understand what problems are solvable, how machines reason, and how digital infrastructure will evolve.
Quantum Computing Promises A Decade of Acceleration
The overall message is measured but unmistakable. Quantum computing applications are growing; they will be powerful, fast, and transformative. Quantum systems will not replace classical computers but will augment them through hybrid architectures. They will not immediately overhaul every industry but can reshape fields where classical computation is already at its limits.
The next decade will be shaped by how organisations prepare for quantum technologies, particularly in areas where classical systems already face fundamental limits.
Jayne Thompson’s core advice is clear: prepare now. The future of computation is already unfolding, and the organisations that adopt quantum thinking today will shape the technological world of tomorrow.
[1] How much energy will AI really consume? The good, the bad and the unknown, Sophia Chen, Nature https://www.nature.com/articles/d41586-025-00616-z.

