quantum computing
Credit: Lawrence Berkeley National Laboratory

“Practical Large-Scale Quantum Computation” Could Be on the Horizon as Researchers Solve Problematic Error Rate

The long-standing issue holding back quantum computing—its high propensity for errors—may finally be a thing of the past, as Harvard physicists claim they have overcome what researchers have viewed as one of the technology’s greatest hurdles.

Quantum computers operate by encoding information in units known as qubits, which can possess two data storage states at once and can entangle with other qubits to achieve even more complex performance, unlike the classic binary bits of traditional computers. However, the technology has been relegated to being a laboratory curiosity, as the high error rate in qubits makes quantum computing unreliable for practical applications.

The Benefit of Quantum Computers

Traditional computers, which use binary bits, are limited compared to quantum computers’ ability to store data in multiple states simultaneously. While doubling bits also doubles a standard computer’s processing power, qubits’ multiple states allow exponential increases in processing power as more qubits are added through quantum entanglement. Theorists estimate that a system operating at just 300 qubits could potentially store more information than there are particles in the universe.

While the amount of data processing that quantum computers are theoretically capable of is enormous, they are highly susceptible to slipping out of their quantum state into decoherence, thus corrupting that data. 

Error Correction Solved

Now, a group of Harvard researchers recently revealed their solution to the long-standing quantum error problem in a paper published in Nature.

“For the first time, we combined all essential elements for a scalable, error-corrected quantum computation in an integrated architecture,” said lead author Mikhail Lukin, co-director of the Quantum Science and Engineering Initiative. “These experiments—by several measures the most advanced that have been done on any quantum platform to date—create the scientific foundation for practical large-scale quantum computation.”

The researchers describe their new system as “fault-tolerant,” relying on techniques like physical entanglement, logical entanglement, logical magic, and entropy removal to detect and correct errors. In some instances, it even quantum teleports the quantum state of one particle to another.

“There are still a lot of technical challenges remaining to get to very large-scale computer with millions of qubits, but this is the first time we have an architecture that is conceptually scalable,” said lead author Dolev Bluvstein, Ph.D. ’25, who did the research during his graduate studies at Harvard and is now an assistant professor at Caltech. “It’s going to take a lot of effort and technical development, but it’s becoming clear that we can build fault-tolerant quantum computers.”

Bringing Quantum Computing Out of the Lab

“There have been many important theoretical proposals for how you should implement error correction,” said co-author Alexandra Geim, a Ph.D. student in physics in the Kenneth C. Griffin Graduate School of Arts and Sciences. “In this paper, we really focused on understanding what are the core mechanisms for enabling scalable, deep-circuit computation. By understanding that, you can essentially remove things that you don’t need, reduce your overheads, and get to a practical regime much faster.”

Qubits themselves are not yet standardized. The Harvard team created them by bombarding neutral atoms with lasers, but globally, researchers are also experimenting with various types of atoms, ions, and superconductors. This variety of methods for experimenting with quantum computing may make the technology appear even more uncertain, but the team emphasizes that physics relies on experiments, and this work, on the whole, is moving the technology forward.

“This big dream that many of us had for several decades, for the first time, is really in direct sight,” Lukin said. “By realizing and testing these fundamental ideas in a lab, you really start seeing light at the end of the tunnel.”

The paper, “A Fault-tolerant Neutral-atom Architecture for Universal Quantum Computation,” appeared in Nature on November 10, 2025.

Ryan Whalen covers science and technology for The Debrief. He holds an MA in History and a Master of Library and Information Science with a certificate in Data Science. He can be contacted at ryan@thedebrief.org, and follow him on Twitter @mdntwvlf.