If the era of quantum computing began 3 years ago, its rising sun may have hidden behind a cloud. In 2019, Google researchers claimed to have reached a milestone known as quantum supremacy when their Sycamore quantum computer performed an abstruse calculation in 200 seconds that they said would immobilize a supercomputer for 10,000 years. Now scientists in China have done the math in hours with ordinary processors. A supercomputer, they say, could beat Sycamore downright.

“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the task in seconds,” says Scott Aaronson, a computer scientist at the University of Texas at Austin. The advance takes some of the shine out of Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting 300 feet from the top is less exciting than getting to the top.”

Yet the promise of quantum computing remains intact, according to Kuperberg and others. And Google Quantum AI lead scientist Sergio Boixo said in an email that the Google team knew their edge might not last very long. “In our 2019 paper, we said classic algorithms would improve,” he said. But, “we don’t believe this classical approach can keep up with quantum circuitry in 2022 and beyond.”

The “problem” solved by Sycamore was designed to be difficult for a conventional computer but as simple as possible for a quantum computer, which manipulates qubits that can be set to 0, 1 or, thanks to quantum mechanics, n’ any combination of 0 and 1. at a time. Together, Sycamore’s 53 qubits, tiny resonant electrical circuits made of superconducting metal, can encode any number from 0 to 2^{53} (about 9 quadrillion) – or even all at once.

Starting with all qubits set to 0, Google researchers applied a random but fixed set of logical operations, or gates, to single qubits and pairs over 20 cycles and then read the qubits. Roughly speaking, quantum waves representing all possible outputs spread among the qubits, and the gates created interference that enhanced some outputs and canceled others. Some should therefore have appeared with more probability than others. Over millions of trials, a peak output pattern has emerged.

Google researchers argued that simulating these interference effects would overwhelm even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 central processing units and 27,648 graphics processing units (GPUs) more fast. The IBM researchers, who developed Summit, quickly countered that if they mined every bit of hard drive available to the computer, it could handle the computation in days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics of the Chinese Academy of Sciences, and his colleagues have shown how to beat Sycamore in an article in press at Physical examination letters.

Following others, Zhang and his colleagues redefined the problem as a 3D mathematical array called a tensor network. It consisted of 20 layers, one for each gate cycle, with each layer comprising 53 points, one for each qubit. Lines connected the dots to represent gates, with each gate encoded in a tensor – a 2D or 4D grid of complex numbers. Running the simulation was then essentially reduced to multiplying all the tensors. “The advantage of the tensor network method is that we can use many GPUs to perform the calculations in parallel,” says Zhang.

Zhang and his colleagues also relied on a key insight: Sycamore’s calculation was far from exact, so theirs didn’t need to be either. Sycamore calculated the output distribution with an estimated fidelity of 0.2%, just enough to distinguish fingerprint spikes from noise in the circuitry. So Zhang’s team traded precision for speed by cutting some lines from their network and eliminating the corresponding gates. Losing just eight rows made the calculation 256 times faster while maintaining 0.37% fidelity.

The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on an innovation of their own to obtain a truly random representative set. The computation took 15 hours on 512 GPUs and produced the revealing spiky output. “It’s fair to say that Google’s experiment was simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the calculation would take tens of seconds, Zhang says, or 10 billion times faster than the Google team estimated.

This advance underlines the pitfalls of the race between a quantum computer and a conventional computer, according to the researchers. “There is an urgent need for better quantum supremacy experiments,” says Aaronson. Zhang suggests a more practical approach: “We should find real-world applications to demonstrate quantum advantage.”

Yet Google’s demonstration wasn’t just hype, the researchers say. Sycamore required far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had slightly higher fidelity, he says, his team’s simulation wouldn’t have been able to keep up. As Hangleiter puts it, “The Google experiment did what it was supposed to do, start this race.”