A collaboration between the Applied Mathematics and Computational Research Division (AMCRD) at Lawrence Berkeley National Laboratory (Berkeley Lab) and the Physics Division has resulted in a new approach to error mitigation that could help realize the potential theory of quantum computing.
The research team describes this work in an article published in Physical examination letters“Depolarizing Noise Attenuation on Quantum Computers with Noise Estimation Circuits.”
“Quantum computers have the potential to solve more complex problems much faster than classical computers,” said Bert de Jong, one of the study’s lead authors and director of the AIDE-QC quantum computing projects and QAT4Chem. De Jong also leads AMCRD’s Applied Computing for Scientific Discovery group. “But the real challenge is that quantum computers are relatively new. And there’s still a lot of work to be done to make them reliable.”
One of the problems for now is that quantum computers are still too error-prone to be consistently useful. This is largely due to so-called “noise” (errors).
There are different types of noise, including read noise and gate noise. The first concerns reading the result of a race on a quantum computer; the more noise, the more likely it is that a qubit – the quantum equivalent of a bit on a classical computer – will be measured in the wrong state. The latter concerns the actual operations carried out; noise here means the probability of applying the wrong operation. And the prevalence of noise increases dramatically the more one tries to perform operations with a quantum computer, making it more difficult to find the correct answer and severely limiting the usability of quantum computers as they are scaled.
“So the noise here just means: It’s something you don’t want, and it obscures the outcome you want,” said Ben Nachman, a Berkeley lab physicist and study co-author who leads the study. also the transversal program Machine Learning. for the Fundamental Physics group.
And while error correction – which is common in classical computers – is ideal, it is not yet feasible on current quantum computers due to the number of qubits required. The next best thing: Error Mitigation – methods and software to reduce noise and minimize errors in scientific results from quantum simulations. “On average, we want to be able to tell what the correct answer should be,” Nachman said.
To achieve this, Berkeley Lab researchers have developed a new approach they call noise estimation circuits. A circuit is a sequence of operations or a program executed on a quantum computer to calculate the answer to a scientific problem. The team created a modified version of the circuit to give a predictable response – 0 or 1 – and used the difference between the measured and predicted response to correct the measured output of the actual circuit.
The noise estimation circuit approach corrects some errors, but not all. The Berkeley lab team combined their new approach with three other different error mitigation techniques: correcting read errors using “iterative Bayesian unfolding,” a technique commonly used in high-energy physics; a homemade version of the random compilation; and error extrapolation. By putting all these parts together, they were able to get reliable results from an IBM quantum computer.
Make larger simulations possible
This work could have far-reaching implications for the field of quantum computing. The new error-mitigation strategy allows researchers to find the right answer from simulations that require a large number of operations, “much more than people have typically been able to do,” de Jong said.
Instead of doing dozens of so-called controlled tangle or NOT operations, the new technique allows researchers to perform hundreds of such operations and still get reliable results, he explained. “So we can actually do bigger simulations that couldn’t be done before.”
Additionally, the Berkeley Lab group was able to use these techniques effectively on a quantum computer that isn’t necessarily optimally tuned to reduce gate noise, de Jong said. This helps broaden the appeal of the new error mitigation approach.
“That’s a good thing because if you can do it on those kinds of platforms, we can probably do it even better on the less noisy ones,” he said. “So it’s a very general approach that we can use on a lot of different platforms.”
For the researchers, the new approach to error mitigation potentially means being able to solve bigger and more complex problems with quantum computers. For example, scientists will be able to perform chemical simulations with far more operations than before, said de Jong, a computational chemist by trade.
“My interest is trying to solve problems related to carbon capture, battery research, catalysis research,” he said. “And so my portfolio has always been: I do science, but I also develop the tools that allow me to do science.”
Advances in quantum computing have the potential to lead to breakthroughs in a number of areas, from power generation, decarbonization and cleaner industrial processes to drug development and artificial intelligence. At CERN’s Large Hadron Collider – where researchers send particles crashing into each other at incredibly high speeds to study how the universe works and what it’s made up of – quantum computing could help find hidden patterns in LHC data.
To advance quantum computing in the near term, error mitigation will be essential.
“The better the error mitigation, the more operations we can apply to our quantum computers, which means that one day, hopefully soon, we will be able to do computations on a quantum computer that we couldn’t. do now,” said Nachman, who is particularly interested in the potential of quantum computing in high-energy physics, such as the further study of the strong force responsible for binding nuclei.
The study, which began in late 2020, marks the latest in a series of collaborations between Berkeley Lab’s physics and computational research divisions. This kind of cross-divisional work is especially important in quantum computing research and development, Nachman said. A funding appeal a few years ago from the US Department of Energy (DOE) as part of a pilot program to see if researchers could find ways to use quantum computing for high-energy physics initially prompted Nachman and his colleague Christian Bauer, a Berkeley Laboratory theoretical physicist, to approach de Jong.
“We said, ‘We have this idea. We do those calculations. What do you think? “, said Nachman. “We have put together a proposal. It was funded. And now it’s a huge fraction of what we do.”
Many people are interested in this technology at all levels, according to Nachman. “We’ve benefited greatly from working with the (de Jong) band, and I think it’s a two-way street,” he said.
De Jong agreed. “It’s been fun learning each other’s physical languages and seeing that deep down we have similar algorithmic requirements and needs for quantum computing,” he said.
The Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility at Oak Ridge National Laboratory, provided researchers with access to IBM Q quantum computers used for research.
Along with de Jong, Nachman, and Bauer, participants in this research effort include Miroslav Urbanek, formerly of Berkeley Lab’s Computing Research Division and now at Atom Computing; Vincent R. Pascuzzi, former member of the Physics Division and now a research associate with Brookhaven National Laboratory’s Computational Science Initiative; and Andre He, formerly of the Physics division and now a quantum hardware engineer at IBM.
The study was supported by the DOE through the Office of Advanced Scientific Computing Research’s Quantum Algorithm Team Program and the Office of High Energy Physics through the Quantum Information Science Enabled Discovery Program.