Quantum computers have the potential to speed up computation, help design new medicines, break codes, and discover exotic new materials – but only when they’re functional.
A major thing that prevents quantum computers from functioning as normal is the noise or errors that are produced during computations.
However, Daniel Lidar, holder of the Viterbi Professorship in Engineering and Professor of Electricl & Computing Engineering at the USC Viterbi School of Engineering, has been iterating on quantum error correction, and in a new study with collaborators at USC and Johns Hopkins, has been able to demonstrate a quantum exponential scaling advantage, using two 127-qubit IBM Quantum Eagle processor-powered quantum computers, over Cloud.
The paper off the back of the study, ‘Demonstration of Algorithmic Quantum Speedup for an Abelian Hidden Subgroup Problem’ was published in APS flagship journal ‘Physical Review X’.
“There have previously been demonstrations of more modest types of speedups like a polynomial speedup,” said Lidar, who is also the cofounder of Quantum Elements. “But an exponential speedup is the most dramatic type of speed up that we expect to see from quantum computers.”
The key milestone for quantum computing, according to Lidar, has always been to demonstrate that we can execute entire algorithms with a scaling speedup relative to ordinary “classical” computers.
A scaling speed up doesn’t mean that you can do things 100 times faster, for example: “Rather, it’s that as you increase a problem’s size by including more variables, the gap between the quantum and the classical performance keeps growing. And an exponential speedup means that the performance gap roughly doubles for every additional variable. Moreover, the speedup we demonstrated is unconditional.”
What makes a speedup “unconditional,” Lidar explained, is that it doesn’t rely on any unproven assumptions. Prior speedup claims required the assumption that there is no better classical algorithm against which to benchmark the quantum algorithm. Here, the team led by Lidar used an algorithm they modified for the quantum computer to solve a variation of ‘Simon’s problem,’ an early example of quantum algorithms that can, in theory, solve a task exponentially faster than any classical counterpart, unconditionally.
Simon’s problem involves locating a hidden repeating pattern in a mathematical function and is considered the precursor to what’s known as Shor’s factoring algorithm, which can be used to break codes and launched the entire field of quantum computing. Simon’s problem is like a guessing game, where the players try to guess a secret number known only to the game host (the ‘oracle’). Once a player guesses two numbers for which the answers returned by the oracle are identical, the secret number is revealed, and that player wins. Quantum players can win this game exponentially faster than classical players.
The exponential speedup was achieved by “squeezing every ounce of performance from the hardware: shorter circuits, smarter pulse sequences, and statistical error mitigation,” said Phattharaporn Singkanipa, USC doctoral researcher and first author.
The researchers achieved this in four ways:
- They limited the data input by restricting how many secret numbers would be allowed (technically, by limiting the number of 1’s in the binary representation of the set of secret numbers). This resulted in fewer quantum logic operations than would be needed otherwise, which reduced the opportunity for error buildup
- They compressed the number of required quantum logic operations as much as possible using a method known as transpilation
- Most crucially, the researchers applied a method known as ‘dynamical decoupling,’ which means applying sequences of carefully designed pulses to detach the behavior of qubits within the quantum computer from their noisy environment and keep the quantum processing on track. Dynamical decoupling had the most dramatic impact on their ability to demonstrate a quantum speedup
- Finally, they applied ‘measurement error mitigation,’ a method that finds and corrects certain errors that are left over after dynamical decoupling due to imperfections in measuring the qubits’ state at the end of the algorithm
“The quantum computing community is showing how quantum processors are beginning to outperform their classical counterparts in targeted tasks, and are stepping into a territory classical computing simply can’t reach., Our result shows that already today’s quantum computers firmly lie on the side of a scaling quantum advantage,” said Lidar. “The performance separation cannot be reversed because the exponential speedup we’ve demonstrated is, for the first time, unconditional.”
In terms of next steps, Lidar cautioned that “this result doesn’t have practical applications beyond winning guessing games, and much more work remains to be done before quantum computers can be claimed to have solved a practical real-world problem”.
This will require demonstrating speedups that don’t rely on ‘oracles’ that know the answer in advance and making significant advances in methods for further reducing noise and decoherence in ever larger quantum computers. Nonetheless , quantum computers’ previously ‘on-paper promise’ to provide exponential speedups has now been firmly demonstrated.
There’s plenty of other editorial on our sister site, Electronic Specifier! Or you can always join in the conversation by visiting our LinkedIn page.