First Breakthrough in 30 Years! Peter Shor Responds to Speedups in the Multi-Dimensional Shor Algorithm

 

The Shor algorithm will enable future quantum computers to compute large numbers quickly enough to disrupt many online security protocols. Now, a researcher has shown how this can be accomplished even faster (for more information on this achievement, see the Photon Box article, "First Time in 30 Years! A Qualitative Breakthrough in the Shor Algorithm").

 

At the time, Peter Shor's original intent was not to disrupt the Internet. But an algorithm he developed in the mid-1990s had the potential to do just that. In a landmark paper, Shor showed how the "weirdness" of quantum physics could be exploited to allow a hypothetical computer to decompose large numbers into prime factors much faster than any ordinary classical machine.

 

 
Link to paper:
https://ieeexplore.ieee.org/document/365700
 

The implications of this result extend far beyond mathematics. At the time, public key cryptography, a key component of Internet security, relied on the assumption that decomposing large numbers into prime numbers was computationally very difficult and practically impossible. This assumption still underlies a number of key protocols today. Shore's algorithm suggests that in a world with powerful quantum computers, this algorithm would be severely invalidated.

 

Over the past 30 years, computer scientists have streamlined Shore's algorithm in anticipation of the day when quantum technology is mature enough to run the algorithm. But a new variant proposed by New York University computer scientist Oded Regev has radically accelerated the process: it improves, for the first time, the relationship between the size of the factorization numbers and the number of quantum operations required to factorize them.

 
Regev develops a multidimensional version of Shore's algorithm that runs faster
 
Regev's personal website at NYU:
https://cims.nyu.edu/~regev/
 

Ashley Montanaro, a quantum computing researcher at the University of Bristol, commented, "It's really remarkable that many years later, someone has apparently been able to improve the complexity of this result. This is exciting!"

 

Martin Ekerå, a cryptographer at Sweden's National Communications Security Bureau, agreed that Regev's paper was interesting, but cautioned that further optimization would be needed to surpass the current state of the art in practice. In an e-mail, he wrote: "Shore's original algorithm is already surprisingly efficient, so it will not be easy to make significant improvements."

 

Regev's approach to developing the new algorithm was to augment Shore's algorithm with techniques from the branch of cryptography that deal with high-dimensional geometry.


Shore's algorithm may not be optimal

 

The power of quantum computers comes from the special way they process information. While classical computers use bits, each of which must always be in one of two states, 0 and 1, quantum bits can be in a combination of 0 and 1, a phenomenon known as superposition. It is also possible to "coax" multiple quantum bits into a collective superposition state: a two-qubit superposition state has four components that can perform different computations at the same time, and the number of these components grows exponentially as the number of quantum bits increases. This allows a quantum computer to efficiently perform exponentially many different computations in parallel.

 

But there's a catch: reading the results of a superposition calculation only shows the answer to the computational part of a random component. To get the benefits of a superposition calculation, the final result must somehow be mapped to a simpler state so that the result can be safely read. This is not possible in most cases, and at first no one knew how to make it work on any problem. Regev says, "Very few people had the courage to think about quantum computing at that time."

 
Internet security depends on how computationally difficult it is to factor large numbers into prime factors.In 1994, Peter Shor discovered a quantum algorithm that could quickly factor large numbers.
 

In 1994, Shor read a paper by computer scientist Daniel Simon that showed how quantum superposition could be used to solve a hypothetical problem. Shore figured out how to extend Simon's result to a more general, practical problem called "cycle finding": when the output of a mathematical function repeats the same value over and over again as the input increases, the function is said to be periodic; the length of a cycle is called the function's period.

 

To find the period of a given function using a quantum computer, one first builds a very large superposition in which each component computes the output of the function for a different input. That large superposition is then converted to a simpler state using Shore's method and the results are read. At this point, a classical computer can take over and complete the computation quickly. Overall, Shor's cycle finding algorithm is several times faster than any classical alternative because it can utilize the superposition to compute different outputs of a periodic function at the same time.

 

As Shore searched for an application area for his quantum cycle finding algorithm, he rediscovered a previously known but obscure mathematical theorem: for every number, there exists a periodic function whose period is related to the prime factor of the number. Thus, if you want to find the prime factor of a number, you can compute the corresponding function and then solve the problem using period finding.

 

-- "That's what quantum computers are good at." Regev said.

 

On a classical computer, this would be an excruciatingly, painfully slow method, even slower than trying all possible factors. But Shor's method speeds up the process exponentially, making the discovery ideal for building fast quantum factorization algorithms.

 

Shor's algorithm was one of several key early results that transformed quantum computing from an obscure subfield of theoretical computer science into the behemoth it is today. But putting the algorithm into practice is a daunting task, because quantum computers are notoriously error-prone: in addition to the quantum bits needed to perform the computation, they need many others to do extra work to prevent them from making mistakes.Ekerå and Google researcher Craig Gidney have estimated that using Shore's algorithm to compute a security standard of 2048-bit number (about 600 bits long) would require a quantum computer with 20 million quantum bits - whereas current state-of-the-art computers have at most a few hundred quantum bits.

 
 

This is why some key Internet protocols still rely on the difficulty of factorizing large numbers, but researchers don't want to be too complacent. Theoretical and technological innovations could further reduce the number of quantum bits required, and there's no evidence that Shore's algorithm is optimal: perhaps there's a better quantum factorization algorithm out there that no one has discovered.

 

If so, says Regev, "we should know sooner rather than later".

 
Finding the tantalizing connection between factorization and lattice cryptography
 
 

Regev's academic career began in the late 1990s, when cryptographers were searching for a new type of public-key cryptography unaffected by Schorr's algorithm. The most promising approach, known as lattice-based cryptography, relied on the apparent difficulty of computational problems involving high-dimensional lattices or grids of dots; one of these problems was similar to finding the tree closest to a random point in a forest.

 

Greg Kuperberg, a mathematician at the University of California, Davis, says, "If it's a 100-dimensional forest, it's much more complicated than a two-dimensional forest."

 

Regev began working on lattice-based cryptography during his postdoc, initially as an attacker: he wanted to stress-test the new method by finding weaknesses that quantum computers could exploit. But he wasn't able to make any headway, and soon wondered if there was a deeper reason for this, and in 2005 he found a way to turn these failed attacks into a lattice-based cryptography that outperformed all other variants.

 

"Aude was an absolute genius when it came to the lattice." Kuperberg said.

 

Over the years, Regev taught Shore's algorithm to generations of students, and he found himself wondering if the techniques he used to attack lattice-based cryptography would actually come in handy for factorization algorithms. This is because one of the steps in the final classical phase of Shore's algorithm amounts to finding the nearest point in a one-dimensional lattice. This one-dimensional problem is very simple, but its similarity to similar problems in hundreds of dimensions, which are the basis of lattice-based cryptography, is obvious.

 

Regev said, "If you're a lattice researcher like me, you think, 'Well, there must be some kind of lattice problem here.' But it wasn't clear to me how to capitalize on that."

 

For years he had been "toying" with other ideas for new quantum factorization algorithms, but never got anywhere; last winter he returned to the problem, determined to find a tantalizing link between factorization and lattice-based cryptography.

 

This time, he succeeded.

 
Expanding from one-dimensional to multi-dimensional
 
 

Regev knew that he needed to start by generalizing the periodic function at the heart of Shore's algorithm from one to many dimensions.

 

In Shore's algorithm, the function involves repeatedly multiplying a random number (called g) with itself. But the period of this function-the number of times it must be multiplied by g before the function's output begins to repeat-can be very large; this means that the quantum computer must multiply large numbers in some of the components of the superposition it uses to compute the periodic function: these large multiplications are the most computationally expensive part of Schor's algorithm.

 

A similar two-dimensional function uses a pair of numbers, g1 and g2. it needs to multiply g1 by itself many times, and then by g2. The period of this function is also two-dimensional: it is defined by the number of times g1 is multiplied by and g2 is multiplied by, and these two multiplications and g2 multiplications cause the output of the function to start repeating. There are many different combinations of g1 and g2 multiplications that can accomplish this.

 

Regev generalized the algorithm to arbitrary dimensions, not just two, through technical details, but his initial results were not encouraging. To compute periodic functions in multiple dimensions, a quantum computer still needs to multiply many numbers together. Each number doesn't need to be multiplied as many times as it would in the one-dimensional case, but there are many more different numbers to multiply.

 

The whole thing seems to be a shuffle.

 

"You think, 'Great, I just did everything in the high-dimensional case with exactly the same running time as Shore's,'" says Regev, "and I was stuck with that thought for a while. " Then he realized he could solve the problem by changing the order of multiplication. In quantum computing, a product gets progressively larger, and instead of repeatedly adding numbers to a product, he started with pairs of smaller numbers, multiplied the products together, and then continued working his way up. The total number of multiplications hasn't changed much, but almost all multiplications now involve relatively small numbers, which makes calculations faster.

 
 

At first, it seemed that Regev had simply replaced one problem with another. He speeded up the quantum computation of periodic functions by increasing the number of dimensions, but then the classical computation required to extract the cycles was now analogous to locating the nearest lattice point in a high-dimensional space: a task that was widely regarded as daunting. The analogy with lattice-based cryptography seemed to doom his new approach to failure.

 

On a cold morning in March, before heading to a seminar at Princeton University, Regev found himself waiting for a colleague with whom he had carpooled. I arrived early and he was late picking up his car," he says. As he sat and waited, the last piece of the puzzle suddenly came to him. It was at that moment that everything fell into place, but it still had to bake for a while."

 

It all comes down to the right number of dimensions. When the lattice dimensions were too low, his algorithm couldn't take full advantage of the acceleration of fractional multiplication; when the dimensions were too high, quantum computation was fast, but the classical part required solving a lattice problem that was hard as hell.

 

Regev knew from the start that he would have to work in between to have any hope of success, but he wasn't sure if there was a "sweet spot". That morning in March, he realized how to tweak the details of the algorithm to make it work fast in dozens of dimensions.

 
Extra quantum bits needed could be major drawbacks
 
 

The improvement is profound. When factoring an n-digit number, the basic number of logical steps in the quantum part of Regev's algorithm is n^1.5, rather than n^2 in Shore's algorithm.The quantum part of the algorithm is repeated dozens of times, and then the results are combined to plot a high-dimensional lattice from which the period and factorization can be derived. As a result, the whole algorithm may not run faster, but speeding up the quantum part by reducing the number of steps required can make it easier to put into practice.

 

Of course, the time required to run a quantum algorithm is only one of several considerations. Equally important is the number of quantum bits required, which is analogous to the memory needed to store intermediate values in ordinary classical computing. The number of quantum bits required for Shore's algorithm to compute the factorization of an n-bit number is proportional to n, whereas the original form of Regev's algorithm requires a number of quantum bits proportional to n^1.5, which is a big difference for a 2048-bit number.

 

In classical computing, speed is usually a more important consideration than memory, because classical bits are extremely robust: you can save a file on your computer without worrying that it will randomly change when you open it again later; quantum computing researchers aren't always so lucky.

 

Says Gidney: "Our quantum bits keep trying to fall apart, and we're trying to stop them from falling apart." It's like trying to write in the sand and the wind blows it away. This means that the extra quantum bits needed for Regev's algorithm could be a major flaw.

 

But Regev's paper isn't the end of the story. Two weeks ago, Vaikuntanathan and his graduate student Seyoon Ragavan found a way to reduce the algorithm's memory usage. Their variant of Regev's algorithm, like Shore's original algorithm, requires a number of quantum bits proportional to n, rather than n^1.5. Ecklow wrote in an email that this work "brings us closer to a more efficient implementation in practice."

 
 
Link to paper:
https://arxiv.org/abs/2310.00899
 

In addition to its implications for factorization, Regev's new algorithm brings a broader message that quantum computing researchers should always be open to surprises, even for problems that have been studied for decades.

 

To this achievement, Shore responded, "My variant of this algorithm has been undiscovered for 30 years and came out of nowhere; there are probably many other quantum algorithms yet to be discovered."

 
Reference Links:
[1]https://ieeexplore.ieee.org/document/365700
[2]https://quantum-journal.org/papers/q-2021-04-15-433/
[3]https://www.quantamagazine.org/thirty-years-later-a-speed-boost-for-quantum-factoring-20231017/
 
2023-10-19