Classical computing is fighting back in a quantum showdown

As quantum computing evolves, there are regular announcements about realizing the "superiority of quantum computing". Quantum computers (QCs) can accomplish certain example algorithms much faster than classical computers.

 

One of these public results appeared in 2019. At that time, a quantum computer built by Google (Google's 53-quantum-bit Sycamore chip) solved a problem that would have taken 10,000 years to reproduce on supercomputing hardware of the time. The specific problem used was to simulate the output of a random sequence of gates and quantum bits in a quantum computer. While it sounds completely self explanatory, the sequence of 1's and 0's was derived from the random behavior of the quantum bits, exhibiting a particular kind of random result that the researchers could examine.

 

In response, IBM published a paper in 2019 arguing that the Oak Ridge Summit supercomputer's 250 petabytes of storage could actually store the entire quantum state vector of Google's Sycamore chip. With this configuration, a brute force update of the entire state vector (all 250 PB of it) could compute the same result in about 2.5 days.

 

 

At the time, an analysis of the IBM article showed that on the Summit supercomputer at Oak Ridge National Laboratory, Google's Sycamore circuit could be simulated with high fidelity to arbitrary depths and output all amplitudes in a matter of days.

 

However, adding just a few more quantum bits could re-establish QC's insurmountable lead. If Google or others upgraded quantum bits from 53 to 55, it would be enough to exceed Summit's 250 PB of storage capacity. If it were 60 bits, it would require 33 Summit supercomputers, but who's going to do the math?

 

In this case, while QC walked back to its corner with arms raised in celebration, the classical approach stood up for another round.

 

In a 2021 paper, researchers noted that Google had chosen a very specific method to calculate the expected behavior of its processor, but that there were other ways to perform equivalent calculations. Since the publication of the results, several classical schemes have reported better performance. For example, in their paper, Pan Zhang's group at the Chinese University of Science and Technology (CSU) describes a specific method that allows a GPU-based cluster to produce the same results as a QC run in just 15 hours. The researchers note that running the problem using a GPU-equipped supercomputer such as Summit would outperform the Sycamore quantum processor.

 

Quantum computing superiority, or quantum supremacy, refers to the use of quantum computers to solve well-defined problems that would take orders of magnitude longer to solve with any known algorithm running on existing classical computers - not out of chance , but rather for reasons of asymptotic quantum complexity.

 

So, if quantum computing superiority is indeed realized, does this mean that no code is now unbreakable?

 

--No, it does not.

 

There are two problems here. First, the devices Google, IBM and others are currently building have only a few hundred quantum bits and no error correction. Running the Shor algorithm to break the RSA cryptosystem would require thousands of logical quantum bits: using known error correction methods, this could easily translate into millions of physical bits, and these could be of higher quality than any physical bits that currently exist.

 

The second problem is that even in a hypothetical future with scalable error-correcting QCs, they will only be able to break some ciphers, not all, based on our current understanding. By an unfortunate coincidence, the public-key ciphers they can break include most of the ciphers we currently use to secure the Internet: RSA, Diffie-Hellman, Elliptic Curve Encryption, and so on. But symmetric-key cryptography should be minimally affected. There are even candidate public-key cryptosystems (e.g., lattice-based systems) that, after more than 20 years of attempts, no one knows how to break from a quantum perspective, and, fortunately, there are now efforts underway by major organizations to begin migrating to these systems.

 

In June of this year (2023), IBM published a major QC result in the journal Nature. This time, instead of creating a special kind of randomness, the researchers used a 127-quantum-bit IBM Eagle processor to calculate the Ising model, which simulates the behavior of 127 magnetic, quantum-sized particles in a magnetic field. The problem actually has some real-world value, including ferromagnetism, antiferromagnetism, liquid-air phase transitions and protein folding. When encoded as 127 quantum bits, it presents a quantum size advantage rather than a speed advantage, since even the largest classical computers don't have enough memory to accommodate the problem.

 

The IBM team used an interesting approach to mitigate the quantum noise to produce more usable results. The researchers actually introduced more noise and then precisely recorded the effect on each part of the processor circuitry. Using this data, the researchers were able to extrapolate the results of the calculations that would have occurred in the absence of the noise.

 

This result from IBM seems to be a real smack in the face of classical computing, but not enough to cause obsolescence. Within two weeks of the news, researchers at the Flatiron Institute's Center for Quantum Physics Computing rose to the challenge: they pre-published a paper on their results and reported that "by employing a tensor network approach, we can perform classical simulations with significantly higher accuracy than the results obtained with quantum devices." They also mentioned that the simulations used "modest computational resources."

 

 

For classically verifiable systems, the experimental tensor network can simulate the dynamics of the two-dimensional transverse-field Ising model; the figure above compares the simulation results with the Eagle quantum processor and other tensor network approaches.

 

 

For non-classically verifiable systems, the above simulation results are compared with the Eagle quantum processor and other tensor network approaches.

 

Not to be outdone, Tomislav Begušić and Garnet Kin-Lic Chan of Caltech note in their recent preprint, "Our classical simulations on a single core of a laptop were orders of magnitude faster than the reported quantum simulation times."

 

 

"Classical algorithms based on sparse bubbleley dynamics can efficiently simulate quantum circuits recently studied in a 127 quantum bit experiment on the IBM Eagle processor [Nature 618, 500 (2023)]. Our classical simulations on a single core of a laptop computer are orders of magnitude faster than reported quantum simulation times, and orders of magnitude faster than estimates of quantum hardware runtime without classical processing, and are in excellent agreement with the results of zero-noise extrapolation experiments."

 

Indeed, as quantum computing advances, we can expect two complementary results:

 

- Quantum computing continues to make progress in realizing viable systems and will ultimately succeed (quantum bits are increasing while noise is still a big problem)

 

- Quantum computing will drive the development of new algorithms for classical computing, benefiting many existing users.

 

Quantum computers and supercomputers work in tandem, known as "quantum-supercomputer convergence". In the future-oriented development of computing power, "quantum-supercomputing fusion" can realize complementary quantum-classical computing power. According to the characteristics of the computational process, quantum computing can accelerate the key steps in complex problems, reduce the solution time of complex problems, and collaborate with supercomputing to improve the efficiency of complex problem solving in general.

 

Guo Guangcan, academician of the Chinese Academy of Sciences and professor at the University of Science and Technology of China, has also said, "Rapidly developing domestic quantum computers that can be directly applied and playing the role of quantum advantages in various fields of the economy is a historical mission incumbent on our generation."

 

Let's keep looking forward, in fact, each round of technology iteration will bring more "super winners" to the market.

 

Reference link:

[1]https://scottaaronson.blog/?p=4317

[2]https://www.hpcwire.com/2023/08/23/classical-computing-keeps-counter-punching-in-the-latest-quantum-smackdown/

[3]https://www.jwview.com/jingwei/html/08-21/554736.shtml

2023-08-25