Heavy! Classical simulation time of nine-chapter 'superiority of quantum computing' experiment reduced by a billion times

Google's "planet tree" random quantum circuit sampling experiment has been continuously challenged by classical simulations after claiming "the superiority of quantum computing". In the 2021 study, the speed of classical computers to complete the same task has exceeded "planet tree".
 
Another type of quantum computing superiority is the Gaussian Bose sampling experiment completed by the optical quantum computing prototype "Nine Chapters". In December 2020, the "Nine Chapters" paper [1] estimated that it would take 600 million years for the experiment, which was completed in 200 seconds, to simulate on the world's most powerful supercomputer.
 
More than a year since the paper was published, "Nine Chapters" has yet to face the challenge of classical simulations. But in a Science Advances paper published on January 26 [2], the QET lab teamed up with researchers at Imperial College London and Hewlett-Packard to reduce the classical simulation time to a few months. It's about a billion times faster.


One billion times shorter time

 

In Gaussian Bose sampling (GBS), squeezed states are injected into the interferometer, and subsequent photon detection produces correlated events associated with hafnian matrix rings. While the theoretical scheme for GBS assumes the use of photon-number-resolved detectors (PNRDs), threshold detectors are often used in experiments -- distinguishing 0 and at least 1 photon by a "click".


In this work, the researchers propose a classical algorithm to compute precise correlated photon detection probabilities faster than existing methods in the presence of collisions (multiple photons arriving at the same detector) for PNRD GBS simulation.
 
They also introduced a new classical method to generate samples for GBS simulations of threshold detectors, which is orders of magnitude faster than samples from PNRD when collisions dominate.


 

Figure 1. Conceptual diagram of the repeated row hafnian ring algorithm and threshold detector sampling method. (A) GBS collision results measured using PNRD. (B) To calculate correlation probabilities, photons were grouped into pairs (red line) to maximize the number of identical photon pairs repeated. (C) Inclusion/exclusion formulas or finite-difference sieves can then operate on the resulting photon pairs, with repeating pairs leading to speedups. (D) The same event was measured with a threshold detector, the "click" is shown in green. (E) Considering fan-out to a set of sub-detectors, it is impossible to receive more than 1 photon. The results of all detectors can be ignored except the first detector that sees the photon. x is the relative position of the detected photon and the fraction of the ignored sub-detectors. (F) The probability of detecting the first photon at the x position can be expressed as the loss after single-photon detection.

 
Both results provide quadratic speedup over existing methods in high collision states. The researchers applied these results to two sampling algorithms: the probabilistic chain rule and Metropolis Independent Sampling (MIS). The time it took the researchers to simulate an idealized "nine chapter" GBS experiment with a threshold detector was reduced by nine orders of magnitude.
 
Specifically, the researchers were able to classically simulate the GBS experiment on a 100,000-core supercomputer with 100 modes and up to 60 "click" detection events. In this simulation, replacing the threshold detector with PNRD was able to generate a sample of 92 photons, but significantly increased the runtime.
 
Using chain rule sampling, an M=60 mode experiment was simulated with PNRD, generating 4200 samples in 3 hours. Figure 2A shows a histogram of the number of samples versus the number of photons, which is in good agreement with the calculated distribution. Figure 2B shows the corresponding run times of the samples. At less than about 45 photons, the sampling time appears to be approximately constant, suggesting that the problem is not large enough to fully utilize the system. In addition to this, the runtime increases rapidly, albeit with a wide range of variations depending on the specific configuration of the output photons. The researchers provided an approximate fitted line for this ratio, equal to (0.15 + 1.59 × 10−9 × N3e0.147N) seconds. Using this curve to extrapolate photon counts >80, the average time per sample was estimated to be about 10 seconds. This could be reduced to 130 milliseconds (0.13 seconds) using the world's No. 1 supercomputer, Fuyake (which has about 66 times more CPUs available).
 

 

Figure 2 60-mode PNRD GBS chain rule benchmark. (A) Number of samples as a function of photon number, theoretically calculated distribution (red line) and (B) runtime versus number of photons, fitted with exponential addition constant (red line).

 
Then, a chain-law simulation of the M=100 mode experiment was tested with a threshold detector, generating 1600 samples in 3.5 hours. Figure 3A shows a histogram of the number of "clicks" and Figure 3B shows the corresponding run times of the samples. There are more than 45 "clicks", and the sampling time increases approximately exponentially, from which it is inferred that the number of "clicks" is >60. The running time was fitted with a curve of (0.58 + 3.15 × 10−7 × 2N/2) seconds. From this, the researchers predicted an average time of 8.4 seconds per sample. In Fuyue, this can be reduced to about 127 milliseconds (0.127 seconds).
 

 

Figure 3. 100-mode threshold detector GBS chain rule benchmark. (A) Number of samples as a function of the number of 'clicks', fitted with a Gaussian (red line). (B) Run time versus number of 'clicks', fitted with an exponential addition constant (red line).
 

The researchers found that the complexity of simulating 60-mode experiments using PNRD was comparable to simulating 100-mode experiments using the same density of photons and threshold detectors.

 

The race between quantum computing and classical simulation

 

Jake Bulmer, a PhD student in QET's lab and co-first author of the paper, said: "There is an exciting competition going on in which researchers are working to build increasingly complex quantum computing systems that they claim cannot be achieved by conventional computers. Simulate these systems. At the same time, researchers like us are improving simulation methods so that we can simulate these machines that were thought to be impossible to simulate!"
 
Co-first author Bryn Bell, a Marie Curie researcher at Imperial College London and a senior quantum engineer at Oxford Quantum Circuits (OQC), said: "As researchers develop larger-scale experiments, they will seek to claim quantum advantages over classical simulations. Our results will provide a basic point of comparison with which to determine the computational power of future GBS experiments."
 
The team's method did not exploit any errors in the experiments, so the next step in the research is to combine their new method with techniques that exploit flaws in real-world experiments. This will further speed up simulation times and better understand which areas need improvement.
 
Jake Bulmer said: "These quantum superiority experiments represent a huge achievement in physics and engineering. As a researcher, it is exciting to contribute to understanding where the computational complexity of these experiments comes from. Amazed at the tremendous progress made - you can rarely claim to have discovered a billion-fold improvement!"
 

references:
[1] https://www.science.org/doi/10.1126/science.abe8770
[2] https://www.science.org/doi/10.1126/sciadv.abl9236
[3] https://phys.org/news/2022-01-team-advantage-quantum.html

2022-01-27