Scientists are exploring the potential of quantum machine learning. But whether the convergence of artificial intelligence and quantum computing will lead to useful applications is unclear.

This is the Avengers of the future of computing. Put together two of the hottest words in tech, machine learning and quantum computers, and you have quantum machine learning. Like the Avengers comics and movies, which bring together all-star superheroes to form a dream team, the result is likely to attract a lot of attention. But in technology, as in fiction, it's important to have a good plot.
If quantum computers can be built on a large enough scale, they promise to solve certain problems more efficiently than ordinary digital electronics by exploiting the unique properties of the subatomic world. For years, researchers have pondered whether these problems might include machine learning, a form of artificial intelligence (AI): computers are used to discover patterns in data and learn rules that can be used to make inferences in unfamiliar situations.
Now, with the release of the much-hyped AI system ChatGPT (which relies on machine learning to enable human-like conversations by inferring relationships between words in text) and the rapid growth in the size and power of quantum computers, both technologies are making great strides.
So will the combination of the two produce anything useful?
is thriving ....
Many technology companies, including stalwarts such as Google and IBM, as well as startups such as Rigetti in Berkeley, California, and IonQ in College Park, Maryland, are investigating the potential of quantum machine learning. Academic scientists are also interested.
CERN, the European Laboratory for Particle Physics outside Geneva, Switzerland, has used machine learning to look for signs of the creation of certain subatomic particles from data generated by the Large Hadron Collider. Scientists there are among the academics who are conducting experiments in quantum machine learning.
Physicist Sofia Vallecorsa, head of the Quantum Computing and Machine Learning research group at CERN, said, "The idea is to use quantum computers to accelerate or improve classical machine learning models."
The big unanswered question is whether quantum machine learning has an advantage over classical machine learning. Theory suggests that for specialized computational tasks, such as simulating molecules or finding prime factors of large integers, quantum computers will speed up calculations that might otherwise take longer than the age of the universe. But researchers still lack sufficient evidence that the same is true for machine learning. Others say that quantum machine learning can find patterns that classical computers miss - even if it's not fast.
Maria Schuld, a physicist in Durban, South Africa, says there are two extremes in researchers' attitudes toward quantum machine learning. Schuld, who works for Xanadu, a quantum-computing company based in Toronto, Canada, says researchers have a strong interest in quantum learning methods, but seem to be increasingly unimpressed by the prospect of short-term applications.
Some researchers are beginning to shift their focus to the idea of applying quantum machine learning algorithms to inherently quantum phenomena. Aram Harrow, a physicist at the Massachusetts Institute of Technology (MIT) in Cambridge, says that of all the proposed applications of quantum machine learning, this is "the area where the quantum advantage is pretty clear."
Do quantum algorithms help?
Over the past 20 years, quantum computing researchers have developed a large number of quantum algorithms that could theoretically improve the efficiency of machine learning.
In a seminal achievement in 2008, Harlow, along with MIT physicists Seth Lloyd and Avinatan Hassidim (now at Bar-Ilan University in Israel), invented a quantum algorithm that is several times faster than classical computers at solving large systems of linear equations, one of the central challenges of machine learning is one of the core challenges.
But in some cases, quantum algorithms are not as promising as they could be. One high-profile example occurred in 2018, when computer scientist Ewin Tang discovered a way to beat a quantum machine-learning algorithm devised in 2016-a quantum algorithm designed to provide Internet shopping companies and service companies such as Netflix with the type of advice that customers are offered based on their previous choices types of suggestions: it was several times faster at delivering such suggestions than any known classical algorithm.

Tang, who was an 18-year-old undergraduate student at the University of Texas at Austin (UT) at the time, wrote an algorithm that was almost as fast, but could run on a regular computer.
UT quantum computing researcher Scott Aaronson, who was Tang's mentor, said, "Quantum recommendation algorithms are a rare example of algorithms that seem to provide significant speedups in real-world problems, so her work makes the goal of exponential quantum speedups for real-world machine-learning problems even more out of reach than before. "
Tang, who now works at the University of California, Berkeley, says she remains "very skeptical of any claims of significant speedups from quantum technology in machine learning.
A potentially bigger problem is that classical data and quantum computing don't always go well together. Roughly speaking, there are three main steps in a typical quantum computing application:
First, the quantum computer is initialized, which means placing its individual storage units, called quantum bits, in a collectively entangled quantum state. Next, the computer performs a series of operations, the quantum analogs of classical bit logic operations. In the third step, the computer performs a readout, e.g., measuring the state of the individual quantum bits carrying information about the results of the quantum operation. This might mean, for example, whether a particular electron within a machine is spinning clockwise or counterclockwise.
"The thinnest of straws."

Harlow, Hasidim and Lloyd's algorithm promises to speed up the second step: quantum computing. But in many applications, the first and third steps can be so slow that they cancel out these gains. The initialization step involves loading "classical" data into the quantum computer and converting it to a quantum state, which is often an inefficient process.
Moreover, because quantum physics is inherently probabilistic, the readout process tends to be random, in which case the computer must repeat all three stages multiple times and average the results to get to the final answer.
It can also take a long time to get an answer once the quantized data has been processed into its final quantum state, said Nathan Wiebe, a quantum computing researcher at the University of Washington in Seattle. Speaking at a quantum machine learning workshop in October, Wiebe said, "We can only draw information from the tiniest of straws."
Schuld said, "When you ask almost all researchers what applications quantum computers will excel at, the answer is, 'Probably, not classical data.' So far, there's no real reason to believe that classical data requires quantum effects."
Speed is not the only criterion by which quantum algorithms will be judged, Vallecosa and others say. There are also signs that quantum AI systems powered by machine learning could learn to recognize patterns in data that would be missed by their classical counterparts.
Karl Jansen, a physicist at the DESY Laboratory for Particle Physics in Zeuthen, Germany, says this could be because quantum entanglement establishes correlations between quantum bits and therefore between data points. He said, "We want to detect correlations in the data that are difficult to detect with classical algorithms."

But Aronson doesn't agree. Quantum computers follow well-known laws of physics, so given enough time, ordinary computers are perfectly capable of predicting how quantum computers work and the results of quantum algorithms.
Aronson says, "The only question of interest, therefore, is whether a quantum computer is faster than a perfect classical analog computer."
Fundamental quantum change
Another possibility is to bypass the barrier of converting classical data altogether by using quantum machine learning algorithms on data that is already quantum.
Throughout the history of quantum physics, the measurement of quantum phenomena has been defined as taking readings using instruments that "live" in the macroscopic classical world. But now a new idea has emerged involving an emerging technology called quantum sensing, which allows the measurement of the quantum properties of a system using purely quantum instruments.
By loading these quantum states directly onto the quantum bits of a quantum computer, it is then possible to use quantum machine learning to discover patterns without any connection to the classical system.
Hsin-Yuan Huang, a physicist at the Massachusetts Institute of Technology and a Google researcher, says this could have a big advantage in machine learning over systems that collect quantum measurements as classical data points. "Our world is inherently quantum mechanical. If you want a quantum machine that can learn, it could be much more powerful."
Huang and his collaborators conducted proof-of-principle experiments on one of Google's Sycamore quantum computers. They used some of the quantum bits to simulate the behavior of an abstract material. Another part of the processor then took information from these quantum bits and analyzed it using quantum machine learning. The researchers found that this technique is several times faster than classical measurements and data analysis.

Is it a superconductor?
Collecting and analyzing data entirely in the quantum world could allow physicists to address questions that classical measurements can only answer indirectly, Huang said. One such question is whether a material is in a particular quantum state that makes it a superconductor - able to conduct electricity with almost zero resistance.
Classical experiments require physicists to prove superconductivity indirectly, for example by testing the material's response to a magnetic field.
Jensen said particle physicists are also looking at using quantum sensing techniques to process data generated by future particle colliders, such as at LUXE, where an experiment at DESY slammed electrons and photons together. But he added that the idea is at least a decade away from realization.
Distant astronomical observatories could also use quantum sensors to collect data, which could be transmitted to a central laboratory via a future "quantum internet" and processed by a quantum computer. It is hoped that this will allow images to be captured with unparalleled clarity.
If this application of quantum sensing proves successful, then quantum machine learning could play a role in combining the measurements from these experiments and analyzing the resulting quantum data.
Ultimately, whether or not quantum computers can provide an advantage for machine learning will be determined by experimentation, not by mathematical proof of their superiority or lack thereof.
According to Harlow, "We can't expect to prove everything by way of theoretical computer science."
Aronson, on the other hand, says, "I certainly think quantum machine learning is still worth investigating, whether or not it ultimately improves efficiency."
[4]https://www.science.org/doi/10.1126/science.abn7293