A new limit! Superconducting quantum circuits, the first to disprove Einstein Nature Express
For the first time ever, superconducting circuits separated by 30 meters have passed the Bell test, proving that quantum mechanics violates "local causality," the first test in physics to confirm the quantum behavior of a system - a test that proves, with new precision, that their quantum bit is truly entangled.
Among the many physical platforms explored for quantum computing, superconducting quantum bit processors have stood out over the past decade for their increased number of bits and system performance. Today, researchers have built superconducting quantum computers with up to several hundred quantum bits and run a number of quantum algorithms (including modeling physical systems and correcting quantum errors) using superconducting quantum bits, which pave the way for useful quantum computing.
However, these quantum bits encode quantum information with tiny amounts of energy stored in circuit elements that must also be cooled to milli-Kelvin temperatures, making it extremely challenging to transfer quantum information between these processors.
Writing in Nature on May 10, Andreas Wallraff's team, professor of solid-state physics at ETH Zurich, reports on a superconducting quantum bit device that can operate with high fidelity at distances of up to 30 meters.

The research paper, titled "Loophole-free Bell inequality violation with superconducting circuits," is now available as open access.
Their approach involves a quantum property known as entanglement, which is a key component of a possible strategy for transmitting quantum information. The authors also show a fast readout of the state of a quantum bit, allowing them to demonstrate an experimental feat known as the "Loophole-free Bell" experiment - the first successful attempt to use superconducting quantum bits. .

A diagram of the experimental test. The 30-meter-long cryogenic vacuum system houses two superconducting quantum bits connected by a superconducting aluminum waveguide at temperatures below 50 milliKelvin. During the experiment, they developed techniques to push the limits of the superconducting quantum bit platform, a promising technology for future quantum computers.
Conceived to reconcile observations inconsistent with classical electromagnetic theory, the founders of quantum theory ran into some stumbling blocks when the theory's logical consequences clashed with their physical intuitions. One of the main problems raised by Albert Einstein, Boris Podolsky and Nathan Rosen in a famous 1935 paper was that quantum theory was incomplete and needed to be augmented with hidden variables in order to capture the real physical situation.

Can the description of physical reality by quantum mechanics be considered complete?
The basis of their objection is as follows:
- According to quantum theory, a pair of quantum objects can be described as a single unit, and measurements of each object return correlated results, even if the two objects are physically far apart - such that they are later called entangled.
- However, in an acceptable theory, each object should be completely describable individually, and no communication or other physical influence should travel faster than the speed of light - the principle of "locality".
Because quantum theory does not provide an independent description of each entangled object, Einstein and colleagues concluded that the theory was incomplete.
Although the physicists responsible for developing quantum mechanics found the theory deeply disturbing, this property of entanglement is now considered an important property of quantum technology and is widely used in technologies such as quantum computing.
Nearly three decades later, a critical point in the development of entanglement occurred.
In 1964, John Bell discovered that there was a limit to the extent to which measurements of distant quantum systems could be correlated in the local hidden variable theory envisioned by Einstein and his colleagues. Bell captured this limit in the mathematical expression of Bell's inequality and also observed that the predictions of quantum mechanics can violate this limit. This implies that no increment is possible to make a quantum theory consistent with a local hidden variable theory, because in some cases the two theories will predict different results.
The next natural step is to detect these cases in experiments.
To test Bell's inequality, one distributes a pair of entangled objects between two distant measurement stations, often called Alice and Bob, who each choose one of the two measurement setups and record one of the two possible outcomes. After repeating this process several times, Alice and Bob combine their data to determine the probability of measuring each pair of outcomes under each pair of possible setup choices. These probabilities are the terms of Bell's inequality, leading to data that violate the inequality that cannot be described by any local hidden variable theory-quantum mechanics makes predictions that violate the inequality.
The first Bell experiments were conducted in the 1970s, and most of them found results that violated the predictions of local hidden variable theory and were consistent with quantum mechanics.
The first practical Bell experiments were conducted in the early 1970s by John Francis Clauser and Stuart Freedman, who won the Nobel Prize in Physics last year. In their experiment, these two researchers were able to show that Bell's inequality was indeed violated; however, they had to make certain assumptions in order to perform the experiment in the first place. Thus, theoretically, Einstein may still have been right to be skeptical about quantum mechanics.
This was very surprising at the time, and researchers began looking for experimental errors or alternative explanations for the results.
Mapping Bell's thought experiments to laboratory experiments necessarily involved a number of assumptions, each of which opened a loophole (loophole) by which local hidden variable theory could explain the observed results. For example, experiments using entangled photons often detect only a small fraction of the generated photons; researchers can invoke the assumption of a fair sampling: the assumption that the subset detected represents the entire set. However, it is possible to write a local hidden variable theory that can exploit this fair sampling loophole to achieve violations of the inequality.
Similarly, a local hidden variable model that includes steganographic communication (hidden communication) can explain the relevant measurement results. For example, using the locality loophole.
It should be noted that the speed of communication should not exceed the speed of light, and the experimenter can limit any hypothetical communication that such a loophole might exploit by carefully adjusting Alice and Bob's setup choices and measuring the physical distance and time of activity.
Since 2015, scientists have reported experiments that simultaneously address all major vulnerabilities in diamond-based defect, optical photon and trapped-atom systems. This time, Storz et al. report the first such experiment on a superconducting quantum platform.
Building a "loophole-free" Bell experiment on any platform is challenging because the requirements for closing each loophole are often conflicting. Addressing the fixed-domain vulnerability requires an experiment with a large area and fast, accurate timing; closing the fair-sampling assumption usually requires a smaller experiment so that any information transmitted by distributing the entanglement is not easily lost; and obtaining sufficient statistics requires that the data be generated much faster than the losses and environmental fluctuations on the experimental perturbations of the system.
Two key technical developments allowed the Bell experiment by Storz and colleagues to succeed:
- By achieving a single quantum bit readout of about 50 nanoseconds, much faster than the several hundred nanoseconds that define the most advanced systems with multiple quantum bits; as a result, the ETH researchers have determined that the shortest distance to perform a successful loophole-free Bell test is about 33 meters, since a light particle in a vacuum takes about 110 nanoseconds to complete this distance.
- They then developed low-loss cryogenic waveguides of this size and integrated them with quantum bits to achieve a high-fidelity connected system.
The team built this impressive facility in an underground tunnel on the ETH campus. At each end of it is a cryogenic device containing superconducting circuits. The two cooling devices are connected by a 30-meter-long tube whose interior is cooled to a temperature slightly above absolute zero (-273.15°C).

"We have 1.3 tons of copper and 14,000 screws in the machine, as well as a lot of physics knowledge and engineering know-how," Wallraff says. He believes it is possible in principle to build facilities that overcome greater distances in the same way. For example, the technology could be used to connect superconducting quantum computers over great distances.

A map of the space-time in which the experiment took place.

S-values, entangled states and readout fidelity.
Before each measurement begins, a microwave photon is transmitted from one of the two superconducting circuits to the other so that the two circuits become entangled. The random number generator then decides which measurements to make on the two circuits as part of the Bell test. Next, the measurements are then compared for both sides.
After evaluating more than a million measurements, the researchers showed with very high statistical certainty that Bell's inequality was violated in this experimental setup. In other words, they have shown that quantum mechanics also allows for non-local correlations in macroscopic circuits and, therefore, superconducting circuits can be entangled over a large range.
Specifically, due to the performance of the quantum bits, the researchers were able to perform more than a million individual trials in just 20 minutes; the resulting correlations eventually exceeded the limit set by the Bell equation, reaching a staggering 22 standard deviations - a result with a p-value below 10-108.
This Bell experiment set a record for the longest separation between two entangled superconducting quantum bits, and is impressive for its physical size and precision. Although the 50-nanosecond reading shown here cannot be readily applied to multi-quantum-bit quantum computers, it pushes this quantum-bit technology to new limits. Similarly, although the superconducting waveguide approach does not scale to arbitrary distances, it represents a path to quantum information transfer between superconducting quantum bit chips - a technology needed in large-scale quantum computers.
With the achievement of this fundamental quantum milestone, and the technological advances that make it possible, Storz et al. have expanded the superconducting quantum bit possibilities, providing interesting possible applications in the fields of distributed quantum computing and quantum cryptography.
The two main factors limiting the performance of the system are the errors in the quantum bits and the loss of photons used to entangle them. The researchers believe they can make quantum bits the most stringent test of Bell's inequality by improving both of these points.
But this work could become even more important because of how it entangles quantum bits.
Everyone working on superconducting quantum bits says that we will eventually need to integrate thousands of quantum bits into a single quantum computer. Unfortunately, each of these quantum bits requires a considerable amount of space on the chip, which means that building chips with more than a few hundred quantum bits becomes difficult. As a result, big companies like Google and IBM eventually plan to connect multiple chips into a single computer (something the startup Rigetti is already doing).
However, with tens of thousands of quantum bits, we will almost certainly need so many chips that it will be hard to put them all in a single cooled piece of hardware. This means that we will eventually want to connect the chips to different cooling systems - which is exactly what is demonstrated here. So this is an important proof that we can in fact connect quantum bits in these systems.
Reference links:
[1] https://www.nature.com/articles/d41586-023-01488-x
[2]https://www.newscientist.com/article/2372828-superconducting-qubits-have-passed-a-key-quantum-test/
[3]https://arstechnica.com/science/2023/05/qubits-used-to-confirm-that-the-universe-doesnt-keep-reality-local/
[4]https://phys.org/news/2023-05-entangled-quantum-circuits-einstein-concept.html
[5]https://www.nature.com/articles/s41586-023-05885-0