IBM vs. Quantum Noise, the looming technology battle

 

At one point, researchers thought that, at least in the short term, they would be stuck with noisy, error-prone systems.

 

But that is beginning to change.

 
 

"Taming" the Noise

 

Over the past 20 years, hundreds of companies, including giants like Google, Microsoft, and IBM, have thrown themselves into the rush to build quantum computing. All of these efforts have one purpose: to create the next big thing in the world.

 

Quantum computers utilize counterintuitive rules that govern matter at the atomic and subatomic levels to process information in a way that traditional or "classical" computers cannot. Experts believe this technology will have an impact in areas as diverse as drug discovery, cryptography, finance and supply chain logistics.

 

The promise is bright, but so is the hype. In 2022, for example, Haim Israel, managing director of research at Bank of America, declared that quantum computing will be "bigger than fire and greater than all the revolutions humanity has experienced."

 

-- Even among scientists, the various claims make quantum computing a difficult field to assess.

But assessing our progress in building useful quantum computers comes down to one central factor: whether we can handle the noise.

 

The delicate nature of quantum systems makes them highly susceptible to the slightest disturbance, whether it's stray photons from heat, random signals from surrounding electronics, or physical vibrations. This noise can wreak havoc, generate errors, and even bring quantum computing to a halt.

 

It doesn't matter how big the processor is, or what the "killer application" is: unless the noise can be "tamed", quantum computers will never outperform classical computers.

 

For years, researchers have thought that, at least in the short term, they might be stuck with noisy circuits: many have been looking for applications that could do something useful with this limited capacity. This quest has not gone well, but that may not matter now. In the past few years, theoretical and experimental breakthroughs have led researchers to announce that the noise problem may finally be solved. A combination of hardware and software strategies holds promise for suppressing, mitigating, and removing quantum errors. It's not a particularly elegant approach, but it looks like it could work, and faster than anyone expected.

 

Earl Campbell, vice president of quantum science at Riverlane, a quantum computing company in Cambridge, England, says, "I'm seeing more evidence in defense of optimism."

 

Even hardened skeptics have been persuaded. For example, Sabrina Maniscalco, a professor at the University of Helsinki, studies the effects of noise on computing.

 
 

Ten years ago, she said, she scoffed at quantum computing. "I think there are some real fundamental problems with it. I wasn't sure there would be a way out."

 

Now, though, she's working on using quantum systems to design improved versions of light-activated anti-cancer drugs that are effective at lower concentrations and can be activated by a less harmful type of light. She believes the project is just two and a half years away from success. For Maniscalco, the era of "quantum utility" - where it makes sense to use quantum processors rather than classical ones for certain tasks - is coming.

 

"In fact, I am confident that we will soon enter the quantum era."

 
 

Putting quantum bits in the cloud

 

After more than a decade of disappointment, quantum computers are finally having their breakthrough moment. Throughout the late 2000s and early 2010s, researchers building and running real-world quantum computers found that they had far more problems than theorists expected.

 

For some, the problems seemed insurmountable. But others, like Jay Gambetta, were undaunted.

 

Gambetta is a quiet Australian with a doctorate in physics from Griffith University on Australia's Gold Coast. He chose to go there in part because it would give him a surfing addiction. But in July 2004, he took the plunge and headed for the northern hemisphere to work on the quantum properties of light at Yale University.

 

Three years later (by which time he had stopped surfing due to the frigid waters near New Haven), Gambetta moved further north to the University of Waterloo in Ontario, Canada. Later, he learned that IBM wanted more hands-on work in quantum computing, and in 2011, Gambetta became one of the company's newest employees.

 
Jay Gambetta is in charge of the development of IBM's quantum computers and leads the initiative to put such systems in the cloud
 

IBM's quantum engineers have been busy building quantum versions of the binary bits of classical computers.

 

A year after joining the company, Gambetta discovered a problem with IBM's quantum bits: everyone could see them getting pretty good. Whenever he met fellow physicists at conferences, they would ask him to test their latest ideas on IBM's quantum bits. Within a few years, Gambetta began to shy away from the flood of requests.

 

He recalls, "I started to think this is crazy. Why should we do experiments just for physicists?"

 

"We watched the first jobs come in one after another. We could see them bouncing around the quantum computer. When it didn't break, we started to relax."

 

It occurred to Gambetta that his life might be easier if he could find a way for physicists to operate IBM's quantum bits themselves-perhaps through cloud computing.

 

He told his boss about the idea, and then at a party in late 2014, he found himself with five minutes to sell the idea to IBM executives.

 

The only question they asked was whether Gambetta was sure he could make the idea happen. "I thought, how hard can it be?"

 

-- It turned out to be very hard, because the IBM executives told Gambetta that he had to do it quickly. Gambetta said, "I wanted two years to get it done, and they gave him a year."

 

It was a daunting challenge: At the time, he barely knew what the cloud was. Fortunately, some of his colleagues knew, and they were able to upgrade the team's remote access protocol used to tweak the machines at night or on weekends: a set of interfaces were created that could be accessed from anywhere in the world.

 

Ultimately, the world's first cloud-accessible quantum computer, built from five quantum bits, went live on May 4, 2016 at midnight.

 

Cloud-based quantum computing was an instant hit. 7,000 people signed up in the first week, and by the end of the month there were 22,000 registered users. However, their adventure clearly shows that there is still a big problem with quantum computing.

 

The ultimate goal of the field is to have hundreds of thousands or even millions of quantum bits working together.

However, with researchers expected to test quantum computers containing only a few cooperating quantum bits, many of the theoretical assumptions about the extent of the noise they produce are proving to be seriously off.

 

Quantum computers will always generate some noise. Since quantum computers operate at temperatures above absolute zero and thermal radiation is always present, everyone expects some random impacts on the quantum bits; but there are also non-random impacts.

 

Temperature changes in the control electronics create noise. Applying a pulse of energy to put a quantum bit in the right state also creates noise. Worst of all, sending a control signal to one quantum bit creates noise in other quantum bits nearby.

 

When quantum algorithms run on a dozen or so quantum bits, their performance is consistently astounding.

 

In a 2022 evaluation, Michael Biercuk, director of the Quantum Control Laboratory at the University of Sydney in Australia, and others calculated the probability that the algorithm would run successfully before the noise corrupted the information in the quantum bits and forced the computation off course.

 

If an algorithm with a known correct answer is run 30,000 times, the correct answer may be returned only three times.

 

It's disappointing, but it's also educational. "People are learning a lot by actually using these machines, and we're finding out a lot of things that no one knew; or they knew, but didn't know what to do."

 

fix an error

 
 

After recovering from the cacophony of "applause", the researchers began to unite. Now they have come up with a set of solutions that can work together to control the noise.

 

Broadly speaking, the solutions can be divided into three categories.

 

The base layer is error suppression. This is achieved through classical software and machine-learning algorithms that constantly analyze the behavior of circuits and quantum bits, then reconfigure the circuit design and the way instructions are given to better protect the information held in the quantum bits.

 

This is one of the things that Bill Cook's company, Q-CTRL, is working on; the company says that suppression can make quantum algorithms 1,000 times more likely to produce the right answer.

 

The next layer is error mitigation, which capitalizes on the fact that not all errors will cause a computation to fail; many will simply throw the computation off track.

 

By studying the errors that noise produces in a particular system running a particular algorithm, researchers can apply an "anti-noise" technique to quantum circuits to reduce the chances of errors in the computation and output.

 

The technique works similarly to noise-canceling headphones, but it's not perfect. For example, it relies on multiple runs of the algorithm, which increases operating costs, and the algorithm can only estimate noise.

 

However, Gambetta says it is effective in reducing errors in the final output.

 

Helsinki-based Algorithmiq, of which Maniscalco is CEO, has its own method for cleaning up the noise after the calculation is complete.

 

It basically removes the noise in post-processing, like cleaning up the garbage in a quantum computer," says Maniscalco. So far, it seems to work on a fairly large scale."

 

On top of that, there are growing achievements in the field of "quantum error correction" (QEC), which encodes information in the quantum states of a set of quantum bits, rather than preserving information in one quantum bit.

 

An error caused by noise in any of these quantum bits is not as catastrophic as if the information were held in a single quantum bit: by monitoring each additional quantum bit, any changes can be detected and corrected before the information becomes unusable.

 

Implementing QEC has long been considered one of the key steps on the road to large-scale, noise-resistant quantum computing, as well as one of the key steps for machines to realize all the promises of the technology, such as the ability to break popular encryption schemes.

 

The problem is that QEC requires a lot of overhead. The gold-standard error-correction architecture (i.e., surface codes) requires at least 13 physical quantum bits to protect a useful "logical" quantum bit.

 

When you connect logical quantum bits together, that number balloons: a useful processor might need 1000 physical quantum bits to protect each logical quantum bit.

 

However, there are multiple reasons to be optimistic about this right now. For example, in July 2022, researchers at Google released a demonstration of surface codes, where performance gets better, not worse, when more quantum bits are connected together.

 

Theoretical alternatives to surface codes have also been well demonstrated.

 
IBM's Quantum System One is a commercial quantum computer that uses this chandelier-like structure to cool quantum bits
 

In August 2023, an IBM team including Gambetta demonstrated an error-correction technique that could control errors in a 12-qubit storage circuit using only 276 additional quantum bits, a major improvement over the thousands of additional quantum bits required for surface codes.

 

In September, two other teams demonstrated similar improvements in error-tolerant circuits known as CCZ gates, using superconducting circuits and ion-trap processors.

 

It's a great thing that so many noise-processing techniques are flourishing: especially when the idea that we might get something useful out of small-scale, noisy processors proves to be a failure.

 

Actual error correction has not yet been implemented on commercial quantum processors (and generally cannot be implemented as a real-time process during computation). But Bill Cook believes that quantum computing is finally having its day.

 

He says, "I think we're on track now."

 

Alongside these innovations, hardware performance is generally improving: this means that there are fewer and fewer baseline errors in running quantum bits, and the number of quantum bits per processor is increasing, making bigger, more useful computations possible.

 

Bill Cook says he has begun to realize that he may soon choose a quantum computer over the best performing classical computer. Neither classical nor quantum computers can fully solve large-scale tasks, such as finding the best routes for delivery trucks across the country.

 

But Bill Cook points out that accessing and running the best classical supercomputers costs a lot of money: probably more than accessing and running a quantum computer, which might even offer a little better solution.

Kuan Tan, CTO and co-founder of Finnish quantum computer provider IQM, says: "Look at what high-performance computing centers are doing every day. They are running power-hungry scientific calculations that quantum computers can implement with much lower power consumption."

 

Tan argues that a quantum computer doesn't have to be better than other types of computers to attract paying users: it just needs to have comparable performance and be cheaper to run.

 

He predicts that we will realize the quantum energy advantage within the next three to five years.

 

Finding "utility"

 
 

There has long been a debate about what quantum computing researchers should aim for when competing with classical computers.

 

Is it Google's quest for "quantum hegemony" - proving that quantum computers can solve problems in a reasonable amount of time that classical computers cannot? Or is it IBM's desire for "quantum superiority" - superior performance in solving useful problems? Or is it IBM's latest buzzword, "quantum utility"?

These semantics reflect different perceptions of the importance of near-term goals.

 

In June 2023, IBM announced that it will begin retiring its entry-level processors from the cloud, so its 127-qubit Eagle processor will be the smallest the company offers. The move is intended to push researchers to prioritize truly useful tasks.

 

IBM says Eagle is a "utility-class" processor: if handled correctly, it can "provide useful results for problems that challenge the best scalable classical methods".

 

That's a controversial claim. Many people doubt that Eagle can really outperform properly prepared classical computers. But classical computers are already struggling to keep up with it, and IBM has even bigger systems: the 433-qubit Osprey processor (also accessible via the cloud) and the 1,121-qubit Condor processor (debuting in December 2023).

 

Gambetta's reason for naming IBM's quantum processors is simple: 'I like birds'. The company has a new modular design called Heron, and the Flamingo processor will be available in 2025: a fully quantum connection between the chips that allows quantum information to flow freely between processors, thus enabling a truly quantum processor. The chips will be fully quantum-connected, allowing quantum information to flow freely between the different processors, enabling truly large-scale quantum computing.

 

Gambetta said, "My goal is to make 2025 an important year for demonstrating key technologies that will allow us to scale quantum computing to hundreds of thousands of quantum bits. This will make 2025 the first year that quantum computing can be demonstrably scalable."

 
IBM's modular Heron chip aims to enable multi-chip quantum computers connected to classical communications, and the company aims for future chips to support quantum communications
 

As quantum computing takes great strides and quantum computers begin to process real-world data, technical and geographic diversity will be important to avoid geopolitical issues and problems with data-sharing regulations.

 

For example, there are restrictions aimed at maintaining national security, which may limit market opportunities for multinational giants such as IBM and Google.

 

In early 2022, France's defense minister declared quantum technology to be of "strategic importance" and announced a new national research program.

 

In July 2023, Deutsche Telekom announced a new partnership with IQM to provide cloud-based access to quantum computing, describing it as a way for Deutsche Telekom customers to access a "truly sovereign quantum environment built and managed within Europe".

 

This is more than just nationalist bluster: sovereignty matters.

 

As the era of massive quantum computers posing a serious threat to standard encryption protocols approaches, governments and commercial organizations are looking to test "post-quantum" encryption algorithms - algorithms that can withstand attack by any quantum computer, no matter how large or small - within their own borders. that can withstand attacks from any quantum computer, no matter how big or small.

 

That's not a problem. Few believe that large-scale quantum processors that disrupt security are on the horizon. But what is certain is that there is a growing belief that in just a few years, the field has the potential to be transformative and practical in other ways.

 

And today, that belief is based on real-world achievements. "At Algorithmiq, we believe in a future where quantum utility will soon be realized." Maniscalco says.

 

The only downside for her is that not everyone is as optimistic as she is. Quantum computing is here now, she insists, but old objections are hard to dispel, and many people refuse to see it coming.

 

"There are still a lot of misconceptions: when I see or hear certain conversations, I get very uneasy, and sometimes I wish I had a magic wand to open people's eyes."

 
Reference Links:
[1]https://www.nature.com/articles/d41586-023-03854-1
[2]https://www.technologyreview.com/2024/01/04/1084783/quantum-computing-noise-google-ibm-microsoft/
[3]https://scipost.org/SciPostPhysLectNotes.49/pdf
[4]https://fortworthreport.org/2022/09/17/listen-boa-global-strategist-sees-a-transition-from-oil-to-cleaner-energy-options-and-increased-u-s-based-manufacturing/
[5]https://www.technologyreview.com/2023/10/19/1081389/unbreakable-encryption-quantum-computers-cryptography-math-problems/
 
2024-01-08