Quantum is creeping closer - ready?

 

Imagine it's 2035 and the headline finally reads "Quantum Computing is Now".

 

With the advent of commercially viable quantum computing since the quantum advantage was won in 2030, the stream of innovation is beginning to spread across all industries. Vaccine and drug simulations have revolutionized human and animal testing, and new international client downloads are being trained at higher speeds and with greater accuracy than ever before. Our major financial institutions noted record speeds and volumes of data processing, trade transactions and volumes increased dramatically, and the world cheered the discovery of a new carbon dioxide catalyst that could recycle carbon dioxide into hydrogen.

 

But a few weeks into what has been dubbed the "Summer of Quantum Success," the number of quantum bits stably formed in quantum computers reached 4,099, and globally, cyber-attacks began as our public-key encryption was cracked under Schorr's algorithm.

 

--This eventful year is known as "Q-Day".

 

 
 

Dataset after dataset is uncovered, unraveled, distorted, or held hostage in exchange for billions of dollars in ransom. Historical data collected long before Q-Day is subject to a series of damaging leaks that bring down organizations. In the midst of this turmoil, a handful of organizations remain unscathed because of a decade's worth of preparations to update their quantum-secure, cyber-agile algorithms.

 

Now you might say that imagination overrides journalism, but there's no doubt that a future powered by quantum computing is bound to change the world. Not too dissimilar to the hype cycle of artificial intelligence, the coming capabilities of quantum computing are rapidly approaching their zenith. In theory, fully commercialized quantum as well as crypto-cracked quantum is still years away, but experts from IBM, Deloitte, and Accenture are speaking out about the need to prepare for quantum now.

 

 

Today's quantum computers are 'noisy,' expensive, and in dire need of development

 

On the outskirts of Santa Barbara, California, between the orchards and the ocean, sits an unassuming warehouse with brown windows and its exterior painted a dull gray. The facility is barely marked, and its name doesn't appear on Google Maps. A small label on the door reads "Google AI Quantum."

 

Inside, computers are being reinvented from scratch.

 
 

In the middle of the warehouse floor, a device the size and shape of a ballroom chandelier hangs from metal scaffolding. Bundles of cables snake down from the top, passing through a series of gold-plated disks to the processor below. The processor, called Sycamore, is a small rectangular cube with dozens of ports embedded in it. "Sycamore uses some of the strangest properties of physics to perform mathematical operations that defy all human intuition. Once connected, the entire device is placed in a cylindrical refrigerator and cooled for more than a day. The processor relies on superconductivity, which means that its electrical resistance virtually disappears at super-cold temperatures. When the temperature around the processor is colder than the deepest voids in outer space, calculations can begin.

 

Classical computers use a bit language with values of 0 and 1. Quantum computers, such as the one being built by Google, use quantum bits, which can take values of 0 or 1, or complex combinations of both. As a result, quantum bit bits are much more powerful and can perform calculations that ordinary bits cannot. However, because of this fundamental change, everything had to be redeveloped: the hardware, the software, the programming language, and even the programmer's approach to problem solving.

 

Google's quantum computer is the culmination of years of research and hundreds of millions of dollars of investment. It was also barely functional. Today's quantum computers are "noisy," meaning they fail in almost every attempt. Yet the race to build them has attracted as much genius as any scientific problem on Earth. Intel, IBM, Microsoft and Amazon are also building quantum computers.

 

A full-sized quantum computer could crack our current encryption protocols and fundamentally disrupt the Internet. Most online communications, including financial transactions and popular messaging platforms, are protected by encryption keys that would take conventional computers millions of years to decipher. A working quantum computer could probably crack one in less than a day: and that's just the beginning.

 

A quantum computer could open up new frontiers in mathematics, revolutionizing our view of what "computing" means; its processing power could stimulate the development of new industrial chemicals to address climate change and food shortages; it could reconcile Albert Einstein's elegant theories with the unruly microcosmos of particle physics, and thus Discovering Space and Time ......

 

"The impact of quantum computing will be more far-reaching than any technology to date," Jeremy O'Brien, CEO of startup PsiQuantum, said recently. First, though, engineers must make it work.

 
 

Shore's algorithm "exposes the quantum threat."

 

Imagine two pebbles thrown into a calm lake; when the stones hit the surface, they create concentric ripples that collide to produce complex interference patterns.

 

At the turn of the twentieth century, physicists studying the behavior of electrons discovered similar wave-like interference patterns in the subatomic world. This discovery led to a moment of crisis: for under other conditions, these same electrons behaved more like individual points in space, called particles. Soon, in what many consider to be the strangest scientific result ever, physicists realized that whether an electron behaved more like a particle or more like a wave depended on whether someone was observing it.

 

The field of quantum mechanics was born.

 

Over the next few decades, inventors used the discoveries of quantum mechanics to build a variety of technologies, including lasers and transistors. In the early 1880s, physicist Richard Feynman proposed building a "quantum computer" to obtain results that could not be computed by conventional means. The response from the computer science community was lukewarm; it was difficult for early researchers to gain a seat at conferences. The practical utility of such a device was not demonstrated until 1994, when Peter Shor, a mathematician working at Bell Labs in New Jersey, showed that a quantum computer could help break some of the most widely used encryption standards.

 

Even before Shor published his results, he was approached by a concerned representative of the National Security Agency. "Such a decryption capability could render the military capabilities of the loser virtually irrelevant and its economy overthrown," an NSA official later wrote.

 
 

Shore is now chair of the MIT Applied Mathematics Committee. "He looks like the guy who would invent algorithms," reads a comment in a video of one of his lectures.

 

An algorithm is a set of instructions for making calculations. A child doing long division follows an algorithm; so does a supercomputer simulating the evolution of the universe. The formal study of algorithms as mathematical objects began in the twentieth century, and Shore's research suggests that there's still a lot we don't know. "In terms of algorithms, we are probably at the level of the Romans relative to numbers," describes experimental physicist Michel Devoret, who compares Schor's work to the eighteenth-century breakthroughs in imaginary numbers.

 

Schor's most famous algorithm suggested using quantum bits to "factorize" very large numbers into smaller components. The key to factorization, says Schorr, is to identify prime numbers, i.e., integers that are divisible by only one, and can be divided by themselves. "There are 25 primes between 1 and 100, but they become fewer and fewer as you count higher." Shore drew a series of compact formulas on the board, explaining that certain sequences of numbers repeat periodically along the number axis. However, the distance between these repetitions grows exponentially, making them difficult to calculate with a conventional computer.

 

"This is the core of my discovery." Shore said.

 

The challenge was to implement Shore's theoretical work with physical hardware.In 2001, experimental physicists at IBM tried to implement the algorithm by firing electromagnetic pulses at molecules suspended in a liquid. "I think that machine cost about half a million dollars," says Shore, "and it told us that 15 is equal to 5 times 3. Classically computed bits are relatively easy to construct-just like a bit that can be 'on' or 'off' with a light switch. The quantum bits of quantum computing require something like a dial, or more accurately, several dials, each of which must be tuned to a specific amplitude. Implementing such precise control on the subatomic scale remains a tricky problem."

 

Nonetheless, the protocols that protect text messages, emails, medical records, and financial transactions will have to be torn down and replaced by what security experts are calling the "fall of the millennium. "Earlier in 2022, the Biden administration announced that it was moving toward new quantum encryption standards that would provide protection against Shor's algorithm. Implementing them is expected to take more than a decade and cost tens of billions of dollars, creating a mine of riches for cybersecurity experts. "The difference between this and Y2K is that we know the actual date that Y2K happened," said cryptographer Bruce Schneier.

 

Prior to the arrival of Y2Q, spy agencies were storing encrypted Internet traffic in hopes of reading it in the near future. "We see our adversaries doing this: copying our encrypted data and holding on to it," said Dustin Moody, a mathematician responsible for the U.S. Post-Quantum Encryption Standard, "This is definitely a real threat. Within a decade or two, most communications of this era could be exposed.

The Biden administration's deadline for cryptographic upgrades is 2035, and quantum computers capable of running a simple version of Shor's algorithm could appear as early as 2029.

 

 

Quantum entanglement, the root of quantum computing research.

 

The roots of quantum computing research lie in a scientific concept known as "quantum entanglement". Entanglement is what computational nuclear fission means for explosives: a strange property of the subatomic world that can be used to create technologies of unprecedented power.

 

If entanglement can be enacted on the scale of everyday objects, it seems like magic. Imagine you and a friend tossing two entangled coins without looking at the outcome. The outcome of the toss is only decided when you peek at the coins.

If you check your coin and find it heads up, your friend's coin will automatically tails up; if your friend sees her coin heads up, your coin will be tails up. This property holds no matter how far apart you and your friend are. If you traveled to Germany or Jupiter and looked at your coin, your friend's coin would immediately show the opposite.

 

If you find entanglement puzzling, you're not alone: it took the scientific community most of a century to begin to understand its implications. Like many concepts in physics, entanglement was first described in one of Einstein's Gedanken experiments. Quantum mechanics states that the properties of a particle assume a fixed value only after it has been measured. Before that, a particle existed in a "superposition" of many states at once, which were described in terms of probabilities.

 

Physicist Irwin Schrödinger famously proposed a thought experiment in which he imagined a cat trapped in a box containing a vial of quantum-activated poison, with the cat superimposed between life and death. This disturbed Einstein, who in later years raised objections to the "new physics" that succeeded his generation, and in 1935, in collaboration with the physicists Boris Podolsky and Nathan Rosen, he revealed an apparent paradox in quantum mechanics: if one takes seriously the idea of quantum activation, it is not possible to understand it. apparent paradox: If one takes the implications of the discipline seriously, it should be possible to create two entangled particles separated by any distance that somehow interact faster than the speed of light. "There is no reasonable definition of reality that would allow this," Einstein and his colleagues wrote.

 

However, in the decades that followed, other predictions of quantum mechanics were repeatedly verified in experiments, and Einstein's paradox was ignored.

"Because his views ran counter to the prevailing wisdom of his time, most physicists regarded Einstein's hostility to quantum mechanics as a sign of senility." Science historian Thomas Rickman writes.

 

Physicists in the mid-century were preoccupied with particle gas pedals and nuclear warheads; entanglement received little attention. In the early 1960s, Northern Irish physicist John Stewart Bell worked alone to reformulate Einstein's thought experiment into a five-page mathematical argument, and in 1964 he published his findings in the obscure journal Physics Physique Fizika. Over the next four years, his paper was not cited once.

 

In 1967, John Clauser, a graduate student at Columbia University, stumbled upon Bell's paper while leafing through a bound journal in the library. Clauser had been struggling with quantum mechanics and had taken the course three times before receiving an acceptable grade. "I was convinced that quantum mechanics must be wrong," he later said. Bell's paper provided Crowther with a way to test his objections. Against the advice of his professors, including Richard Feynman, he decided to conduct an experiment to prove that Einstein was right by demonstrating that the theory of quantum mechanics was incomplete.In 1969, Crowther wrote Bell a letter telling him of his intentions. Bell responded happily. No one had ever written him a letter about his theorem before.

 

Crowther moved to Lawrence Berkeley National Laboratory in California, where, with little budget, he created the world's first pair of intentionally entangled photons. He measured the photons when they were about ten feet apart.

Observing the properties of one photon immediately produced the opposite result in the other. Crowther and his co-author Stuart Freedman published their findings in 1972. From Crowther's point of view, the experiment was disappointing: he had definitively proved Einstein wrong. Ultimately, Crowther very reluctantly accepted that the puzzling rules of quantum mechanics actually worked. "I confess that to this day I still don't understand quantum mechanics," Crowther said in 2002.

 

But Crowther also proved that entangled particles are more than just a thought experiment. They are real, and even stranger than Einstein imagined.

 
Ultimately, the solution to Einstein's paradox is not that particles signal faster than light; rather, once entangled, they are no longer distinct objects, but function as a system that exists in two parts of the universe at once. This phenomenon is called nonlocalization. Since the 1980s, research on entanglement has led to a steady stream of breakthroughs in theoretical and experimental physics.2022 In October, Crowther shared the Nobel Prize in Physics for his work. In a press release, the Nobel Committee described entanglement as "the most powerful property of quantum mechanics." Bell did not live to see the revolution completed; he died in 1990; today, his 1964 paper has been cited 17,000 times.

 

Breakthroughs and challenges at the same time

 

At Google's lab in Santa Barbara, the goal is to entangle many quantum bits at once.

 

Imagine hundreds of coins arranged in a network. Manipulating these coins in a carefully choreographed order can produce amazing mathematical results. One example is Grover's algorithm, developed by Lov Grover, a colleague of Shore's at Bell Labs in the 1990s.

 

"Grover's algorithm is about unstructured search, which is a great example for Google," says Nevin, the lab's founder, "I like to think of it as a giant closet with a million drawers. In one of those drawers is a tennis ball. On average, someone rooting around in the closet will find the ball after opening half a million drawers. As amazing as that sounds, Grover's algorithm can do it in a thousand steps, and I think the whole magic of quantum mechanics can basically be seen here."

 
 

Nevin's career has been erratic. He originally majored in economics, but switched to physics after attending a lecture on string theory. He earned a PhD focusing on computational neuroscience and was hired as a professor at the University of Southern California. While at USC, his research team won a facial recognition competition sponsored by the U.S. Department of Defense. He founded a company called Neven Vision, which developed technology for facial filters for social media; in 2006, he sold the company to Google for $40 million. At Google, he worked on image search and Google Glass, turning to quantum computing after hearing stories about it on public radio.

 

Nevin's contributions to facial analysis technology are widely admired, and over the past few years, in research papers published in the world's leading scientific journals, he and his team have also revealed a series of small but strange wonders: photons clustering together; their properties changing according to the order in which they are arranged; a singular state of affairs, an ever-mutating substance known as a "time crystal. ". "Literally, there are literally a dozen things like this, and every one of them is science fiction." Nevin said.

 

Last November, a team led by physicist Maria Spiropulu used Google's quantum computer to simulate a "holographic wormhole," a conceptual shortcut through space and time - an achievement that recently appeared in Nature. achievement was recently featured on the cover of the journal Nature.

 

Google's published science on quantum computing has sometimes drawn scrutiny from other researchers. One Nature author called their wormhole "the smallest, worst wormhole you can imagine"; Scott Aaronson, a professor at the University of Texas at Austin who specializes in quantum computing, said, "You have to squint, and quantum computing isn't going to replace classical methods any time soon. computing isn't going to replace classical methods anytime soon, either."

 

"Quantum computers are terrible at counting," says Marissa Giustina, a research scientist at Google.

 

Justina is one of the world's leading experts on entanglement. in 2015, while working in the lab of Austrian professor Anton Zeilinger, she updated Crowther's 1972 experiments. in october 2022, Zeilinger was also named a Nobel laureate.

"After that, I got a bunch of binging messages, and she spoke with some frustration about a machine that may soon simulate complex molecules but is currently incapable of basic arithmetic. "It's contrary to what we experience in our daily lives," she said, "and that's what's most annoying about it, and that's what's beautiful about it."

 

The main problem with Google's entangled quantum bits is that they are not "fault-tolerant". On average, the Suzuki processor makes an error every thousand steps.

But a typical experiment requires far more than a thousand steps, so to get meaningful results, researchers must run the same program tens of thousands of times and then use signal-processing techniques to distill a small amount of valuable information from the mountains of data. This situation could be improved if programmers could examine the state of the quantum bits while the processor is running, but measuring the superimposed quantum bits forces it to assume a particular value, which leads to a worsening of the calculation. Such "measurements" don't need to be made by a conscious observer; any number of interactions with the environment will result in the same crash.

 

"Getting quantum bits to survive in quiet, cold, dark places is an essential part of scaling quantum computing," said Giustina, noting that Google's processors sometimes fail when exposed to radiation from outside the solar system.

 
 

Rapid development: Improving Shore codes with "topological" solutions

 

In the early days of quantum computing, researchers feared that measurements would be tricky, but in 1995, Shore showed that entanglement could also be used to correct errors, ameliorating the high failure rate of the hardware. Shor's work caught the attention of Alexei Kitaev, a theoretical physicist working in Moscow at the time, and in 1997 Kitaev improved Shor's code with a "topological" quantum error-correction scheme. John Preskill, a theoretical physicist at the California Institute of Technology, speaks of Kitaev, now a professor at the school, with something close to awe. "He's very creative and he's technically deep," Preskill said. "He's one of the few people I know who I can call a genius without hesitation."
 
 

"I've been to Mt. Wilson about a hundred times," Kitaev said. When the problem is really tough, Kitaev skips Mount Wilson and instead hikes around Mount Baldy, a 10,000-foot peak that is often covered in snow.

 

Quantum computing is a Bald Mountain problem. "I predicted in 1998 that computers would be realized within thirty years," says Kitaev, "and I'm not sure we'll make it." Kitaiev's error-correction scheme is one of the most promising ways to build a functional quantum computer, and in 2012 he won the Breakthrough Prize, the world's most "lucrative" science award, for his work. Later, Google hired him as a consultant; so far, no one has been able to realize his idea.

 
 

Is everything hype?

 

In early 2020, Pfizer scientists began producing hundreds of experimental drugs designed to treat Covid-19. In July of that year, they synthesized a seven-milligram version of an investigational chemical labeled PF-07321332, one of twenty formulations the company produced that week. pF-07321332 remained an anonymous vial in the lab refrigerator until 19, when experiments showed that it was effective at inhibiting covid-19 in rats. the chemical was then combined with another substance and renamed Paxlovid, a cocktail of drugs that reduces covid-19-related hospitalizations by about 90 percent.

 

Paxlovid is a savior, but the laborious trial-and-error process that led to its development could be shortened with the help of quantum computers. "We're just guessing at what can be designed directly," says Peter Barrett, a venture capitalist and board member of startup PsiQuantum: "We're guessing at something that our civilization depends on completely, but that's by no means the best .

 

Fault-tolerant quantum computers should be able to simulate the molecular behavior of industrial chemicals with unprecedented accuracy, guiding scientists to faster results. in 2019, researchers predicted that, for the first time, a method of producing ammonia for agricultural use, known as the Haber-Bosch process, could be accurately simulated using just a thousand fault-tolerant quantum bits.

Improvements in this process would lead to significant reductions in carbon dioxide emissions. Lithium, the main ingredient in electric car batteries, is a simple element with atomic number 3; fault-tolerant quantum computers, even primitive ones, may show how to extend their ability to store energy and increase vehicle range. Quantum computers could be used to develop biodegradable plastics or carbon-free jet fuel. Another use suggested by consulting firm McKinsey is "modeling surfactants to develop better carpet cleaners."

 

"There is good reason to believe that quantum computers will be able to effectively model any process that occurs in nature," Preskill wrote a few years ago.

 

Hype aside, we expect to see a number of real-world use cases for quantum computing power in the coming years that will have a huge impact on our planet and our lives. The initial hotspots for use cases will be areas such as simulations and calculations that don't require fast responses below the millisecond level, and molecular level modeling using quantum.

 

Richard Hopkins, Distinguished Engineer and Senior Quantum Ambassador at IBM, shared the likely reality of early quantum commercial viability: "I expect that there will be enough high-value use cases that don't require millisecond turnarounds, such as in payment-type scenarios for Visa and MasterCard, if we're trying to optimize the planet's energy flows on the planet, biotech tankers, or something else, that's a problem that you could probably solve once a day, rather than trying to do it in a millisecond."

 
 

Currently, maintaining quantum bits for the time available remains a hurdle for many use cases. In addition to continuing to increase the number of quantum bits in a quantum machine, one of IBM's goals is to create a cloud-based server farm containing multiple quantum processing units (QPUs) to prevent disruptions and alleviate the need for a full reset, recalibration, and readout in the event of a disturbance in one of the superconducting method cylinders.

 

Even the most optimistic analysts believe that quantum computing will not be meaningfully profitable in the next five years, and pessimists warn that it could take more than a decade. It seems likely that many expensive devices will be developed with little lasting use.

 
 

Quantum superiority still has a way to go

 

The true commercial viability of quantum computers on a global scale is still a few years away, with 2030 often touted as the expected time, but some expect it to be sooner.

 

Microsoft has partnered with quantum hardware companies Quantinuum, IonQ and Rigetti to provide cloud access to quantum hardware that has 23 fully connected quantum bits, and Amazon has made a similar move with its own Amazon Braket quantum tool.

 

IBM currently has a 433-quantum-bit machine, "Osprey," currently running on the network, and it has announced that it will upgrade 24 other machines to bring the entire fleet up to 127 quantum bits - its "Eagle " processor. Many of these machines are free to academics, and IBM Quantum's Composer is now free to use and build, visualize and run quantum circuits on quantum hardware or simulators.

 

Hopkins is quick to point out, with a note of caution, that while he believes IBM is leading the way in quantum computing research, with the most quantum bits, it still has a way to go before it can claim quantum superiority (aka commercially viable quantum): 'It's going to take some new science, some new math, and a lot of research, and maybe even a large supercomputer, to fully verify that whether what we're doing is right. We want to claim it's right when we know we can show that it produces the right results in a real-world setting that conventional computers have never been able to solve in a reasonable timeframe."

 

"The field is very competitive. Superconductivity, ion trapping, photonics, and other technologies, they're all at different stages of development, each with their own strengths and weaknesses. I'm certainly not surprised by the success of multiple quantum technologies, as different technologies are used to solve certain types of problems. But I think the frustrating thing is that we can't apply it everywhere at once. Engineering has a long way to go before it changes the world and the planet. It's going to take a lot more time."

 

Scott Buchholz, CTO of Deloitte's government practice and global quantum computing leader, shares a similar outlook, believing that the rollout of quantum technology will be slower than we expect: "I think people will have the impression that they're going to wake up one morning and quantum computers are going to sweep the world. But the challenge we face is, if we look ahead to January 1, 2030, what will the world look like before people feel that a shift has taken place? Over the next few years, we may start to see niche problems where quantum computers can do better than classical computers. Two years from now, there will be more problems that can be solved better with quantum computers, and so on."

 

"I think people are conceptualizing that a quantum computer will have a light switch or a Frankenstein switch. It will be much more gradual than that and won't be as dramatic as people think."

 
 

Changing strategies to face 2035 head on

 

It is predicted that when the number of quantum bits reaches a quantum-defined 4,099, the machine will have the negative impact of being able to implement an algorithm to break our current public encryption algorithms, and perhaps much of our distributed ledger technology, leaving most of our cyber-security systems to date open to hackers.

 

"The timeframe for quantum computers to start doing bad things, the timeframe for cracking cryptography, cracking cyber problems like the things that underpin our daily lives now, looks like it could be between 2030 and 2035."

 

To build cyber protection beyond quantum risk, the next step for organizations is to achieve cyber agility, a more flexible and updatable cybersecurity model that allows them to update their security systems more quickly in the future.

Historically, the time required to update security models has typically been around ten years, while large companies suggest that current quantum-ready updates could take around eight years.

 

The clock is ticking and we need to act faster than the projected 2035 deadline.Companies such as IBM, Deloitte and Accenture are working with industry bodies to accelerate the pace of action.

 

The message from some of the world's most influential quantum computing experts is that the race is not about which country or organization achieves a commercially viable quantum advantage the fastest, but rather whether companies can successfully protect themselves against quantum cybersecurity threats in a timely manner and then be able to continue to reap the benefits of the use cases.

 

There is no doubt that the sheer scale of these projects seems insurmountable to some, and it calls into question the way businesses have been collecting data for the last decade, as they are now starting to move towards cyber agility. In contrast to the massive data grab we've experienced over the last five or ten years, where the more data we have, the more we can process and the better we can run our businesses, Deloitte is encouraging people to rethink the data they keep.

 

It's just cyber hygiene that needs to be done right," Buchholz said. There are two things I hope will come out of this quantum lens. One is that we're going to have to step up cyber hygiene in terms of cryptography, whether there is a quantum threat or not. Secondly, I think it will make organizations in the commercial and government sectors think twice about the data they store and whether they really need it, because again, you can't steal unusable data."

 

In addition to the cyber hygiene benefits mentioned above, we also need to consider the costs of storage, both monetary and environmental, as we keep large amounts of unused but potentially risky data all the time.

 

Regardless of the risks posed by potentially bad quantum actors, technologists remain optimistic that beyond the distant ideals of the hype cycle, quantum computing will continue to deliver breakthrough benefits that will outweigh the negative effects of cyber risk.

 

Accenture's Patterson puts it in a nutshell: "It's really going to change the world ...... Quantum communications will be running by the end of this decade; quantum computers are already available in the cloud. The real winner is humanity, and I fully support the continued global effort to advance science because it benefits everyone."

 
Reference Links:
[1]https://www.newyorker.com/magazine/2022/12/19/the-world-changing-race-to-develop-the-quantum-computer
[2]https://erp.today/quantum-creeps-closer-ready-or-not/
 

 

2023-10-09