Four Challenges for Quantum Computing Hardware

Recently, the IEEE Spectrum website featured James S. Clarke, Director of Quantum Hardware at Intel Labs, writing for this week's IEEE Quantum Week 2022 - "Sorting out the truth from the hype of quantum computing" [1].

 

3197f23f90860e870f5c61e5190369b6

Intel engineers are working to accelerate the development of quantum computing

 

The full article is translated as follows (author's opinion only).

 

Few fields have attracted as much unbridled hype as quantum computing. Most people's understanding of quantum physics extends to the strange fact that it is unpredictable, powerful, and almost existential. in 2019, I provided an update on the state of quantum computing to IEEE Spectrum and looked at the positive and negative claims across the industry. Just as then, I remain enthusiastically optimistic today. While the hype is real and has outpaced the actual results; the quantum field has also accomplished a lot in the last few years.

 

First, let's talk about the hype.

 

In the last five years, there has been undeniable hype around quantum computing. Hype around methods, timelines, applications, etc. Back in 2017, vendors claimed that commercialization of the technology was only a few years away, announcing, for example, a 5,000-bit system by 2020 (which didn't happen). There was even what I call "anti-hype," where some people questioned whether quantum computers would ever come to fruition (I hope they were ultimately wrong).

 

Recently, some companies have adjusted their timelines from a few years to a decade, but they continue to publish roadmaps that show commercially viable systems as early as 2029. And these expectations, driven by hype, are being institutionalized. The U.S. Department of Homeland Security has even released a roadmap to protect against the threat of quantum computing in an effort to help agencies transition to the new security system. This has created a "adopt or be left behind" mentality for quantum computing applications and post-quantum cryptographic security.

 

Market research firm Gartner (known for its "hype cycle" [2]) believes that quantum computing may have reached the peak of the hype, or the second phase of its five-phase growth model. This means that the industry is about to enter a phase called the "trough of disillusionment". According to McKinsey & Company, "Fault-tolerant quantum computing is expected to materialize between 2025 and 2030, based on published hardware roadmaps for gate-based quantum computing players." I don't think this is entirely realistic because we still have a long journey to quantum utility - quantum computers can do some unique things to change our lives.

 

In my opinion, quantum utility is probably still 10 to 15 years away. However, progress toward this goal is not just steady: it is accelerating. It's the same as we've seen with Moore's Law and semiconductor evolution. The more we discover, the faster we will go. Semiconductor technology has taken decades to evolve to its current state, accelerating each time. We expect a similar progression in quantum computing.

 

In fact, we have found that what we learned while designing transistors at Intel also helps accelerate our quantum computing development efforts today [3]. For example, in developing silicon spin quantum bits, we were able to leverage our existing transistor fabrication infrastructure to ensure quality and speed up manufacturing. We have begun mass-producing quantum bits on 300 mm silicon wafers in a high-volume manufacturing facility, which allows us to deploy arrays of more than 10,000 quantum dots on a single wafer. We have also leveraged our semiconductor experience to create a low-temperature quantum control chip, called "Horse Ridge," which helps solve the interconnect challenges associated with quantum computing because it eliminates many of the cables that are crammed into dilution chillers today. And our experience in testing semiconductors led to the development of a low-temperature probe that allows our team to obtain test results for quantum devices in a matter of hours, rather than days or weeks, as was previously the case.

 

6605daee37cde729f23461f71a90a637

Horse Ridge Cryogenic Control Chip

 

Others may also benefit from their own prior research and experience. For example, Quantinuum's recent research has shown the implementation of entanglement of logical quantum bits in a fault-tolerant circuit using real-time quantum error correction. While still primitive, it is an example of the type of progress needed in this critical area. For its part, Google has a new open source library called Cirq for programming quantum computers. Along with similar libraries from IBM, Intel and other companies, Cirq is helping to drive the development of improved quantum algorithms. Finally, IBM's 127-quantum-bit processor, Eagle, shows steady progress in increasing the number of quantum bits.

 

There are also some key challenges that remain.

 

First, we still need better-equipped, high-quality quantum bits. While the best single- and double-quantum-bit gates meet the required fault-tolerance threshold, this has not yet been achieved on a larger system.

 

Second, we have yet to see anyone propose interconnect technology for quantum computers: it is no less important than the way we wire microprocessors today. Right now, each quantum bit requires multiple control lines; this approach is untenable when creating a large-scale quantum computer.

 

Third, we need fast quantum bit control and feedback loops. Horse Ridge is a pioneer in this area because we expect to improve latency by putting the control chip in a cooler and thus closer to the quantum bit chip.

 

Finally, there is error correction. While there have been some recent signs of progress in correction and mitigation, no one has yet run an error correction algorithm on a large group of quantum bits.

 

These are challenges that we will overcome as new research regularly shows new approaches and advances. For example, many in the industry are working on ways to integrate quantum bits and controllers on the same chip to create a quantum system-on-chip (SoC).

 

But we are still quite a long way from having a fault-tolerant quantum computer. Over the next 10 years, Intel expects to be competitive (or ahead of the pack) in terms of quantum bit count and performance, but as I've said before, a system powerful enough to deliver compelling value will be realized by anyone within 10 to 15 years. The industry needs to continue its development in terms of quantity and quality of quantum bits; after that, the next milestone should be to produce thousands of high quality quantum bits (still a few years away) and then scale it to millions.

 

Keep in mind that it took Google 53 quantum bits to create an application that could perform the functions of a supercomputer. If we want to explore new applications beyond today's supercomputers, we will need to see systems of orders of magnitude larger in size.

 

Quantum computing has come a long way in the last five years, but we still have a long way to go, and investors will need to fund it for the long term. Significant developments are happening in the lab, and they show great promise for what could happen in the future. For now, it's important that we don't get caught up in the hype and focus on the real results.

 

Reference links:

[1]https://spectrum.ieee.org/ieee-quantum-week

[2]https://www.gartner.com/en/research/methodologies/gartner-hype-cycle[3]https://www.intel.com/content/www/us/en/research/quantum-computing.html

2022-09-21