Interview with supercomputing expert Estela Suarez: Options for the post-Moore's Law era - quantum and neuromorphic computing

6b359ed7dc722e6d336baa2ae652625f

 

Whether it's automatic speech recognition, self-driving cars or medical research. Our need for computing power is constantly increasing. However, the possibilities to increase the performance of microchips will soon reach their limits. So, where do we go from here?

 

It's been more than 50 years since Intel's founder, Gordon Moore, introduced "Moore's Law," which predicted that the capacity of our digital devices would double every two years for the foreseeable future. This is because it is possible to double the number of transistors on a silicon microchip every two years. This means twice as much computing power in the same amount of space at a fraction of the cost. Over the next few decades, technology companies did manage to regularly double the computing power of silicon chips, driving rapid technological development. But will this continue?

 

f9ae3980b6aec7c34f4c03d248e4111f

Estela Suarez, Senior Computer Scientist, Jülich Research Center (Germany)

 

Supercomputing expert Estela Suarez explains why Moore's Law will not work forever, although development will continue apace. And why it is important for advances in medicine, transportation and climate prediction.

 

Full interview.

 

Eva-Maria Verfürth: Since 1965, technological progress has been a certain constant according to the so-called "Moore's Law. But now we hear that the principle will reach its limits. Why is that?

 

Suarez: Physical limitations are the problem. It is not possible to make structures (such as circuits) in a chip, infinitely smaller. At some point, we will reach the size of an atom. And that, in turn, will be the final result.

 

Eva-Maria Verfürth: So, does the end of microchip optimization also mean the end of performance improvement?

 

Suarez: The end of miniaturization means that we will have one less way to improve performance. Therefore, we will have to use other strategies. There are other optimization methods that can give us higher computational power and that are not directly related to the size of the structure on the microchip.

 

For example, many applications have a problem getting data from memory to the processor quickly. They are constantly waiting for data before continuing processing. New memory technologies are being developed to solve this problem. They provide more bandwidth so that data can be accessed more quickly. This increases computing power because more operations can be performed in the same amount of time. However, even this strategy will eventually reach its limits at some point. That's why researchers and technology companies are always looking for completely new ways to make the computers of the future even more powerful.

 

Eva-Maria Verfürth: Why is it important to further increase computational power? Can you give examples of applications or technologies that can only be developed and deployed if computing power continues to increase?

 

Suarez: In science, we want to use computers to analyze increasingly complex phenomena so that we can draw the right conclusions. These phenomena must be simulated with increasing accuracy, which increases our need for higher computational power.

 

Weather and climate models are an example. They obtain data from a variety of different sources and take into account many interrelated aspects: from the chemical and oceanic dynamics of the atmosphere to the geography and vegetation of different regions of the Earth's surface. All these and even more different but interrelated aspects need to be simulated in order to accurately predict changes in climate conditions over time. Today, even on the most powerful computers, these calculations take weeks to months of computational time.

 

Specifically, this means that higher computing power will lead to more accurate predictions, which in turn will help prevent weather catastrophes and take more targeted measures to mitigate climate change.

 

Eva-Maria Verfürth: What other areas does this apply to?

 

Suarez: There are similar situations in other areas of research and industry, such as medicine. Today's computers are not yet able to create simulations of the entire human brain, although this could decisively improve our understanding of neurological diseases. Drug development also requires high computing power to determine the correct composition of active ingredients more quickly. Supercomputers are also being used to identify new materials needed to make more environmentally friendly products. The development of language models also requires increasingly powerful computational performance. These are already being used in many parts of our daily lives, such as cell phones, cars and smart TVs.

 

Eva-Maria Verfürth: At the Jülich Research Center, you are working on quantum technology, which will provide higher computational speeds for certain processes. What possibilities might it open up?

 

Suarez: This approach is being pursued vigorously all over the world, not just in research institutions. Large IT companies like Google, who are our partners, are also working on this problem.

 

Quantum computers can solve certain problems faster than standard computers because they compute in a completely different way. They take advantage of properties of quantum physics that allow for more compact data storage and faster execution of certain operations. Quantum computers are not necessarily better than "standard" digital computers in every application. But they are very efficient, especially for optimization problems, such as finding the minimum value in a multidimensional array of parameters. Navigation is a practical example of this type of problem: When you enter a destination into a GPS system, the computer in the device calculates the best route, taking into account distance, speed limits, current traffic conditions, and other factors. This is a typical optimization problem that runs much faster on a quantum computer than on a normal computer.

 

Imagine a future in which millions of cars in self-driving mode must drive on German roads in a coordinated fashion to get each passenger to their destination as quickly as possible. This would require enormous computing power, which may only be feasible with quantum computers.

 

Eva-Maria Verfürth: So far, no quantum computer has been used outside of research. How far away are we from quantum technology actually becoming available?

 

Suarez: Today's quantum technology is at the same level as ordinary computers from the 1940s. There are some prototype systems that have shown very promising results, but we are still struggling with unreliability and computational errors.

 

For example, the software environment is also still in its infancy; many of the utilities we are used to for ordinary computers, such as highly optimized compiler software, custom libraries, and debuggers, are not yet available for quantum computers; and algorithm development has a long way to go.

 

Eva-Maria Verfürth: What are some other approaches or ideas to enable increased data processing, and how far along are they?

 

Suarez: Neuromorphic computing is another innovative technology: this is the concept of building a computer that mimics the way the brain works. Our brain is not very fast at completing mathematical operations, but because of the massive networking between neurons, it is extremely efficient at learning and recognizing connections between various observations. Our brains require much less energy than conventional computers to perform tasks such as pattern recognition or language learning. Neuromorphic computers are trying to implement such capabilities and apply them to data processing.

 

Eva-Maria Verfürth: What are the current predictions? How fast do experts see the technology evolving in the coming years and decades? What assumptions are they basing their predictions on?

 

Suarez: The current trend is toward diversification. We assume that progress will not be driven and cannot be achieved in isolation using a single technology solution.

 

For example, this is a trend we have seen in the development of supercomputers over the years: supercomputers are increasingly combining different computing and storage technologies to achieve maximum performance and maximum efficiency. The technological development of the individual components is very dynamic, and entirely new approaches are emerging. Therefore, one goal of research in this area is to effectively interconnect the individual components.

 

At the Jülich Supercomputing Center, we are developing a modular supercomputing architecture in which we are working on bringing together different approaches as modules in a supercomputer; we are also working on combining conventional computers with quantum computers.

 

Reference link:

https://thequantuminsider.com/2022/10/26/the-future-of-digitalisation-quantum-and-neuromorphic-computing/

2022-10-31