We're on the verge of a huge change in the DNA of computing, and it's not just quantum technology that's coming--
Computers are built around logic: the use of circuits to perform mathematical operations. Logic is built around things like adders - the basic circuits that add two numbers together.
This is true of today's microprocessors and all microprocessors from the early days of computer history. It goes back to the abacus, which at some basic level does the same thing as your shiny gaming PC - it's just that it's much, much less powerful.
Today, processors can do a lot of math in a single clock cycle using any number of complex circuits. And it's not just adding two numbers together. But building shiny new gaming CPUs requires centuries of iteration of the classic computers that came before them.
As might be imagined, it's a bit tricky to build something completely different from that, but that's exactly what some are trying to achieve with technologies like quantum computing and neuromorphic computing: two very different concepts that could revolutionize computing.
"Quantum computing is a technology that we've taken for granted, at least by name, and it's always referred to as the 'future of computing'." Carlos Andrés Trasviña Moreno, software engineering coordinator at CETYS in Ensenada, said.
Quantum computers utilize quantum bits, and unlike classical bits, which can only exist in one of two states (0 or 1), these quantum bits can exist in two states as well as in a superposition of these two states: it can be either 0, 1, or both 0 and 1. If this sounds very confusing, that's because it is, but it also has enormous potential.
Quantum computers are expected to be powerful enough to break modern "unbreakable" encryption, accelerate medical discovery, reshape the way goods are transported in the global economy, explore the stars, and revolutionize anything that involves a lot of number crunching.
The problem is that quantum computers are hard to build and perhaps even harder to run.

"One of the main drawbacks of quantum computing is its high power consumption, as the complexity of the algorithms it uses far exceeds that of any CPU currently available," Moreno continued, "In addition, quantum computations require a temperature environment that is close to absolute zero, which also increases the power requirements of the system. Finally, they are extremely sensitive to environmental disturbances such as heat, light and vibration."
"Any of these can alter the current quantum state and produce unexpected results."
While you can replicate the functionality of classical logic with quantum bits, we're not exactly starting from scratch in developing these machines; harnessing the power of quantum computers requires new, complex quantum algorithms that we're only just beginning to master.
IBM is one of the companies investing heavily in quantum computing, with the goal of building a quantum computer with 4,158 or more quantum bits by 2025; Google is also dabbling in the quantum field.


Admittedly, we're still a long way from ubiquitous "quantum dominance" - the moment when quantum computers outperform today's top classical supercomputers. Back in 2019, Google claimed to have done just that, and while that may be a niche achievement, it's still impressive. In any case, in practice, we're not there yet.
Scientifically speaking, it's a real pain in the ass to figure them out. But that's never stopped good engineers.
"I do think we've scratched the surface in terms of quantum computing." Marcus Kennedy, Intel's general manager of gaming, said, "Again, just as we've broken the laws of physics with silicon over and over again, I think we've broken the laws of physics here."
Artificial Intelligence - the buzzword of 2023 - is more direct about the potential for the future of computing. But for many, AI is truly a huge, life-changing development, and not just for that smart-sounding, somewhat over-argued chatbot in your browser. Today, we've only scratched the surface of what we can do with AI, and a whole new kind of chip is in the works to enable these deeper, more impactful applications.
"In my opinion, neuromorphic computing is the most viable alternative to classical computing," says Moreno.
"In a sense, we could say that a neuromorphic computer is a biological neural network implemented in hardware. One would think that it just converts perceptrons into voltages and gates, but in reality it comes closer to mimicking the workings of the brain, mimicking how actual neurons communicate with each other through synapses."
What is neuromorphic computing? The answer lies in the name "neural," meaning related to the nervous system. Neuromorphic computing aims to mimic the greatest computer and most complex creation ever created by mankind: the brain.
"I think the processing power of the neuromorphic chip will far exceed the processing power of a single chip based on the x86 architecture, which is a traditional architecture." Says Kennedy, "Based on how the brain works, we know it has far more capacity and capability than anything else."
"The most efficient systems tend to be very similar to what you see in nature."
Conventional central processors handle instructions according to "clock time": information is transmitted at fixed intervals, as if managed by a metronome. Neuromorphic chips, on the other hand, communicate in parallel (without the rigidity of clock time) by building in digitally equivalent neurons that utilize "spikes" - bursts of current that can be sent at any time. Like our brains, the chip's neurons communicate by processing incoming current, and each neuron can decide whether to send current to the next based on the incoming spiking current.
What's striking is that these chips require much less power to process AI algorithms. For example, one neuromorphic chip made by IBM contains five times as many transistors as a standard Intel processor but uses only 70 milliwatts of power; Intel processors, on the other hand, consume between 35 and 140 watts, or 2,000 times as much.
Neuromorphic chips have not yet reached their breakthrough moment, but they are coming. Intel is currently developing several neuromorphic chips: the Loihi and Loihi 2.

What exactly is a neuromorphic chip? It's a brain with neurons and synapses. But because they're still made of silicon, think of them as a hybrid of a traditional computer chip and brain biology.
But it's not necessarily a big brain: with 1 million neurons and 120 million synapses, Loihi 2 is many orders of magnitude smaller than the human brain, which has about 86 billion neurons and trillions of synapses. As you might imagine, it's hard to count all the neurons, so we don't know exactly, but our brains are big. You can brag about it all you want to your fellow small-brained animals.
It's estimated that a cockroach has as many synapses as a "Loihi 2", which gives a better idea of the size of the gray matter we're talking about.
Neuromorphic computing has a lot of room to grow, and with interest in AI growing rapidly, this emerging technology could be the key to powering the impressive AI models you keep reading about.
You might think that AI models run well today, thanks in large part to NVIDIA's graphics cards. "What makes neuromorphic computing so appealing is that it can dramatically reduce the power consumption of processors while still achieving the same computational power as modern chips,"
says Moreno.
"By comparison, the human brain can process hundreds of teraflops of power and consume only 20 watts, while an average graphics card can put out 40-50 teraflops of power and consume only 450 watts."
Basically, "If a neuromorphic processor were developed and implemented in a graphics processor, it would have more processing power than any existing product, while consuming a fraction of the energy of existing products."
Sound appealing? Yes, of course it does. Reducing energy consumption not only leads to a huge amount of potential computing power, it also reduces energy consumption, which has a knock-on effect on cooling as well.
Moreno continues, "Changing the computing architecture also requires a different programming model, which is a remarkable feat in itself."

It's one thing to make a neuromorphic chip, it's another to program it; that's one of the reasons Intel's neuromorphic computing framework is open source.
What we haven't cracked yet is how to utilize the software behind this structure," says Intel's Marcus Kennedy. That way, you can make a chip that looks very much like a brain, and it's the software that actually makes it work like a brain. So far, we haven't cracked that."
It will be some time before we can completely replace AI gas pedals with something that looks like a brain; or replace adders and binary functions, which are as old as computing itself, with quantum computers. However, we are already beginning to experiment with empirical replacements for classical computing as we know it.
A recent breakthrough announced by Microsoft has made the company very bullish on the quantum future, and IBM recently predicted that quantum computers will surpass classical computers in important tasks within two years.
In the words of Intel's Kennedy, "I think we're getting there."
[6]https://thequantuminsider.com/2022/10/26/the-future-of-digitalisation-quantum-and-neuromorphic-computing/