The rise of AI, photonic computing windfall has arrived!
Imagine a situation where there is no latency in anything you click on your computer, laptop, phone or smartwatch? So low a latency that the human brain wouldn't even notice it! It would be at our fingertips, optical computers based on photonic technology, information flow and processing at the speed of light.
Right now, we use systems that rely on electrical signals for computing; transistors, usually made of silicon, implement the logic of current computing systems. Millions of microscopic transistors on a chip manipulate electrical currents, thus enabling the binary system of zeros and ones - the language our computers understand.
But according to a report by the International Energy Agency (IEA), data centers and data transmission networks each account for 1-1.5% of global electricity use. Most data centers use almost as much computing (cooling and power consumption) power as they do to power their servers. Today, this technology is about to become saturated.
Global Trends in Digital and Energy Metrics 2015-2021
Moore's Law isn't dead, it's in decline. At some point, the transistors on a chip will reach the size of a single atom and it will no longer be possible to continue to shrink them and increase their density. Scaling will be almost impossible, power consumption will increase, and they will not be as expensive. We won't be able to fit as many silicon transistors on a chip as we have in decades past.
Pressure likewise ensues with the rise of artificial intelligence. Chips have been used in various fields, such as Internet search, language translation, image recognition and self-driving cars, among others. Technology companies, such as Google, have figured out ways to make digital AI computing more efficient.
What do we do today? Look for emerging technologies, such as quantum computing, neuromorphic chips, photonics and new materials. Among them, advances in photonic computing make it the next best contender to revolutionize hardware computing again.
Unlike traditional computers, photonic computing does not encode and manipulate data by representing currents as zeros and ones, but rather uses the physical properties of light to perform calculations.
A typical computer is the result of three things it really does well: computation + communication + storage. In an electronic configuration, these processes are performed when current is manipulated with the help of transistors, capacitors, resistors and other components; in photonic computing, light is manipulated using photodetectors, phase modulators, waveguides, etc. These are the building blocks of electronic computers and photonic computing, respectively.
Unlike electronic computing, photonic computing relies on the properties of photons , while electronic computing relies on electronic manipulation. So, how does photonic computing have better performance when working on binary languages?
Specifically, photonic computing uses light, which makes it faster, more efficient, and more compact.
1) Higher bandwidth. The wave nature of light allows for parallel computing capabilities, enabling optical computing to pack more information and thus see higher bandwidths. This allows optical computers to be more compact and handle more complex data.
2) High efficiency. Compared to electric current, light is also less prone to transmission losses and therefore does not generate the same level of heat as electrical computing. This means that it makes optical computing highly energy efficient. In addition, one does not need to worry about electrical shorts, as they are immune to electromagnetic interference.
3) Faster processing. Even under perfect conditions, the speed of electric current is 50-95% of the speed of light. And this makes optical computers faster than universal computers.
The potential of photonic computing is huge, but how far away are we from it? -- It starts with the history of photonic computing.
Back in the 1980s, scientists at Bell Labs made early attempts at creating an optical computer. This new computer would offer significantly higher bandwidth than electronics: hundreds of terahertz (1014), rather than several gigahertz (109). By the mid-1980s, hopes for this technology had reached a fever pitch.
"By the mid-1990s, we would have flexible programmable computers. You may never know there are optics inside. You'll see no blinking lights. It will be very dull looking. But it will go in circles with everything else. Electronics just can't keep up with us."
-- Dr. Henry J. Caulfield (New York Times, 1985)
Bell Labs' approach to optical computing relies on implementing an optical version of an electronic transistor - a device used to switch (or amplify) electrical signals. Unlike the electrons used in the transistors in your cell phone and computer, a beam of light does not interact directly with each other; it can interact with materials; by temporarily changing the properties of the material through which it passes, the passage of one beam of light can be "felt" by another.
Unfortunately, however, the predictions made by the Bell Labs scientists did not come true. This is due in large part to the difficulty of implementing an "optical transistor". Because each optical transistor will absorb some light, resulting in increasingly weak signals in the propagation process, thus limiting the number of operations that can be performed on the system. In addition, storing optical data is a challenging problem that remains unsolved. The scientific community was deeply skeptical of research in optical computing because of the unfulfilled promises and the hype generated in the 1980s.
And just as the optical transistor was going into decline, a new type of optical computing method was being quietly invented.
In the mid-1990s, the field of quantum computing was growing rapidly as evidence emerged that quantum systems could solve problems that were difficult to handle on classical computers. In 1994, to build an optical quantum processor, Michael Reck described an array of Mach-Zehnder interferometers (MZI) using a basic optical component to perform the important mathematical operation of matrix multiplication. to perform the important mathematical operation of matrix multiplication.
At the time, optical experiments were typically performed using bulky optical components that were threaded onto an optical table (i.e., a heavy metal vibration control surface, often called a "breadboard") to ensure mechanical stability. Using such "breadboard" platforms, it is impossible to stabilize an optical system that jumps dozens of beams simultaneously: even small vibrations or temperature changes can introduce system errors. Thus, although the scientists' idea of forming a larger optical computer by connecting small optical circuits is revolutionary, the technology that would make it possible has not yet kept pace with the theory.
One solution lies in shrinking optical elements several meters long to tens of microns, using computer chips with "integrated photonics" components that can be easily manufactured and controlled. The telecommunications industry was actively developing photonic chips thanks to interest in the fiber optic network technology that serves the backbone of today's Internet. However, it was not until about 2004 that photonic integrated circuits with a large number of components were manufactured. And it wasn't until 2012 that photonic chip fabrication facilities began offering multi-engineered wafer (MPW) services for silicon-based optical chips, allowing multiple academic research groups to share resources and produce low quantities of designs at a low cost.
The following are multiple integrated photonic chip configurations currently available.
Photonic chips based on femtosecond laser-written silica. (a,b) Embedded photonic circuits consisting of basic components such as directional couplers that enable complex interconnections in three dimensions. (c, d) Buried photonic lattice with continuous coupling along the longitudinal direction with fractal and hexagonal cross sections, respectively
Selected advances in programmable silica photonic chips. (a) Waveguides and phase shifters (phase shifters) can be fabricated with the same femtosecond laser writing system. The waveguide is first written into the chip, and the phase shifters are then formed by ablating a thin metal film deposited on the chip surface. (b) Isolation trenches are used to reduce the heat dissipation from the phase shifter to the adjacent waveguides and to reduce thermal crosstalk on the chip. (c) A three-dimensional optical lattice consisting of coupled straight waveguides can be reconfigured by the combination of multiple phase shifters.
Silicon-on-Insulator (SOI) and Lithium Niobate-on-Insulator (LNOI) (a) SOI is characterized by a high density of integrated photonic elements, even in the programmable case. (b) Photonic components from passive to active are possible on LNOI. (c) Ultra-high-speed and energy-efficient optical switches based on strong nonlinearity of lithium niobate.
In the future, scientists estimate that the final form of development of photonic computing will be a crystal plate without a screen, a holographic projection in the air for data input and output. However, it will take decades to reach this goal, and for the time being, it is still an "empty gun".
However, we are already seeing applications for photonic computing in near-edge computing and data centers. This means that with near-edge computing capabilities, 5G-enabled IoT devices in retail stores can calculate and store some of the data they generate, rather than transmitting all the raw data to a distant data center; this can take advantage of the low latency, low transmission loss of photonic computing.
In artificial intelligence, artificial neural network-based computing architectures mimic the way the human brain works, which goes beyond the framework of the von Neumann architecture. There are many kinds of neural networks, such as emergent neural networks, recurrent neural networks and convolutional neural networks, which differ in the arrangement of neurons or in the degree of similarity to the human brain. It should be emphasized that photonic circuits are well suited for implementing artificial neural networks due to their advantages in matrix multiplication and interconnections. Based on photonic chips, machine learning tasks, image classification and vowel recognition have been shown to have considerable accuracy and computational speed.
Applications of integrated photonic platforms in artificial intelligence. (a) An integrated Kerr mirocomb is used as a gas pedal for convolutional optical neural networks. (b) An all-optical neural network based on an integrated photonic circuit is used for vowel recognition. (c) A deep neural network is implemented on a photonic chip to enable image classification with high accuracy.
With the advent of the information age and the artificial intelligence era, developed photonic and advanced quantum technologies have opened a new chapter for light-based computing, bringing photonic computing into the race.
Finally, despite the technical difficulties, various integrated photonic computing regimes have been proposed and applied to solve NP puzzles or perform machine learning tasks. Considering the long-term development, large-scale reconfigurable photonic circuits are essential. The attenuation of optical signals in large-scale computations may have a significant impact on the computational results. To address these limitations, the application of single-photon detectors and quantum detection techniques may greatly enhance the scale of photonic computations that can be handled in reality.
These design techniques and their future developments will help further improve the scalability of integrated photonic platforms and pave the way for large-scale photonic computing.