Wednesday, August 17, 2022
The photonics computing industry is an ecosystem where future tech deployment dreams come true.
The dreams of smart cities, smart autonomous mobility, smart personalized medicine, and beyond are truly unlimited. What’s more, despite the huge data processing involved, it can all be done in real-time, along with predictive and auto-improving cybersecurity capabilities, all in a sustainable fashion and at scale. But, there are many skeptics of the photonics sector who have been let down before. Either by overblown claims, immature technologies, or a host of other problems that plagued bold ideas from ever reaching the market.
With recent discoveries and emerging new technologies, this sector is coming into the light, bringing promise to companies who seek to scale their real-time computing capabilities and reduce costs associated with AI in both data centers and at the edge.
What promises does modern photonic computing hold?
Photonic computing is a formidable high-bandwidth technology that promises to replace traditional electronic computing components. The benefits are achieved by rethinking today’s standard copper wires and silicon-based chips, which were designed to carry electrons. Instead, silicon-based chips, glass-based chips, and fiber-based devices can process light-carried information at a higher capacity.
With photonic computing, wires can be replaced with either fiber optic cables that can process signals before it reaches its destination or special light carrying waveguides within photonic chips. For a more dense functionality, integrated glass-based and silicon-based circuits do further heavy lifting.
Today’s state of the art technology is limited in speed due to the huge computational burden inherent to artificial intelligence models, which are ever-growing. However, having a super fast chip within a hybrid system where some of the computing is done in standard electronics will create by definition information bottlenecks. These bottlenecks, such as PCI busses and memory bandwidth limitations, immensely reduce the overall performance benefits of the photonic processor.
Yet it can’t be ignored that silicon chips reduce the space, time, and energy needed to conduct computations. Until now, the benefits of photonic chips including reduced wear and lower overheating impacts have to be balanced with scaling and yield challenges due to the nature of photons with their far larger wavelength and hence much larger component size (up to 10,000X).
Thanks to revolutionary developments, the atmosphere in photonic computing today is similar to the mid-century transition from vacuum tubes to transistors. Just like what we saw in the mid-1950s, photonic computing is in its relatively early stages—yet already making waves. Aware of the potential to unlock technologies of the future, experts are motivated to reach the magic quadrant of efficient architecture and technology. While the road ahead has some difficult barriers to overcome, there is no doubt that a world of computational efficiency and scalability awaits over the horizon.
Why this time, it’s different
In 1975, light as an information carrier validated its clear supremacy over electric wires when optical fibers replaced submarine copper communications cable and enabled accelerated information flow via the global internet.
Since then, a series of remarkable engineering feats by companies such as Cisco, Juniper, NTT, Mellanox, and others, has allowed the bandwidth supported by optical communication to continuously break its own records every year reaching a staggering 200Gbps. If this wasn’t enough to bring over skeptics, speeds of 400Gbps and 800Gbps on a single fiber is just over the horizon—including an experiment by NTT that demonstrated 1 Peta (1,000,000 Giga) bit per second transmission over a single fiber.
The basic physics of controlling light interactions and its far longer wavelength, which dictates x1000-x10,000 larger elements. Beyond the physical impact of this, it produces concerns surrounding mass production costs and yield. Nevertheless, attempts to invent, design and construct optical computing-based switches, routers and other high transmission products have been on the roadmap for decades for photonic companies and startups.
Bursting into the modern era, artificial intelligence (AI) applications present a unique opportunity by leveraging photonic computing, as their outputs are measured in relative probabilities rather than bitwise exact computations. It does not matter if an AI system deviates 5% in its output as long as it consistently gets the right answer to the question, such as “Is it a face in the picture? A car? a dog?” The correct answer is the goal, not the exact probability that led the system to give the answer
This understanding along with high motivation to solve the painful limitations of current AI infrastructure has propelled the young photonic computing industry into an accelerated path with large investments, large promises, and hopefully large advancements. Today, several technologies are competing for the holy grail: silicon photonics, photonic integrated circuits, glass-based photonic chips, and fiber-based photonic devices.
Unlocking future capabilities with photonic computing
Just as CPUs evolved to have more cores, and GPUs evolved to handle thousands of cores, Pure Photonic Computing is the natural next step towards brain-inspired distributed photonic computing systems.
With expected energy consumption incredibly low, efficiency surpassing 100X, and speeds that require us to reimagine what is possible, it is clear that the future is happening over pure-photonics. The industry needs to aspire to a solution where data comes into a system at the high speeds of optical communication and all calculations done on-the-fly without any conversion to electronics or without any dynamic memory read and write operations. What can we achieve with such a non-Von Neumann architecture? Imagine a cable of 100 fibers each carrying 800Gbs, entering a photonic pod, streaming photons through a multitude of processors and exiting as the desired outputs at exactly the same rate of the input. 800 billion full computations per second with an expected energy cost of 10–20kW.
Today’s trend of reducing carbon footprints collide with the growing AI models. Pure photonics seems to be the only large-scale sustainable solution to reconcile these trends.
.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|