Optoelectronics
Exploring the intersection of photonics and artificial intelligence
By Prof. Callum Littlejohns, deputy director at the CORNERSTONE Photonics Innovation Centre T
he rapid pace of our digital age demands ever-increasing speed. Improvements in data rates and computational power are not merely digital advancements – they directly enhance the efficiency of real-world operations, impacting everything from supply chains to scientific research. At the forefront of both technological advancement and the demand for computing power is artificial intelligence (AI). As we move forward, the scale, and therefore potential impact, of AI, from large-scale data centres to edge computing devices, is intrinsically linked to its hardware performance – primarily data rates and computational power.
However, in conventional electronic circuits, the speed of data transfer is fundamentally limited. Faster data transfer requires more power to overcome the increasing resistance in the wires, leading to increasing energy loss and reduced efficiency. These issues not only increase operational costs but also pose sustainability challenges in a world increasingly dependent on data-driven technologies.
This is a growing, but not new, challenge. Telecommunications, an industry always pushing for faster data speeds and higher bandwidth, has long embraced light- based technology like fibre optics and, more recently, silicon photonics (SiPh) to sidestep the energy inefficiency of electron- based systems. This is something that the CORNERSTONE Photonics Innovation Centre (C-PIC) is all too aware of, being the UK’s Innovation and Knowledge Centre (IKC) for SiPh innovation, led by the University of Southampton. In this article, we will explore the growing challenges of AI and the potential that SiPh has in advancing AI.
34 September 2025
Figure 1: A selection of photonic wafers within the University of Southampton’s cleanrooms (Source: University of Southampton).
The challenges of AI: performance, power, and cost
As AI models grow in size and complexity, power consumption escalates – a cost that is difficult to sustain in both economic and environmental terms.
The International Energy Association (IEA) forecasts that global data centre electricity consumption, estimated at 460 terawatt-hours (TWh) in 2022, could surpass 1,000 TWh by 2026 – roughly equivalent to Japan’s total electricity use.1 To highlight the impact of AI and similar processes, typical data centres consume around 5 to 10 megawatts (MW), but large hyperscale facilities supporting centralised AI models and training usually require 100 MW or more2
. Furthermore, the clustering
of many hyperscale data centres produces significant localised energy demands, such as in Ireland, where data centres use over 20 per cent of the electricity3
. Components in Electronics
Nevertheless, managing these growing power demands is not the only challenge associated with AI. While relatively immature, today’s market-ready AI is already having a profound impact on our increasingly digital society. Therefore, it is critical to increase the computing power and speed of hardware not only in the cloud but also within edge devices to further extend the reach and efficacy of AI solutions.
Take, for example, AI’s growing role in healthcare. The development of advanced diagnostic systems capable of analysing medical images for patient diagnostics is already underway. In remote or underserved areas, these types of AI models could be deployed to provide real-time, highly accurate diagnoses where expert radiologists are scarce, enabling timely treatment that can be lifesaving. Similarly, in the realm of edge
computing, AI-powered devices are already impacting critical infrastructures. For example, smart city technologies now use edge AI to monitor environmental conditions, manage traffic flow, and enhance public safety by enabling immediate responses to dynamic urban challenges. Further developments in AI hardware can help to reduce latency and allow for more sophisticated edge services, further enhancing the efficiency of urban operations.
Ultimately, to sustainably scale AI’s impact, we must efficiently manage its escalating energy consumption, boost computing speed and power both in the cloud and at the edge, and simultaneously drive down hardware costs – a trio of challenges critical to ensuring economic and environmental viability.
How silicon photonics can potentially drive AI innovation SiPh is emerging as a promising technology for AI applications. By leveraging light instead of electrons for data transmission, SiPh offers the possibility of dramatically higher speeds and significantly lower energy losses, without impacting hardware costs negatively. These benefits are crucial for both hyperscale data centres supporting large AI models and edge devices needing real-time analytics while minimising power and cost.
Three key areas are already emerging where SiPh shows strong potential for advancing AI performance: co-packaged optics (CPO), optical switches, and sensing technology tailored for edge applications. CPO directly integrates photonic components with switch ASICs (application- specific integrated circuits) or other high-performance processors, replacing traditional copper interconnects for chip-to-
www.cieonline.co.uk
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60