• • • AI • • •
WHY HEAT IS NOW AN ELECTRICAL ENGINEERING RESPONSIBILITY IN AI DATA CENTRES
BY MICHAEL POTO, PRODUCT MANAGER, VERTIV
From an engineering perspective, the challenge is not peak load alone, but dynamic behaviour. Electrical systems are designed to tolerate transients, but thermal systems traditionally are not. Bridging that gap requires closer alignment between power delivery, thermal capacity and control.
Grid connections, transformer sizing and power delivery architectures dominate early-stage discussions. Yet in practice, many of the constraints now shaping AI data centre performance emerge downstream, as electrical energy is converted into heat and must be removed safely and efficiently. This shift has blurred the traditional boundary between electrical and mechanical domains. Heat is no longer simply a by-product managed by cooling systems, it is an outcome of electrical behaviour that feeds back into system stability, efficiency and resilience. As a result, thermal management is increasingly an electrical engineering responsibility, requiring a system-level view of the entire thermal chain.
T From electrical load
to thermal implications At its core, every watt consumed by IT equipment becomes heat. In conventional enterprise settings, that relationship is relatively predictable. Electrical loads are steady, utilisation varies slowly and cooling systems can be designed around static assumptions with comfortable margins. AI workloads disrupt this balance. Training and inference tasks introduce rapid, high-amplitude swings in power draw, often across heterogeneous hardware estates. These electrical transients translate immediately into thermal stress at the component and rack level. If cooling response lags behind electrical demand, temperature excursions follow.
22 ELECTRICAL ENGINEERING • FEBRUARY 2026
electricalengineeringmagazine.co.uk
he rapid growth of AI infrastructure is often framed as a challenge of compute scale and electrical capacity.
Power density and the limits of air cooling
Rising rack densities expose the limitations of conventional air-based cooling from both an electrical and thermal standpoint. As power density increases, airflow requirements rise non-linearly, increasing fan energy consumption and complicating airflow management. When it comes to electrical efficiency, this creates a compounding effect. More electrical power is consumed by cooling fans and auxiliary systems, reducing overall facility efficiency and increasing heat generation further upstream. Liquid-based cooling approaches address this challenge by removing heat much closer to the electrical source (e.g., the processors themselves).
Direct-to-chip liquid cooling intercepts thermal energy right at the processor level, which significantly reduces reliance on high-volume airflow through the data centre and lowers the electrical overhead associated with powering fans. Recent expansions in the EMEA region have introduced new coolant distribution unit (CDU) models with capacities including 70 kW, 121 kW, 600 kW and even up to 2300 kW (2.3 MW). These are available in both in-rack and in-row configurations, supporting liquid-to-air and liquid-to-liquid cooling loops. This variety enables flexible deployments, whether for retrofitting existing facilities or building new greenfield data centres.
There’s also an emerging trend of backing up CDUs with uninterruptible power supply (UPS) systems to provide consistent cooling availability and maintain operational continuity during power disruptions. For AI accelerators operating near thermal and electrical limits, this tighter coupling between electrical input and thermal removal supports more stable operation under fluctuating load conditions.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48