THERMAL MANAGEMENT
Cooling the AI Frontier:
How liquid cooling and modular CDUs are powering the next generation of data centres
The data centre industry is standing at the threshold of a new era, one defined by artificial intelligence, accelerated compute, and the relentless rise of thermal demand.
A
ir cooling, once the workhorse of the digital economy, is straining against the limits of physics. The surge in AI and High- Performance Computing (HPC) workloads is driving a fundamental rethink of how facilities manage heat, scale infrastructure and ensure sustainability. At the core of this shift lies one critical technology: liquid cooling.
What are the forces driving liquid cooling into the core of AI infrastructure
AI is not just redefining computational capability; it’s reshaping the physical infrastructure required to sustain it. As GPUs become denser and more powerful, with rack densities exceeding 120 kW and projected to reach 600 kW by 2027, traditional air systems can no longer cope. Every watt consumed by a GPU becomes heat that must be efficiently removed to prevent performance throttling and maintain uptime.
Simultaneously, sustainability pressures are 22
intensifying. Data centres already consume close to 40 per cent of their energy budget on cooling, and regulators in Europe, North America and Asia are demanding measurable reductions in energy and water use. Liquid cooling provides a dual advantage: it handles extreme heat loads while reducing both power consumption and water dependency, particularly when deployed in closed-loop systems. Moreover, liquid cooling has evolved from a niche experiment to a strategic enabler. It’s now integral to the AI ecosystem, supporting the shift from model training in hyperscale clusters to real-time inference at the edge, where latency and energy efficiency are paramount. In both environments, operators need cooling systems that are not just powerful but modular, globally deployable and designed for lifecycle efficiency.
How do high-capacity Coolant Distribution Units (CDUs) support high-density computing environments and enable future
FEBRUARY 2026 | ELECTRONICS FOR ENGINEERS scalability?
Coolant Distribution Units (CDUs) are the unsung heroes of liquid cooling infrastructure. Acting as the thermal backbone of direct-to-chip systems, they manage the flow, pressure and temperature of coolant throughout the facility. High-capacity CDUs are specifically designed to address the realities of AI-scale computing. Offering up to 10 MW of cooling capacity, they provide a modular, pay-as- you-grow platform that allows operators to scale in step with compute expansion. Unlike traditional monolithic builds, which often lead to over-provisioning, modular CDUs let operators deploy just the cooling they need today, then expand seamlessly as rack densities rise. This reduces stranded capital, accelerates deployment timelines and ensures that facilities remain adaptable to future generations of hardware. As Kevin Roof, global sales director at LiquidStack, notes, scalability “isn’t just a design feature anymore -it’s a strategy for survival”. Beyond scalability, high-capacity CDUs
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50