Feature: Power management
Silicones for data centre cooling save energy, reduce costs and improve performance
By Dr Dachao Li, Director, APAC Technical Service and Development – Consumer Solution Business, Dow
I
t is now a well-known fact that data centres use a tremendous amount of electricity. A 2025 report by the UK Department for Energy Security and Net Zero states that data centres have power densities
an order of magnitude higher than standard office buildings, with energy use per square metre significantly exceeding that of traditional commercial properties: 1-2kW/m² compared to 0.01-0.1kW/m². With the introduction of power hungry AI workloads, these energy requirements are now significantly higher, especially at the hyperscale data centres. System designers and materials specialists
are now creating energy-efficient designs to reduce data centre energy consumption and efficiently manage heat dissipation. Traditionally air cooling has been used to dissipate heat in electronics. Fans are cost- effective, but they can be large and bulky, requiring more space in the racks, and are not sufficient to meet the large-scale cooling requirements of concentrated high-power computing. Air cooling can include chilled air, but this requires a large supply of water, some of which is lost to evaporation, and also occupies rack space.
Power usage effectiveness Today, data centre architects and electronics designers require innovative and environmentally sustainable cooling methods and a meaningful way to compare them. Power usage effectiveness (PUE), a metric that describes a data centre’s efficiency, provides a starting point. It is the ratio of a data centre’s total energy consumption to the energy consumption of its information technology (IT) equipment. Te ideal value of PUE is 1.0, but this is a theoretical value, since not all power that enters a data centre is used by IT equipment: Some energy is lost to power distribution and lighting, and some to fans or liquid cooling, a heat transfer method that uses an electric-powered pump to move liquid coolant through a closed loop system. Liquid cooling includes both direct-to-chip (D2C) and immersion cooling types. Te Technology Collaboration Programme
on Energy Efficient End-Use Equipment (4E TCP) is an International Energy Agency (IEA) consortium. It has recently reported that air-cooled data centres typically have PUE values from 1.5 to 1.8, 50-80% higher than the ideal PUE of 1.0.
Today, high-performance processors
such as Intel Core i9 can draw as much as 350W under full load, with future CPUs’ TDP nearing 1000W. KAO Data, a UK data centre provider, notes that air cooling alone is insufficient when chips have a total dissipated power density (TDP) that exceeds 250W.
Thermal management materials Termally conductive silicones for data centre electronics can help lower PUE by moving heat away from electronics. Silicone is inherently thermally insulating, but the addition of specialised fillers can make it thermally conductive instead. For example, a silicone compound filled with aluminum oxide can achieve significant thermal conductivity, enabling it to transfer substantial amounts of heat – potentially up to 100W – depending on the material’s thickness, contact area and temperature gradient, as described by Fourier’s Law. As thermal management materials,
silicones provide other advantages, too. For example, silicone compounds can protect electronics against high operating temperatures, moisture and dust. In addition, they can withstand thermal
www.electronicsworld.co.uk February 2026 17
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48