search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
DATA CENTRE COOLING Keeping it cool


Simon Harris, Director of Critical Infrastructure at global data centre consultancy BCS, looks at the most critical challenge facing the development of AI and global digitalisation.


T Simon Harris "AI is


rewriting the rules, demanding extreme power and smarter, greener


infrastructure and when combined with climate change and rising global temperatures – cooling is increasingly a key


challenge."


he UK and Europe have set their sights on becoming a global leader in artifi cial intelligence, with national strategies supported by signifi cant R&D investment


across both the public and private sectors. But delivering on that ambition requires more than policy and funding – it depends on infrastructure and at the heart of that infrastructure are data centres. These facilities are vital for training AI models, managing vast datasets, and keeping digital services available around the clock. They form a critical part of our digital future. However, the data centre infrastructure that supported the last decade of growth is no longer suffi cient as demand for capacity continues to accelerate, with no indication of slowing. This was refl ected in our latest BCS independent survey confi rmed that most data centre facilities aren’t yet ready for AI-heavy workloads and that the tech is coming faster than the infrastructure, with 85% of the 3000 respondents agreeing.


Cooling is key


One vital element in getting the infrastructure AI ready is the need for additional cooling as AI workloads concentrate high-performance, power-intensive GPUs within dense racks, operating them at greater intensity and duration than traditional IT applications. Legacy sites have usually been designed with a rack density from 5kW to 10kW whereas an AI ready design calls for 50kW to100kW or more. As a result, advanced and often specialised cooling solutions are critical to prevent overheating and safeguard performance.


Liquid cooling – a slow burn While liquid cooling is often positioned as a key solution for addressing the signifi cant heat output of AI and high- performance computing workloads in data centres, its adoption has been relatively slow. There are high initial costs as liquid cooling requires substantial capital investment for specialised equipment. There is also a lack of current universal standards, with suppliers off ering bespoke solutions with limited compatibility, leading to vendor tie-in and increased integration risk, which complicates planning. We also fi nd that operators have deeper familiarity with air-based cooling systems so there are concerns about leaks and the complexity of maintenance as well as questions around coolant disposal, material recyclability, and long-term environmental impact. Finally, not all workloads benefi t from the high-density cooling capability that liquid systems off er and for lower- density applications, traditional air cooling is often considered


14 November 2025 • www.acr-news.com Download the ACR News app today


suffi cient. This raises questions about whether this is the right thing to do and whether future market shifts due to consumer technology demands might impact any decision. Despite modern systems being highly reliable, the presence of liquid near sensitive electronic components still raises concerns regarding system longevity, downtime and reliability. It is however important to focus on the positives. Liquid cooling is signifi cantly more effi cient at transferring heat, allowing for higher server densities and reduced energy consumption, lower energy bills and operating costs. It is currently the best option for cooling AI and other compute- intensive workloads. It’s also a good option in space- constrained environments, delivering more effi cient use of rack space and reducing the overall footprint of the data centre.


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40