search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
• • • DATA CENTRE MANAGEMENT • • •


Designing data centres for the new AI era


A


s AI continues to transform industries, data centres must evolve to meet the unique demands of AI workloads. Nick Ewing, Managing Director at EfficiencyIT, shares insights on how data centre companies can design and build facilities to accommodate these disruptive technologies.


How are the infrastructure requirements for AI workloads different from traditional data centre workloads?


AI workloads significantly impact data centre design. GPU-powered servers require more power and far more precise cooling than traditional servers, necessitating high-power distribution to racks and advanced cooling solutions like direct-to-chip or precision immersion liquid cooling. The entire data centre environment needs to be optimised and designed to support these high-density systems effectively, and that requires operators to build net-new facilities or invest significantly in modernisation.


What networking innovations are required to support the evolving demands of AI workloads? High-performance computing and AI systems often use InfiniBand and Ethernet-based technologies to interconnect GPU-powered systems. New networking and data transmission technologies are needed to efficiently distribute AI workloads across multiple environments, enabling scalable and flexible AI deployments across multiple geographies.


How does the rise of AI and edge computing impact data centre location decisions? Training large language models (LLMs) requires vast computational resources, and in today’s market, are often hosted in colocation and hyperscale data centres. In contrast, inference, the real-time operation of AI models, can be performed closer to end-users, driving the need for localised or edge compute. This has led to a demand for high-density, modular data centres, which can be deployed in prefabricated or


containerised systems with liquid cooling. This brings computing power closer to users, and when coupled with high-bandwidth connectivity, can help to reduce latency significantly.


What sustainability challenges arise with new AI-focused data centres, and how can these be addressed moving forward?


AI workloads are already being associated with an increase in energy consumption and carbon emissions, with some organisations reporting up to a 50 per cent rise in energy usage. To address these challenges, data centres should be designed using digital twins and computational fluid dynamics (CFD) software, allowing for modelling under different conditions to optimise energy efficiency. Additionally, carefully considering cooling architectures the power provision and type of backup power systems is essential to minimise environmental impact.


How can legacy infrastructures be integrated successfully and sustainably when building new data centres?


Integrating AI workloads into legacy data centres requires significant modernisation to handle increased power and cooling demands. While many considerations for new builds apply, retrofitting existing facilities can be as costly as constructing new ones. Assessing the feasibility of upgrading versus building anew is crucial, ensuring sustainability and energy efficiency without incurring excessive costs.


What advice can you give to data centre organisations looking to build new facilities? What hurdles do they need to be aware of? Organisations should design data centres based on the specific application requirement, determining the type of AI workload, necessary GPU power and cooling infrastructure. Future-proofing is also essential to accommodate technological advancements like transitioning to next-generation GPUs. Considerations should include sustainable design practices, renewable energy sources, optimal location, component selection and, thereby, the embodied carbon and effective cooling methodologies to ensure efficiency and sustainability.


By carefully planning and considering both current and future needs, organisations can build data centres that are efficient, scalable, environmentally responsible and primed for the next generation of AI technologies.


14 ELECTRICAL ENGINEERING • JUNE 2025


electricalengineeringmagazine.co.uk


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52