search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
• • • DATA CENTRES • • •


ENGINEERING THE INVISIBLE: HOW ENTERPRISE IT IS PREPARING FOR AI


The increasing adoption of graphics processing units (GPUs) is reshaping the architecture of enterprise data centres By Giuseppe Leto, Senior Director IT Systems EMEA, Vertiv


his paradigm shift, though progressive, is gaining significant momentum as new artificial intelligence (AI) initiatives are routinely unveiled by medium and large-scale organisations.


T Whether we are talking about digital twins, high


performance computing (HPC), or AI these applications are in continuous, rapid evolution, often centring the conversation on models and platforms. While these technologies are undoubtedly impressive and progressive, their performance is affected by the underlying physical infrastructure. The efficiency, cost of power and location of data processing are critical factors that can determine a speedier return on investment (ROI) on the AI deployment and bring value across the enterprise.


At the same time, AI models often need to act in real time, creating latency and bandwidth demands that the traditional cloud model struggles to meet. Running inference close to the point of data generation reduces delay and improves consistency. This is where edge computing infrastructure becomes relevant, offering more control over how data is handled, which helps with compliance and security.


Edge is becoming a practical choice


According to IDC, worldwide spending on edge computing is forecast to reach $378 billion in 2028, driven by demand on real-time analytics, automation and enhanced customer experiences. Placing computer infrastructure closer to users and devices can remove some of the bottlenecks that hold back AI adoption. This does not necessarily require a full redesign of enterprise architecture, but a retrofit of the existing one. Many organisations are introducing capacity at the edge in targeted ways to support specific services or use cases.


The applications vary. In some cases, it’s all about faster data analytics in a branch network. In others, it’s a Large Language Model (LLM) deployed to train, fine-tune and run inference for generative AI applications. In still others it’s scientific research in fields like physics, chemistry, or life sciences driving breakthroughs in molecular dynamics and genome sequencing. The common thread is the need for infrastructure that can deliver performance without relying on long-distance data movement.


28 ELECTRICAL ENGINEERING • NOVEMBER 2025


This shift changes how systems are planned. Infrastructure teams must now think about power, cooling and connectivity in more granular ways. These areas are becoming deeply interdependent, and weaknesses in one can affect the rest of the system.


AI workloads reshape the


critical digital infrastructure Modern AI hardware consumes more power and generates more heat than previous generations. AI models demand massive parallel computations from specialised hardware like graphics processing units (GPUs) and tensor processing units (TPUs) to process vast datasets. This high computational load requires powerful processors, which by design, use a lot of electricity and, as a direct consequence of their operation, produce a large amount of heat that must be actively removed by advanced cooling systems in data centres. This puts pressure on legacy systems and highlights the need for accurate load planning. The shift from traditional central processing units (CPUs) to power-intensive GPUs has caused rack densities to surge, with some reaching 100+ kW per rack - a dramatic increase from the typical


electricalengineeringmagazine.co.uk


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52