search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
COMMENT


Meta is building a data centre the size of Manhattan. How


cool is that? Meta is spending hundreds of billions of dollars on a string of multi gigawatt AI data centres. Writes Kevin Roof, director of offer & capture management at LiquidStack.


T


hat’s not hyperbole. Mark Zuckerberg in July said that its Hyperion data centre would “be able to scale up to 5GW over several years, and “multiple more” of these titan clusters would be built.


 of the footprint of Manhattan,” the Facebook founder said. To prove his point, he posted a gif that illustrated precisely how one of these  of land that is home to 1.66 million people. But if running a city like Manhattan is a challenge, keeping it cool is even harder. Manhattan is a classic example of the phenomenon where cities are several degrees hotter than surrounding areas, thanks to the city’s “built environment.” [urban heat island effect]


And that’s without running millions of GPUs 8


and associated equipment. We know that data centre operators already face a massive challenge when it comes to managing the heat their facilities produce – and calming the public’s concern over their potential environmental impact. So, how should Meta approach the cooling challenge for these truly titanic data centre? Here are some thoughts based on our experience of cooling at scale.


Calculating the options Current cutting-edge data centre designs stretch into the 100s of kWs, and we can safely assume that Meta is laying its plans with NVIDIA’s roadmap in mind, and that envisions 1MW racks.


Traditional evaporative cooling is considered a non-starter for data centre


NOVEMBER 2025 | ELECTRONICS FOR ENGINEERS


of this scale except for peripheral activities. Just the amount of water needed would be astronomical, tens of millions of Olympic sized swimming pools.


So liquid cooling is the most practical option for the majority of cooling in these next generation data centres.


But what type of liquid cooling? Immersion does have an edge when it comes to pure cooling potential. But is complex and  most likely option here. Indeed, NVIDIA’s own reference designs lean towards direct to chip liquid cooling for its cutting-edge designs.


Opting for direct to chip liquid cooling is the easy part though. Meta’s engineers then face the challenge of implementing it at titanic scale. Taking the right approach from


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46