In this respect, LiquidCool is similar to most of the cooling solutions in that neither power nor hard-drives are directly cooled. According to Rich Whitmore
of Motivair: ‘We design our product to remove 100 per cent of the heat from the rack.’ Te new £97 million Cray supercomputer being installed at the UK Met Office will use Motivair’s ChilledDoors product to remove the heat which can reach up to 45kW per server rack. Whitmore pointed out that:
‘As efficient as on-chip cooling can be, you are leſt with a not- insignificant amount of heart that gets rejected to the space and then needs to be cooled by traditional air conditioning. When you have these large clusters coming out now hitting 45kW per rack, if you take 30 per cent or more of that heat and put it into the room, then it is a significant challenge to air conditioners. When we put the ChilledDoor on, we are removing 100 per cent of the heat. We create this “heat neutral” environment, so that heat never leaves the server rack.’ Te rack cooling system is an active, rear-door heat exchanger that is mounted directly at the back of a standard server rack. It takes the heat out of the air and is capable of removing up to 75kW per rack, using cool but not chilled water. Te technology solves two
problems that, Whitmore believes, tend to be underestimated in the HPC community. Te first is that the doors are ‘rack agnostic’, and the second is that they are scalable. If a university is going to buy a cluster and they decide to do on-chip cooling then, when that cluster gets refreshed, the cooling system leaves with it and they are faced with another cooling dilemma, he pointed out. Since Motivair’s door is part of the rack system, ‘if they refresh and go with a different brand, the same cooling system can manage different server
www.scientific-computing.com l
manufacturers. It’s not just rack agnostic, It’s OEM computer agnostic. Tat oſten gets overlooked and people are going to find this out if they go to on- chip cooling,’ he said. ‘Te scale that is coming
down the line is really quite remarkable. We see some of these chips that are coming out – some of these manufacturers can provide chips where you change the chip out and add 30 per cent to the compute power, just by changing the chip. But it creates a significant challenge to the facility – and unlimited possibilities to us – on the cooling side, because somebody has to remove that heat. We see scale as a real challenge, which is why the products we are working
YOU MUST BE
PREPARED FOR SCALE AND YOU MUST BE ABLE TO REMOVE 100 PER CENT OF THE HEAT
on are designed to scale with the clients. We see great opportunity – the heat is incredible.’ Motivair’s technology does use
inlet water that is colder than the others, but Whitmore points out that it still saves energy compared to normal facility air conditioning which uses 7C water: ‘We’re up at 15 to 17C. In the UK, the difference between standard air conditioning plant and our cool water is 30 per cent in energy efficiency.’ Where free cooling is feasible, then it is 70 per cent more efficient than a standard air conditioning system, he said. Te temperature rise across the door is such that the outlet heat can be reused within the building. He concluded: ‘What we drive
home to the market is you must be prepared for scale and you must be able to remove 100 per cent of the heat. If you can’t do that, you are not really solving the problem of cooling’. l
@scwmagazine
ChilledDoor
®
Trusted by the world’s largest supercomputers 75 kW cooling capacity
• Removes 100% server heat at its source • Ultra efficient • Rack agnostic • Low profile footprint • Built-in scalability • Inherent redundancy • Optional Cooling Distribution unit (CDU) • Made in the USA
Contact Motivair™ A Division of for details
US Headquarters: 85 Woodridge Drive l Amherst NY, 14228 Tel: 716-691-9222 l Fax: 716-691-9229
www.ChilledDoor.com l
www.motivaircorp.com
HIGH DENSITY RACK COOLING
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56