This page contains a Flash digital edition of a book.
HPC cooling using up to 75o F water.’ Addressing concerns

that exist surrounding the introduction of water into an electronic device, a Motivair Leak Prevention System (LPS) is standard equipment. On detection of water under a Chilled Door the Motivair LPS simultaneously switches the fans to full speed, isolates the water supply to the door and sets off the alarm locally and remotely. Te adjoining Chilled Doors automatically

increase cooling capacity to manage the additional load, while the Chilled Door in LPS alarm continues to cool the rack with maximum flow of room temperature air.

Chilled Door access panel for fans and controls

the total computing power.’ Best said that the technology lowers the power consumption of the server and the power usage effectiveness (PUE) of the system, and provides compelling value starting at 6 kW, per rack. Cluster specialist ClusterVision is working

with Green Revolution Cooling to design and engineer customised solutions using the full contact oil cooling technology. ClusterVision’s Alex Ninaber explained that the combination of the high thermal conductivity and the intimate contact of the fluid to the server components creates a very effective heat dissipation. Te coolant returns to the rack unit at a

reduced temperature, lowering the overall temperature of the rack and delivering consistent and uniform cooling to the servers. ‘While this is not necessarily a solution for all of our customers’ current cooling challenges, we are already seeing that this new technology can deliver both a highly effective cooling performance and energy- efficiency gains in excess of 90 per cent over standard air-alone methods,’ said Ninaber. Taking an alternative approach is Motivair,

whose Motivair Chilled Door solution is an active rear door cooler, which includes a chilled water cooling coil; multiple, individually fused EC fans; motorised water valve; and a PLC to control air and water flow with remote communication via LON, BACnet or Modbus. Graham Whitmore, president of Motivair, explained: ‘Te multiple, hot-swappable fans provide total airflow redundancy. Te Chilled Door has the ability to match the maximum server airflow and automatically modulate both air and water flow to precisely match changing server loads. Cooling capacity is currently up to 45 kW per standard 600 mm x 42U rack, l

Hot or cold? Te temperature of the liquid is an important factor, as Asetek’s Stephen Empedocles explained: ‘Te difference between warm and hot water cooling is not so much in the server cooling efficiency – in both cases they can capture 100 per cent of the heat. Te difference is the ability to recover and re-use the waste heat. Te hotter the water, the more energy can be extracted.’ Asetek’s direct to chip solution uses liquid running with an input temperature of 105o and an output of 140o

F F. An integrated micro-

pump cold plate transfers heat directly from the microprocessors into the liquid, and because the unit is the same size as the heat sink it replaces, it’s a simple drop-in installation. Tere are also thin liquid channels that fit

between each one of the memory cards in the server. Te hot water goes into the rack-cooling distribution unit, which fits on the back of a server rack and has a series of liquid-to-liquid heat exchangers. A secondary loop picks up the heat and removes it from the building, where it can be cooled by the external ambient temperature. Also taking the direct

contact liquid-cooling approach is CoolIT. Geoff Lyon, CEO/CTO, commented that the company’s solution differs from those using chilled water and large heat exchangers, either in a rear door or in-row configuration. Cold plates and pumps

are utilised, and liquid heat dissipation is used directly on the CPU, GPU, RAM, etc. Lyon added that one current design challenge is that it can be difficult to make solutions compatible with an architecture that has evolved in an air-cooling realm.

@scwmagazine Te fact that the industry has previously relied

so heavily on air cooling presents somewhat of an adjustment for those considering liquid, as highlighted by Paul Wright, president of LiquidCool Solutions: ‘Tere are concerns around the introduction of water into any electronic device. ‘A water-based system can have hundreds

if not thousands of potential failure points; all it takes is one small leak to create a significant problem.’ LiquidCool Solutions (LCS) employs directed

or ‘intelligent’ flow submersion cooling of virtually any electronic device, not just servers,


using an eco-friendly, non-volatile dielectric carbon-based oil solution. ‘In a worst-case scenario, if we were to have a

leak, you would have a small spill to wipe up. We don’t need to introduce water into the device or facility,’ he added. Wright also commented that facility footprint can be reduced by 50 per cent or more, and the energy needed to operate and cool IT equipment can be reduced by 40 per cent or more. Offering further reassurance with its solution






CoolIT Systems

Green Revolution Computing


LiquidCool Solutions


is Iceotope, whose coolant is not a hydrocarbon – in fact, it’s more like an extinguisher than anything else and is actually used in fire- suppressant systems. Tis primary coolant is in contact with every single component that generates heat. As an ultra-convective material, it passively regulates itself in terms of flow rate, without the need for pumps. Despite all the benefits

outlined in this article, liquid-cooling still has a lot of ground to cover in terms of adoption. However, as Wright

commented: ‘Liquid-cooling is beginning to gain traction; no-one denies the physics and advantages of liquid over air in removing and handling heat. ‘Even though the industry

is averse to change, as most industries are, liquid-cooling’s day is coming.’

JUNE/JULY 2013 39


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52