This content requires Macromedia Flash Player version 8 or later. Get Flash.
If you believe you do have Macromedia Flash Player version 8 or later installed, there is a problem with your Flash installation and we were unable to detect it. Please follow the solution in Adobe TechNote 7d1862a to resolve this issue.
Bath your computer in baby oil
Tom Wilkie weighs up some of the options for
improving energy-efficiency in high-performance computing
O
ne path to better energy-efficiency in high-performance computing (HPC) appears to lead back to the future. In the early 1980s, when Seymour Cray
was designing the iconic Cray 2, he tackled the issue of heat generation by submerging the electronics in liquid. Today, 30 years on, several companies are offering modern versions of liquid cooling to try to improve both the compute performance and the energy economics of modern HPC. Cray chose a Freon-based cooling system,
called Fluorinert. Today a British company, Iceotope, is offering convective cooling with 3M’s environmentally-friendly Novec, while two US companies are cooling computers by using a liquid not too dissimilar to baby oil. Te idea that future computers may be bathed
in baby oil may seem surprising, but more surprising still is that concerns about electricity consumption appear to be relatively recent. Until the past five years or so, the concern of most supercomputer centres was to squeeze the maximum number of flops that they could out of their machines. Even today, efficiency in HPC is oſten regarded as an issue of keeping utilisation levels up to or more than 90 per cent, regardless of the power cost. Tis opens a soſtware-based approach to energy efficiency – whereby the scheduling soſtware can shut down nodes that are not doing any processing, or run jobs so as to minimise power consumption, and where application soſtware too is formulated, to make the most efficient use of the machine. But in the longer term, the route to energy
efficiency in HPC may lie in switching the underlying chip technology. Nvidia’s GPU technology has shown how unconventional chips can find an important place in HPC,
30 SCIENTIFIC COMPUTING WORLD In a 42U rack, Supermicro 1026GT-TF servers get the Green Revolution Cooling treatment
setting a precedent for inherently energy- efficient chips, available cheaply as commodity items, such as those used in mobile phones or tablet computers. Tus, instead of HPC being the trailblazer of new compute technology, it may be the beneficiary of technology developed for the mass consumer market.
Refrigeration Nonetheless, Richard Barrington, business development director of UK-based Iceotope, warns there is a risk that as chips become more efficient, people will buy more, increase the density of their systems and end up consuming as much, if not more, power. Currently, he said, the industry ‘spends vast amounts of money blowing air at the problem’. At Iceotope,
however, ‘We have taken a long hard look at this and done a lot of work using CFD to understand how could we use a liquid more efficiently – “harvesting” the heat rather than disposing of it. We recognise that heat, potentially, has a value. So, rather than see it as a waste product, if we think about it holistically, we can think of heat as an asset to be redeployed.’ A distinctive aspect of the Iceotope approach
is its modular design, he said. ‘We stick the PCB, the motherboard, inside a sealed container and inside this box we have the 3M Novec liquid which hyper-convects the heat.’ One of the problems with the old Cray design was that its heat transfer properties depended on the fluid boiling off. In contrast, Iceotope’s fluid ‘does not phase-change into a gas; we are working in a
www.scientific-computing.com
Image courtesy of Green Revolution Cooling
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52