This page contains a Flash digital edition of a book.
data centres Chad Harrington, VP of marketing at Adaptive Computing

hardware or space. Te overall price of servers – certainly in terms of per unit of compute – is declining, yet hardware remains the biggest expense. In time, however, the ancillary costs may take over as the main investment. Power costs, for example, are continuing to rise on a global scale and much of a data centre’s power is consumed in powering the cooling equipment. Data centres produce a lot of heat and many of them are implementing cost- reduction strategies such as utilising free air cooling instead of air conditioning. Other operations are running higher


voltages, which results in fewer power losses. Power is also lost each time a voltage or type of power is converted and so larger data centres are investigating running DC only instead of AC. Tere are also a range of other

ithin the data centre, everything comes down to cost; be it in terms of the power consumption, cooling,

technologies that remove the number of steps that need to be taken when converting power, which result in higher efficiencies. On the space side, it really comes down to having the flexibility to find a cheaper location. Unfortunately some scientific centres need to locate their compute power in a specific area and are therefore faced with paying the


going rate. Other have more freedom – the US National Security Agency (NSA), for example, is building a $2 billion facility in Utah, which is much cheaper than Virginia or Maryland,

David Power, head of HPC at Boston

costs – both in terms of energy consumption and increasing demand for space and power. As energy costs rise along with a heightening concern for the environment and carbon footprint, IT decision makers have a new responsibility in addressing energy efficiency concerns in the IT infrastructure. Companies have always been focused on


business growth, but what if their data centre can’t support expansion due to power and cooling limitations or spiralling energy costs? Te double impact of rising data centre energy consumption and rising energy costs has dramatically heightened the importance of data centre efficiency as a means of reducing costs, managing capacity and promoting environmental responsibilities. Data centre energy consumption is largely

driven by demand for greater compute power and increased IT centralisation. While this heightening demand was occurring, global electricity prices also increased – by a staggering 56 per cent between 2002 and 2006. Te financial implications, as you would expect, are significant, with estimates of annual power costs for US data centres, for example, ranging as high as $3.3 billion.


he biggest issue our customers within the data centre arena have been faced with in recent years is the consistent rise in

So, in terms of energy consumption within

the data centre, what accounts for such a large demand for power? In a recent study by Emerson Network Power that looked at energy consumption within typical 5,000 square-foot data centres, the likes of power hungry, x86- based servers and storage systems account for nearly 60 per cent of total consumption. And a major factor behind the massive appetite of

Tis means that data centre administrators

are currently wasting phenomenal amounts of their limited fiscal and power budget on processing inefficient transactions. Over the last few years, numerous IT companies have proposed virtualisation as a potential solution to this problem. Tis is based on the idea that you can increase efficiency by running more than one application on a single server.



these hardware platforms is the overworked processors. With the latest generation Intel Xeon E5-

2600 series processor averaging a 95W TDP and the fastest models peaking at 150W, the CPU is considered a significant pain-point of a server’s fierce power consumption within the data centre. Tis is for the simple reason that processor architects continue to add more features and performance to each new design. However, multiple studies, from the likes of Microsoſt and McKinsey, have consistently shown that most server CPUs typically only run at around 10 per cent capacity in daily use.

Boston, however, looked to develop a

completely new server – one designed to run a single application as efficiently as possible. It uses an alternative processor architecture to the x86 as a way of reducing power consumption and heat dissipation. Te Viridis platform features low power general purpose System- on-Chips (SoCs) – used in Calxeda’s ARM- based EnergyCard – that deliver an order of magnitude improvement in performance-per- Watt. With each SoC using as little as 5W, this equates to a fully populated server – within a 2U enclosure – consuming 240W plus the overhead of disks (so ~300W for 48 quad core servers).

where most of their facilities are located. Historically,

high- performance computing (HPC) data centres have had the largest footprint, but they have been outstripped by those operated by internet giants such as Google and Facebook. It’s interesting that, as an industry, we are now learning from these companies. Facebook, for instance, is using evaporative cooling in its data centres and locating them in a dry climate and Google has selected a site in Europe where sea water can be used for cooling. In the future, we believe that HPC data centres will be able to learn a lot from what these companies are doing and that the largest will closely resemble their operations.

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52