FACILITIES design
Equipment updates
Virtually none of the existing infrastructure escaped the refresh. IT equipment was left to UEA to make its own decisions upon but it was keenly aware that it would have to make smart purchasing decisions to support the infrastructure changes. Some things, such as the existing racks, would not change. This meant that there would be a need to create innovative solutions when installing aisle containment.
Power
The entire electrical subsystem was overhauled with new supplies being installed to support the change in cooling equipment and to ensure that there was sufficient power per square foot to support the latest IT equipment. As well as replacing virtually all the existing cabling, new distribution boards were added and all the lighting, both inside the data centre and in the plant facilities were updated to use low power LEDs. A new monitoring system was deployed that makes it very easy for operations to see exactly where the power is being used and to quickly identify any potential leakage or power loss.
Key temperatures Perhaps the most important part of this project was the change to how cooling is delivered. While there were savings from changing the electrical system, cabling and distribution board, cooling would need to manage a greater amount of heat, be more flexible and use far less power. The target temperatures were set at 22C input temperature with a maximum hot air return of 32C. There were key reasons for these temperatures. Given the historic external temperature around the university, it would ensure that for 60 per cent of the year, the university would be able to rely on free air cooling. For the rest of the year, there would be a need to use some degree of mechanical cooling but the amount would depend on the temperature outside the data centre and the loading on the IT equipment.
Aisle containment
The first major change was aisle containment. After a lot of consideration, the decision was made to go with hot aisle containment which would be provided by DataRacks. With hot aisle containment, the room would be cooled with flooded cold air. However, creating the aisle containment was more complex than expected. The racks in use are Rittal and there was no off the shelf Rittal containment solution that would work.
A further complication was introduced by the IT team’s requirements. They wanted the ability to remove an entire rack when they wanted to work on it and the aisle containment had to facilitate this. This meant that the aisle containment could not be secured to the physical racks and it had to be flexible enough to allow additional elements to be added while racks were off being worked upon. DataRacks had to very quickly create a custom solution.
In many data centres, the ‘Manhattan Skyline’ of different sized racks is a challenge that DataRacks is used to contending with. According to their managing director Jeremy Hartley, having all the legacy racks
24
www.dcsuk.info I May 2014
in the university’s data centre all of one height made a welcome change. The next issue was that all of the Ethernet switches are located in the rear of the racks. Hartley, explained: “With the hot aisle now running at 32C this would lead to possible overheating and reduced reliability of the switches. This is a frequent problem with aisle containment and one that we at DataRacks are used to solving. Fortunately, all the switches in use are cooled form side to side. So in double-quick time we designed and manufactured bespoke air-ducts that feed the cold air from the front of the rack to the input side of the switch.”
Another air-duct then takes the hot exhaust away from the switch and vents it into the hot aisle. As well as providing highly effective switch cooling, this also prevents the risk of hot air being vented into the centre of the rack and causing hot spots that would be impossible to effectively cool. This problem is not unique to UEA but is something that many data centres struggle to resolve easily.
Cooling The Future Tech cooling solution uses a combination of direct fresh air delivery supplemented by the use of DX (direct expansion) coolers. As the temperature rises, the number of coolers on-line is increased under software control. This means that power requirements will be kept to a minimum while ensuring that cooling is kept within the target temperatures. The graduated scale of cooling will mean that at full IT load, of 220kW, while ambient temperatures are below 13C, only free air cooling will be used. Between 13C and 22oC, cooling will be a combination of free air with increasing amounts of DX cooling. Above 23C, the primary cooling will come from the DX units. This latter stage, which will take the most power is only expected to occur for around 300hrs each year (just 3.4 per cent of the year). Also while the
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56