This page contains a Flash digital edition of a book.
W: edtechnology.co.uk | T: @Educ_Technology


| CASE STUDY | 61


NEW DATA CENTRE FOR MMU James


Woodward (L) says the new


data centre has contributed 20% of total carbon


savings at MMU. Pictured with his colleague Jeff


Hall (R) and John Thompson of APT (centre)


ABOVE: Hot aisle containment and InRow cooling enable a high density, 


Manchester Metropolitan University has introduced a new, high density data centre as part of a dual strategy to improve the reliability of IT services and reduce environmental impact


Situated close to the city centre, Manchester Metropolitan University (MMU) is the largest, campus-based undergraduate university in the UK, with a total student population of more than 37,000. Sustainability is an important


aspect of the University’s operations. In 2007–8, MMU’s carbon footprint from gas, electricity and business travel was 24,797 tonnes, which cost £4.6m. By implementing a series of dramatic changes, the University is on target to reduce its carbon footprint to 15,600 tonnes, saving £3.8m annually. The University has recently completed


the second phase of a new primary data centre. As a large consumer of energy, it developed a strategy to reduce the operating cost of providing IT services which ranged from implementing power management software for staff PCs to consolidating communications rooms, server rooms and data centre facilities. “As the greenest university in UK league


tables, it’s important that everybody contributes to the sustainability agenda at MMU and the data centre is an obvious opportunity,” says James Woodward, IT Client Services Manager at MMU. “Our consolidation strategy was aimed at improving the efficiency and availability of the data centre, as well as increase capacity utilisation


over the lifecycle of the new facility.” Prior to commencing the consolidation


strategy, IT services were provisioned through myriad server rooms and small data centres spread around the campus. Ensuring adequate power and cooling for the IT equipment had proven to be a challenge as there were issues with ‘dirty’ mains. Investment for the new data centre


was provided by Salix Finance Ltd, but to access funding, Woodward and his team first had to audit the energy consumption of the ad-hoc server rooms and associated equipment. Actual energy use was compared with that forecast after consolidation, making a strong case for the project with a return on investment well within a required five-year period. It was also decided to


locate the new data centre within an existing space on the campus, which saved the cost of a new building and reduced the emissions associated with demolition and removal of old buildings. Built by APT, an Elite Partner to


Schneider Electric, the new 120kW N+1 scalable, modular primary data centre uses APC InfraStruxure with Hot Aisle Containment System (HACS), together with StruxureWare for Data Centers


software. It provides one-hour minimum fire and water resistance and includes a raised floor for services. The HACS system increases the efficiency and effectiveness of the cooling solution and enables higher density IT to be accommodated. A Symmetra PX UPS provides 15


minutes autonomy for the IT in the event of an outage. Physical security is also important and the room is also protected by CCTV monitoring and swipe card access that can be extended to individual racks. “In operations, the use of StruxureWare


for Data Centers software enables us to plan the way racks are utilised, and ensure that we have capacity for new service deployments,” said James Woodward. “We can also monitor energy consumption which is helping to ensure the on-going efficiency and


resilience of the data centre.” James Woodward adds: “Our new primary data centre has had a significant impact on MMU’s carbon


footprint, reducing our overall emissions by 4% and taking a big stride towards our target of a 25% reduction. We’re also seeing annual savings in energy costs exceeding 30% at the same time as gaining beter control over our data centre capacity utilisation.. ET


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74