This page contains a Flash digital edition of a book.
data centres Jim Hearnden, IEEE member and enterprise technologist at Dell E


urope is leading the green charge and, while eco-consciousness is certainly a part of that, it’s mainly a result of


the fact that utility prices have risen over the years. Te drive to economise has become a necessity, rather than an ideal. Energy prices are also now much higher in the United States and our American colleagues are embracing the trend and looking to EMEA as a steer. Another defining factor is that hardware is now running at a higher temperature. Traditionally, IT hardware would run at around 19°C, but x86 hardware has stretched that by quite a margin and, even by generic standards, data centre operators are looking at 25-27°C. Tat said, there are a vast amount of people out there with hardware that doesn’t run on those temperatures; they’re running at 21°C and that’s costing a considerable amount of money. Tere are companies, such as Dell, that have hardware at the telecoms NEBS (Network Equipment- Building System) standard which means that it can run at 35°C all the time. Tis is a significant change and various influencing factors – some legislative, some behavioural – also seem to be pushing the market in this direction. In northern Europe, this will mean many data


centres will be able to run without any cooling infrastructure – they simply need to move enough cool air around the facility. Tis ability to run without chillers enables organisations to save on not only capital expenditure in terms of the infrastructure, but on operating expense as well. I do suspect that data centres in the US will follow this example as not only does it make financial sense, it reduces the major issue of downtime as a result of cooling failure. Power is another determining factor for data centres. Power and cooling are very strongly


interrelated, because the power put into the server is exactly what is needed to cool within the infrastructure. Most companies are looking very closely at their power budgets and are painfully aware of how much power is needed and what it will take to provide adequate cooling. From that point of view, the power race has been replaced by a focus on performance per watt as efficiency has become the main concern. Te situation is very similar to what has occurred in the automotive industry; manufacturers previously aimed towards


THE POWER RACE HAS BEEN REPLACED BY A FOCUS ON PERFORMANCE PER WATT AS EFFICIENCY HAS BECOME THE MAIN CONCERN


making the most powerful car, but now the focus is on developing the most economic, in keeping with market demands. Te same is true of data centres. If you can come up with some sort of hot


and cold separation, there are some very quick wins in the data centre, such as blanking panels, and all of them offer massive leaps in efficiency for a relatively small cost. Tese solutions have yet to be implemented on a wide scale because while many people are aware of them, they aren’t aware of their true implications in terms of the improvement to efficiency. Te situation is that the gains are viewed as marginal at best and with mounting pressure simply to get the job done, these considerations are very oſten pushed to the side. Te sheer pressure


Andy Watson, VP of Life Sciences EMEA at IntraLinks


stored and managed internally, to the realisation that soſtware-as-a-service (SaaS) is a good solution. Companies don’t want to sink lots of capital into servers and hardware or spend time running validation routines and so are turning to reputable vendors who they can audit and rely upon to do all of that for them. Te outsourcing approach can not only reduce costs, but also enables companies to be secure


T 22 SCIENTIFIC COMPUTING WORLD


here has been a real mood shiſt within pharmaceutical companies away from the view that all information should be


in the knowledge that their systems are moving forward in a validated environment. Tere are some organisations that may be concerned about a lack of control, but the current change of perception has led to the word ‘vendor’ quite oſten being replaced with the word ‘partner’. Te main obstacle for organisations is that the


distinction between a public cloud and SaaS, or private cloud, is quite oſten misunderstood. Te key is that we’re not talking about something like a proprietary solution where movies and photos are hosted, we’re talking about


somewhere that data is safe and secure on an on-going basis. We host in two locations in the US and two in Europe to offer our customers peace of mind in terms of where the data is residing and we use SunGard to host the servers that we manage and maintain. It’s important that we offer hosting in Europe as many European organisations want the reassurance of knowing that their data is being retained here. We also have a robust process in terms of encrypting the data and undergo penetration tests on a regular basis to maintain security.


www.scientific-computing.com


of keeping the kit running should not be underestimated and sufficient resource planning is absolutely vital. Maintaining good operational and data centre practices can offer a way of making steady improvements by ensuring that it’s a concurrent process – for example, every time a piece of hardware is taken out of a rack, a blanket panel should be put in. Of course, another issue is that operators


can become entrenched in a ‘small data centre’ attitude and will continue to do things in a certain way, simply because it’s how they have always been done. Essentially, they believe that if everything worked when there were one or two racks, it should keep working when there are more. Te best advice is to take a step back and take advantage of all the available information. But be warned, there is a considerable amount of what I like to call ‘green wash’ out there – things sound convincing, but when you get down to the nuts and bolts of the suggestions, they aren’t valid by any stretch of the imagination. Some of the advice would also have very limited payback. Te best way to differentiate between the good and the bad is to talk with industry peers and find out what’s working for them. Reducing energy costs is fundamental to everyone and ISPs (Internet Service Providers) have this down to a fine art. Most organisations are proud of what they have been able to achieve in terms of increasing the efficiency of their facilities and will be more than willing to discuss it.


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52