This page contains a Flash digital edition of a book.
OPINION Your views from across the built environment SPOTTING THE BUG IN THE BMS


A building’s control systems are crucial to its safe and secure operations. But, asks David Fisk, do we worry enough about computer viruses that could potentially put these systems at risk?


A chance remark from Chris Hankin, who runs Imperial


College London’s Institute for Security Science, set my head spinning. Just how well could our systems perform if they were subject to a cyber attack? The answer might be: not at all. The story begins with a now infamous computer virus, stuxnet. Most computer viruses steal passwords and personal data. But could one be written to infect the software of process controllers? It would not be easy: stuxnet has 15,000 lines of code, and there seems to be no criminal advantage that would justify the effort. But stuxnet stopped the Iranian nuclear enrichment programme for a while. The word is out that infecting process controllers could be done, and cyber- terrorists might want to do it. Not every facilities manager is running a building that is under a high security threat. But, where they are, owners have often spent large sums of money on finding safe locations, installed blast-proof glass, provided security access, bought four days’ worth of fuel for standby power, and so on. The same attention is not always lavished on the humble building services system that heats, cools and lights all this investment. Yet, ‘upstream’, people are taking the


cyber threat to the smart grid seriously. The US will be spending several billion dollars a year on protecting it from cyber attack, and ASHRAE is looking at standards. But hold on. Isn’t the smart grid supposed to be linked to the smart meter, and the smart meter to the building management system (BMS)? And, if you are ambitious, the BMS is linked to the enterprise’s platform, and that probably to the ‘Cloud’? Has not


www.cibsejournal.com


(see www3.imperial.ac.uk/ lorsystemscentre/workingpapers), but IT is a compliance culture and the protection will not be included unless it is asked for. Indeed, it is just as unwise to presume that IT systems will always be protected as it is for a householder to suppose that a determined burglar can be kept out. System providers will do their


The threat from cyberspace for building services is in danger of being neglected


the testing group KEMA just criticised the all-singing, all-dancing UK smart meter specification for not addressing security threats? We have had some clues to this


The real services engineering question is: what would happen if the system protection was breached?


threat. To name just one, in 2003 a bug in communications software led to a single unexceptional line fault resulting in the whole north-east of the US being blacked out for four days. The first BMSs were hard-wired monitoring systems, giving oversight of a largely manual process. Digital data communication then opened up vast possibilities, but initially in a system safely quarantined from the enterprise itself. But the attractions of upgradeable control software and access to BMS data within the enterprise network are clear and no one wants to lose those advantages if they can, for a threat that might never happen. So the risk level has been creeping up while IT security people have been worrying about other things.


Agreed, you can now insert firewalls and the like in these systems


very best – stuxnet can be patched, for example – but the real services engineering question is: what would happen if the system protection was breached and the digital control was compromised? What is needed is some ‘fall-back’ engineering, a simple principle; everyone takes the lift in a 20-storey building but they are glad the back stairs are there when they can’t. So the simple question is: can the building’s services be run ‘hands on’ if the BMS has for some reason or other decided to take on a life of its own? Those neat, naturally-ventilated, naturally-lit, B-rated buildings look the least at risk, while the energy-guzzling, G-rated ones appear to be most at risk. Normal practice would be to program a device to go to ‘fail safe’ and shut down if it detected that something was awry. That is no great help if the system is a server farm or a hospital theatre. Sometimes, at the cost of a few pounds, a simple extra manual valve would be enough to do the job of getting something back on stream – worth checking out if you are (justifiably) paranoid in a world where the Foreign Office is convinced that cyber warfare is here to stay.


l Professor DaviD fisk is director of the Laing O’Rourke Centre for Systems Engineering and Innovation at Imperial College London


December 2011 CIBSE Journal


17


Shutterstock


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68