This page contains a Flash digital edition of a book.
Interview


4 Everybody today talks about risk, but do we really understand it? And what level of risk is acceptable? To find out, Jon Severn discussed risk management with James Catmur, a director and the global head of the risk practice at Arthur D Little, and co-author of System Safety: HAZOP and Software HAZOP.


4 Tout le monde parle de risques, mais sont-ils réellement compris ? Quel niveau de risque est acceptable ? Pour le savoir, Jon Severn discute de gestion du risque avec James Catmur, un directeur et responsable international de la division Risque d’Arthur D Little, et co-auteur de System Safety: HAZOP et Software HAZOP.


4 Heute reden alle über Risiko – aber verstehen wir es auch wirklich? Und welches Ausmaß an Risiko ist akzeptabel? Um dies herauszufinden, besprach Jon Severn das Thema Risikomanagement mit James Catmur, Director und Global Head des Unternehmensbereichs Risk Practice bei Arthur D Little, und Koautor von System Safety: HAZOP and Software HAZOP.


How do you manage risk when you do not understand it?


G


iven that it is unfeasible to make anything 100 per cent safe, residual risks posed by products and systems must be acceptably low. But what is an acceptable level of risk, and


how do you manage the risks? James Catmur says that the first step in risk management is to recognise that you do not understand risk (Fig. 1). “Almost nobody understands risk, as it is a fairly


difficult concept. We are also all poor at estimating risk day-to-day and in our work. I have worked in the field of risk assessment and risk management for over 20 years and I know I still get risk estimation wrong, both in day-to-day life and professionally. The only advantage I have is that I often spot myself doing it and can tell myself why I did it.” There are several reasons why people do not


understand risk and either over- or underestimate risk levels, as Catmur explains: “Risk judgement is often based on ‘gut feeling’ so that any formal risk assessment process usually ends up justifying the gut feeling rather than being totally objective. Also the ‘example rule’ says that when people estimate risk they will overestimate the risk of events that they have witnessed and underestimate risks they have not witnessed (so engineers in a company with a good safety record will tend to underestimate risk while experienced risk experts will err the other way). There is also the ‘anchoring’ effect:


for example, if you think about something totally unrelated that involves low numbers (giving a low ‘anchor point’) when you try to estimate something you will subconsciously pitch low.” Other reasons behind poor estimates for risk


include the ‘rule of typical things’ that leads to misjudgements of the likelihood of events that people think are more plausible, and the ‘Good-bad rule’ that says people rank good things as low risk but bad things as high risk. Catmur also warns us to watch out for ‘black swans’: “If you were to plot the distribution for the colour of swans you would find a Gaussian distribution of the different shades of white; and you would conclude that black swans would never occur – yet we know that black swans do exist. In the world of risk management, black swans are those events that nobody could have predicted, though when they do occur they can be catastrophic.” Another point worth highlighting is the confusion


that exists between hazards and risks. “People often use the term ‘risk’ when they talk about hazards (such as moving machinery),” says Catmur. “When you create more operating modes or features that have additional hazards associated with them, you may need to add control measures to reduce the existing risks. As a result, the overall risk can be reduced and the situation made safer, even though the total number of hazards has increased” (Fig. 2).


Where to start


Fig. 1. James Catmur says that the first step in risk management is to recognise that you do not understand risk.


6 www.engineerlive.com


Leaving aside the semantics, peoples’ inability to understand risk leads Catmur to caution against starting a project by estimating the risks: “If people go straight into risk estimation they often get it wrong, which has implications for the design of the product or system. It is better to just focus on the outcomes and bear the following questions in mind as you work on the design: What could go wrong? What are the possible consequences? Are you building in enough barriers for the higher- consequence hazards? Will those barriers last throughout the life of the product/system? Will those barriers be effective however the product/ system is used or abused? “Only after addressing those points should you


ask whether the risk is being controlled.” Returning to the question of how to establish the acceptable level of risk, Catmur states: “You almost certainly cannot define a totally clear acceptable


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60