BTS | HARDING MEMORIAL LECTURE
Human error Deliberate
Example reasons: • Peer pressure • Conflicting demands
• Precedent • Unaware of consequences
Accidental
Example causes: • Distraction • Fatigue • Stress
Inadvertent Systematic
Organisational dificiency
Example: • Lack of resources • Lack of training • Conflicting responsibilities
Built-in
Example: • Incorrect rule
owner, increase operational and maintenance costs, and even unintentionally compromize safety.
● There could be a loss of confidence in the project, even if just in the short term, and reputational damage could potentially impact everyone involved.
Disruption will be compounded when the hunt for the party at fault begins. In many, but not all cases, this can lead to finger pointing, and long battles where every side starts employing their own teams of lawyers, solicitors and technical experts within an adversarial process to proportion blame.
Above, figure 3:
Characteristics of Human decision making mistakes
Within the accidental errors category, almost anything
that is random can be placed - for example a numerical mistake. The factors that might contribute to making an accidental mistake include: ● Distraction ● Fatigue ● Stress ● Even just how the person is feeling, or the general mood at that time.
Systematic errors are more pervasive and are often the result of organizational or system deficiencies. Factors that allow systematic errors to creep in might include: ● Complexity and distraction – in this case possibly a result of insufficient staff.
● Lack of training to undertake ever more technically complex or new tasks.
● Organizational constraints - such as confusion over limits of responsibility.
Unfortunately, a person’s level of intelligence does not necessarily make them less prone to making mistakes, and some research even suggests that it might make them more prone in some circumstances. We need to be more aware of how our minds operate, along with more understanding concerning the influences, both internal and external, that cause people to make mistakes in order to help reduce error and understand them better when they occur.
THE CONSEQUENCES OF AN INCIDENT The most obvious immediate consequences that impact a project following an incident are that there is something to clean up, a problem to solve and an issue to rectify. ● This takes time, incurs cost and uses up resources, including any correction or repair, the labour and materials, but also the potential loss of revenue as a result of a delay.
● The error may introduce a functional weakness, which, even if accepted by the project, might mean that the system is operating at a sub-optimum level. This could increase frustration for the user and the
50 | Winter 2023
AN ADVERSARIAL PROCESS At its worst, our adversarial process follows the rationalization of finding and denigrating the persons at fault (losers) and sanctifying everyone else (winners). Winners and losers can be determined by the strongest, or possibly in some cases the most expensive, arguer. It has been shown that in a trial for example, the jury are more convinced by the person who sounds most confident rather than by the facts themselves. And who is to say that this doesn’t affect more than just jurors? The process to attribute blame can be a major
distraction from other work and it doesn’t necessarily achieve unequivocal acceptance from all parties as to the fairness of the result. Everyone becomes entrenched in their own opinion, takes up defensive positions, and there are numerous engineers, lawyers and solicitors for each party ratcheting up the cost. The adversarial blame process often ends with
results being confidential, judgements being slanted by contractual rather than technical arguments, or even the persecution of individuals for only-too-human mistakes. We seem to have created an adversarial process that thrives on stances of moral superiority. We also appear to have done this deliberately. Perhaps it is time to adopt a different approach to
achieve more open results and enhance the learning opportunities.
A ‘NO-BLAME’ PROCESS This type of culture is already supported by various bodies including within the aviation and maritime industries. It is also the process behind the Rail Accident Investigation Branch. In the aviation industry it is generally known as a ‘Just Culture’ and the basis is that it doesn’t place blame where honest mistakes within someone’s training and experience have occurred. A Just Culture approach concentrates on the facts and
is a more open learning opportunity and is more likely to disseminate the results to those who might potentially make similar mistakes. It doesn’t focus on who to blame and therefore avoids defensive positioning and accusations. This is not an easy option:
● It requires a conscious decision for adoption at the start of the project and high levels of discipline to accept the findings if invoked.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57