HEALTH & SAFETY
T
he 2010 Deepwater Horizon disaster in which 11 people died, dozens were injured and around four million barrels of oil were spilled into the Gulf of Mexico,
has gone down in history as one of the worst events of its kind. When the crew of Transocean’s Deepwater Horizon fl oating drill rig lost control of the Macondo oil well and the escaping gas and oil ignited, this was the culmination of a staggering catalogue of errors. It was, as Earl Boebert and James Blossom describe it, a failed engineering project. T e two senior systems engineers have written a book that provides a highly detailed account of the oil spill. In this book, entitled Deepwater Horizon, a systems analysis of the Macondo disaster, the authors challenge the commonly accepted explanation that the crew, operating under pressure to cut costs, made mistakes that were compounded by the failure of a key safety device. Instead, the book reveals that the blowout emerged from corporate and engineering decisions that combined to create the disaster. It provides a fascinating and thorough account of the disaster that explores the complex relationship between technology, humans and their interaction.
DESTINED TO FAIL When planning the project, its operator BP had estimated the cost to drill at $120.6million and for it to take 98 days. Yet barely anything actually went to plan. T e book explores how both technical compromise and what the authors label ‘go fever’ united to create the disaster. It also highlights how both the original drilling plan and its peer review failed. Part of the reason for such failure is bluntly explained in the book: “‘Risk’ clearly meant the same thing to the
[safety] reviewers as it did to those who produced the register: risks to the timely and economical completion of the project, not to the Horizon, the lives of those aboard or the ecology of the Gulf.” It’s unsurprising that an insuffi cient plan
didn’t work out well. But this was far from the only problem. T e book also explores the delays that led to the crew being put up against a far tighter deadline than planned, and mentions equipment that was ageing, in need of repair or just plain not fi t for purpose. It also cites a more human factor, that “the crew of the Horizon had a tendency toward complacency when not actually ‘making hole’.” It’s well worth reading the book to get to grips with the immense technical detail it covers with regard to the drilling operations side of things. And it’s also worth asking what we can learn from this disaster to prevent such engineering failures in the future. Which is where the book’s co-author, Earl Boebert, comes in. When asked to try and break down such a complex scenario into just a few bullet points on the key engineering lessons to be learned, Boebert proff ers a plain-speaking answer. “Actually, we think there is just one lesson: over the decades engineers concerned with emergent properties such as safety have evolved a process that for lack of a better term can be called ‘engineering discipline’. T is process involves planning, reviews, testing, management of change, updating of risk registers and other techniques to maintain control of the interactions between elements. T e lesson of the Macondo disaster is that in high-consequence environments, ignoring this process places everybody in that environment in very great peril.” In this case, though, there was a huge list of factors that contributed to the
www.engineerlive.com 41
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52