search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
MY 2 CENTS WORTH A CASE STUDY


When I teach and facilitate my one-day crew resource management (CRM) course or my five-day CRM train-the-trainer course, attendees often find it difficult to bridge the gap between the human factors they learn to recognize in the course and actually putting their knowledge into practice in the real world to prevent an accident.


“See it. Say it. Fix it.” is the most succinct and easiest-to-understand definition of CRM that I have heard in the decades I have been a CRM instructor and facilitator.


While you read this article, I challenge you to practice your CRM skills. Consider the excerpts. They are red flags I’ve extracted from the recent 53-page NTSB report CEN19FA072 titled “Group Chairmen’s Factual Report (on) Operational Factors/ Human Performance,” resulting from NTSB’s investigation of the fatal Survival Flight crash that killed three air medical crew members in January 2019 in Ohio. (I wrote an article titled “Just Say No!” detailing this accident in the March/April 2019 issue of this magazine.)


Try to explain why the following are red flags. Why are they dangerous human factors that could be links in an error chain (or a hole in James Reason’s “Swiss cheese” model) leading to that accident? Your awareness — and everyone in your organization’s awareness — of human factors that can hurt you is the essence of what CRM is all about. Once identified, use that awareness to take steps to break that link in the error chain (or plug that “Swiss cheese” hole) before it leads to an accident. The errors can be either active or latent threats in the “Swiss cheese” model. After you successfully identify why those red flags are a danger, ask this important question: “Does this situation or culture exist in my organization?” If it does, it needs to be fixed right this minute.


Now, put on your CRM “antenna” (which means be vigilant to an error chain forming) and act as an aviation forensic scientist, as I highlight events taken from the NTSB report of that fatal accident.


8 Jan/Feb 2020


Helicopter shopping is the potentially deadly practice of calling another flight program to see if it will accept a flight that another program turned down. It was considered a contributing factor in the Ohio crash because two other flight programs turned down the request due to bad weather. The NTSB uncovered a derivative of helicopter shopping, something called reverse helicopter shopping.


On page 16 under the heading “Reverse Helicopter Shopping,” one pilot stated:


They (the Operational Control Center) specifically told me, they were looking at weather turndown and there’s one that was turned down out of Pittsfield, Illinois. “We were going to call that hospital and see if you wanted to take it.”


Why is this a red flag?


On page 39, section 4.17 under the heading “Safety Culture,” it’s noted:


Several former employees had stated that they received multiple texts from current company pilots and med crew stating they were “scared to fly.” One nurse stated that she believed the pilots were safe but the company (administration and management) was unsafe. Several pilots highlighted a lack of transparency by the company on safety issues.


Why is this red flag?


Section 4.17.1 “Pressure to Attempt Flights” reads:


A pilot that had relocated to open the Columbus bases said there was “an awful push to get numbers...it was like they created an environment that felt like a competition, especially when [base] 14 opened up.” He stated that the vice president (VP) of emergency medical service (EMS) stated their flight volume was going to be 150 flights a month, where this pilot considered 30-35 flights per month to be realistically achievable in the new environment. Company management motivated bases to conduct flights by


By Randy Mains


purchasing a massage chair for the base if they flew 30 flights in one calendar month. The count of flights per month was kept on the safety board in the SF14 base. According to the company’s monthly summary, the accident flight was the 26th flight the base would have completed in January.


Why is this a red flag? Section


Time” reads:


Pilots and medical crew stated that the company management wanted pilots to be off the pad within 7 minutes of getting a call for a flight. If the aircraft was not off the ground in 7 minutes, pilots were expected to fill out an “occurrence log” to explain to the DO why they didn’t lift off within 7 minutes.


Why is this a red flag?


Section 4.17.1.4. “Pressure from management to accept flights” reads:


Numerous pilots and medical crew indicated incidents where they were the recipient of or witnessed a pilot being reprimanded or challenged for declining a flight. One medical crewmember said, “The chief pilot of the company...would call within about 10 minutes and would cuss out our pilots and belittle them...saying... we need to take these flights...he would yell so loud on the phone that you could hear it...just standing within earshot.” He continued to say that the chief pilot told the pilot that if the base failed, it would be his fault because he was turning down flights.


Why is this a red flag?


In another case, a pilot declined a flight for instrument conditions:


The lead pilot confronted him about why the pilot declined and said that the reporting station that was indicating IFR was faulty and that the pilot should have attempted the flight.


4.17.1.3. “Expected Launch


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92