Written by Andy Smith, MTM publisher

Please pass this on to your colleagues with a safety, quality or risk job title; and anyone you happen to know in the C Suite!

Human Factors and Healthcare

A couple of weeks ago I was privileged to attend a meeting of experts in surgical simulation. It was of excellent quality and represented the best of healthcare; until the subject of learning from the aviation experience was raised, for the third or fourth time.

At that point an eminently qualified surgeon made the usual healthcare excuse for ignoring outside expertise, saying, “the human body is far more complex than an airplane.” I suspect that many of you have heard the same comment. It is simply designed to close the conversation. After all we may learn something that is inconvenient for us; namely that to work safely requires an admission that we are all human, that humans do make errors and need support. Above all that we all need to change our ways.

Three days later I was also privileged to attend a CAE Onboard Healthcare event demonstrating how airline procedures and human factors training, CRM, or crew resource management, is used to mitigate the effect of human error in the airline sector.

After a study of the Tenerife air disaster of 1977, we were asked to work out what went wrong and why. We then had a short briefing, and armed with a laminated reminder sheet, three non-pilot healthcare innocents were put under some potential error inducing stress in a Full Flight Simulator.

Scheduled to land at San Fransisco, that airport was closed as we approached, requiring us to divert. While managing that, a cabin fire with injuries was announced requiring us to divert again, yet despite us all being in a totally alien and new environment we flew and safely landed an Airbus A320 into LAX.

I doubt our Flight Instructors/Air Traffic controllers were over impressed with the process we took to get there but it was close enough to the instruction we had received for us to manage the situation and get the aircraft and its imaginary passengers safely onto the tarmac.

We all made errors during the ‘flight’ but acting as a true team we overcame them. The key difference is that the airlines expect errors to be made (to err is human, it’s what we all do) and have designed a safety system to nullify their effect. Healthcare still relies on everyone getting it right all the time.

Small wonder then that the frequency of airline incidents is 1 per million whilst healthcare runs at 1 to 300.

Two additional things struck me at the time. First, when things require a decision to be made in the cockpit there is a procedure to ensure the decision is made using all the resources available; and the most junior member of the team speaks first. That is to ensure that the authority gradient is managed and more than one voice is heard i.e. not only the Captain/Surgeon. Second the looks of horror on the faces of the clinicians in the room when our two airline pilot instructors mentioned that all their actions and communication in the cockpit are recorded.

That of course is to protect them by proving that they did all that they were required to do (whatever the outcome) and had followed all carefully thought out and tested procedures. There is no equivalent in healthcare but there easily could be, and should be, to protect the operating room personnel should anything go wrong. If pilots perform ‘to the standard’ then they are legally protected. If not then of course they are culpable.

The law, when it comes to healthcare, is currently a major barrier to improvement; it prevents promulgation of best practice and stops us from learning from each other.

Finally, some days ago, in another conference room during a superb briefing on IPE (Inter-Professional Education), James Reason’s Swiss Cheese Theory was raised. Only two of the audience of twenty-plus had heard of it. If you have not please go here.

It lays out the role of procedures, and the training that supports them in, ensuring the ‘holes in the swiss cheese’ do not align. As you may have guessed the holes represent potential errors during a complex procedure and if we allow them to line up the end result is catastrophe. This too was featured in our pilots briefing.

Recently I visited the Clinical Human Factors Group (CHFG) website, something I have been meaning to do since a last visit in December.

CHFG is a U.K. organization formed by airline Captain Martin Bromiley after the avoidable death of his wife Elaine during routine surgery. I had seen the video at the bottom of that page before, but it tied my experiences of the last weeks together. 

To close, two other pieces of airline folklore that are regularly discussed when we all meet. Both mentioned during the surgical meeting, the first is verbatim and the other by inference on the role of surgeons vis pilots.

“If you think training is expensive try paying for an accident.”

“There are old pilots and there are bold pilots, but there are no old bold pilots.”