First of all, anyone who says they haven’t made a mistake in their professional capacity is most probably: a) lying, b) ignorant, c) lacking insight or d) all of the above (terrifying condition called “terminal insight failure” – no known therapy only palliative measures). It is from our failures that we learn. I wish that when I was a new graduate nurse in Intensive Care, someone had said:
“You will make mistakes, all that will define you as a health care professional, is how you recover and correct the course of treatment, care or relationships endanger by these mistakes.”
(Injectable Orange, 2013 – new grads you can quote me on that)
Instead, I was left for my first six months thinking everyone I worked with were infallible and had knowledge that spanned every possible critical illness and therapy. A superiority complex seems to exist within many tribes – Intensive Care considers itself the fixer of all other’s mistakes. My question really was who fixes Intensive Care’s mistakes? The family? Community health? The GP? Psychologists? Strangely enough, it is at the point that I realised my superiors were not omniscient, omnipotent beings, that I began to find a degree of confidence in my own knowledge. It is a sad fact that we often like our juniors to see us as bulletproof, when in fact my best mentors shared their gaps in knowledge, their lessons learnt and their passion to drive their own acquisition of knowledge.
I was absolutely thrilled to see this concept emerging into the mainstream with the recent BBC Horizon broadcast of ‘How to Avoid Mistakes in Surgery’ – see Resus Room Mx – BBC Horizon and Kevin Fong for a great synopsis and video links. I did not think this documentary to be amazingly illuminating, as much of my day to day work is concerned with training and clinical governance activity around human factors in the forms of team based training and implementation of escalation procedures. What it did do, however, is bring the concept of ‘human factors’ into mainstream discussion. The point of great interest, I took from the documentary, was a small section on psychological testing relating to error recovery.
This is a concept we have been exploring in simulation at my hospital, particularly with anaesthetics registrar training. The scenario premise is that the registrar is brought in, without briefing other than a normal clinical handover, at a point when an error has been made, with the objective of recognising the error/failure and correcting or recovering from the error. A framing used with success recently, was a scenario in which a registrar was called to see a patient in Recovery following a shoulder arthroplasty with regional block. The patient was a young male, presenting with post-operative delirium and uncontrolled pain and dense paresis on the non-operative side limb. The registrar promptly assessed the patient (won’t bore with the details) and determined probable wrong site regional anaesthetic administration. The registrar called the consultant and then implemented alternative analgesia strategy. The team then progressed with a full open disclosure discussion with patient and the scenario ended. The debrief yielded a multitude of discussion points from participant and observers alike.
Stage two, of the same session of simulations, moved to an acute pain team review (with a different registrar and same recovery nurse), the following day on the orthopaedic ward. The patient still had complaints of pain in operative side and numbness to non-operative limb. More importantly, there was no recollection of the disclosure of error from the post-anaesthetic unit the day before. The registrar then proceeded to explain the mistakes and the subsequent steps to correct these, to a now rather irate patient. Once again a robust discussion ensued in the debrief, drawing on personal/professional experiences from the entire team, ethical and philosophical considerations, organisational culture and current literature on the topic. The feedback from these sessions was the most positive of any anaesthetics training we have conducted. The phrase from one written session evaluation, ‘no-one ever teaches you this stuff’, resonated greatly with me. Why not?
The discussion about the innate neurological capability of an individual to bounce back from error and promptly correct is an underplayed component in the ‘human factors realm currently. We have gotten pretty good with checklists and protocols, which in the Reason “Swiss Cheese Model” provide adequate if not excellent patches, but what of the individual who freezes and tunnels vision.
Cognitive aids, such as the brilliant Vortex approach are definitely a great next step from checklists, giving the individual a clarity of approach in the heat of crisis. The issue that cognitive aids does not overcome is the individuality of response to when a mistake is made. One person may announce ‘I screwed up that induction dose, quick let’s up the pressors’, another might collapse and rock in the corner saying ‘Propofol, propofol, propofol’.
I can only think of a couple of ways that work to address the individuality of error recovery (especially when it is our own error we are trying to recover). Mindfulness training (nicely explained by Impacted Nurse) urges the us to explore our reactions in every situation, from the benign to the sublime. Another potential aid is simulation training, mistakes made or responses to mistakes in simulation, coupled with skilled debriefing can provide a schema for reflective practice.
For an interesting and succinct look at failure in a business model, have a look at economist Tim Harford’s 3 principles of failure. If this peaks your curiosity check out Tim’s book ‘Adapt – Why success always starts with failure’ and follow @timharford on Twitter for some really interesting commentary.