Skip to main content
  1. Home
  2. Blog
  3. The problem with the ‘bad apple’ fallacy

The problem with the ‘bad apple’ fallacy

Published on
23 April 2019
Written by

When faced with a human error problem you may be tempted to ask ‘Why didn’t they watch out better? How could they not have noticed?’ You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. They are all expressions of the ‘Bad Apple Theory’ where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere.

Sidney Dekker, The Field Guide to Understanding Human Error

How many of us can identify with this quote from Sidney Dekker? Have you ever been involved in an incident where the response was to discipline an individual/individuals, or to simply change a process/guideline?

This article highlights the prevalence of this tendency in healthcare.

We have heard of many occasions in operating theatres, for example, where in response to incidents, evidence-based safety check processes have been adapted or additional duplicate checks added. Are these changes always made with a full understanding of how they will work for front line staff, or done with a ‘work as imagined’ view? Atul Gwande — the original developer of what became the WHO Safer Surgery Checklist — took dozens of attempts before coming up with a process that actually worked. Many of his early attempts actually made work more difficult, for unanticipated reasons!

More information of how Atul Gwande and his team developed the now infamous safe surgery checks are detailed in his book — The Checklist Manifesto — well worth a read.

If we look at the tragic and much publicised death of Elaine Bromiley, clear protocols and guidelines weren’t followed. Is that because the clinicians were reckless? Is it because the protocols themselves weren’t fit for purpose? Some of the key contributory factors identified from that case were:

  • Lack of clear leadership.

  • Ambiguous communication and lack of communication, leading to a lack of role clarity and team-working.

  • Lack of assertiveness from nursing staff, potentially linked to inter-professional hierarchies (and potentially to broader cultural hierarchy issues).

  • Task fixation, which lead to a loss of situation awareness and poor task prioritisation.

In essence, protocols weren’t followed because the professionals involved lost awareness of what was happening around them (for multiple interdependent reasons), rather than because the protocols themselves were poor or misunderstood. In this instance, targeting the protocols or the individual(s) is highly unlikely to improve safety.

One of the challenges healthcare organisations face is the emotive reaction — both internally and externally — when things go wrong, leading to the (often subliminal) desire to be seen to be doing something. The issue this can then create is knee-jerk reactions to situations, reactions which can have knock on consequences:

‘Blame and Shame'

  • Potential loss of skilled professional(s), at a time when staff shortages abound

  • We don’t address underlying wider systemic and/or cultural issues

  • Fear of blame in future which leads to less proactive reporting of risks, lack of willingness to disclose errors and less willingness to engage with those affected by harm — patients and their families. This lack of openness then of course leads to anger from those affected (understandably) and the person or people to blame for what happened. And so the cycle continues…..

‘Just change the rules’

(before conducting a thorough investigation with a Human Factors and Just Culture lens)

  • Don’t address underlying wider system and/or cultural issues

  • Don’t consider exceptional circumstances that may have contributed

  • End up with changes to/new rules & guidelines that at best have no impact on future safety and simply add more work for operators, and at worst make things less safe. Without thorough analysis and understanding the multiple causation factors, even rules that at face value make sense can have unintended knock-on effects.

So what can we do?

  • We need to look beyond simplistic and linear incident analysis models that seek to find singular ‘root causes’, and consider the complexity of the socio-technical systems that make up healthcare.

  • Start with the assumption that people don’t come to work to cause harm or take unnecessary risks. Consider the question “I wonder why that made sense at the time?”

  • If changes to protocols/rules/guidelines ARE necessary (which of course they sometimes are) — when implementing any change, ensure that this is complemented with high quality education and adult communication to gain understanding and buy-in, rather than a parental ‘just do it’ approach.

  • Break the cycle of blame and anger by being completely open and transparent at both a personal and organisational level with anyone affected by an incident of harm.

I look forward to hearing your thoughts!