, , ,

The James Reason Swiss Cheese Failure Model in 300 Seconds

James Reason Swiss Cheese Model. Source: BMJ, 2000 Mar 18:320(7237): 768-770 A while ago I was part of the Cardiff pilot of Practical Strategies for Learning from Failure (#LFFdigital). My job was to explain the James Reason Swiss Cheese Failure Model in 300 seconds (5 minutes). This is what I did. The Swiss Cheese Model of…

James Reason Swiss Cheese Model. BMJ, 2000 Mar 18:320(7237): 768-770
James Reason Swiss Cheese Model. Source: BMJ, 2000 Mar 18:320(7237): 768-770

A while ago I was part of the Cardiff pilot of Practical Strategies for Learning from Failure (#LFFdigital). My job was to explain the James Reason Swiss Cheese Failure Model in 300 seconds (5 minutes).

This is what I did.

The Swiss Cheese Model of Accident Causation (to give it the full name), was developed by Professor James T. Reason at the University of Manchester about 25 years ago. The original 1990 paper,“The Contribution of Latent Human Failures to the Breakdown of Complex Systems”, published in the transactions of The Royal Society of London, clearly identifies these are complex human systems, which is important.

Well worth reading is the British Medical Journal (BMJ), March 2000 paper, ‘Human error: models and management’. This paper gives an excellent explanation of the model, along with the graphic I’ve used here.

The Swiss Cheese Model, my 300 second explanation:

  • Reason compares Human Systems to Layers of Swiss Cheese (see image above),
  • Each layer is a defence against something going wrong (mistakes & failure).
  • There are ‘holes’ in the defence – no human system is perfect (we aren’t machines).
  • Something breaking through a hole isn’t a huge problem – things go wrong occasionally.
  • As humans we have developed to cope with minor failures/mistakes as a routine part of life (something small goes wrong, we fix it and move on).
  • Within our ‘systems’ there are often several ‘layers of defence’ (more slices of Swiss Cheese).
  • You can see where this is going…..
  • Things become a major problem when failures follow a path through all of the holes in the Swiss Cheese – all of the defence layers have been broken because the holes have ‘lined up’.
Source: http://www.energyglobal.com/upstream/special-reports/23042015/Rallying-against-risk/
Source: Energy Global Oilfield Technology http://www.energyglobal.com/upstream/special-reports/23042015/Rallying-against-risk/

Who uses it? The Swiss Cheese Model has been used extensively in Health Care, Risk Management, Aviation, and Engineering. It is very useful as a method to explaining the concept of cumulative effects.

The idea of successive layers of defence being broken down helps to understand that things are linked within the system, and intervention at any stage (particularly early on) could stop a disaster unfolding. In activities such as petrochemicals and engineering it provides a very helpful visual tool for risk management. The graphic from Energy Global who deal with Oilfield Technology, helpfully puts the model into a real context.

Other users of the model have gone as far as naming each of the Slices of Cheese / Layers of Defence, for example:

  • Organisational Policies & Procedures
  • Senior Management Roles/Behaviours
  • Professional Standards
  • Team Roles/Behaviours
  • Individual Skills/Behaviours
  • Technical & Equipment

What does this mean for Learning from Failure?  In the BMJ paper Reason talks about the System Approach and the Person Approach:

  • Person Approach – failure is a result of the ‘aberrant metal processes of the people at the sharp end’; such as forgetfulness, tiredness, poor motivation etc. There must be someone ‘responsible’, or someone to ‘blame’ for the failure. Countermeasures are targeted at reducing this unwanted human behaviour.
  • System Approach – failure is an inevitable result of human systems – we are all fallible. Countermeasures are based on the idea that “we cannot change the human condition, but we can change the conditions under which humans work”. So, failure is seen as a system issue, not a person issue.

This thinking helpfully allows you to shift the focus away from the ‘Person’ to the ‘System’. In these circumstances, failure can become ‘blameless’ and (in theory) people are more likely to talk about it, and consequently learn from it. The paper goes on to reference research in the aviation maintenance industry (well-known for its focus on safety and risk management) where 90% of quality lapses were judged as ‘blameless’ (system errors) and opportunities to learn (from failure).

It’s worth a look at the paper’s summary of research into failure in high reliability organisations (below) and reflecting, do these organisations have a Person Approach or Systems Approach to failure? Would failure be seen as ‘blameless’ or ‘blameworthy’?

High Reliability Organisations: Source BMJ, 2000 Mar 18:320(7237): 768-770
High Reliability Organisations: Source BMJ, 2000 Mar 18:320(7237): 768-770

It’s not all good news. The Swiss Cheese Model does have a few criticisms. I have written about it previously in ‘Failure Models, how to get from a backwards look to real-time learning’.

It is worth looking at the comments on the post for a helpful analysis from Matt Wyatt. Some people feel the Swiss Cheese model represents a neatly engineered world. It is great for looking backwards at ‘what caused the failure’, but is of limited use for predicting failure. The suggestion is that organisations need to maintain a ‘consistent mindset of intelligent wariness’. That sounds interesting…

There will be more on this at #LFFdigital, and I will follow it up in another post.

So, What’s the PONT?

  1. Failure is inevitable in Complex Human Systems (it is part of the human condition).
  2. We cannot change the human condition, but we can change the conditions under which humans work.
  3. Moving from a Person Approach to a System Approach to failure helps move from ‘blameworthy’ to ‘blameless’ failure, and learning opportunities.

Responses to “The James Reason Swiss Cheese Failure Model in 300 Seconds”

  1. complexwales

    I can’t help thinking that when Reason says “complex human systems” he really means ‘complicated human designs’. Afterall the whole premise of a complex system is that it’s non-linear. Things don’t have to line up and the relationship between cause and effect can be oblique. It would be like one layer of cheese being a baked Camembert (it just slows problems down), one is an American Slice (bouncing problems off in all directions) and one a thin wedge of unbreakable Parmesan. What’s more the line of failure could be the equivalent of a hot wire that simply slices through, holes or not. That’s enough of the metaphor.

    The Cognitice Science has moved on in the past 25 years and what is described as failures in human cognition are now more clearly recognised as contextual strengths, not failures. If you work in a widget factory full of machines designed for specific purposes, then we expect them to do exactly what they are supposed to do. In this context, it’s a big machine with a few annoying biological bits mucking up the teleological perfection. Health is not that.

    Health is an ecosystem, a Biology with the odd stupid inert mechanical bit doing the boring stuff. In this sense, we don’t want high reliability – quantitative efficiency – out of the qualitative context. For example, consistently giving every third person an infection is highly reliable. In health we’ve suffered from the bell curve effect. NICE set up most of their advice for the middle line on an effecient normal distribution of idealised patients. What that means is that the perfectly designed best practice works perfectly for a tiny proportion of the world. The job is actually more about tailoring every decision to fit the individiual. Sounds mad doesn’t it, but that’s why it takes 14 years to become a Doctor. Unlike factories and boats, in complex systems there are different outcomes, in different directions for people with different wants and needs. In the end every body dies, so in Reason terms the whole health system is one massive failure.

    Health doesn’t need to be highly reliable, like a machine. Albeit some parts like labs and radiology and theatre are more like the Ships and Powerstations of the research. The majority of health needs to be resilient. Going wrong is all part of being alive, the trick is, as you say, to be sensitive to the present, spot inevitable variations early and make a choice each time. It’s why zero harm campaigns don’t work. We’re in the business of harm, we exchange one harm (apendicitis) for a lesser harm (appendectomy). So harm, can’t be a failure.

    Just stirring up your head ready for the conversation.
    Thanks for the montion, “axe weilding” made me laugh out loud.

  2. whatsthepont

    I haven’t got past the Camembert , Ameican Cheese Slice and Parmasan metaphor for the minute.
    The Matt Wyatt Cheese of The World / Exotic Cheese Board Model of Failure could be the 21st Century version.
    You should work on the graphic.
    It would be brilliant.
    Welcome form holidays, I’ve missed you.

    1. ComplexWales (@ComplexWales)

      Exotic Cheese Board of Failure graphic, coming up!

  3. Dysgu o Fethiant: Beth mae hyn yn ei olygu i archwilio? | Good Practice Exchange at The Wales Audit Office

    […] cyflwyniad Chris Bolton ar Fodel Methiant Caws y Swistir gan James Reason, sy’n cymharu systemau dynol i haenau o Gaws y Swistir. Dewisodd Reason Caws y Swistir am […]

  4. Learning from Failure: What does this mean for audit? | Good Practice Exchange at The Wales Audit Office

    […] Bolton’s presentation was on the James Reason Swiss Cheese Failure Model, which compares human systems to layers of Swiss Cheese. Reason chose Swiss Cheese for a reason […]

  5. Practical Strategies for Learning from Failure is coming to Leeds! #LFFdigital | Connecting Social Care and Social Media

    […] @whatsthepont The James Reason Swiss Cheese Failure Model in 300 Seconds […]

  6. Sut wnaeth Cyngor Abertawe cynnal ymchwiliad craffu i’w diwylliant | Good Practice Exchange at The Wales Audit Office

    […] wedi gwneud cryn dipyn o waith ar fethiant dros y blynyddoedd diwethaf drwy ein Rheolwr Chris Bolton. Mae’r gwaith hwn wedi bod yn sylfaen i’r wybodaeth rydym yn rhannu a’n ffocws ar […]

  7. How Swansea Council undertook a scrutiny inquiry into their culture | Good Practice Exchange at The Wales Audit Office

    […] done a fair bit of work around failure over the last couple of years through our Manager Chris Bolton. This work has underpinned a lot of our information sharing and our focus on improvement. So it’s […]

  8. Struggling with Learning from Failure? Just host a Cheese Fondue Party – What's the PONT

    […] Swiss Cheese and Failure. Previously I’ve written about the James Reason Swiss Cheese Model which is widely used to illustrate how failure happens in complex systems. I’ve even had a go at trying to explain it in 300 seconds (link here). […]

  9. Automating bugs – Automation Journal

    […] a legacy, big messy project prevails the Swiss cheese failure model. In software development, the holes in the defence are the unknown concepts and the assumptions. […]

  10. Chris Subbe

    Reblogged this on An Audible Patient Voice and commented:
    And a great summary about systems approach to failure.

  11. The Swiss Cheese Model: Human Error Puts Holes in IT Security – DevPro Journal

    […] complex systems” by University of Manchester professor James T. Reason. The model, explained in a What’s the Point article, compares layered defenses — whether intended to prevent a data breach, an accident, or […]

  12. The Regulation Culture Ladder. Moving from Pathological to Generative. – What's the PONT

    […] The James Reason Swiss Cheese Failure Model in 300 Seconds […]

  13. Swiss Cheese, Mishaps, Accidents and Family Bust-Ups

    […] Read another opinion […]

  14. Is anyone deploying 'Innovation and Learning' people alongside COVID-19 Response Teams? – What's the PONT

    […] Basically everything gets mixed together in a gloopy mess. This is an idea that builds upon the Swiss Cheese Model of Failure (explained here) where everything is mixed together for the review, like the cheese in a fondue. What is drawn out […]

  15. 2:00PM Water Cooler 12/27/2022 | naked capitalism

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  16. 2:00PM Water Cooler 12/27/2022 | naked capitalism – Montana Digital News

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  17. 2:00PM Water Cooler 12/27/2022 | naked capitalism – utahdigitalnews.com

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  18. 2:00PM Water Cooler 12/27/2022 | naked capitalism – Wisconsin Digital News

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  19. 2:00PM Water Cooler 12/27/2022 | naked capitalism – North Dakota Digital News

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  20. 2:00PM Water Cooler 12/27/2022 | naked capitalism – My blog

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  21. 2:00PM Water Cooler 12/27/2022 | naked capitalism – westvirginiadigitalnews.com

    […] (I love that “To make it easy to remember” part.) Here is James Reason’s version (from the NC archives). Reason is an “error management” scholar, and the developer of the model: […]

  22. 2:00PM Water Cooler 12/27/2022 | bare capitalism – financialhanck.com

    […] (I really like that “To make it straightforward to recollect” half.) Right here is James Motive’s model (from the NC archives). Motive is an “error administration” scholar, and the developer of the mannequin: […]

Leave a comment