fbpx

The evolution of mistakes and errors – an NHS perspective

In the first of a two-part series, Richard Fieldhouse attempts to interpret some aspects of Nobel Prize winner Daniel Kahneman's work in trying to make sense of human errors, biases and rational behaviour

The NASGP has always taken a particular interest in the management of mistakes within primary care. Often working in isolation, with little or no support and in environments of enforced underperformance, sessional GPs, and in particular GP locums, can find themselves at the sharp end of system errors and huge variability, with every practice being different to the next by up to 500 different points of variable, practice-specific information. We are the veritable canaries in the coalmine.

Daniel Kahneman, who won the Nobel Prize for his work on human rationality, published his highly acclaimed book, Thinking Fast and Slow, in 2011, which is a distillation of his life's work on behavioural psychology and the cognitive basis for common human errors that arise from heuristics and biases.

System One

Kahneman breaks down human thought into two systems: the epnonymous Thinking Fast, which he calls System One, and the Slow being System Two. System One (S1) thinking is the automatic, fast, stereotypical, intuitive, and emotional thinking that we all do, almost all of the time, even though we are largely unaware of it. This type of thinking requires little or no energy, and enables us to perform simple tasks instantaneously like completing simple sentences, sums like 2+2, localising the source of a specific sound or reading large text on a poster.

S1 thoughts are like our own little team of people working in our heads for us 24/7, keeping an eye on everything that's going on around us, coming in from all senses, so that we’re not exhausted by the simple things in life and can focus our energy instead on more complex tasks.

The typical image of an angry face is S1 thinking in action. Kahneman talks about S1 as our "jumping to conclusions machine”, and through elegant studies showed that when most people are shown an image of a face with wide staring eyes, open mouth and bearing teeth, most of us automatically conclude that the person is angry, even though there is actually little else to go on; it could be that the person is simply scared, or shocked, but because there is little other information, the result is a narrative so simple that we don't look for any other evidence, and stick with our first impression.

System Two

And then the slow System Two (S2) thinking is the much more effortful, logical, calculating and conscious thinking we do, albeit much less often. It’s responsible for pointing our attention towards something specific, recalling a specific memory, sustaining a high walking rate.

S2 only gets involved when we encounter something unexpected that S1 can’t automatically process; it’s activated when an event is detected that violates the model of the world that S1 maintains.

23 x 17

Kahneman illustrates S2 in action by asking us to solve 23 x 17. In doing so, our muscles will start tensing, our blood pressure will rise, our heart rate increase and our pupils dilate. It takes strain, effort and vigilance, and will eat into our finite energy reserves.

And so we rely as much as we can on the fast S1 thinking taking the strain off S2. But our slower S2 is quite lazy; think of it as our ‘boss mode’, getting reassurances as and when it needs them.

Problems arise when we are faced with a complex question that requires the deeper analytical thinking of S2. But instead we unconsciously allow S1 biases and intuitions to cloud our thoughts, making us prone to errors in judgement.

Bat and ball problem

And to illustrate how easy it is to confuse our S1 and S2 states, Kahneman uses the example of a bat and ball that together cost £1.10, with the bat costing £1 more than the ball; how much does the ball cost? Using our best guess intuition to try and solve this problem, most of us, including 50% of the Harvard University students who were studied, will get this wrong.

This question is perceivably low stakes, with a seemingly obvious answer pretty much given to us in the equation, so much so that S2 isn’t often activated, and we present the wrong answer. But in fact the answer is 5 pence, whereas most subjects shown this answer 10 pence.

This tendency of substituting simpler, more comfortable S1 conclusions that "feel right" to solve complex problems that require deeper, more uncomfortable, nuanced, effortful S2 thinking is a major theme that Kahneman returns to as it crops up in many settings.
His work describes many such errors in thinking, all beautifully evidence-based in clever studies and experiments, so much so that this book has been called a masterpiece and pivotal moment in how we understand ourselves.

Cognitive load

Focusing on biases and thinking errors and how they impact on us as GPs, cognitive load stands out as an important one.

For a demonstration of our fixed pool of energy reserves, Kahneman gets us to think about taking a nice gentle stroll with a friend and talking about a tricky clinical case. If we begin to climb a hill, as our physical effort increases, we find our train of thought slows as we have to divert more energy and self control to the physical work of walking up the incline. Only when we slow, and reach a comfortable physical pace, will our thinking clear again.

This, Kahneman explains, is because our energy for processing cognitive tasks comes from same pool as emotional and physical tasks. Mental energy is not a metaphor; difficult cognitive reasoning, and concentrating hard costs as much glucose as required during a fast sprint. We find it patently obvious when our muscles are tired, much less so our cognitive energy.

Energy levels during a clinic - the Israeli parole judge study

In this study, Kahneman looks at the research done on the records of eight Israeli parole judges who were required to record every parole decision they made, as well as every rest break they took. The work of a parole judge invariably requires a high cognitive load with high stakes; the consequences of each decision has a potential impact on public safety, and so the default position is that the prisoner will not be released early. On average, 35% of prisoners were given parole, but what struck the researchers was the large variation in decisions to grant parole depending on the time of day and the interval since a rest break. The first prisoner of the day, or the first reviewed after a rest break, had a 65% chance of being released early. But for the prisoners reviewed just before a break or at the end of of the day, their chances of having been released earlier were almost 0%.

The researchers concluded that with each decision, the judges' cognitive load increased, and as their physical and mental energies became depleted, this affected their decision making.

So a group of highly educated professionals defined by their rationality and impartiality are affected by tiredness and hunger. And, amazingly, until this study, they were probably totally unaware of these very human factors that were affecting their performance.

And similar findings have been made in other studies of other groups from our own profession. Clinicians performing colonoscopies identify 50% less abnormal polyps in the afternoon than in the morning, and GPs are 10% more likely to prescribe an antibiotic for a viral condition in the afternoon, the performance of which is not too dissimilar to these doctors having drunk a glass of wine or a bottle of beer prior to these procedures.

What you see is all there is

Kahneman writes extensively about the phenomenon of how people jump to conclusions on the basis of limited information and ignore absent evidence. He has an abbreviation for this phenomenon — WYSIATI — “What you see is all there is.”

Our S1 is constantly seeking patterns and coherence and when it finds them, it gives us a sense of ease, even lightens our mood, and increases our sense of confidence in our judgements, making them more firmly embedded.

As a result of WYSIATI, S1 is following its drive to create a coherent and believable story. In fact, the less evidence and the more one-sided the information presented, the simpler it is for S1 to jump to a conclusion, and the more confident and at ease we feel with using S2 to endorse this judgement because "it feels right".

Our brains are wired to be these machines for jumping to conclusions. On the most fleeting evidence, we can instantaneously infer and invent all sorts of conclusions about the causes of events and the intentions of the protagonists, even though the connections we make may be entirely spurious. Kahneman calls these "illusions of causality and intention".

Moving triangles

A small yet smart demonstration of these illusions is the moving triangles study. When subjects are shown a large and a small triangle circling around a screen, a consistent finding was that subjects described the large triangle as either chasing or bullying the small triangle. Kahneman quotes that "we see causality and intention as clearly as we see colour."

It's an entirely brilliant evolutionary mechanism, helping us rapidly identify and respond to threats by spotting changes in patterns: that sudden swaying movement in the long grass that could have been a predator for our evolutionary ancestors. To survive, it's best not to ask too many questions, just decide quickly and act. But it can be off beam in our uncertain and complex, information-rich world.

On a personal and societal level, WYSIATI is a key mechanism in why people can jump to conclusions, assume bad intentions, give in to prejudices or biases, and buy into conspiracy theories.

Hindsight bias

Another very applicable and robust cognitive illusion to us in healthcare is hindsight bias, where we literally "change our minds" and revise our beliefs in light of an outcome. We are rarely aware of this happening, and the cost is that when outcomes are bad, decision-makers are often unfairly blamed, despite the fact that the decisions and actions taken were reasonable given the circumstances at the time of making the decision. It’s the "Well I knew that would happen!" phenomenon.

A striking demonstration of this is the bridge study, where two groups of people were given the full details of a bridge over a river in Minnesota which was known to have a risk of debris becoming caught underneath it, giving rise to a risk of the nearby town flooding. The subjects were asked if the city should go to the expense of employing a full time monitor to detect the blockage.

In the group that was given the same information available to the city decision makers at the time, 24% said that yes, they should employ a full-time monitor.

The other group were given additional information revealing the actual outcome, whereby the river had flooded and caused significant damage. 56% said a monitor should have been employed - even though they had been instructed not to let the outcome affect their judgement. The subjects were not aware that they were in a position of not being able to "un-know" the outcome of the terrible flooding.

The bad news for those working in healthcare where the stakes are usually high is that the worse the outcome, the greater the tendency to hindsight bias

Implications

In general practice, we need to be aware of the evidence-base for our psychological limits as human beings and how this can affect our judgement, our proneness to errors and quality of care.

This is absolutely not about personal weakness or stupidity or incompetence.This is about powerful evolutionary forces of physiology and psychology. We can't beat it by going on a resilience course and then going straight back to our overloaded jobs. All we can do is learn from the evidence of researchers like Daniel Kahneman and do our best to implement checks and balances.

We need a cultural and system change but we can all individually start the process from the grassroots by making small changes which protect ourselves, our colleagues and our patients.

This concept of cognitive load and how depleted we become during our clinics as we grapple with complex, varied clinical scenarios has very real consequences and risks in our day to day work.

And as GP locums working in often unfamiliar settings, often without adequate inductions into practice-specific processes, we have additional demands on our finite pool of mental resources as our S2 thinking is often activated for many non-clinical tasks, administrative type tasks that practice-based GPs would be able to carry out effortlessly using S1.

We must also remember and resist "WYSIATI" and hindsight bias when we are involved in significant events or in thinking about our colleagues who have been involved in significant events.

Ending this first part of our two part series on risk and mistakes in general practice, we all look forward to the day when there is a just and open and safer culture in medicine. Working towards this is where the next part, focusing on Black Box Thinking, will take us.

Richard has worked as a freelance GP locum since 1995 in around 100 different practices, living and working in West Sussex and Hampshire. He founded NASGP in 1997 and the UK’s first locum chambers in 2004.

He enjoys walking, reads too many books on behavioural economics and has an unhealthy obsession with his sourdough starter.

No Comments Yet.

Leave your comments