Learning to live with cognitive bias

Ben Hoban lives and works in Exeter

You are probably familiar with the idea of cognitive bias: a trick of the mind that stops you seeing what’s in front of you or thinking clearly, something that’s a recognised cause of diagnostic error. There is a whole menagerie of biases with mundane or more exotic names like base-rate neglect, the gambler’s fallacy and the Pollyanna principle,1,2 and wandering through this psychological zoo for any length of time may make us question how much we ever get right. Of course, if you’re prone to exceptionalism, you might just smile and shake your head at how easily other people get in a muddle.

Of course, if you’re prone to exceptionalism, you might just smile and shake your head at how easily other people get in a muddle.

The idea of cognitive bias is connected with the idea that we have two complementary ways of thinking.3,4 What we usually understand as thinking – consciously processing information step-by-step in a way that we can readily explain – turns out to be quite slow. It can only handle small sets of data, and only if they’re readily available, limiting its usefulness. The other kind of thinking is far more powerful, integrating huge datasets from multiple sources below our level of conscious awareness, including memory and subtle environmental cues. It is so fast and distinct from its slower cousin that we don’t even consider it thought, referring to it instead as intuition, gut-feeling or acting-in-the-moment. Its weakness, though, is that it is prone to cognitive bias. To some extent of course we can recognize this and make allowances. The tendency to bias in our fast thinking, however, is so fundamental that it actually demonstrates a much bigger truth.

We all have a literal blind spot in our visual field of which we’re so unaware that identifying it feels like a party trick.

Fast thinking relies on heuristics: cognitive short-cuts and rules of thumb that work by association and extrapolation to bridge our mental gaps and fill in the blanks. We rely on these heuristics not just in our thinking, but in our day-to-day experience of reality. We all have a literal blind spot in our visual field of which we’re so unaware that identifying it feels like a party trick.5 Likewise, our perception of colour is limited to central vision, yet there is no point beyond which we suddenly experience our surroundings in black-and-white. Cognitive blindness prevents us from seeing something unexpected even when it stands in front of us and waves.6 In each of these cases, we are integrating incoming sensory data with extensive background knowledge of how things usually work to make sense of the world around us. We do this so smoothly that we don’t even realize: we think we’re simply observing reality, when in fact what we see depends at least partly on what our brain tells us to see. Cognitive bias affects our perception and thinking in a similar way during the consultation. It’s not just that we commit errors: our entire mental operating system is built in a way that makes them inevitable.

 Can we ever hope to see things clearly, then, or are we doomed to founder in a sea of subjectivity and misdiagnosis? We all fall off the tightrope now and then, and a safety net is a sensible precaution, no matter how confident we are in our skills. More than this, though, being open about our propensity to error with ourselves, our patients and our colleagues makes it easier to learn from experience. Every diagnostic “failure” becomes another piece of evidence to inform and refine our thinking next time. Patients tend to value doctors who are fallible but willing to help them through life’s difficulties by embracing their own part in the drama, as one person with another, rather than pursuing some false vision of objectivity. We will never be free from bias or error. Our security lies not in perfection, but in recognising and learning from our imperfections.


  1. Croskery P, The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them, Academic Medicine 2003; 78: 775-780
  2. Cognitive Illusions: Intriguing Phenomena in Judgement, Thinking and Memory, Ed R Pohl, Psychology Press, 2nd edition 2016
  3. Thinking, Fast and Slow, Daniel Kahneman, Penguin 2012
  4. The decisive moment: how the brain makes up its mind, Jonah Lehrer, Canongate Books Ltd 2010
  5. (accessed 20/6/22)
  6. (accessed 20/6/22)

Featured image by Osarugue Igbinoba on Unsplash

Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Inline Feedbacks
View all comments
Previous Story

Seven ages of the family physician and the problem of ‘premature abdication’

Next Story

Abuse against healthcare staff: what’s happening, so what and what now?

Latest from Bright Ideas and Innovation

Would love your thoughts, please comment.x