Nada Khan is an Exeter-based GP and an NIHR Academic Clinical Lecturer in General Practice at the University of Exeter. She is also an Associate Editor at the BJGP.
Back when I was working in a hospital setting and probably more concerned about acute airway emergencies than diagnostic dilemmas, I attended a training session that left a lasting impression. The session was about what to do when trying to get airway access and things aren’t going quite to plan. Do we always stop and rethink what we’re doing if what we’re trying isn’t working?
We watched a harrowing video titled ‘Just a Routine Operation’, a film made by Martin Bromiley, an airline pilot, about the tragic death of his wife Elaine. Elaine had been scheduled for a routine sinus operation, but the anaesthetic team encountered a, ‘Can’t intubate, can’t ventilate,’ emergency. Despite repeated failed attempts, the team fixated on trying to intubate her, failing to escalate to a surgical airway in time. Elaine suffered catastrophic hypoxia, leading to irreversible brain damage and her tragic death.1
Elaine Bromiley’s death wasn’t a failure of knowledge or equipment. It was a human factors failure. Her husband Martin went on to establish the Clinical Human Factors Group (CHFG), promoting awareness of how the environment, systems, and cognitive processes affect safety in healthcare.1,2 There is a lot to unpack in the field of human factors, but one concept stands out in this case, and often in general practice too: fixation error.
Fixation error: A hidden threat in practice?
Fixation error happens when a clinician locks onto a particular diagnosis or route of action. It can mean that we become so focussed on one problem we think we can solve, and fail to see the mounting evidence that we might be on the wrong track…
Fixation error happens when a clinician locks onto a particular diagnosis or route of action. It can mean that we become so focussed on one problem we think we can solve, and fail to see the mounting evidence that we might be on the wrong track and have lost sight of a more critical issue. It’s not caused by inexperience, but often the opposite, and is driven by cognitive shortcuts, or heuristics, that bypass calculated steps and can lead to cognitive biases in decision making.3 The problem with always just relying on heuristics comes when those mental shortcuts harden into clinical blind spots.
Given Martin’s background as a pilot, an article in the New Statesman focussing on human factors and errors in medicine picks up on this concept through the lens of aviation safety.4 In aviation, where similar errors have caused fatal crashes, crews are trained to ‘Get unstuck’, to bypass fixation errors by pausing, reassessing, and listening to concerns from the wider team. These kinds of approach can, and are applied in healthcare settings in the form of safety checklists and time-outs at critical moments in surgical environments to allow space and time for team-based reassessments. General practice systems have been slower to embrace these cognitive guardrails, especially as we often work in silos, alone in a consultation room with a patient.
This all reminds me of Daniel Kahneman’s book ‘Thinking, Fast and Slow’, which describes two modes of thinking: the fast, intuitive System 1 and the slower, more analytical System 2.5 Fixation error is a classic example of System 1 going unchecked when we jump to a diagnosis or course of action and remain locked into it, despite contradictory evidence. In his book, Kahneman describes how fast, intuitive thinking (System 1) helps us make rapid decisions but can be prone to error. Slowing down and deliberately engaging a slower, more analytical mode of thinking (System 2) is important, as it engages us to reflect and challenge our heuristics. In general practice, that might mean resisting the urge to stick to an initial diagnosis too quickly, and instead asking, ‘could this be something else?’
What can we do about fixation error?
In general practice, and in a busy clinic, the pace and high cognitive load can lead to an increasing reliance on rapid pattern recognition. But, overuse of System 1 thinking can leave us vulnerable to multiple cognitive biases.6
Fixation error is perpetuated by anchoring onto an initial piece of information, and then reinforcing that anchoring bias by selectively interpreting information to support that initial impression (i.e. confirmation bias).7It’s easy to focus on first impressions, especially when it fits common patterns. Here on the BJGP podcast I spoke to Dr Afrodita Marcu, who talked about diagnostic delays in cancer amongst pregnant women. Women felt that doctors anchored their attribution of symptoms like abdominal pain, or tiredness, just as symptoms of pregnancy without considering an alternative diagnosis. This research emphasises the value, as symptoms persist, worsen or evolve, to pause, reassess, and think again about what else might be going on. This is a deliberate reflective moment, and shifts us from automatic to analytical mode, and moves us from Kahneman’s System 1 to System 2 thinking.5
In general practice, that might mean resisting the urge to stick to an initial diagnosis too quickly, and instead asking, ‘Could this be something else?’
Moving to System 2 thinking, however, is not the only way to overcome cognitive biases. And whilst the kind of checklists and ‘time-outs’ used in operating theatres won’t likely work across general practice settings, it may be worth considering mental checklists to consider and minimise biases in decision making. Gopal and colleagues suggest asking questions like, ‘Did I consider causes besides the obvious ones?’ and ‘did I ask questions that would disprove my diagnosis?’ when coming back to a case.7 It’s worth also considering how to practice diagnostic flexibility in the notes, not just as a prompt for ourselves, but for other team members coming back to review the patient. This might involve adding possible alternatives to consider alongside a working diagnosis when something’s not getting better, or getting worse despite the initial management.
Being aware of cognitive biases
Managing fixation error isn’t about rejecting our clinical instincts, but learning when to question them. In the pressurised environment of general practice, where speed and efficiency often dominate, creating space for reflection and uncertainty is not always easy. The pace of work in general practice can make it difficult to slow down and reconsider. But, even small opportunities to deliberately pause, a few words in the notes to flag diagnostic uncertainty or running a case past a colleague can create headspace to think again, to think slow and to challenge our own assumptions. These simple acts of reflection and curiosity can help us step back from the noise, shift out of autopilot, and reduce the risk of fixation error in practice.
References
- Bromiley M. The husband’s story: from tragedy to learning and action. BMJ Qual Saf. 2015;24(7):425-7.
- Carayon P, Wetterneck TB, Rivera-Rodriguez AJ, Hundt AS, Hoonakker P, Holden R, et al. Human factors systems approach to healthcare quality and patient safety. Appl Ergon. 2014;45(1):14-25.
- Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy. Med Decis Making. 2015;35(4):539-57.
- Leslie I. How mistakes can save lives: one man’s mission to revolutionise the NHS: The New Statesman; 2014 [Available from: https://www.newstatesman.com/long-reads/2014/06/how-mistakes-can-save-lives.
- Kahneman D. Thinking, fast and slow: Farrar, Straus and Giroux; 2011.
- Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak. 2016;16(1):138.
- Gopal DP, Chetty U, O’Donnell P, Gajria C, Blackadder-Weinstein J. Implicit bias in healthcare: clinical practice, research and decision making. Future Healthc J. 2021;8(1):40-8.
Featured Photo by JC Gellidon on Unsplash