Based on a transcript of the five part podcast series on Diagnostic Error with Dr Mark Graber and Dr Owen Bradfield.

 

Original transcript:  Nicholas Malouf

Author:  Nathan Jamieson

Editor:  Elie Matar

 

You’re nearing the end of another busy ED shift in your second term as a junior doctor.  You pick up your last patient of the day: Jane, a 72 year old woman brought in from a retirement village with a two hour history of vomiting and headache. Multiple residents from the retirement village have been unwell with vomiting and diarrhoea.  Jane is pleasantly confused, but has normal vital signs and is afebrile.  On examination she has dry mucous membranes, a soft abdomen and a grossly normal neurological exam.  She is anxious to get home to her husband as soon as possible.

You provisionally diagnose Jane with likely viral gastroenteritis and write her up for some intravenous fluids and anti-emetics.  Come handover, she is still vomiting intermittently and complaining of a headache.  You draft her discharge letter and hand her case over to the evening staff.

During your shift the next day you have a heart sink moment when a colleague says “Remember that patient you saw last night?”

You learn that during her stay in the department, Jane became increasingly confused and disoriented.  She was reviewed by another staff member and sent for an urgent non-contrast CT scan, which revealed a subarachnoid haemorrhage likely arising from a burst aneurysm.  She was transported overnight to a tertiary centre for ongoing neurosurgical management.

 

Diagnostic error is the failure to provide an accurate and timely explanation for a patient’s symptoms.  The field of cognitive psychology suggests that our diagnoses are incorrect approximately 10-15% of the time, and a small but significant proportion of these errors will result in significant harm to patients.  We have all encountered patients who carry the burden of misdiagnosis, and may even have been involved in their initial care.

To understand how errors can occur, we can first consider the two distinct thought processes that underlie decision-making, and the cognitive biases that may interfere with our gathering and processing of information.

 

Thinking, fast and slow

Dual processing is widely accepted as a model that characterises clinical decision making.  It refers to two distinct thought forms: type 1, or intuitive thinking, and type 2, or analytical thinking.

Intuitive thinking, also referred to as gestalt, relies on pattern recognition.  This type of thinking is automatic and rapid, and often employs heuristics (problem-solving techniques) such as the availability heuristic: if you hear hoof beats, think horses not zebras.

Analytical thinking refers to the application of logic and reasoning to work through unfamiliar patterns.  This type of thinking is systematic, resource-intensive and time-consuming.

 

 

As junior doctors we are frequently impressed by clinicians who diagnose aortic stenosis without so much as touching a stethoscope to the chest, or who can press an abdomen for a moment and refer the patient to the surgical team with appendicitis.  Such intuitive thinking is essential to rapid decision-making and efficient work flow, but it can be flawed, particularly in situations where clinician confidence outweighs their experience.

 

Cognitive bias

Cognitive bias refers to systematic errors in thinking.  In the clinical context, over 20 individual biases have been identified as contributing to diagnostic error.  Some examples of these include:

Availability bias: accepting a diagnosis based on how easily an example is called to mind, giving preference to events that are more recent or observed personally.   I’ve treated multiple patients with gastroenteritis today, and it is a very common cause of vomiting and dehydration.

Anchoring: fixating on an initial piece of information when making subsequent judgments. This patient is presenting from a retirement village where multiple residents have symptoms suggestive of gastroenteritis, which makes it likely that she has the same diagnosis.

Confirmation bias:  the tendency to look for evidence that supports a diagnosis and to discount evidence that refutes it.  The patient has a headache and is disoriented, but I don’t know her cognitive baseline, and both are probably due to her being dehydrated.

Visceral bias: where counter-transference (involving the clinician’s negative and positive feelings towards patients) may result in incomplete information-gathering. Jane is a pleasant lady and is worried about her husband at home. I want to treat her symptomatically and get her home as soon as possible.

 

Cognitive bias can be compounded by contextual factors such as time pressure, emotional distress and hunger:  It’s the end of a shift, I’m hungry and tired, and ideally all of my patients will be sorted before I attend evening handover.

 

Strategies to encourage analytical thought and avoid cognitive bias

Junior doctors have a smaller pool of clinical experience to draw from than senior colleagues, which makes it safer to rely on analytical thinking – intuition is then able to play a larger role over time with increasing exposure and clinical acumen.  So how can junior doctors work towards avoiding diagnostic error, over-reliance on heuristics, and impacts from cognitive bias?

 

  • Make a differential diagnosis list:  a common reason for missed diagnosis is failure to consider the diagnosis as a possibility in the first place
  • Practice reflectively:  reflect on how you arrived at the diagnosis, try to identify potential biases that have influenced you, and follow patient’s progress over time if possible
  • Listen to your patient and their family members: many medicolegal claims involve the perception of poor listening from the clinician
  • Confer with senior colleagues and involve the patient in your decision-making: explaining your thought flow to others allows explicit reflection on your diagnostic process.  This also assists in establishing expectations for your patient, as diagnosis is rarely certain
  • Ensure effective follow-up:  follow-up provides a safety net for the patient and results in feedback for the referring clinician, helping to refine their intuitive thinking
  • Document comprehensively and clearly, outlining important positive findings as well as relevant negatives
  • Learn about clinical decision-making and cognitive bias:  gaining awareness of how errors arise may allow us to minimise the potential harms that arises as a result
  • Embrace simulation opportunities as part of education:  simulation scenarios allow trainees to gain exposure, proficiency and feedback in a safe environment

 

Even if we were to apply all of the above strategies, diagnostic error will always be a possibility due to atypical presentations and other confounding factors. In the case of Jane, a more cautious assessment and a broader differential diagnosis list may still have resulted in the same initial management and outcome – perhaps she has baseline cognitive impairment, or her headache may have been relatively mild or unable to be fully described.

However, consciously gaining further awareness of our cognitive biases and limitations may facilitate earlier detection of cases in which a patient does not behave in an expected way, resulting in prompt anticipation of potential complications.  Forming habits based around the above strategies and applying them throughout our careers will likely translate into better learning from our mistakes, make us less medicolegally liable, help us to be more forgiving of our colleagues, and ultimately lead to better outcomes for our patients.

 

Useful resources

Part 1:  Diagnostic error – overview

Part 2:  Diagnostic error – decision making and bias

Part 3:  Diagnostic error – preventing diagnostic error

Part 4:  Diagnostic error – responding to diagnostic error

Part 5:  Diagnostic error – learning and teaching about diagnostic error