Based on a transcript of the five part podcast series on Diagnostic Error with Dr Mark Graber and Dr Owen Bradfield.
You’re nearing the end of another busy ED shift in your second term as a junior doctor. You pick up your last patient of the day: Jane, a 72 year old woman brought in from a retirement village with a two hour history of vomiting and headache. Multiple residents from the retirement village have been unwell with vomiting and diarrhoea. Jane is pleasantly confused, but has normal vital signs and is afebrile. On examination she has dry mucous membranes, a soft abdomen and a grossly normal neurological exam. She is anxious to get home to her husband as soon as possible.
You provisionally diagnose Jane with likely viral gastroenteritis and write her up for some intravenous fluids and anti-emetics. Come handover, she is still vomiting intermittently and complaining of a headache. You draft her discharge letter and hand her case over to the evening staff.
During your shift the next day you have a heart sink moment when a colleague says “Remember that patient you saw last night?”
You learn that during her stay in the department, Jane became increasingly confused and disoriented. She was reviewed by another staff member and sent for an urgent non-contrast CT scan, which revealed a subarachnoid haemorrhage likely arising from a burst aneurysm. She was transported overnight to a tertiary centre for ongoing neurosurgical management.
Diagnostic error is the failure to provide an accurate and timely explanation for a patient’s symptoms. The field of cognitive psychology suggests that our diagnoses are incorrect approximately 10-15% of the time, and a small but significant proportion of these errors will result in significant harm to patients. We have all encountered patients who carry the burden of misdiagnosis, and may even have been involved in their initial care.
To understand how errors can occur, we can first consider the two distinct thought processes that underlie decision-making, and the cognitive biases that may interfere with our gathering and processing of information.
Thinking, fast and slow
Dual processing is widely accepted as a model that characterises clinical decision making. It refers to two distinct thought forms: type 1, or intuitive thinking, and type 2, or analytical thinking.
Intuitive thinking, also referred to as gestalt, relies on pattern recognition. This type of thinking is automatic and rapid, and often employs heuristics (problem-solving techniques) such as the availability heuristic: if you hear hoof beats, think horses not zebras.
Analytical thinking refers to the application of logic and reasoning to work through unfamiliar patterns. This type of thinking is systematic, resource-intensive and time-consuming.
As junior doctors we are frequently impressed by clinicians who diagnose aortic stenosis without so much as touching a stethoscope to the chest, or who can press an abdomen for a moment and refer the patient to the surgical team with appendicitis. Such intuitive thinking is essential to rapid decision-making and efficient work flow, but it can be flawed, particularly in situations where clinician confidence outweighs their experience.
Cognitive bias refers to systematic errors in thinking. In the clinical context, over 20 individual biases have been identified as contributing to diagnostic error. Some examples of these include:
Cognitive bias can be compounded by contextual factors such as time pressure, emotional distress and hunger: It’s the end of a shift, I’m hungry and tired, and ideally all of my patients will be sorted before I attend evening handover.
Strategies to encourage analytical thought and avoid cognitive bias
Junior doctors have a smaller pool of clinical experience to draw from than senior colleagues, which makes it safer to rely on analytical thinking – intuition is then able to play a larger role over time with increasing exposure and clinical acumen. So how can junior doctors work towards avoiding diagnostic error, over-reliance on heuristics, and impacts from cognitive bias?
Even if we were to apply all of the above strategies, diagnostic error will always be a possibility due to atypical presentations and other confounding factors. In the case of Jane, a more cautious assessment and a broader differential diagnosis list may still have resulted in the same initial management and outcome – perhaps she has baseline cognitive impairment, or her headache may have been relatively mild or unable to be fully described.
However, consciously gaining further awareness of our cognitive biases and limitations may facilitate earlier detection of cases in which a patient does not behave in an expected way, resulting in prompt anticipation of potential complications. Forming habits based around the above strategies and applying them throughout our careers will likely translate into better learning from our mistakes, make us less medicolegally liable, help us to be more forgiving of our colleagues, and ultimately lead to better outcomes for our patients.