Cognitive bias

From Ganfyd

Jump to: navigation, search

This article on cognitive biases (or usual cognitive dispositions) is intended to help the diagnostic process and enhance patient safety. It is written in the context that all medical decisions may be considered fundamentally biased since the use of judgement heuristics and a combination of cognitive-related and system-related factors developed over many years as most effective in health care and the cultures where that health care is delivered, limit physicians' rationality. Our brains appear to manage and process information principally in either[1]:

  • Automatic "intuitive" mode (Type 1 processes)
    • This is far more efficient, allowing rapid decision making
    • Hardwired or acquired through repetitive experience
    • The impressive Augenblick process that is lifesaving for a tension pneumothorax, but is the most common cause of medical error
  • Controlled "analytic" mode (Type 2 processes)
    • Inefficient
    • Completely separate cortical pathways and even physiology to automatic mode
    • Decision systems such as structured clerking and presentation, formal ward rounds, multiple history taking and second opinions have evolved in medical culture to aid this mode. However once the decision system itself becomes automatic, addressable medical error again will occur.

The balance of the two mechanisms that allow optimal function will differ between specialities and clinical presentations. De-biasing is also very difficult and has a productivity cost that society or individual patients may not routinely tolerate but expect retrospectively.

Contents

Introduction

The aim of this article is to help in identifying "cognitive dispositions to respond" (CDRs)[2]. It examines predictable tendencies in how doctors react to contextual clues that are largely unconscious and may contribute to flaws in reasoning (e.g. faulty heuristics, biases, sanctions, fallacies and errors). Readers are encouraged to address a prevailing pessimism against improving cognitive performance through the de-biasing techniques outlined below and thus to develop better insight. This of cause starts with recognising that common scenarios such as triage cueing, where for example a patient with chest pain is more likely to have a cardiac issue considered, so may enter an inappropriate pathway of care, can actually be addressed by ensuring general diagnostic skills are available at initial assessment and not later in the diagnostic process. It does not deal with cognitive biases as commonly seen outside medicine and which can influence healthcare such as in political decision making, or such cognitive bias in group decisions such as committee decisions and health policy or in the literature. It can not deal fully with imposed biases such as when triage is time limited, or resources are not available to pursue particular avenues of diagnosis.

LogoKeyPointsBox.pngA good guide to the jargon in this field is found at The importance of cognitive errors in diagnosis and strategies to minimize them 2003[2] Definitions exist for:
  • Aggregate bias
  • Anchoring
  • Ascertainment bias
  • Availability
  • Base-rate neglect
  • Commission bias
  • Confirmation bias
  • Countertransference
  • Diagnosis momentum
  • Feedback sanction
  • Framing effect
  • Fundamental attribution error
  • Gambler’s fallacy
  • Gender bias
  • Hindsight bias
  • Multiple alternatives bias
  • Omission bias
  • Order effects
  • Outcome bias
  • Overconfidence bias
  • Playing the odds
  • Posterior probability error
  • Premature closure
  • Psych-out error
  • Representativeness restraint
  • Search satisfying
  • Sutton’s slip
  • Sunk costs
  • Triage cueing
  • Unpacking principle
  • Vertical line failure (as with guidelines)
  • Visceral bias
  • Yin-Yang out

Diagnostic failure

The rates of diagnostic failure in diagnostically undifferentiated patients such as those presenting to primary care and emergency medicine approach 10 to 15%. However in visually orientated specialities such as plain film radiology and histopathology routine error rates of 2% are obtainable. Interestingly these error rates are higher when non-specialists do the reporting. This has resulted in several healthcare scandals, but is often an acceptable compromise in resource constrained systems.

Clinical management failure

Clinical management does not stop at diagnosis, and cognitive bias can cause errors in selecting or presenting treatment options to a patient and evolution of clinical management. Many prescribing and medicines administration errors are due to cognitive bias. But so can be medical device selection, based on say a promoted product, or a particular surgeons skill set.

Management of uncertainty

The cognitive bias related to risk estimation is well known. Clinicians and patients face the difficulties involved with prognosis usually expressed in terms of risk for a population with great difficulty given individual issues such as psychological perception. Risk estimation by a physician taking a dopamine agonist might be expected to create in some similar cognitive bias to that seen in some patient's gambling behaviour when on these drugs.

Common types of bias

Anchoring

Anchoring is focusing on one particular piece of information such as a symptom, sign, or a particular diagnosis early in the diagnostic process and failing to make any adjustments for other possibilities — either by discounting or ignoring them.

Strategies to minimise anchoring

  • Gather sufficient information
  • Always have a differential diagnosis
    • Reconsider the most likely diagnosis if:
      • There are new symptoms or signs
      • The patient without treatment is not following the natural course of the assumed illness and is not improving
      • The patient is not improving as expected
  • Consider the worst case scenario - what you don’t want to miss
  • Consider the "sunk cost" rationally - just because you have personally invested in a diagnosis does not mean you were right
  • Consider that all of us are subject to confirmation bias (confirmatory bias or myside bias) is the tendency of people to favour information that confirms their beliefs or hypotheses

Premature closure

Premature closure is the uncritical acceptance of an initial diagnosis and failing to search for information to challenge the provisional diagnosis or to consider other diagnoses.

Strategies to minimise premature closure

  • Address anchoring
  • Continuity of care or adequate communication of uncertainty and outstanding possibilities
  • Define a conditional management plan based on sufficient information
  • Identify any “red flag” symptoms and investigate appropriately.
  • Consider consultation with a colleague or specialist
  • Remember "Sutton’s law", the diagnostic strategy of going for the obvious will lead to "Sutton's slip" when possibilities other than the obvious are not given sufficient consideration

Search satisfaction

Search satisfaction refers to calling off activity to further define the problem when one abnormality has been found, and so failing to look for other relevant information.

Strategies to minimise search satisfaction

  • Address anchoring
  • If information is available from more than one source, cross check it for reliability
  • Ensure you have good understanding of specificity and sensitivity
  • Having identified one abnormality, ask yourself if there is anything more going on?

Zebra retreat

Zebra retreat is the process of rejecting uncommon presentations or rare presentations of common presentations. In this context it means backing away from a rare diagnosis.

Strategies to address Zebra retreat

  • Consider that worse case scenario
  • Consider whether to address an outstanding issue even though inconvenient to do so

Authority bias

Authority bias is declining to disagree with an "expert"

Strategies to address authority bias

  • Do not assume authority is acting upon all information and has all information
  • Promote open culture
  • Understand the decision making process of authority
  • Put safety of patient before personal discomfort at potential challenge to authority

Availability heuristic

Availability heuristic (availability short cut) is when recent, more likely or vivid patient diagnoses are more easily brought to mind and so overemphasized in assessing the probability of a current diagnosis.

Strategies to address availability heuristic

  • Test a piece of information outside current time frame
  • Look for red flags or symptoms or signs inconsistent with a common, less serious diagnosis that has been brought to the fore
  • Balance with respect to over-investigating or over-treating after recent unexpected diagnosis (particularity one you personally missed)

External links

References

Personal tools