Article Text

Download PDFPDF

General medicine
Clinicians’ cognitive biases: a potential barrier to implementation of evidence-based clinical practice
  1. Claudia Caroline Dobler1,
  2. Allison S Morrow1,
  3. Celia C Kamath2
  1. 1 Evidence-Based Practice Center, Robert D and Patricia E Kern Center for the Science of Health Care Delivery, Mayo Clinic, Rochester, Minnesota, USA
  2. 2 Division of Health Care Policy and Research, Robert D and Patricia E Kern Center for the Science of Health Care Delivery, Mayo Clinic, Rochester, Minnesota, USA
  1. Correspondence to Dr Claudia Caroline Dobler, Evidence-Based Practice Center, Mayo Clinic, Rochester MN 55905, USA; dobler.claudia{at}mayo.edu

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

The uptake of new evidence in healthcare relies on clinicians’ willingness to change their clinical practice by implementing an evidence-based clinical intervention or deimplementing an obsolete, non-evidence-based practice. A number of barriers to change among health professionals have been identified including the way that clinicians make medical decisions. When clinicians judge situations, make decisions and solve problems, they routinely use cognitive shortcuts, also called ‘heuristics’, as well as internalised tacit knowledge (based on clinicians’ own experiences, exchange with colleagues, reading information and hearing from opinion leaders, patients, pharmaceutical representatives, and so on).1 Mental shortcuts can assist clinicians to process large amounts of information in a short time and are an important tool for experienced clinicians to make a correct diagnosis based on recognition of internalised patterns of signs of symptoms. They also have the potential, however, to prevent evidence-based decisions.

Here, we will outline a number of cognitive biases that constitute potential barriers to the practice of evidence-based medicine and potential solutions to address and overcome these biases. It is unknown to which extent cognitive biases play a role in clinicians’ decision-making, but some evidence suggests that cognitive biases in medical decision-making might be common.2 In a study on anaesthesiology practice, of nine types of cognitive errors selected for observation, seven occurred in >50% of observed emergencies.2

Examples of cognitive biases in clinical decision-making

The following cognitive biases typically influence clinicians’ decision-making regarding treatment and management of patients and are thus potential barriers to evidence-based clinical practice (figure 1).

Figure 1

Illustrations of clinicians’ cognitive biases. COPD, chronic obstructive pulmonary disease.

Omission bias

Omission bias is the tendency to judge actions that lead to harm as worse or less moral than equally harmful non-actions (omissions). The inclination is thus towards inaction following the principle of ‘do no harm’. Kahneman and Miller explained this bias in terms of norm theory: When an action leads to an adverse outcome, it is easy to envision the outcome if no action had been taken (‘if only I had not…, then…’), therefore it is associated with a strong emotional reaction of regret. If no action is taken, it is more difficult, and therefore associated with less emotions, to envision the outcome if action had been taken.3

Physicians may not prescribe evidence-based preventive treatments because they are more afraid of the risk of potential adverse effects from treatments than the higher risks of morbidity and mortality associated with the disease (eg, preventive tuberculosis treatment in patients with latent infection,4 anticoagulation for atrial fibrillation in high-risk patients5).

A randomised controlled trial assessed omission bias among 125 pulmonologists who reviewed case vignettes and had to indicate decision preferences for the evaluation of pulmonary embolism and treatment of septic shock.6 The physicians were randomised to different decision options for the same case scenario: in one version the clinicians had to do nothing (omission) for a management decision consistent with guidelines (eg, not ordering a computed tomography (CT) for patients at low risk of pulmonary emboli), in the other version they had to take action for guideline-consistent management (eg, cancelling an ordered, but not yet performed—and not indicated—CT). Physicians were significantly more likely to make a decision not consistent with guidelines when it was presented as an omission option compared with an option to take action (71% vs 53%, p=0.048 for evaluation of pulmonary emboli; 50% vs 29%, p=0.016 for treatment of septic shock).

Status quo bias

The ‘status-quo-bias’ is closely related to the omission bias. It is the preference for the current state and can be explained with loss aversion.7 Any change is associated with potential losses and discomfort. As people are loss averse, the losses weigh heavier than the gains.

This bias can facilitate clinician inertia, for example, when clinicians do not step down or, conversely, intensify treatments, even when it is indicated (eg, step down of asthma medication, intensification of treatment for type 2 diabetes mellitus). The status quo bias can additionally be supported by the omission bias, for example, when the fear of what could happen if the treatment was stepped down (eg, exacerbation after stepping down inhaled corticosteroid therapy in asthma) or intensified (eg, episodes of hypoglycaemia) is an additional barrier to clinicians taking action. A study of real-life data of 7389 patients found that despite clinical practice guidelines recommending escalation of antihyperglycaemic therapies until glycaemic targets are reached (HbA1c<7%), 53% of patients with HBA1c≥8% and 44% of patients with HBA1c≥9% (after having been on a stable regimen of two oral antihyperglycaemic drugs for at least 6 months) did not have their treatment intensified.8

Commission bias

This is the opposite of the omission bias and is the tendency to prefer action over inaction. Action is motivated by trying to avoid experiencing regret about a missed opportunity when a treatment is not given or a procedure is not performed, even when the expected effectiveness of the intervention is minimal. The commission bias is potentially an important driver of low value care including overinvestigation and overtreatment9 (eg, overtreatment of chronic back pain,10 overscreening for various conditions such as carotid artery stenosis, pre-diabetes and thyroid cancer11).

Availability bias

This bias refers to the tendency to make likelihood predictions based on what can easily be remembered. When asked to judge the probability of an event, people will try to recall events in the past and will associate easily recalled events with a higher probability of occurrence than events that are difficult to remember.12 The problem is that not all easily remembered events occur frequently. The effect of availability bias was demonstrated in a prospective cohort study of 227 patients for whom physicians ordered one or more sets of blood cultures.13 Physicians were asked to intuitively estimate the numerical probability that at least one of the blood cultures from a patient would be positive (bacteraemia). When physicians recalled that they had frequently cared for bacteraemic patients, their probability estimates for bacteraemia were higher than if they did not recall this (36% vs 22%, p=0.0025).

Framing bias

This bias refers to the fact that people’s reaction to a particular choice varies depending on how it is presented, for example, as a loss or as a gain. If a physician tells a patient that the risk of a brain haemorrhage from oral anticoagulation is 2%, it is likely perceived very differently compared with when the physician informs the patient that there is a 98% chance that they will not have a brain haemorrhage on treatment. Framing relates not just to whether an event is presented from a positive or negative angle, but relates to other aspects of information communication, for example, whether absolute or relative risk estimates are provided. A trial that randomised 1431 doctors to differently framed risk presentations comparing an old and a new HIV drug showed that the proportion of doctors who rated the new HIV drug as more effective varied by risk presentation format (framing).14 When the relative mortality reduction was presented, 94% of doctors rated the new drug as more effective compared with 52% when the absolute survival with the new and the old drug was presented (p<0.001).14

Solutions to overcoming cognitive biases

The underpinning foundation for overcoming heuristic biases is the dual process theory of decision-making15 16 which posits that medical decisions are a function of both (A) an intuitive process, which is fast, reflexive and requiring minimal cognitive input, and (B) an analytical process that requires a more conscious, slow and deliberate effort. Intuitive processing, while valuable in making everyday decisions, generates the kind of biases described above.

Solutions to this problem consist of a series of cognitive interventional steps, termed ‘cognitive debiasing’, a process of creating awareness of an existing bias and intervening to minimise it.

Cognitive debiasing strategies can be broadly categorised into educational strategies, real-time workplace strategies and real-time strategies for individual decision makers (Box 1).16

Educational strategies aim to make physicians aware of the risk of bias and prepare them for enhancing their future ability to ‘debias’. For example, a small study among 15 emergency medicine residents used a simulation scenario designed to lead the residents into a cognitive error trap.17 After going through the scenario, they were debriefed and received instructions on how to mitigate the effect of their cognitive biases in future decision-making. While the residents ranked the value of the training highly, the study did not assess the impact of the intervention on future decision-making. Generally, there is no evidence that educational debiasing strategies in a medical context are effective because of a lack of studies in this area.

Workplace strategies aim to integrate debiasing strategies at a system level. Examples include slowing down strategies, such as a planned time out before a surgical procedure,18 19 decision support systems integrated in the electronic medical record20 and checklists (eg, checklist to decrease catheter-related bloodstream infections,21 surgical safety checklist22). While evidence of the effectiveness of a surgical time out (a pause immediately before the planned procedure to ascertain accurate patient identity, surgical site and planned procedure) from comparative trials is lacking, excellent results (eg, no wrong-site surgeries) have been documented in cohort studies after implementation of surgical time outs.18 19 The effectiveness of decision support systems is supported by randomised controlled trials. For example, a randomised controlled trial that evaluated clinical decision support for radiography in acute ankle injuries integrated into the physician order-entry system found that clinical decision support improved guideline adherence for ankle radiography (93% vs 62%, p=0.02) and foot radiography (81% vs 64%; p<0.01).20

The use of a checklist for infection-control practices has been shown in a multicentre before-after study to significantly reduce the mean rate of catheter-related bloodstream infections per 1000 catheter-days (from 7.7 at baseline to 1.4 at 16–18 months of follow-up; p<0.002).21 Introduction of a surgical safety checklist in a multicentre before-after study has been found to decrease surgical mortality from 1.5% before the checklist was introduced to 0.8% afterwards (p=0.003).22 Inpatient complications decreased from 11.0% of patients at baseline to 7.0% after introduction of the checklist (p<0.001).

Real-time debiasing strategies used by individuals are based on cognitive forcing functions, which means that the decision maker is forcing conscious attention to information before taking action.16 An example would be to seek evidence to support a decision opposite to one’s initial decision preference before making a final decision.16 There is, however, no evidence of the effectiveness of these strategies in a medical context.

Box 1

Potential solutions to overcoming cognitive biases (debiasing strategies)

Educational strategies

  • Definition: Learning about cognitive biases, developing insight on how these biases may impact on your own decision-making (learning to think about how you think=metacognition), discussing strategies to mitigate the impact of biases, for example, forcing strategies (forcing yourself to consciously think about your thought processes to over-ride the initial intuitive response to a problem and forcing yourself to consider alternatives).

  • Examples: Cognitive tutoring, simulation training.

  • Evidence: There is no evidence of the effectiveness of educational strategies for debiasing in a medical context.

Real-time workplace strategies

  • Definition: Debiasing strategies are embedded in the healthcare system and force the decision maker to take a step back to over-ride potential cognitive biases.

  • Examples: A planned time out in the operating room, decision support systems integrated in the electronic medical record, checklists (checklist to decrease catheter-related bloodstream infections, surgical safety checklist and others).

  • Evidence: There is evidence of the effectiveness of these interventions from randomised controlled trials (for clinical decision support) and cohort studies (for time out before a procedure, checklists).

Real-time strategies for individual decision makers

  • Definition: Deliberate real-time reflection by a decision maker on his/her own thinking, forcing conscious attention to information before taking action. The strategy is not enforced by a healthcare system (see real-time workplace strategies).

  • Example: Consider the opposite (seek evidence to support a decision opposite to your initial decision preference before making a final decision).

  • Evidence: There is no evidence of the effectiveness of these debiasing strategies in a medical context.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.

Footnotes

  • Contributors CCD conceived the idea. CCD, CCK and ASM drafted and revised the manuscript.

  • Funding This study was funded by the National Health and Medical Research Council (Fellowship for CCD (APP1123733)).

  • Competing interests None declared.

  • Patient consent Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.