top of page

How to Make Better Clinical Decisions



How do we make clinical decisions? We would like to think it is through well-reasoned, logical decision making. Rationality can be defined as “acting in a way that helps us achieve our goals, which in the clinical setting typically means a desire to improve our health.”[1] Meanwhile, clinical decision making (CDM) can be defined as a highly complex, multi-faceted skill that is developmental and requires a substantial amount of practice with realistic patients to develop.”[2] These are supposed to go hand in hand. Yet, our emotions are often the decision-makers. Consciously, we believe we are making the rational decision, but research is clear when it comes to decision making, rationality often falls short.


There are two primary models of clinical decision making supported in the medical field: Fast and Frugal and Dual Process. What follows is a breakdown of those decision-making strategies and how we can use them to our advantage. My goal with this article is to help you recognize the influence bias and emotion have on your clinical decision making. In turn, this will help our practice fulfill our core value of Live Clinically.


"The prize is the pleasure of finding things out, the kick in the discovery, the observation that other people use it." – The Pleasure of Finding Things Out by Richard Feynman

Fast and Frugal Heuristics

The Fast and Frugal system emphasizes quick decisions drawn from our intuition. Our gut reaction is often the correct one, even if we cannot explain why we made that decision. The theory was first proposed by Gigerenzer.[3] To better understand Fast and Frugal decision making, we first need to understand heuristics.


Heuristics are decision-making strategies you employ every day. The goal of heuristics is to improve our decision making by filtering out unnecessary information and taking shortcuts. Take a knee evaluation as an example. How do you pair down the tests you use in an examination? What leads you to use one set of education versus another? Why do your educational strategies differ now from when you were a student on clinal rotations?


There are several answers to these questions. One is your efficiencies have improved. The ‘soft skills’ have been fine-tuned – or completely overhauled – and you can build a rapport with your patients. Another reason is you have learned which tests have credibility and are efficacious. In school, we are taught the tests and measures needed for our board examination. Once we start treating, we learn many of the tests and measures lacked validity and reliability. Furthermore, their diagnostic utility is limited. The same is true for interventions. And this never changes. We progress our evaluation and treatment approaches as new evidence comes to light. Lastly, a primary reason for changes in evaluation and treatment strategies is the development of intuition.


Intuition is nothing more than recognition. As we build experiences, we build intuition. Our memory banks fill with experiences we can draw upon at a later date. While no two patients look the same, there are similarities between patients and situations. Fast and Frugal heuristics depends on these memories.


When you develop expertise, gut reactions become more common. You can experience gut reactions in a job, sports, or a relationship. It is a feeling deep down that urges you to take a particular action. What is fascinating about gut reactions is we struggle to explain them. When asked why we made a particular decision, we are unable to provide a concrete answer. In his book Blink, Malcolm Gladwell describes many of these situations.


It is remarkable how little information is needed to make some complex decisions and inferences. John Gottman’s research on divorce rates is a perfect example. Gottman can predict with remarkable accuracy the likelihood of a married couple divorcing by watching a single conversation. Gottman can predict the likelihood a couple will divorce with 90% accuracy after only one 15-minute conversation between the couple. He did this through thin-slicing – our unconscious ability to find patterns in situations based on short experiences. In other situations, mere seconds are needed.


Art curators can spot expert forgeries within seconds of viewing a piece from several feet away. When asked how they knew, often they are unable to articulate the exact reason. It was a “gut” feeling. Upon closer examination, given time, they can explain the specific signs of the forgery. Similar experiences can be found across the professional spectrum. As we gather experience and expertise, we build intuition. While these gut reactions can be impressive, they are far from fool-proof. A key requirement is the correct context. Gottman can’t predict divorce rates with any 15-minute interaction between a married couple. Observing them while they watch a TV show or play tennis is different than watching a 15-minute one-on-one conversation. Fast and Frugal requires expertise and context.


Fast and Frugal heuristics in healthcare has largely been explored in medicine, particularly when rapid decisions are necessary. In triage situations, delayed decision making can be the difference between life and death. ED physicians and paramedics often lack the luxury of time to reflect and weigh options. They need to make decisions on a limited set of information. In some cases, more information can be detrimental. This is where the frugal component of fast and frugal comes into play.


Emergency situations benefit from decision trees. Take the following graphic as an example:[4]





This fast-and-frugal tree is used by emergency physicians to detect acute ischemic heart disease. Many other tests and measures can be used. There are subtleties in a patient’s presentation and their history. The extra information is white noise. The decision tree boils down the information to what is needed at the moment. This type of ED decision tree was first explored in 1982 by Lee Goldman.


If a patient is showing signs of an MI or a stroke, immediate action is needed. It is not the time for goal setting or determining the patient's understanding of psychosocial influences of pain. This may seem obvious, but what about more conflicting information. What if a patient has chest and arm pain but a normal ECG? What information is most valuable? The physician needs to filter out unnecessary or potentially misleading information and having a systematic way for progressing to decisions quickly. Goldman developed and tested various decision trees to determine which questions allow physicians to quickly make decisions regarding an MI. When research compares physician decision making to the algorithm, there is no contest. Goldman’s decision tree is 70% more effective at identifying heart attacks. Physicians accurately diagnosed an MI 75-89% of the time while the algorithm was right 95% of the time.


As you can imagine, Goldman received an immense amount of resistance. How could a few questions better predict whether a patient is having an MI than expert physicians? It is important to remember the context. Frugal decision making works well when variables are confined. There can be many variables, but they are of similar types. Honing in on specific information works well for biomedical models. Psychosocial influences are not a concern in emergency situations. Canadian C-Spine and Ottowa ankle rules are effective because they are simply detecting a fracture. Move to something as complex as low back pain and it becomes clear why clinical prediction rules continue to fail validation.


Now we start to see where the blending of fast and frugal becomes beneficial. The combination of intuition and heuristics is powerful. Unfortunately, there are significant limitations. What happens if you are in a complex situation, such as treating chronic low back pain, and you lack expertise? For any clinician with less than 5-10 years of experience, intuition is limited. You simply haven’t built the memory bank of experiences. If you try to use simple heuristics, you will miss valuable information. If you rely on your intuition, again limited information and your cognitive bias and emotions will lead you astray. We need another method for improving clinical decision making.


"Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed." – Thinking, Fast and Slow by Daniel Kahneman

Dual Process Theory

In place of Fast and Frugal, the dual System model contends we have two systems of the mind.[5] System 1 is the quick decision-making system. It draws about heuristics — rule of thumb decision-making strategies. While System 1 can help us make quicker decisions by filtering out unnecessary information (the premise of Fast and Frugal), it is prone to error. Bias is the error in a heuristic-driven decision.


To combat the errors made by System 1, we can lean on System 2. This system is the slow, critical thinking system. It is arduous but more accurate. It involves reflection and assessment of all the known details. It does not rely on heuristics. While System 2 is more accurate than System 1, it cannot be universally used as we often need to make snap decisions.

Pattern recognition is inherently biased. What follows is a handful of heuristic approaches that often fall victim to bias. I have included clinical examples.


Anchoring:

Anchoring is the tendency to focus heavily on the initial information received for a given situation.


Anchoring heavily influences patients and clinicians. A physical therapist may prescribe 12 visits of physical therapy over three months after the initial evaluation. The response of the patient will be determined by prior expectations. If the referring physician told the patient to expect three visits over three weeks, they will be resistant to the plan of care, regardless of the reasoning the therapist gives. If the referring physician told the patient to expect 8-10 visits, they will be more accepting of the 12-visit recommendation.


"If you want to persuade, appeal to interest not to reason." – Poor Richard’s Almanack by Benjamin Franklin

Clinicians may cling to the anchors set while developing the initial plan of care. We are all prone to the planning fallacy, which is a tendency to overestimate results and under-estimate the time and effort needed to achieve results. We are resistant to updating the plan of care, even as new evidence comes to light. This builds off the representative and confirmatory heuristics that cause us to cling to initial patient assessments (more on this shortly).


The halo effect is a version of the anchoring bias. Our first impression of someone is hard to dismiss. We often ascribe unknown patient beliefs and characteristics based on the few we glean the first interaction of the first visit, a type of representative bias. What is your response to someone who walks into the clinic wearing sunglasses during the day and has fibromyalgia on their referral? Do you make inferences about the entire plan of care at that moment? The visceral bias is when we make negative connotations about a patient and then treat them differently. It may be subtle, such as spending less time with them during exercises, but the patient will likely notice. Similar to information, it is difficult to shake the first impression.


"Scientific knowledge is a body of statements of varying degrees of uncertainty/certainty - some most unsure, some nearly sure, none absolutely certain." – The Pleasure of Finding Things Out by Richard Feynman

Availability:

The availability heuristic is the tendency to overestimate the frequency of things that easily come to mind.[6] Furthermore, it is assumed the more easily something comes to mind, the more important it must be. The availability heuristic changes the tests clinicians order, the treatments they provide, and the education they relay to patients. A clinician will display a bias towards the information they have most recently studied - the recency effect - or information they are most interested in. For example, the week after becoming dry needling certified, miraculously, all your patients become prime dry needling candidates. A clinician may alter perceived base-rates – the established rate of prevalence in the research – to comply with their perceived frequency and importance of information.[7]


As stated, a common example of the availability bias is witnessed after a clinician completed a continuing education course. Having recently attended the course (recency effect) and invested time, money, and effort (sunk cost) the clinician will be apt to use the information gained. The investment and recent information will increase the perceived importance and the clinician will search for the information confirming their beliefs (confirmation bias). A clinician will likely halt the example after completing the recently learned assessments (satisfaction of search) and lean towards the newly acquired treatment skills, regardless if it is truly the best option (base-rate neglect). The availability bias is heavily influenced by many other biases and is a springboard for continued cognitive errors.[8]


"The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed." – Thinking, Fast and Slow by Daniel Kahneman



Representativeness:

The representative heuristic is the “assumption that something that seems similar to other things in a certain category is itself a member of that category.”[9] The purpose of the heuristic is to home-in on patient characteristics that are most likely associated with the patient’s complaints and the best course of future action. However, the representative heuristic often limits the thoroughness of a clinician’s search. Furthermore, if early finds do not match the representation in the clinician’s head, they will be led further astray or simply delay making decisions (analysis paralysis).[10]


The representative heuristic manifests in multiple associated biases, primarily base-rate neglect. Base-rate neglect is ignoring established prevalence rates in favor of new information you perceive to supersede in importance.7 For example, 90% of acute low back pain resolves within six weeks without intervention.[11] An overweight patient may enter the clinic drinking a soda and explain they do not like to exercise. Drawing on past experiences, the clinician categorizes this patient as someone who will not recover without intervention. While the odds may no longer be 90%, they are certainly not 0%, yet the conversation and future decisions of the clinician will be structured as such. To combat base-rate neglect, clinicians can employ Bayesian reasoning. This involves starting with a base rate and updating that rate with new information. You make small changes in probability to account for the new information, but you do not ignore the established base rate.[12]


"Human beings are quick to perceive patterns where they don't exist and to overestimate their strength where they do." – How Not to Be Wrong by Jordan Ellenburg

Confirmatory:

The confirmation bias is the tendency to accept information that confirms our current beliefs and rejects information that refutes them. We even frame future inquiries to increase the likelihood of supporting previous judgments.[6] The confirmation bias feeds into many similar biases. Theory induced blindness is the phenomenon of once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. Semmelweis reflex is the tendency to reflect new evidence that contradicts an established, norm, belief, or paradigm. The outcome bias is the tendency to evaluate the quality of a decision after learning the outcome. Clinicians will often take credit for good outcomes and explain away poor outcomes. The emphasis on outcomes and devaluing of the process is common as well. It is the “ends justify the means” bias.


The sunk cost fallacy falls within the category of the confirmatory heuristic. Our current actions are used to justify our past investments of time, money, and effort. The sunk cost fallacy is an unwillingness to cut ties with a past investment - or sunk cost - and look forward to future influences. For example, a therapist may use outdated treatments learned in school, regardless of the treatment’s efficacy, to justify the loan debt accrued. Similarly, any investment in certification courses or the development of new techniques clung to. Sunk cost fallacy is one of the drivers behind the high prevalence of modalities, kinesiotape, and dry needling – in addition to the outcome bias and placebo effects.


"Believe whatever you believe by day, but at night, argue against the propositions you hold most dear." – How Not to Be Wrong by Jordan Ellenburg


Influence of Environmental Factors

Our emotions and cognitive biases are not the only factors to consider when assessing our clinical decision making. For example, research shows that sleep deprivation impairs our ability to make decisions.[13] Even using system 2 to slow down and critically think through a situation will be impaired. We will be slow in drawing on past experiences and employing heuristics. We will be prone to additional errors and bias. Making matters worse, sleep deprivation impairs our technical skills.[14] This is only one of the many intrinsic and extrinsic factors that influence clinical decision making. There are two I want to focus on before closing with strategies to improve our clinical decision making and combat cognitive errors.


How Clinic Setting Influences CDM

Our work environment influences our cognitive load. If we are in a busy environment, we have many pieces of information to process at a given moment. This is one of the problems with treating multiple patients at once. I am not saying it cannot be done, but research is clear that a heavy cognitive load impairs decision making.[5] Experience and intuition play a role.


A novice clinician will struggle with higher cognitive loads from busy environments as they cannot rely on system 1 thinking. They do not have a vault of clinical experiences to fall back on. Fast and frugal decision making will be impaired. This leaves us with a couple of options. 1) We can provide more algorithms to ease decision making; 2) we can partner the novice clinician with a senior clinician who can review decisions and provide guidance; 3) the caseload of the novice clinician is lightened to allow time for reflection and system 2 thinking.


Even expert clinicians have a saturation point for the cognitive load. We must consider all the tasks someone is trying to complete at a given time. Multi-tasking is not possible. Our brains can only focus on a single task at a given time. The perception of multi-tasking is a repeated, rapid switch in attention. The more tasks we try to complete at a given time, the higher the “switch cost” will be. Switch cost is the reduction in performance accuracy or speed that results from shifting between tasks.[15] Our ability to quickly process information and recognize situations improves our ability to handle multiple tasks, but we still need to consider the overall cost. Consider a manager who is trying to treat a full caseload, train new staff, answer questions for the front office employee, and write notes. All of this cannot be accomplished at once, at least not well. That is the key. Tasks can be completed, but are they completed well.





At the heart of clinical decision-making is ethics. What do we define as high-quality care? What is the lens by which we decide what cognitive load is appropriate? Completing a task does not mean it was completed well. Quality care is not dichotomous either. Service is not either good or bad; there are many gradations of clinical quality. This brings me to the second environmental influencer of clinical decision making: culture


How Culture Influences CDM

Expectations influence our experiences. Anchoring is a form of developing expectations. The culture a clinician works in will influence their clinical decision making. What are the expectations of a caseload, speed of clinical decision, clinical quality, customer service, and employee engagement? We prioritize our actions based on incentives.


Incentives are not strictly monetary. An incentive can be fulfillment in completing a task. A thank you note, a positive google review, an ideal work shift, additional responsibilities, and structured mentorship are all versions of incentives. Incentives are rewards for behaviors we want to be repeated. Let’s bring this back to clinical decision making.


What are the behaviors we want clinicians repeating in the clinic? What type of treatment do you expect to deliver? What about your colleagues? Reflect on what motivates you and the clinicians around you. Are the incentives aligned with your values and behaviors?


All clinical decisions should start with, “what is the best thing for the patient in front of me.” From that lens, we refine our decisions. It is like the Evidence-Based Medicine funnel.


Start with treatments that are supported in the evidence, then filter with your expertise and experience, and finish with patient values and expectations. All treatments should run through those filters, but it starts with objective evidence.


Our decisions should filter through our core values. It ensures we deliver patient-centric care and serve our communities to the best of our abilities. However, even the best intentions can be led astray by the aforementioned clinical bias. I am going to close with some strategies for tackling bias and improving clinical decision making.





Tackling Bias and Improving Clinical Decision Making

Croskerry lists three requirements for clinicians to optimize clinical decision making. First, clinicians must acknowledge the influence of bias and susceptibility to cognitive errors when making clinical decisions. Second, clinicians must understand errors are not inevitable. Third, clinicians must remain vigilant and believe solutions to reduce bias are viable.[10]


This article and the resources included serves as step one: acknowledging bias. Take time to review the various types of bias that influence clinical practice. You will not eliminate bias, but you can learn to recognize bias and navigate cognitive minefields.


One of the best tools for navigating biases is reflection. Effective reflection involves consciously reflecting on alternative diagnoses and treatment options. Force yourself to list the alternatives, even if intuitively they seem far less likely. This cognitive forcing temporarily contains cognitive biases, particularly confirmation and satisfaction of search bias. To enhance the reflection, gather external opinions from colleagues and mentors. We are limited by our perspective and experiences. Building a collaborative approach – avoiding groupthink – can refine clinical decision making.


As you build experience and develop reflection habits, the time and effort required will decrease. Do not become complacent, however. No level of experience or study will ever make you immune to cognitive bias and decision-making errors.


"Scientists must be able to answer the question "What would convince me I am wrong?" If they can't, it's a sign they have grown too attached to their beliefs." – Superforecasting by Philip Tetlock and Dan Gardner


References


1. Djulbegovic B, Elqayam S. Many faces of rationality: Implications of the great rationality debate for clinical decision-making. J Eval Clin Pract. 2017;23(5):915-922.

2. Huhn K, Black L, Christensen N, Furze J, Vendrely A, Wainwright S. Clinical Reasoning: Survey of Teaching Methods and Assessment in Entry-Level Physical Therapist Clinical Education. Journal of Physical Therapy Education. 2018;32(3):241-247.

3. Raab M, Gigerenzer G. The power of simplicity: a fast-and-frugal heuristics approach to performance science. Front Psychol. 2015;6:1672.

4. Green L, Mehr DR. What alters physicians' decisions to admit to the coronary care unit? J Fam Pract. 1997;45(3):219-226.

5. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84(8):1022-1028.

6. Gorini A, Pravettoni G. An overview on cognitive aspects implicated in medical decisions. Eur J Intern Med. 2011;22(6):547-553.

7. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med. 2002;9(11):1184-1204.

8. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22 Suppl 2:ii58-ii64.

9. Klein G. Naturalistic decision making. Hum Factors. 2008;50(3):456-460.

10. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775-780.

11. van Tulder M, Becker A, Bekkering T, et al. Chapter 3. European guidelines for the management of acute nonspecific low back pain in primary care. Eur Spine J. 2006;15 Suppl 2:S169-191.

12. Hornberger J. Introduction to Bayesian reasoning. Int J Technol Assess Health Care. 2001;17(1):9-16.

13. Goel N, Rao H, Durmer JS, Dinges DF. Neurocognitive consequences of sleep deprivation. Semin Neurol. 2009;29(4):320-339.

14. Whelehan DF, McCarrick CA, Ridgway PF. A systematic review of sleep deprivation and technical skill in surgery. Surgeon. 2020;18(6):375-384.

15. Madore KP, Wagner AD. Multicosts of Multitasking. Cerebrum. 2019;2019.


Comments


bottom of page