top of page

The Art of the Clinical Debate




Punching People in the Face with Facts Doesn’t Work

Allow me to paint a picture of a typical clinical debate. After spending hours the previous weekend thoroughly reviewing the literature on the best approach for managing high ankle sprains, you are ready to put your colleague in their place. They have been using outdated approaches that have little to no support in the literature. Every previous attempt had been met with anecdotal rebuttals with references to their vast amount of experience and past success stories. Well, not today. This time you have the evidence on the tip of your tongue, and you are ready to punch them in the face with facts. Fast forward to the end of the day and your relationship is in shambles and nothing about their clinical practice has changed. What happened? As satisfying as it may be during the lead-up and the visualization of the event, the actual act of fact punching does not work (I have much experience in this area of failure).


We have our beliefs for a reason. Perhaps it is due to past experiences and education, how we were raised, our culture, our religious beliefs, or advice we have received from a mentor. Regardless, once something takes hold in our minds, it is quite challenging to dislodge it. This resistance is a result of a phenomenon called cognitive dissonance.



Cognitive dissonance in clinical practice


Cognitive dissonance theory refers to a situation where your attitudes, beliefs, or behaviors are in conflict. It typically results either when our behavior and beliefs do not align — someone continues smoking despite knowing it is harming them — or when our belief is challenged with sound reasoning, but we guard the belief, even if it is irrational.


People are motivated to defend their attitudes, beliefs, and behaviors. Attitudes are the individual’s evaluation of an entity (e.g. ultrasound); beliefs are an association between an entity and an attribute or outcome (e.g. ultrasound is an effective treatment), and behaviors are overt actions performed in relation to an entity (e.g. I will use ultrasound in treatment).[1] Often, we experience inconsistencies with our perceptions and values. This is commonly observed in interactions with patients, physicians, coaches, parents, and educators. Currently, I am going to focus on speaking with a fellow therapist, but you will be able to see the vast application opportunities.


Debates are never fully objective



One of the primary complicating factors with “dropping knowledge” on someone is the investment the individual previously made in developing their current mindsets. Keep in mind, investment comes in many forms. It could be independent studying they have completed, money dumped into past schooling and courses, or time spent practicing and using a particular clinical approach. If I use a treatment methodology repeatedly, it is because I believe there is value in it. Understand, value is relative here.


We may see someone providing an outdated technique and think “they are simply lazy and don’t care about clinical quality.” The clinician, however, may genuinely think the treatment is sufficient and anything else is a waste of time and effort. They may have different priorities than you, but they still care about the improvement of their patients.


We need to consider that we often tie the value of the interventions we choose to our perceived value as a clinician. If my intervention choices are attacked, then I am personally attacked. We often become defensive when someone presents a contrasting viewpoint as the confrontation can be interpreted as an attack on us as an individual. People will often respond more intensely to threatening information that disconfirms their desired view than to congenial information that confirms it.[2] While you intend to attack someone’s choices, the person may feel that their values as a professional are under fire. Perception is often reality, and if that conclusion is drawn, the conversation is dead in the water.


While one treatment may have more evidence to support it, take a step back and look at the attractiveness of the choices from the viewpoint of the clinician. The current outdated choices have worked in the past — perhaps via placebo — or the clinician is heavily influenced by a selective memory and confirmation bias; it doesn’t matter. The clinician is comfortable using the treatment, and their patients come back for more sessions. If we assume he wants to be a good clinician (most do) then while using evidence-based treatment would be attractive, both approaches have benefits.


When faced with a situation between two options that have justification, people tend to favor the option that supports their previous choices. We want to remain consistent with our previous actions and we will favor reasons that justify our choices. Throwing (or punching) facts at someone won’t change that approach.


At the 2019 San Diego Pain Summit, Dr. Jonathan Fass compared approaching colleagues to approaching patients with chronic pain during a talk he gave. He started by providing the International Association for the Study of Pain (IASP) definition for pain: “an unpleasant sensory and emotional experience associated with actual or potential tissue damage, or described in terms of such damage.” He then proposes a similar definition for dissonance: “an unpleasant cognitive and emotional experience associated with actual or potential social or self-identity damage or described in terms of such damage.” We wouldn’t tell a patient in pain “your pain is all in your head, here is all the evidence to support why it isn’t actually tissue damage or a problem, just move and you will be fine” yet we approach our colleagues with “your treatment approach is wrong, here is all the evidence to support why you should actually do this instead, once you make the change your patients will do much better.” We may not be quite that blunt and short, but the point remains the same.


Just as patients can be sensitized and guarded with respect to pain, clinicians can be sensitized and guarded with respect to their beliefs. If we play the long game, have a conversation, and talk to our colleagues the same way we would a patient or a loved one, we may find the conversations yield more positive outcomes.


What is the goal?



In the paper “Feeling validated versus being correct: a meta-analysis of selectivity of exposure of information” the authors explain the phenomenon of useful versus accurate information. When presented with new information, we typically care far more about its usefulness than its accuracy. Let that sink in for a minute. Now, think back to your past experiences and discussions and see if you can spot examples of that occurring. If someone is using a treatment approach that yields successful outcomes, they are unlikely to search for or listen to information on a “more successful” intervention. The authors concluded that we are more likely to display a confirmation bias when the information is useful to us and more likely to abandon the confirmation bias when the bias is no longer useful.


In their book Superforecasting, Philip Tetlock and Dan Gardner dive into the art of prediction and the qualities necessary to make accurate forecasts. It is chock-full of valuable information regarding the evaluation of information and the ability to adjust your mindset. At the end, they pose an important question that is quite applicable to the topic at hand. Which do you want more: to be right or to know the truth? This leads us to one of the most important qualities needed to drive changing behavior and clinical approaches: curiosity.


Knowledge vs. Curiosity

When comparing scientific knowledge to scientific curiosity, the latter stands out as more important for enhancing learning. Two quotes pop into my head when I think of learning versus curiosity and they happen to be by two of the most influential minds of the 20th century. Albert Einstein stated, “The more I learn, the more I realize how much I don’t know.” Richard Feynman continued the sentiment stating, “with more knowledge comes a deeper, more wonderful mystery, luring one on to penetrate deeper still.”


Both Einstein and Feynman were brilliant physicists who pushed the boundaries of science and our understanding of the universe. They were both incredibly smart and possessed a wealth of knowledge (fun fact, Richard Feynman’s IQ was only 125, compared to Einstein’s impressive, albeit best-guessed, value of 160). However, it was their curiosity that set themselves apart. Einstein was famous for his thought experiments (known as gedankenexperiments) which he conducted throughout his career and led to the theories that make him famous today (some of his greatest thought experiments and discoveries occurred while working at the post office). The Pleasure of Finding Things Out, a book comprised of a collection of Feynman’s most famous works and lectures, sums up his philosophy on curiosity well, and I highly recommend it for all clinicians. Both Einstein and Feynman understood curiosity, not knowledge, was the foundation for their success. The same is true for practitioners.


It is scientific curiosity that drives our pursuit of improving our clinical practice. When we get frustrated with someone’s clinical approach — I am guilty of this on many occasions — it is easy to think the individual lacks the skill or intelligence necessary to be a superior clinician. This may seem harsh, and by “lack of intelligence” I do not mean stupid, I simply mean that there are different levels of ‘smart.’ However, it is often not intelligence that people lack, but curiosity. Many smart individuals quickly plateau in their progress as clinicians because of a lack of curiosity.


Why do some lack curiosity?



In her book Dare to Lead, Brené Brown proposes a reason not commonly considered in this arena. In her research, she has concluded that curiosity is an act of vulnerability and courage. Her research shows that curiosity is correlated with creativity, intelligence, improved learning and memory, and problem-solving. My literature reviews have yielded similar conclusions. [3]


In order to be curious, we must be aware of a need for information. If a clinician has the belief that they have mastered their craft and do not need to advance their clinical practice, then they will not seek out new information and will be resistant to any presented to them.


We are not curious about something we do not desire, just as we are not curious about the information we are unaware of or know nothing about. This is where vulnerability plays a role. It is far easier to resist change than to admit there is a gap in our knowledge. If someone presents evidence for a novel treatment technique, we may be ashamed of the inability to perform the new technique or our lack of knowing any information about it. Rather than admit we are novices and start from the bottom, it is easier to remain in comfort. Again, fact punching will not work here.


This does not mean all hope is lost, but instead that we must facilitate the curiosity. Psychologist George Lowenstein wrote, “To introduce curiosity about a particular topic, it may be necessary to ‘prime the pump’.” We accomplish this by using intriguing information to lure the individual in. A key point here, the information must be intriguing to them, not you. Presenting gaps in knowledge to individuals can motivate them to find the answers and become actively engaged in the subject; this applies to clinicians, educators, students, and patients. This ‘information gap theory’ has limitations, however.


Curiosity is maximized when the gap is relatively small. The more you know about a subject, the less likely you are to accept something that is radically different as our inherent biases have a stronger grip on the beliefs. How do we address this? Play the long game. If you are met with resistance, that is ok. Whether a clinician is educating a patient with chronic pain or a clinician has approached a colleague about outdated treatment approaches, we cannot expect dramatic changes in a single conversation.


There is a very important piece that I have purposefully left until the end.


No one is immune to bias and logical fallacies

At this point, you may have jotted down some ideas or visualized a conversation (or series of conversations) with a colleague or friend that you are trying to convince to change behavior. There is one little issue that needs to be addressed first. Cognitive biases do not only occur to other people. They happen to all of us. Your colleague or patient may be thinking the same things about you. We often think “they are close-minded” or “they are suffering from confirmation bias.” We all do. We all have reactionary thoughts and actions. We all suffer from cognitive dissonance. We all want to believe our way is correct and we are in the right. Once we recognize this, we can begin to move forward to what really matters: improving the quality of care.


Stop trying to “win” conversations



Before concluding, I would like to return to Superforecasting and being right versus winning. There are many powerful insights from this book that can be instrumental in our personal clinical growth and ability to assimilate information for the betterment of our patients. Similar to Einstein and Feynman, the authors drill home the importance of doubt. They state:


It is of paramount importance, in order to make progress, that we recognize ignorance and doubt. Because we have doubt, we then propose looking in new directions for new ideas. The rate of development of science is not the rate at which you make observations alone but, much more important, the rate at which you create new things to test.

At any given time, we need to be able to answer the question “what would convince me I am wrong?” as a method to maintain vigilance and avoid the comfort of becoming too attached with current beliefs. Jordan Ellenberg, author of How Not to Be Wrong: The Power of Mathematical Thinking (I am aware of the irony of this title), utilizes the approach of trying to prove his theorems right by day and wrong by night. What would happen if we all used that same approach?


The goal of any conversation around clinical decision making should not be “winning”. While we can have clinical debates and disagreements, and there may be a “best” approach, an attitude of “I was right and you were wrong” will severely hamper any chances of professional progress and will likely torpedo a relationship. We need more high-quality communication to drive the profession forward. We need more high-quality communication and strong relationships to improve the quality and efficiency of care provided. Differences in opinions are not only ok, but oftentimes a good thing. As we all suffer from cognitive biases, we are all in need of being challenged and priming our curiosity. Let your different ideas be known, seek out differing ideas from others, and have respectful, engaging conversations rather than exchanging blows. I have utterly failed with this in the past and thrown my share of jabs and the occasional haymaker in an attempt to “win” the conversation. I can assure you it doesn’t turn out well. Instead, let us work together to build our collective knowledge and curiosity.


To conclude, I want to point out this is not a phenomenon exclusive to the clinician. While most of this post has been framed as a clinical debate, similar experiences can occur in the treatment room. Both parties, the patient and the clinician, may be guilty. The patient seeking the quick fix may be immune to “the voice of reason” and have no interest in learning about long-term strategies. Clinicians taking the approach of educating from a position of authority will find more success in communicating with a brick wall. All of us, professional or lay individual, are subject to cognitive dissonance and would benefit from more curiosity.


Reference

  1. Hart, W., et al., Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychol Bull, 2009. 135(4): p. 555–88.

  2. Wood, W., Attitude change: persuasion and social influence. Annu Rev Psychol, 2000. 51: p. 539–70.

  3. Pluck, G.J., H.L., Stimulating curiosity to enhance learning. GESJ: Education Sciences and Psychology, 2011. 2(19).


*The book links are Amazon Affiliate Links

Comments


bottom of page