Read Time: 20 minutes
“Science is the belief in the ignorance of experts” – Richard Feynman
If we are to progress as a profession and provide our patients the best care, we must hold each other accountable clinically. Our world is flooded with information on best practices and it’s hard to skim through LinkedIn without seeing “evidence-based” within the first couple posts. Unfortunately, knowing the “best” and most updated treatment techniques is only half the equation. Our ability to think critically to assess the available information, identify and address judgements and biases, and accurately apply available information is necessary to truly Live Clinically and provide evidence-based treatment.
Finding information on how to assess and treat patients is not difficult; the trick is knowing how to filter and apply the information. This article will focus on critical thinking skills often neglected and misunderstood. Heuritsics and cognitive biases are rarely discussed in school and clinical education, yet they influence every bit of information we process and apply in daily life. Many of these biases go unnoticed and unchecked. If we learn to recognize them, and make the necessary adjustments to our thinking, we can effectively navigate the world of evidence-based care.
What are heuristics?
As a physical therapist, I often seek the most efficient way for delivering high-quality care. I may achieve this efficiency through selective patient scheduling, timely completion of notes, clustering marketing and management duties, and applying treatments that provide the greatest outcome relative to the least amount of time required. I am not alone in this approach. By seeking more efficient means of patient care and completing my job duties, I am allotted more time in my day. This time may be used to for any of the previously mentioned tasks – seeing more patients, more marketing, managerial duties – or other areas of interest, such as reading, exercise, or spending time with my family. Regardless of the choice I make, the added time is a bonus and I am motivated to obtain it. One of the methods many clinicians use to improve efficiency in daily tasks is applying heuristics.
Heuristics are mental shortcuts we subconsciously apply to improve efficiencies. These ‘rule of thumb’ strategies are mental biases which are often referred to as common sense. The problem with the commonsense approach is it typically requires a sacrifice of accuracy. A frequently employed strategy is substitution. Our mind often employs substitution to replace a challenging question, to which we lack the answer, with an easier one. For example, a clinician may substitute the question of “what is the most-evidence-based treatment I could apply for this symptom presentation?” with “what type of treatment have I successfully used in the past for this symptom presentation?”. The first requires significantly more work, as I need to scour the evidence and face the uncertainty of applying an intervention I am less familiar and practiced with. The second is safer, easier, and more predictable.
This blog will address the drivers behind treatment decision-making in the healthcare space. However, the drivers behind treatment decisions does not fall squarely on the shoulders of clinicians. Patients and researchers have a large role to play in the development, dissemination, and implementation of best practice. I have been fortunate to gather experiences as a patient, provider, manger, researcher, and educator. These experiences have allowed me to develop multiple perspectives and recognize those who handle cognitive biases well and those who do not. I have fallen on both sides of the aisle.
This idea for this blog started with internal communications to colleagues in my practice, PT Solutions. In 2018, I started sending weekly emails to all our therapists, which contained research studies from a breadth of fields. The intent was to broaden the scope of education provided and reduce the barriers to seeking out and obtaining evidence. I would add a snippet or two on ‘the why’ behind the chosen articles. These emails expanded to include athletic trainers, clinic front desk personnel, and physician liaisons in an effort to provide evidence for all our employees responsible for patient care and garnering new patients. While one barrier was removed, several more remained firmly in place. Chief among these was the act of implementing the evidence I provided into daily practice.
I was finding that seeking out information was not the primary issue. The greater challenges lies in digesting and applying the information. Reading and understanding literature brings a set of challenges and overcoming our biases and cognitive fallacies brings additional ones. It was at this point I decided to expand the communications. I wanted to develop more digestible information and entice the reader. Of course, expanding the emails to a deep dive into literature (they were 2500-3500-word articles, rather than a typical blog post of 300-500 words) and adding well-chosen and hilarious – at least to me – memes to increase engagement and remove the “scholarly journal vibe” takes time. Thus, the ‘Weekly Evidence’ emails became known as ‘Weekly(ish) Evidence’ emails. To my relief, more clinicians reached out to me about the value of the content and using them during ‘Evidence-Based Days’ (PT Solutions’ version of clinical rounds where we discuss evidence and patient cases). As I delved into the assessment and treatment literature, the need for training and understanding of heuristics and biases become more apparent.
Applying Critical Thinking
As mentioned, a heuristic is a simple decision strategy that ignores part of the available information and focuses on the few relevant predictors. They are useful and often vital (originally developed as a survival mechanism for situations when there is insufficient time to critically appraise all variables) but sometimes lead to severe and systematic errors. Heuristics can be defined as “mental shortcuts commonly used in decision making that can lead to faulty reasoning or conclusions”, citing them as a source of many errors in clinical reasoning. I will cover many of these systematic errors and biases in future posts – sunk cost fallacy, confirmation bias, theory-induced blindness, availability heuristic, planning fallacy, and halo effect to name a few – and how they relate to our clinical practice.
There is an overwhelming amount of literature available to all of us, however, if we don’t know how to critically appraise it, apply it, analyze its usage, and make revisions in our practice, then we will be limited in the quality of care we provide and revert to old habits. In addition, without the ability to analyze our own thoughts and actions, we become subjected to the information relayed by others and our natural tendencies. Critical appraisal is difficult, as is challenging information we hear and see on a regular basis. Daniel Kahneman, winner of the 2002 Nobel Memorial Prize in Economics and author of Thinking, Fast and Slow (possibly the most influential book I’ve read) thoroughly addresses heuristics, biases, and critical thinking. At the beginning of the book, Kahneman sets the foundation for our decision-making process with a concept known as the two systems of the mind, system 1 and system 2. Here is how he defines them:
“System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control”
“System 2 allocated attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.”
Many of the biases that clinicians and patients suffer from can be traced back to the system of thinking employed. System 1 is largely responsible for our heuristics and biases. Our brains crave coherence and will search for – or even create – associations that link events and create a story that makes sense. If we start approaching cognitive fallacies, our system 1 won’t alert us of a potential issue. When we acknowledge doubt and uncertainty, we enter the domain of system 2. However, if we do not train our system 2, laziness can result and we search for the path of least resistance, thereby still falling victim to our biases.
image source: upfrontanalytics.com
It is far more effortful for us to mobilize system 2 than to simply revert to system 1 and “common sense”. Herein lies the primary issues with deciding on treatment approaches. Keeping up with literature is a challenge. Tens of thousands of journals publish new articles monthly. Not only do we need to remain up to date in our field but in related fields as well. For example, reading Physical Therapy Journal and Journal of Sports and Orthopedic Physical Therapy only scratches the surface for my clinical care. I still need to remain updated in exercise, nutrition, psychology, medical rehabilitation, medical orthopedics, pain science, and other basic sciences. Not to mention keeping tabs on popular journalism and current trends patients will bring to the session. It is far easier for me to find my niche, develop a set of skills that deliver a positive outcome, and continue to hone those skills throughout my career. Why mess with a good thing?
“Admitting the role of heuristics confers no shame. Rather, the goal should be to formalize and understand heuristics so that their use can be effectively taught, which could lead to less practice variation and more efficient medical care.” 
Are we challenging our potential biases and intuitions? This is not to say act and speak with uncertainty in all we do. When making a decision, stick with it. However, when you have opportunity to reflect, be critical of the information available and the additional information needed. This is reflected in the concept “strong beliefs, weakly held‘ developed by Stanford University professor Paul Saffo. Conviction is important, but a healthy dose of doubt and uncertainty allows us to grow as professionals.
Doubt and Uncertainty
“It is our capacity to doubt that will determine the future of civilization.”
These words were spoken by Richard Feynman, a brilliant physicist who won the Nobel prize for physics in 1965 and is often referred to as the founding father of the field of nanotechnology. His secret to being one of the great minds of the 20th century is his ability to embrace doubt and uncertainty. “The question of doubt and uncertainty is what is necessary to begin; for if you already know the answer there is no need to gather any evidence about it.”
Image source: https://www.bizarro.com/cartoons
Using doubt and uncertainty during the discovery process can be a difficult principle to live by, as it requires a lot of work. It means we never truly know, never have all the answers, never have it all figured out. Despite positive outcomes in the past or previous “best evidence,” we need to constantly assess the literature and constantly face the facts. Science is simply testing, observing, concluding – very objectively, minimizing bias – and repeat. Every day, we should strive to be a little better and learn something new.
While I can attest to the power of doubt and uncertainty in the worlds of physical medicine and rehabilitation, education, and research, we see it written about in other fields as well. Engebresten et al. wrote a 2016 paper titled “Uncertainty and objectivity in clinical decision making: a clinical case in emergency medicine.” They tackle the problems of uncertainty in emergency situations. In emergency care, whether it be a paramedic responding to a 911 call or an ER physician triaging following a catastrophe, uncertainty is often viewed exclusively in the negative. We are instructed to train ourselves to overcome it and abolish it, potentially even ignore it. The authors of the paper disagree: “It is not by getting rid of or even by reducing uncertainty, but by attending systematically to it and by relating to it in a self-conscious way, that objective knowledge can be obtained.”
Doubt is a powerful tool for two primary reasons: 1) What we “know” often turns out to be wrong in the future and 2) We rarely have access to all the information. A healthy level of doubt leads us to better interpret information and reason through the answer. Take reading research as an example. The danger of simply reading the abstract – or the more enjoyable but heavily biased introduction and discussion sections – is we don’t have all the information. Reading the methods and results allows us to gain a better understanding of the objective data and draw conclusions. I will cover strategies for reading and understanding research in a future post.
Doubt forces us to discover and build knowledge. It forces us to ask more questions, seek greater understanding, and refine our thought processes. Knowing ‘the why’ behind a technique or a patient’s experience allows not only a more tailored treatment approach by clinicians, but an ability to navigate unexpected obstacles.
The “gold standard” or “best practice” is constantly evolving and at times performing an about-face. This does not mean clinicians should shoot from the hip and ignore evidence, as it is bound to change eventually, but instead we should challenge our treatment approaches every day. This view can be simplified with the aforementioned concept of ‘strong beliefs, weakly held’. Having doubt does not mean you have weak convictions and are apprehensive to act, but instead that you are constantly open to new idea and make decisions based on strong evidence. Rather than simply taking something at face value, do a little research yourself to determine the validity and seek the necessary amount of information to make a sound decision.
Uncertainty and doubt lead to a yearning for greater understanding and are vital to becoming an expert in any subject matter. They develop a habit of critical assessment and reflection of past actions. They help establish the mindset of never being satisfied with an answer and lead individuals to perpetually adopting the role of a student.
Becoming a Credible Hulk
The amount of information available today is both exciting and terrifying. At any given moment, with a couple swipes and clicks you can access a wealth of information on nearly any topic imaginable. Access to information is a wonderful thing; but, it has its drawbacks. A primary concern is the credibility of the information. There is an abundance of incomplete information, exaggeration, praying on people’s fears or misunderstandings, and incorrect “facts” provided on a daily basis from so-called “experts.” It can be difficult to truly know if the information provided is accurate and beneficial. This applies to both lay individuals and providers. Fortunately, all information is not poor. To determine which is worthwhile and which can be ignored, we must channel our inner ‘Credible Hulk.’
A ‘Credible Hulk’ is an individual who backs up their rage with facts and documented sources. By rage, I don’t mean we should flip tables and throw waiting room chairs while conveying our differing opinions with facts and documented sources. However, there are certainly moments where throwing a chair feels desirable. Instead, we should all be steadfast and anchor our beliefs and approaches to assessment and treatment strategies with high quality evidence.
Often, we put most of our trust in expert opinion, but this leads us down a dangerous path. Many judgements and biases can cause well-meaning assessments to be incomplete and potentially flat out wrong. Research has the benefit of controlling variables, reducing biases (if the study is well constructed), and objectifying results. If you are recommended a course of action by a colleague, mentor, or physician, they should be respectfully challenged if the recommendation lacks adequate support in the research. Ultimately, I hope this blog series can provide some resources and strategies to help with those discussion. At the very least, I hope it leads to more questions being asked.
By no means do I have it all figured out. Discovering I am wrong and changing my approach is a regular experience. Some level of discomfort in the moment will always remain, but as you embrace doubt and uncertainty, challenge your assumptions daily, and practice critical thinking, your curiosity develops and you find yourself enjoying the experience.
“The way to block errors that originate in system 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcements from system 2”
– Daniel Kahneman
- Elstein, A.S., Heuristics and biases: selected errors in clinical reasoning. Acad Med, 1999. 74(7): p. 791-4.
- McDonald, C.J., Medical heuristics: the silent adjudicators of clinical practice. Ann Intern Med, 1996. 124(1 Pt 1): p. 56-62.
- Engebretsen, E., et al., Uncertainty and objectivity in clinical decision making: a clinical case in emergency medicine. Med Health Care Philos, 2016. 19(4): p. 595-603.
ABOUT THE AUTHOR
Zach has numerous research publications in peer-reviewed rehabilitation and medical journals. He has developed and taught weekend continuing education courses in the areas of plan of care development, exercise prescription, pain science, and nutrition. He has presented full education sessions at APTA NEXT conference and ACRM, PTAG, and FOTO annual conferences. He has also presented multiple platforms sessions and posters at CSM. He is an active member of the Orthopedic and Research sections of the American Physical Therapy Association and the Physical Therapy Association of Georgia. He currently served on the APTA Science and Practice Affairs Committee and the PTAG Barney Poole Leadership Academy.