The Dangers of Intuitive Thinking in the Diagnostic Process

July 1, 2024
Marlene Icenhower, BSN, JD, CPHRM, Senior Risk Specialist, Coverys

An estimated 795,000 people in the United States die or are permanently disabled every year across all clinical settings as a result of diagnostic errors. The diagnostic process is a complicated one that involves collecting, integrating, and interpreting data to arrive at a working diagnosis. Diagnosticians typically use two types of reasoning as they move through the diagnostic process—intuitive and analytical. Intuitive reasoning is rapid and based on unconscious correlation to previous experiences and examples stored in memory. Analytical reasoning, on the other hand, is methodical, deliberate, and rule based.

 

Intuitive reasoning is useful in daily activities like cooking a meal or navigating a difficult social situation. However, when practitioners use intuitive rather than analytical reasoning in the diagnostic process, it can introduce cognitive biases that cloud judgment, lead to inaccuracies, and result in diagnostic error.

 

Cognitive biases are errors in thinking that occur due to the natural limitation of the human mind to process information or its inclination to form inaccurate mental models or shortcuts. Coverys analyzed 800 claims closed from 2019-2023 that involved diagnostic error. The data revealed that 43% of those claims showed indicators of cognitive bias within the patient assessment process, specifically in the areas of narrow diagnostic focus and lack of a thorough patient assessment.

 

Out of the more than 100 types of identified cognitive biases, the four most common biases affecting the diagnostic process are:

 

      Confirmation bias—Selectively gathering and interpreting data to confirm one’s own beliefs or neglecting evidence that contradicts them. Common examples of confirmation bias are disregarding lab results that do not support the initial diagnosis or ignoring symptoms that are inconsistent with that diagnosis.

      Anchoring bias—Selecting and prioritizing data that supports the practitioner’s first impressions, even when those impressions are incorrect. An example of anchoring bias is attributing a 40-year-old patient’s repeated rectal bleeding to known hemorrhoids rather than searching for another cause.

      Affect heuristic bias—Allowing one’s actions to be swayed by emotions rather rational deliberations. An example of affect heuristic bias is the belief that a patient’s persistent abdominal pain is related to alcoholism rather than searching for another cause.

      Outcomes bias—Believing that outcomes are always related to prior decisions made by the practitioner even when the practitioner has no reason to believe this to be true. An example of outcomes bias is the belief that a prescribed medication was responsible for resolution of symptoms when, in fact, the symptoms may have resolved on their own without any treatment at all.

 

Risk Recommendations:

 

The formation of cognitive biases in humans is natural—it is the way our brains attempt to efficiently process all the information we encounter. In the medical setting, however, these biases can lead to diagnostic error and patient harm. Consider the following when evaluating the impact of cognitive biases on diagnostic error trends at your organization:

 

      Examine organizational culture. Assess your organization to understand its diagnostic culture and other influences that create overconfidence in a medical diagnosis. Analyze whether production pressures, diagnostic uncertainty, and threats to a practitioner’s self-image may lead to overconfidence and erroneous, delayed, or missed diagnoses. Work to eliminate cultural barriers that may be interfering with diagnostic accuracy.

      Educate practitioners. The first step in combatting biases that negatively impact medical decision-making is to educate practitioners about the different types of cognitive biases, how they relate to diagnostic error, and how to avoid them. Debiasing strategies, such as guided reflection, and conscious consideration of alternative diagnoses may help to reduce biases by compelling close examination of one’s thinking patterns and processes.

      Provide tools. Standardized tools that prompt analytical rather than intuitive thinking, such as clinical reasoning tools, checklists, and evidence-based protocols, can help diagnosticians consider and address biases as they work through the diagnostic process.

      Leverage and optimize technology. If clinical information is hard to find, it’s useless. Implement technology to centralize clinical information so that the entire care team can easily access it. Integrate clinical decision support tools into the EHR to help guide practitioners through the diagnostic process. Some organizations are exploring the use of artificial intelligence (AI) to enhance clinical decision-making and the diagnostic process. While AI cannot replace human interaction and professional expertise when addressing clinical information, this technology shows promise. Monitor this technology as it continues to evolve to determine how it can be safely used to enhance the diagnostic process at your organization.

      Communicate uncertainty. Once a diagnosis is reached, if there is still a degree of uncertainty (e.g., the patient is presenting in an unusual way, their symptoms don’t quite match what one would expect, diagnostic tests are inconclusive, etc.), document this to alert other team members that the final diagnosis is evolving. This might prevent subsequent clinicians from becoming anchored in an uncertain diagnosis.

      Be alert for indicators of diagnostic error. Educate providers to pay attention to clues that may indicate further diagnostic inquiry or a diagnostic time out is needed such as symptoms that do not respond to treatment, worsening symptoms, or repeated requests for medication refills. These clues can be an indication that the provider needs to reconsider the initial diagnosis or pursue other avenues of inquiry.

      Include the patient. An engaged patient is a safer patient, and one who is less likely to pursue litigation when something goes wrong. Find ways to include the patient in the decision-making that leads to a differential diagnosis. Present your initial impressions as a “working diagnosis” rather than a “definitive diagnosis,” and encourage patients to actively examine test results and other information in online portals. Remember that the diagnostic process is collaborative. Your transparency and disclosure can quell patient anxieties and frustrations when a complete or timely diagnosis is elusive.

 

Cognitive biases impact practitioners’ decision-making ability and can lead to diagnostic error, patient harm, and claims. Fortunately, there are strategies that can be implemented to reduce the impact of cognitive biases and enhance patient safety.

Sponsored by:

Sponsored Recommendations

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...

Spotlight on Artificial Intelligence

Unlock the potential of AI in our latest series. Discover how AI is revolutionizing clinical decision support, improving workflow efficiency, and transforming medical documentation...

Beyond the VPN: Zero Trust Access for a Healthcare Hybrid Work Environment

This whitepaper explores how a cloud-enabled zero trust architecture ensures secure, least privileged access to applications, meeting regulatory requirements and enhancing user...

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...