Advertisement

Overconfidence in Clinical Decision Making

      Within medicine, there are more than a dozen major disciplines and a variety of further subspecialties. They have evolved to deal with >10,000 specific illnesses, all of which must be diagnosed before patient treatment can begin. This commentary is confined to orthodox medicine; diagnosis using folk and pseudo-diagnostic methods occurs in complementary and alternative medicine (CAM) and is described elsewhere.
      • Jenicek M.
      • Hitchcock D.L.
      Evidence-based practice: logic and critical thinking in medicine.
      • Croskerry P.
      Timely recognition and diagnosis of illness.
      The process of developing an accurate diagnosis involves decision making. The patient typically enters the system through 1 of 2 portals: either the family doctor's office/a walk-in clinic or the emergency department. In both arenas, the first presentation of the illness is at its most undifferentiated. Often, the condition is diagnosed and treated, and the process ends there. Alternately, the general domain where the diagnosis probably lies is identified and the patient is referred for further evaluation. Generally, uncertainty progressively decreases during the evaluative process. By the time the patient is in the hands of subspecialists, most of the uncertainty is removed. This is not to say that complete assurance ever prevails; in some areas (e.g., medicine, critical care, trauma, and surgery), considerable further diagnostic effort may be required due to the dynamic evolving nature of the patient's condition and further challenges arising during the course of management.
      For the purposes of the present discussion, we can make a broad division of medicine into 2 categories: one that deals with most of the uncertainty about diagnosis (e.g., family medicine [FM] and emergency medicine [EM]) and the other wherein a significant part of the uncertainty is removed (e.g., the specialty disciplines). Internal medicine (IM) falls somewhere between the two in that diagnostic refinement is already underway but may be incomplete.
      Benchmark studies in patient safety found that diagnostic failure was highest in FM, EM, and IM,
      • Brennan T.A.
      • Leape L.L.
      • Laird N.M.
      • et al.
      Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study 1.
      • Wilson R.M.
      • Runciman W.B.
      • Gibberd R.W.
      • Harrison B.T.
      • Newby L.
      • Hamilton J.D.
      The Quality in Australian Health Care Study.
      • Thomas E.J.
      • Studdert D.M.
      • Burstin H.R.
      • et al.
      Incidence and types of adverse events and negligent care in Utah and Colorado.
      presumably reflecting the relatively high degree of diagnostic uncertainty. These settings, therefore, deserve the closest scrutiny. To examine this further, we need to look at the decision-making behaviors that underlie the diagnostic process, particularly the biases that may be involved. Overconfidence is one of the most significant of these biases. This paper expands on the article by Drs. Berner and Graber
      • Berner E.
      • Graber M.L.
      Overconfidence as a cause of diagnostic error in medicine.
      in this supplement in regard to modes of diagnostic decision making and their relationship to the phenomenon of overconfidence.

      Dual processing approach to decision making

      Effective problem solving, sound judgment, and well-calibrated clinical decision making are considered to be among the highest attributes of physicians. Surprisingly, however, this important area has been actively researched for only about 35 years. The main epistemological issues in clinical decision making have been reviewed.
      • Norman G.R.
      The epistemology of clinical reasoning: perspectives from philosophy, psychology, and neuroscience.
      Much current work in cognitive science suggests that the brain utilizes 2 subsystems for thinking, knowing, and information processing: System 1 and System 2.
      • Bruner J.S.
      • Hammond K.R.
      Intuitive and analytic cognition: information models.
      • Sloman S.A.
      The empirical case for two systems of reasoning.
      • Evans J.
      • Over D.
      • Stanovich K.E.
      • West R.F.
      Individual differences in reasoning: implications for the rationality debate?.
      Their characteristics are listed in Table 1, adapted from Hammond
      • Hammond K.R.
      Intuitive and analytic cognition: information models.
      and Stanovich.
      • Stanovich K.E.
      Table 1Characteristics of System 1 and System 2 approaches in decision making
      CharacteristicSystem 1 (intuitive)System 2 (analytic)
      Cognitive styleHeuristicSystematic
      OperationAssociativeRule based
      ProcessingParallelSerial
      Cognitive awarenessLowHigh
      Conscious controlLowHigh
      AutomaticityHighLow
      RateFastSlow
      ReliabilityLowHigh
      ErrorsNormative distributionFew but significant
      EffortLowHigh
      Predictive powerLowHigh
      Emotional valenceHighLow
      Detail on judgment processLowHigh
      Scientific rigorLowHigh
      ContextHighLow
      Adapted from Concise Encyclopedia of Information Processing in Systems and Organizations,
      • Hammond K.R.
      Intuitive and analytic cognition: information models.
      and The Robot's Rebellion: Finding Meaning in the Age of Darwin.
      • Stanovich K.E.
      What is now known as System 1 corresponds to what Hammond
      • Hammond K.R.
      Intuitive and analytic cognition: information models.
      descibed as intuitive, referring to a decision mode that is dominated by heuristics such as mental short-cuts, maxims, and rules of thumb. The system is fast, associative, inductive, frugal, and often primed by an affective component. Importantly, our first reactions to any situation often have an affective valence.
      • Zajonc R.B.
      Feeling and thinking: preferences need no inferences.
      Blushing, for example, is an unconscious response to specific situational stimuli. Though socially uncomfortable, it often is very revealing about deeper beliefs and conflicts. Generally, under conditions of uncertainty, we tend to trust these reflexive, associatively generated feelings.
      Stanovich
      • Stanovich K.E.
      adopted the term “the autonomous set of systems” (TASS), emphasizing the autonomous and reflexive nature of this style of responding to salient features of a situation (Table 2),
      • Stanovich K.E.
      and providing further characterization of System 1 decision making. TASS is multifarious. It encompasses processes of emotional regulation and implicit learning. It also incorporates Fodorian modular theory,
      • Fodor J.
      which proposes that the brain has a variety of modules that have undergone Darwinian selection to deal with different contingencies of the immediate environment. TASS responses are, therefore, highly context bound. Importantly, repeated use of analytic (System 2) outputs can allow them to be relegated to the TASS level.
      • Stanovich K.E.
      Table 2Properties of the autonomous set of systems (TASS)
      • Processing takes place beyond conscious awareness
      • Parallel processing: each “hard-wired” module can independently respond to the appropriate triggering stimulus, and more than 1 can respond at a time. Therefore, many different subprocesses can execute simultaneously
      • An accepting system that does not consider opposites: tendency to focus only on what is true rather than what is false. Disposed to believe rather than take the skeptic position; therefore look to confirm rather than disconfirm (the analytic system, in contrast, is able to undo acceptance)
      • Higher cognitive (intellectual) ability appears to be correlated with an ability to use System 2 to override TASS and produce responses that are instrumentally rational
      • Typically driven by social, narrative and contextualizing styles, whereas the style of System 2 requires detachment, decoupling, and decontextualization
      Adapted from The Robot's Rebellion: Finding Meaning in the Age of Darwin.
      • Stanovich K.E.
      Thus, the effortless pattern recognition that characterizes the clinical acumen of the expert physician is made possible by accretion of a vast experience (the repetitive use of a System 2 analytic approach) that eventually allows the process to devolve to an automatic level.
      • Norman G.
      Building on experience—the development of clinical reasoning.
      • Norman G.R.
      • Brooks L.R.
      The non-analytic basis of clinical reasoning.
      Indeed, it is the apparent effortlessness of the method that permits some disparaging discounting; physicians often refer to diagnosis based on System 1 thinking as “just pattern recognition.” The process is viewed as simply a transition to an automatic way of thinking, analogous to that occurring in the variety of complex skills required for driving a car; eventually, after considerable practice, one arrives at the destination with little conscious recollection of the mechanisms for getting there.
      The essential characteristic of this “nonanalytic” reasoning is that it is a process of matching the new situation to 1 of many exemplars in memory,
      • Hatala R.
      • Norman G.R.
      • Brooks L.R.
      Influence of a single example upon subsequent electrocardiogram interpretation.
      which are apparently retrievable rapidly and effortlessly. As a consequence, it may require no more mental effort for a clinician to recognize that the current patient is having a heart attack than it is for a child to recognize that a dog is a four-legged beast. This strategy of reasoning based on similarity to a prior learned example has been described extensively in the literature on exemplar models of concept formation.
      • Medin D.L.
      Concepts and conceptual structure.
      • Brooks L.R.
      Decentralized control of categorization: the role of prior processing episodes.
      Overall, although generally adaptive and often useful for our purposes,
      • Gigerenzer G.
      • Todd P.
      • Eva K.W.
      • Norman G.R.
      Heuristics and biases: a biased perspective on clinical reasoning.
      in some clinical situations, System 1 approaches may fail. When the signs and symptoms of a particular presentation do not fit into TASS, the response will not be triggered,
      • Norman G.
      Building on experience—the development of clinical reasoning.
      and recognition failure will result in System 2 being engaged instead. The other side of the coin is that occasionally people act against their better judgment and behave irrationally. Thus, it may be that under certain conditions, despite a rational judgment having been reached using System 2, the decision maker defaults to System 1. This is not an uncommon phenomenon in medicine; despite being aware of good evidence from painstakingly developed practice guidelines, clinicians may still overconfidently choose to follow their intuition.
      In contrast, System 2 is analytical, i.e., deductive, slow, rational, rule based, and low in emotional investment. Unlike the hard-wired, parallel-processing capabilities of System 1, System 2 is a linear processor that follows explicit computational rules. It corresponds to the software of the brain, i.e., our learned, rational reasoning power. According to Stanovich,
      • Stanovich K.E.
      this mode allows us “to sustain the powerful context-free mechanisms of logical thought, inference, abstraction, planning, decision making, and cognitive control.”
      Whereas it is natural to think that System 2 thinking—coldly logical and analytical—likely is superior to System 1, much depends on context. A series of studies
      • Kulatunga-Moruzi C.
      • Brooks L.R.
      • Norman G.R.
      Coordination of analytic and similarity-based processing strategies and expertise in dermatological diagnosis.
      • Ark T.K.
      • Brooks L.R.
      • Eva K.W.
      Giving learners the best of both worlds: do clinical teachers need to guard against teaching pattern recognition to novices?.
      have shown that “pure” System 1 and System 2 thinking are error prone; a combination of the 2 is closer to optimal. A simple example suffices: the first time a student answers the question “what is 16 x 16?” System 2 thinking is used to compute slowly and methodically by long multiplication. If the question is posed again soon after, the student recognizes the solution and volunteers the answer quickly and accurately (assuming it was done correctly the first time) using System 1 thinking. Therefore, it is important for decision makers to be aware of which system they are using and its overall appropriateness to the situation.
      Certain contexts do not allow System 1. We could not use this mode, for example, to put a man on the moon; only System 2 would have worked. In contrast, adopting an analytical System 2 approach in an emergent situation, where rapid decision making is called for, may be paradoxically irrational.
      • Norman G.
      Building on experience—the development of clinical reasoning.
      In this situation, the rapid cognitive style known popularly as “thin-slicing”
      • Gladwell M.
      that characterizes System 1 might be more expedient and appropriate. Recent studies suggest that making unconscious snap decisions (deliberation-without-attention effect) can outperform more deliberate “rational” thinking in certain situations.
      • Dijksterhuis A.
      • Bos M.W.
      • Nordgren L.F.
      • van Baaren R.B.
      On making the right choice: the deliberation-without-attention effect.
      • Zhaoping L.
      • Guyader N.
      Interference with bottom-up feature detection by higher-level object recognition.
      Perhaps the mark of good decision makers is their ability to match Systems 1 and 2 to their respective optimal contexts and to consciously blend them into their overall decision making. Although TASS operates at an unconscious level, their output, once seen, can be consciously modulated by adding a System 2 approach. Engagement of System 2 may occur when it “catches” an error in System 1.
      • Risen J.
      • Gilovich T.
      Informal logical fallacies.

      Overconfidence

      Overconfident judgment by clinicians is 1 example of many cognitive biases that may influence reasoning and medical decision making. This bias has been well demonstrated in the psychology literature, where it appears as a common, but not universal, finding.
      • Baron J.
      Thinking and Deciding.
      • Griffin D.
      • Tversky A.
      The weighing of evidence and the determinants of confidence.
      Ethnic cross-cultural variations in overconfidence have been described.
      • Yates J.F.
      • Lee J.W.
      • Sieck W.R.
      • Choi I.
      • Price P.C.
      Probability judgment across cultures.
      Further, we appear to be consistently overconfident when we express extreme confidence.
      • Baron J.
      Thinking and Deciding.
      Overconfidence also plays a role in self-assessment, where it is axiomatic that relatively incompetent individuals consistently overestimate their abilities.
      • Eva K.W.
      • Regehr G.
      Self-assessment in the health professions: a reformulation and research agenda.
      • Krueger J.
      • Dunning D.
      Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments.
      In some circumstances, overconfidence would qualify as irrational behavior.
      Why should overconfidence be a general feature of human behavior? First, this trait usually leads to definitive action, and cognitive evolutionists would argue that in our distant pasts definitive action, under certain conditions, would confer a selective advantage. For example, to have been certain of the threat of danger in a particular situation and to have acted accordingly increased the chances of that decision maker's genes surviving into the next generation. Equivocation might have spelled extinction. The “false alarm” cost (taking evasive action) was presumably minimal although some degree of signal-detection trade-off was necessary so that the false-positive rate was not too high and wasteful. Indeed, error management theory suggests that some cognitive biases have been selected due to such cost/benefit asymmetries for false-negative and false-positive errors.
      • Haselton M.G.
      • Buss D.M.
      Error management theory: a new perspective on biases in cross-sex mind reading.
      Second, as has been noted, System 1 intuitive thinking may be associated with strong emotions such as excitement and enthusiasm. Such positive feelings, in turn, have been linked with an enhanced level of confidence in the decision maker's own judgment.
      • Tiedens L.
      • Linton S.
      Judgment under emotional certainty and uncertainty: the effects of specific emotions on information processing.
      Third, from TASS perspective, it is easy to see how some individuals would be more accepting of, and overly confident in, apparent solutions and conclusions they have reached, rather than take a more skeptical stance and look for disconfirming evidence to challenge their assumptions. This is referred to as confirmation bias,
      • Nickerson R.S.
      Confirmation bias: a ubiquitous phenomenon in many guises.
      which is one of the most powerful of the cognitive biases. Not surprisingly, it takes far more mental effort to contemplate disconfirmation (the clinician can only be confident that something isn't disease A by considering all the other things it might be) than confirmation. Harkening back to the self-assessment literature, one can only assess how much one knows by accurately assessing how much one doesn't know, and as Frank Keil
      • Rozenblit L.R.
      • Keil F.C.
      The misunderstood limits of folk science: an illusion of explanatory depth.
      says, “How can I know what I don't know when I don't know what I don't know?”
      Importantly, overconfidence appears to be related to the amount and strength of supporting evidence people can find to support their viewpoints.
      • Koriat A.
      • Lichtenstein S.
      • Fischoff B.
      Reasons for confidence.
      Thus, overconfidence itself may depend upon confirmation bias. People's judgments were better calibrated (there was less overconfidence) when they were obliged to take account of disconfirming evidence.
      • Koriat A.
      • Lichtenstein S.
      • Fischoff B.
      Reasons for confidence.
      Hindsight bias appears to be another example of overconfidence; it could similarly be debiased by forcing a consideration of alternative diagnoses.
      • Arkes H.
      • Faust D.
      • Guilmette T.
      • Hart K.
      Eliminating the hindsight bias.
      This consider-the-opposite strategy appears to be one of the more effective debiasing strategies. Overall, it appears to be the biased fashion in which evidence is generated during the development of a particular belief or hypothesis that leads to overconfidence.
      Other issues regarding overconfidence may have their origins within the culture of medicine. Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty, and there is a prevailing censure against disclosing uncertainty to patients.
      • Katz J.
      Why doctors don't disclose uncertainty.
      There are good reasons for this. Shamans, the progenitors of modern clinicians, would have suffered short careers had they equivocated about their cures.
      • Croskerry P.G.
      The feedback sanction.
      In the present day, the charisma of physicians and the confidence they have in their diagnosis and management of illness probably go some way toward effecting a cure. The same would hold for CAM therapists, perhaps more so. The memeplex
      • Blackmore S.
      of certainty, overconfidence, autonomy, and an all-knowing paternalism have propagated extensively within the medical culture even though, as is sometimes the case with memes, it may not benefit clinician or patients over the long term.
      Many variables are known to influence overconfidence behaviors, including ego bias,
      • Detmer D.E.
      • Fryback D.G.
      • Gassner K.
      Heuristics and biases in medical decision making.
      gender,
      • Lundeberg M.A.
      • Fox P.W.
      • Punccohar J.
      Highly confident but wrong: gender differences and similarities in confidence judgments.
      self-serving attribution bias,
      • Deaux K.
      • Farris E.
      Attributing causes for one's own performance: the effects of sex, norms, and outcome.
      personality,
      • Landazabal M.G.
      Psychopathological symptoms, social skills, and personality traits: a study with adolescents Spanish.
      level of task difficulty,
      • Fischhoff B.
      • Slovic P.
      A little learning…: Confidence in multicue judgment.
      feedback efficacy,
      • Lichenstein S.
      • Fischoff B.
      Training for calibration.
      • Arkes H.R.
      • Christensen C.
      • Lai C.
      • Blumer C.
      Two methods of reducing overconfidence.
      base rate,
      • Griffin D.
      • Tversky A.
      The weighing of evidence and the determinants of confidence.
      predictability of outcome,
      • Griffin D.
      • Tversky A.
      The weighing of evidence and the determinants of confidence.
      ambiguity of evidence,
      • Griffin D.
      • Tversky A.
      The weighing of evidence and the determinants of confidence.
      and presumably others. A further complication is that some of these variables are known to interact with each other. It would be expected, too, that the level of critical thinking
      • Jenicek M.
      • Hitchcock D.L.
      Evidence-based practice: logic and critical thinking in medicine.
      • van Gelder T.
      • Bissett M.
      • Cumming G.
      Cultivating expertise in informal reasoning.
      would have an influence on overconfidence behavior. To date, the literature regarding medical decision making has paid relatively scant regard to the impact of these variables. Because scientific rigor appears lacking for System 1, the prevailing research emphasis in both medical
      • Croskerry P.
      The theory and practice of clinical decision making.
      and other domains
      • Dane E.
      • Pratt M.G.
      Exploring intuition and its role in managerial decision making.
      has been on System 2.

      Solutions and conclusions

      Overconfidence often occurs when determining a course of action and, accordingly, should be examined in the context of judgment and decision making. It appears to be influenced by a number of factors related to the individual as well as the task, some of which interact with one another. Overconfidence is associated in particular with confirmation bias and may underlie hindsight bias. It seems to be especially dependent on the manner in which the individual gathers evidence to support a belief. In medical decision making, overconfidence frequently is manifest in the context of delayed and missed diagnoses,
      • Berner E.
      • Graber M.L.
      Overconfidence as a cause of diagnostic error in medicine.
      where it may exert its most harmful effects.
      There are a variety of explanations why individual physicians exhibit overconfidence in their judgment. It is recognized as a common cognitive bias; additionally, it may be propagated as a component of a prevailing memeplex within the culture of medicine.
      Numerous approaches may be taken to correct failures in reasoning and decision making.
      • Croskerry P.
      Timely recognition and diagnosis of illness.
      Berner and Graber
      • Berner E.
      • Graber M.L.
      Overconfidence as a cause of diagnostic error in medicine.
      outline the major strategies; Table 3 expands on some of these and suggests specific corrective actions. Presently, no 1 strategy has demonstrated superiority over another, although, as noted earlier, several studies
      • Koriat A.
      • Lichtenstein S.
      • Fischoff B.
      Reasons for confidence.
      • Arkes H.
      • Faust D.
      • Guilmette T.
      • Hart K.
      Eliminating the hindsight bias.
      suggest that when the generation of evidence is unbiased by giving competing hypotheses as much attention as the preferred hypothesis, overconfidence is reduced. Inevitably, the solution probably will require multiple paths.
      • Norman G.
      Building on experience—the development of clinical reasoning.
      Table 3Sources of overconfidence and strategies for correction
      SourceCorrecting strategy
      Lack of awareness and insight into decision theoryIntroduce specific training in current decision theory approaches at the undergraduate level, emphasizing context dependency as well as particular vulnerabilities of different decision-making modes
      Cognitive and affective biasSpecific training at the undergraduate level in the wide variety of known cognitive and affective biases. Create files of clinical examples illustrating each bias with appropriate correcting strategies
      Limitations in feedbackIdentify speed and reliability of feedback as a major requirement in all clinical domains, both locally and systemically
      Biased evidence gatheringPromote adoption of cognitive forcing strategies to take account of disconfirming evidence, competing hypotheses, and consider-the-opposite strategy
      Denial of uncertaintySpecific training to overcome personal and cultural barriers against admission of uncertainty, and acknowledgement that certainty is not always possible. Encourage use of “not yet diagnosed”
      Base rate neglectMake readily available current incidence and prevalence data for common diseases for particular clinical groups in specific geographical area
      Context bindingPromote awareness of the impact of context on the decision-making process; advance metacognitive training to detach from the immediate pull of the situation and decontextualize the clinical problem
      Limitations on transferabilityIllustrate how biases work in a variety of clinical contexts. Adopt universal debiasing approaches with applicability across multiple clinical domains
      Lack of critical thinkingIntroduce courses early in the undergraduate curriculum that cover the basic principles of critical thinking, with iteration at higher levels of training
      Prompt and reliable feedback about decision outcomes appears to be a prerequisite for calibrating clinician performance, yet it rarely exists in clinical practice.
      • Croskerry P.G.
      The feedback sanction.
      From the standpoint of clinical reasoning, it is disconcerting that clinicians often are unaware of, or have little insight into, their thinking processes. As Epstein
      • Epstein R.M.
      Mindful practice.
      observed of experienced clinicians, they are “less able to articulate what they do than others who observe them,” or, if articulation were possible, it may amount to no more than a credible story about what they believe they might have been thinking, and no one (including the clinician) can ever be sure that the account was accurate.
      • Norman G.
      Building on experience—the development of clinical reasoning.
      But this is hardly surprising as it is a natural consequence of the dominance of System 1 thinking that emerges as one becomes an expert. As noted earlier, conscious practice of System 2 strategies can get compiled in TASS and eventually shape TASS responses. A problem once solved is not a problem; experts are expert in part precisely because they have solved most problems before and need only recognize and recall a previous solution. But this means that much of expert thinking is, and will remain, an invisible process. Often, the best we can do is make inferences about what thinking might have occurred in the light of events that subsequently transpired. It would be reassuring to think that with the development of expertise comes a reduction in overconfidence, but this is not always the case.
      • Henrion M.
      • Fischoff B.
      Assessing uncertainty in physical constants.
      Seemingly, clinicians would benefit from an understanding of the 2 types of reasoning, providing a greater awareness of the overall process and perhaps allowing them to explicate their decision making. Whereas System 1 thinking is unavailable to introspection, it is available to observation and metacognition. Such reflection might facilitate greater insight into the overall blend of decision-making modes typically used in the clinical setting.
      Educational theorists in the critical thinking literature have expressed long-standing concerns about the need for introducing critical thinking skills into education. As van Gelder and colleagues
      • van Gelder T.
      • Bissett M.
      • Cumming G.
      Cultivating expertise in informal reasoning.
      note, a certain level of competence in informal reasoning normally occurs through the processes of maturation, socialization, and education but few people actually progress beyond an everyday working level of performance to genuine proficiency.
      This issue is especially relevant for medical training. The implicit assumption is made that by the time students have arrived at this tertiary level of education, they will have achieved appropriate levels of competence in critical thinking skills, but this is not necessarily so.
      • Jenicek M.
      • Hitchcock D.L.
      Evidence-based practice: logic and critical thinking in medicine.
      Though some will become highly proficient thinkers, the majority will probably not, and there is a need for the general level of reasoning expertise to be raised. In particular, we require education about detachment, overcoming belief bias effects, perspective switching, decontextualizing,
      • Stanovich K.E.
      and a variety of other cognitive debiasing strategies.
      • Croskerry P.
      The importance of cognitive errors in diagnosis and strategies to prevent them.
      It would be important, for example, to raise awareness of the many shortcomings and pitfalls of uncritical thinking at the medical undergraduate level and provide clinical cases to illustrate them. At a more general level, consideration should be given to introducing critical thinking training in the undergraduate curriculum so that many of the ∼50 cognitive and affective biases in thinking
      • Risen J.
      • Gilovich T.
      Informal logical fallacies.
      could be known and better understood.
      Theoretically, it should be possible to improve clinical reasoning through specific training and thus reduce the prevalence of biases such as overconfidence; however, we should harbor no delusions about the complexity of the task. To reduce cognitive bias in clinical diagnosis requires far more than a brief session on cognitive debiasing. Instead, it is likely that successful educational strategies will require repeated practice and failure with feedback, so that limitations of transfer can be overcome. While some people have enjoyed success at demonstrating improved reasoning expertise with training,
      • Griffin D.
      • Tversky A.
      The weighing of evidence and the determinants of confidence.
      • Krantz D.H.
      • Fong G.T.
      • Nisbett R.E.
      Formal training improves the application of statistical heuristics in everyday problems.
      • Lehman D.R.
      • Lempert R.O.
      • Nisbett R.E.
      The effects of graduate training on reasoning: formal discipline and thinking about everyday life events.
      • Nesbett R.E.
      • Fong G.T.
      • Lehman D.R.
      • Cheng P.W.
      Teaching reasoning.
      • Chi M.T.
      • Glaser R.
      • Farr M.J.
      The Nature of Expertise.
      to date there is little evidence that these skills can be applied to a clinical setting.
      • Graber M.A.
      Metacognitive training to reduce diagnostic errors: ready for prime time.
      Nevertheless, it is a reasonable expectation that training in critical thinking,
      • Jenicek M.
      • Hitchcock D.L.
      Evidence-based practice: logic and critical thinking in medicine.
      • Lehman D.
      • Nisbett R.
      A longitudinal study on the effects of undergraduate education on reasoning.
      and an understanding of the nature of cognitive
      • Croskerry P.
      The importance of cognitive errors in diagnosis and strategies to prevent them.
      and affective bias,
      • Croskerry P.
      The affective imperative: Coming to terms with our emotions.
      as well as the informal logical fallacies that underlie poor reasoning,
      • Risen J.
      • Gilovich T.
      Informal logical fallacies.
      would collectively lead to an overall improvement in decision making and a reduction in diagnostic failure.

      Author disclosures

      The authors report the following conflicts of interest with the sponsor of this supplement article or products discussed in this article:
      Pat Croskerry, MD, PhD, has no financial arrangement or affiliation with a corporate organization or a manufacturer of a product discussed in this article.
      Geoff Norman, PhD, has no financial arrangement or affiliation with a corporate organization or a manufacturer of a product discussed in this article.

      References

        • Jenicek M.
        • Hitchcock D.L.
        Evidence-based practice: logic and critical thinking in medicine.
        American Medical Association Press, US2005 (118–137)
        • Croskerry P.
        Timely recognition and diagnosis of illness.
        in: MacKinnon N. Nguyen T. Safe and Effective: the Eight Essential Elements of an Optimal Medication-use System. Canadian Pharmacists Association, Ottawa, Ontario2007
        • Brennan T.A.
        • Leape L.L.
        • Laird N.M.
        • et al.
        Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study 1.
        N Eng J Med. 1991; 324: 370-376
        • Wilson R.M.
        • Runciman W.B.
        • Gibberd R.W.
        • Harrison B.T.
        • Newby L.
        • Hamilton J.D.
        The Quality in Australian Health Care Study.
        Med J Aust. 1995; 163: 458-471
        • Thomas E.J.
        • Studdert D.M.
        • Burstin H.R.
        • et al.
        Incidence and types of adverse events and negligent care in Utah and Colorado.
        Med Care. 2000; 38: 261-271
        • Berner E.
        • Graber M.L.
        Overconfidence as a cause of diagnostic error in medicine.
        Am J Med. 2008; 121: S2-S23
        • Norman G.R.
        The epistemology of clinical reasoning: perspectives from philosophy, psychology, and neuroscience.
        Acad Med. 2000; 75: S127-S133
        • Bruner J.S.
        Actual Minds, Possible Worlds. Harvard University Press, Cambridge, MA1986
        • Hammond K.R.
        Intuitive and analytic cognition: information models.
        in: Sage A. Concise Encyclopedia of Information Processing in Systems and Organizations. Pergamon Press, Oxford1990: 306-312
        • Sloman S.A.
        The empirical case for two systems of reasoning.
        Psychol Bull. 1996; 119: 3-22
        • Evans J.
        • Over D.
        Rationality and Reasoning. Psychology Press, Hove, East Sussex, UK1996
        • Stanovich K.E.
        • West R.F.
        Individual differences in reasoning: implications for the rationality debate?.
        Behav Brain Sci. 2000; 23 (discussion 665–726): 645-665
        • Stanovich K.E.
        The Robot's Rebellion: Finding Meaning in the Age of Darwin. University of Chicago Press, Chicago2005
        • Zajonc R.B.
        Feeling and thinking: preferences need no inferences.
        Am Psychol. 1980; 35: 151-175
        • Fodor J.
        The Modularity of Mind. MIT Press, Cambridge, MA1983
        • Norman G.
        Building on experience—the development of clinical reasoning.
        New Engl J Med. 2006; 355: 2251-2252
        • Norman G.R.
        • Brooks L.R.
        The non-analytic basis of clinical reasoning.
        Adv Health Sci Educ Theory Pract. 1997; 2: 173-184
        • Hatala R.
        • Norman G.R.
        • Brooks L.R.
        Influence of a single example upon subsequent electrocardiogram interpretation.
        Teach Learn Med. 1999; 11: 110-117
        • Medin D.L.
        Concepts and conceptual structure.
        Am Psychol. 1989; 44: 1469-1481
        • Brooks L.R.
        Decentralized control of categorization: the role of prior processing episodes.
        in: Neisser U. Concepts and Conceptual Development. Cambridge University Press, Cambridge, UK1987: 141-174
        • Gigerenzer G.
        • Todd P.
        ABC Research Group. Oxford University Press, New York1999
        • Eva K.W.
        • Norman G.R.
        Heuristics and biases: a biased perspective on clinical reasoning.
        Med Educ. 2005; 39: 870-872
        • Kulatunga-Moruzi C.
        • Brooks L.R.
        • Norman G.R.
        Coordination of analytic and similarity-based processing strategies and expertise in dermatological diagnosis.
        Teach Learn Med. 2001; 13: 110-116
        • Ark T.K.
        • Brooks L.R.
        • Eva K.W.
        Giving learners the best of both worlds: do clinical teachers need to guard against teaching pattern recognition to novices?.
        Acad Med. 2006; 81: 405-409
        • Gladwell M.
        The Power of Thinking Without Thinking. Little Brown and Co, New York2005
        • Dijksterhuis A.
        • Bos M.W.
        • Nordgren L.F.
        • van Baaren R.B.
        On making the right choice: the deliberation-without-attention effect.
        Science. 2006; 311: 1005-1007
        • Zhaoping L.
        • Guyader N.
        Interference with bottom-up feature detection by higher-level object recognition.
        Curr Biol. 2007; 17: 26-31
        • Risen J.
        • Gilovich T.
        Informal logical fallacies.
        in: Sternberg R.J. Roediger H.L. Halpern D.F. Critical Thinking in Psychology. Cambridge University Press, New York2007
        • Baron J.
        Thinking and Deciding.
        in: 3rd ed. Cambridge University Press, New York2000
        • Griffin D.
        • Tversky A.
        The weighing of evidence and the determinants of confidence.
        in: Gilovich T. Griffin D. Kahneman D. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press, New York2002: 230-249
        • Yates J.F.
        • Lee J.W.
        • Sieck W.R.
        • Choi I.
        • Price P.C.
        Probability judgment across cultures.
        in: Gilovich T. Griffin D. Kahneman D. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press, New York2002: 271-291
        • Eva K.W.
        • Regehr G.
        Self-assessment in the health professions: a reformulation and research agenda.
        Acad Med. 2005; 80: S46-S54
        • Krueger J.
        • Dunning D.
        Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments.
        J Pers Soc Psychol. 1999; 77: 1121-1134
        • Haselton M.G.
        • Buss D.M.
        Error management theory: a new perspective on biases in cross-sex mind reading.
        J Pers Soc Psychol. 2000; 78: 81-91
        • Tiedens L.
        • Linton S.
        Judgment under emotional certainty and uncertainty: the effects of specific emotions on information processing.
        J Pers Soc Psychol. 2001; 81: 973-988
        • Nickerson R.S.
        Confirmation bias: a ubiquitous phenomenon in many guises.
        Rev Gen Psychol. 1998; 2: 175-220
        • Rozenblit L.R.
        • Keil F.C.
        The misunderstood limits of folk science: an illusion of explanatory depth.
        Cognit Sci. 2002; 26: 521-562
        • Koriat A.
        • Lichtenstein S.
        • Fischoff B.
        Reasons for confidence.
        J Exp Psychol. 1980; 6 ([Hum Learn]): 107-118
        • Arkes H.
        • Faust D.
        • Guilmette T.
        • Hart K.
        Eliminating the hindsight bias.
        J Appl Psychol. 1988; 73: 305-307
        • Katz J.
        Why doctors don't disclose uncertainty.
        in: Elstein A. Dowie J. Professional Judgment: A Reader in Clinical Decision Making. Cambridge University Press, Cambridge, UK1988: 544-565
        • Croskerry P.G.
        The feedback sanction.
        Acad Emerg Med. 2000; 7: 1232-1238
        • Blackmore S.
        The Meme Machine. Oxford University Press, Oxford1999
        • Detmer D.E.
        • Fryback D.G.
        • Gassner K.
        Heuristics and biases in medical decision making.
        J Med Educ. 1978; 53: 682-683
        • Lundeberg M.A.
        • Fox P.W.
        • Punccohar J.
        Highly confident but wrong: gender differences and similarities in confidence judgments.
        J Educ Psychol. 1994; (86): 114-121
        • Deaux K.
        • Farris E.
        Attributing causes for one's own performance: the effects of sex, norms, and outcome.
        J Res Pers. 1977; 11: 59-72
        • Landazabal M.G.
        Psychopathological symptoms, social skills, and personality traits: a study with adolescents Spanish.
        J Psychol. 2006; 9: 182-192
        • Fischhoff B.
        • Slovic P.
        A little learning…: Confidence in multicue judgment.
        in: Nickerson R. Attention and Performance.VIII. Erlbaum, Hillsdale, NJ1980
        • Lichenstein S.
        • Fischoff B.
        Training for calibration.
        Organ Behav Hum Perform. 1980; 26: 149-171
        • Arkes H.R.
        • Christensen C.
        • Lai C.
        • Blumer C.
        Two methods of reducing overconfidence.
        Organ Behav Human Decis Process. 1987; 39: 133-144
        • van Gelder T.
        • Bissett M.
        • Cumming G.
        Cultivating expertise in informal reasoning.
        Can J Exp Psychol. 2004; 58: 142-152
        • Croskerry P.
        The theory and practice of clinical decision making.
        Can J Anaesth. 2005; 52: R1-R8
        • Dane E.
        • Pratt M.G.
        Exploring intuition and its role in managerial decision making.
        Acad Manage Rev. 2007; 32: 33-54
        • Epstein R.M.
        Mindful practice.
        JAMA. 1999; 9: 833-839
        • Henrion M.
        • Fischoff B.
        Assessing uncertainty in physical constants.
        Am J Phys. 1986; 54: 791-798
        • Croskerry P.
        The importance of cognitive errors in diagnosis and strategies to prevent them.
        Acad Med. 2003; 78: 1-6
        • Krantz D.H.
        • Fong G.T.
        • Nisbett R.E.
        Formal training improves the application of statistical heuristics in everyday problems.
        Bell Laboratories, Murray Hill, NJ1983
        • Lehman D.R.
        • Lempert R.O.
        • Nisbett R.E.
        The effects of graduate training on reasoning: formal discipline and thinking about everyday life events.
        Am Psychol. 1988; 43: 431-432
        • Nesbett R.E.
        • Fong G.T.
        • Lehman D.R.
        • Cheng P.W.
        Teaching reasoning.
        Science. 1987; 238: 625-631
        • Chi M.T.
        • Glaser R.
        • Farr M.J.
        The Nature of Expertise.
        Lawrence Erlbaum Associates, Hillsdale, NJ1988
        • Graber M.A.
        Metacognitive training to reduce diagnostic errors: ready for prime time.
        Acad Med. 2003; 78 ([abstract]): 781
        • Lehman D.
        • Nisbett R.
        A longitudinal study on the effects of undergraduate education on reasoning.
        Dev Psychol. 1990; 26: 952-960
        • Croskerry P.
        The affective imperative: Coming to terms with our emotions.
        Acad Emerg Med. 2007; 14: 184-186