Skip to content
BY-NC-ND 3.0 license Open Access Published by De Gruyter January 8, 2014

Figure and ground in physician misdiagnosis: metacognition and diagnostic norms

  • Robert M. Hamm EMAIL logo
From the journal Diagnosis

Abstract

Meta-cognitive awareness, or self reflection informed by the “heuristics and biases” theory of how experts make cognitive errors, has been offered as a partial solution for diagnostic errors in medicine. I argue that this approach is not as easy nor as effective as one might hope. We should also promote mastery of the basic principles of diagnosis in medical school, continuing medical education, and routine reflection and review. While it may seem difficult to attend to both levels simultaneously, there is more to be gained from attending to both than from focusing only on one.

Campaigns for fundamental change need to push on all fronts. The movement to reduce diagnostic error in medicine is correct to address simultaneously the cognition of the physician, the coordination of the medical team, and institutional support for the provision of accurate and timely information. Yet on the topic of the physician’s diagnostic thinking, much more attention has been devoted to recognizing and correcting mental shortcuts and strategic shortcomings than to the mastery and maintenance of basic diagnostic competency. These are equally essential.

Physicians’ thinking is the cause of too many diagnostic errors. A recent review checking records of all hospital patients showed that the proportion of adverse diagnostic events due at least in part to cognitive error was 84% [1]. The diagnostic error rate revealed by autopsies has been stable over recent decades (though the proportion of deaths deemed to need autopsy is decreasing), with 10% showing misdiagnosis that contributed to the patient’s outcome, and another 35% showing incidental diagnostic error [2]. Though much misdiagnosis may be attributed to system failures [3, 4], and promising work has been done discovering ways to reorganize practice at the system level, still it is worth exploring how to reduce the number of diagnostic errors attributable to physicians’ cognitive errors.

On the ImproveDx email discussion listserve of the Society to Improve Diagnosis in Medicine, there is abiding interest in how to prevent cognitive error from causing misdiagnosis. Drawing on psychology’s account of heuristics and biases [5], and of intuitive and analytic cognition [6, 7], it is suggested that physicians can learn to reflect on a patient’s situation, remind themselves that like all humans they use cognitive strategies that are vulnerable to error, look for how they might make such errors with this patient (metacognition), and follow corrective strategies that make such errors less likely [8]. The promise of this vision is well expressed in this summary statement,

“…knowing how the metacognitive process works allows us to train students and inexperienced clinicians in its use, ultimately helping them minimize or avoid diagnostic errors” [8].

To focus on the potential for reducing misdiagnosis by reducing cognitive strategy error implicitly assumes that physicians have adequately mastered the basic principles of diagnosis. Diagnosis is one of the essential competencies and a focus of training and assessment at all levels of medical education [9]. But in diagnosing, both students and experienced clinicians often fail to make full use of what is known about disease prevalence, the informativeness of findings [10], or the utility of appropriate treatments [11, 12]; and are resistant to instruction in the relevant principles [13, 14].

If indeed many physicians don’t know how to appropriately apply diagnostic reasoning, one might ask why we focus on the heuristic cognitive strategies that may bias diagnosis, when we could as profitably focus on physicians’ basic understanding of the principles of diagnosis and of the impacts of particular signs and symptoms on diagnoses’ likelihood? One could substitute “diagnosis” for “metacognition” in the above statement,

“… knowing how the diagnostic process works allows us to train students and inexperienced clinicians in its use, ultimately helping them minimize or avoid diagnostic errors.”

That too is a plausible vision. Yet it seems impossible to think simultaneously about both the basic diagnostic skill and the typical human cognitive strategies that can produce errors in applying diagnostic skill. In the visual figure/ground illusion (see Figure 1), sometimes we see the columns, other times the spaces between the columns. Our discussions of improving physicians diagnoses have recently been addressing only one part of the problem. The project of eliminating cognition-caused misdiagnosis should not focus solely on the figure, the awareness and suppression of heuristic strategies, and neglect the ground. I say this not because the metacognitive approach is wrong, but because it is not easy, not guaranteed effective, and not enough.

Figure 1 Angel Columns, by David Barker. (Photograph by Sha Sha Chu, http://www.fotopedia.com/items/flickr-443215138).
Figure 1

Angel Columns, by David Barker. (Photograph by Sha Sha Chu, http://www.fotopedia.com/items/flickr-443215138).

If it were easy to stop a habit with metacognitive insight, why do so many still struggle with alcohol, tobacco, anorexia, and obesity? Why is it so difficult for physicians to change their own behaviors that evidence shows are ineffective or harmful? [15]. While insight is a legitimate component of psychotherapy, it is known to involve hard work and slow progress.

Why think metacognition is cognitively demanding? Like diagnosis, it requires knowing categories and activating them for the present situation through a pattern recognition process. Two distinct modes of applying metacognitive knowledge are implied by the “general” and “specific” levels of metacognition (table 4 of Croskerry’s exposition of the theory [8]). A general-level understanding of the way cognitive heuristics can produce misdiagnoses involves: knowing the categories of error-producing strategy (such as judging diagnosis probability by the patient presentation’s representativeness); knowing corresponding strategies to prevent those errors; allocating attention and effort to monitor one’s diagnostic thinking to recognize the possibility that one might have used one (or more) of the general strategies with this particular patient; recognizing which general strategy one has used in its particular manifestation and recalling an appropriate general corrective strategy; and modifying that corrective for application in the context of the particular cognitive processes used with this patient.

Metacognition is demanding because the physician would need to learn to recognize many general heuristic strategies. Hogarth [16] described 29, Croskerry [17, 18] listed 33, and Wikipedia’s current “List of cognitive biases” has 61 that affect memory and 95 that distort decision making, belief, and behavior. Clinicians in the ImprovedDx conversations have asserted they’ll not be learning the full set. Perhaps one could reduce the study load, concentrating on the most common or troublesome categories of cognitive bias, and on mastering the most effective debiasing techniques. For example, these cognitive errors are often mentioned: stopping diagnostic search too soon (e.g., premature closure, anchoring on first hypothesis compounded by confirmation bias), failure to consider prevalence (e.g., base rate neglect), and failure to appreciate evidence’s impact (e.g., pseudodiagnosticity). Correspondingly, there may be just a few powerful corrective strategies: for example, to prevent stopping search too soon, make oneself to look at it differently (take an outsider’s perspective, consider the opposite, use the competing hypotheses heuristic) [19–21].

Besides learning the types of cognitive bias, metacognition at the general level also requires monitoring of one’s everyday diagnosing, whether one is being vigilant for the full Wikipedia list or for a reduced set of the most important. This moment of reflection about each patient, though brief, still adds time and effort to the physician’s day. Though one might be inspired to reflect after hearing a lecture on metacognition, how long will such a commitment persist?

The specific-level mode of applying metacognitive knowledge [8] recognizes that physicians usually encounter familiar patient presentations, and their knowledge (illness scripts [22, 23] or internalized clinical algorithm trees) is shaped by past study and experience so that strategies are ready to apply once activated through pattern recognition. In Croskerry’s example of the patient bitten by a dog (p. 116 of [8]), the animal bite script should include recognition of the pitfall of not recognizing that a patient is immunocompromised, and a cognitive forcing strategy for avoiding that pitfall by asking if there are any of a list of causes of immunocompromise (e.g., splenectomy). When one has learned the particular pitfalls and corrective strategies of every presentation, there is no perceived effort in the application of metacognition. The work has been done before, learned explicitly and then made automatic through practice, turning explicit System 2 strategies into intuitive System 1 responses [21, 23, 24].

To estimate the burden of learning specific metacognitive knowledge, adding pitfalls and correctives to every script, assume that the physician needs to be able to diagnose 120 presenting complaints [25]. To systematically search for all possible pitfalls using the Wikipedia list, there’d be 156*120 or 18,720 checks. Surely expert clinical teachers would identify a much smaller set of relevant pitfalls. Still the student will need to study each explicitly, in medical school or residency, and practice it enough that the corrective strategy comes to mind whenever needed. Whether physicians practice metacognition by monitoring their diagnostic thinking for signs they may be using heuristics that can produce misdiagnoses, or by learning a large number of specific pitfalls and correctives as part of their education, the project requires a large amount of work.

Some discussions of the promise of metacognition imply it can very effectively control cognitive errors of diagnosis. This is overly optimistic. Diagnosis is intrinsically hard because the world and our knowledge of it are uncertain [26, 27]. We don’t yet know how much the rate of misdiagnosis can be reduced through awareness of cognitive errors [28]. There are wonderful success stories, such as the one related by a resident on ImproveDx: In 3rd year of medical school, her patient with psychiatric symptoms died; the team had missed a medical illness. Grieving and seeking understanding, she read about cognitive diagnostic errors. In residency she encountered another patient with a psychiatric diagnosis complaining of physical symptoms. This time, the resident did not accept the team’s judgment that “it is not medical,” worked up the patient as appropriate, and found two major medical problems. The resident’s anecdote affirms the value of teaching metacognition: Croskerry [8] had listed some particular pitfalls, including “failing to look for medical problems once a psychiatric diagnosis has been found, unwillingness to accept that patients might have more than one diagnosis,” and even accurately described how availability (a heuristic strategy) can promote the metacognitive recognition of cognitive errors (due to heuristic strategies). Thus the resident was primed by the trauma of her earlier patient, and by her reading on cognitive error, to recognize when the team failed to address the later patient’s medical problems once they had a psychiatric diagnosis.

But here is another anecdote. In my family medicine department’s mortality and morbidity rounds, the residents described a 27-year-old woman with abdominal pain and a mass near the adnexa. The FM residents feared cancer, but the OB/GYN residents declared it an abscess. Abscesses are more common causes of such complaints even though the configuration of symptoms looked very much like cancer. The patient could not have an investigational procedure until an unrelated condition was resolved. I recognized this to have the same structure as the classic demonstrations of the representativeness heuristic – the case looks more similar to the prototype of the rare disease, though the competitor is more prevalent [29]. I gave a lecture explaining this cognitive error to the residents, though we had to leave it open whether our team had been right. Months later, the patient’s investigational procedure revealed she had an endometrioma, neither a cancer nor an abscess. In the language of heuristics and biases, this was a case of “premature closure” not “representativeness heuristic”.

The metacognitive theory, just like medical diagnosis, involves fallible pattern recognition. After we know the patient diagnosis, we can accurately categorize the cognitive error. Prospectively, we may misidentify the heuristic and apply an irrelevant corrective strategy. The strategies appropriate for protecting oneself from the neglect of base rate that the representativeness strategy might produce may not be appropriate for preventing premature closure.

If our “figure,” metacognitive awareness, is not easy enough or effective enough, should we instead focus on our “ground,” improving physicians’ mastery of the cognitive mechanics of diagnosis per se? Unfortunately, the same arguments may be made, as many have noted before: learning and applying the principles of diagnosis is not easy and teaching them has had small impact. As with metacognition, there are two distinct modes by which physicians may apply diagnostic principles as they think about their patients (see Table 1). They can apply the general principles to the particular patient, or they can recall and use specific facts and strategies appropriate for the particular patient’s presentation that they have learned before.

Table 1

Categorization of general and specific modes of applying diagnostic norms and metacognition to physicians’ diagnostic reasoning.

Formal norms of diagnosis. Principles for diagnosing accurately and efficientlyMetacognitive monitoring for cognitive errors of diagnosis
Learning of general principles (done previously); application work (done currently)Bayes theorem

Decision theory based diagnosis
Croskerry’s [8] universal and general levels: cognitive heuristics and debiasing strategies.
Learning of specific patterns (done previously); current application (may be conscious or automatic)Scripts

Evidence based algorithms
Croskerry’s specific level: pitfalls and cognitive forcing strategies for particular clinical presentations

Education for the first mode would teach the formal principles of the optimal diagnostic process, training students to: identify relevant diagnoses and assess their prevalence; seek information that best discriminates among competing disease categories; adjust probabilities for new findings, using Bayes’ theorem (for one finding at a time) or prediction rules (combining multiple findings); and act in the light of the disease probabilities to maximize treatment utility by choosing most beneficial treatment, or treating all diseases whose probability is over a utility-based threshold. Learning these principles may impose an even greater burden than learning general metacognition. Even physicians who understand the principles well enough to teach them don’t often apply them to individual patients, for the numerical inputs for Bayes’ theorem are not easily accessible, prediction rules are hard to find, and it is hard to carry out the calculations in the clinic. A believable case has been made, however, that just knowing the general rules provides useful insight [30, 31].

To improve diagnostic reasoning through the specific-level mode would require that physicians master particular diagnostic strategies for each clinical presentation, strategies built previously through the application of diagnostic principles to the data, but integrated into the physician’s illness scripts and not requiring current calculations. Acquired through study and supervised case experience, these diagnostic scripts would allow the physician to seek information and draw conclusions based on accurate knowledge of diagnosis-finding relations. To make the principles of diagnosis more accessible to physicians, a number of thinking aids have been proposed, including the use of natural frequencies rather than probabilities [32], graphic representations [33], re-expressions of Bayes’ theorem [34, 35], and their combination (Hamm and Beasley, under review).

I have argued that metacognition is more work than conventionally recognized, and less effective than initially hoped. Based on this, I urged those seeking to improve physician diagnostic reasoning not to neglect improving physicians’ mastery of diagnosis, per se. That too takes hard and sustained work, and may have only a moderate effect [28]. Presumably focusing on both aspects of physician diagnosis, figure and ground, will produce more improvement than focusing on either alone. So I propose another revision of Croskerry’s [8] sentence as an appropriate vision for reducing physician misdiagnosis:

“… knowing how the diagnostic and metacognitive processes work allows us to train students and inexperienced clinicians in their use, ultimately helping them minimize or avoid diagnostic errors.”


Corresponding author: Robert M. Hamm, Department of Family and Preventive Medicine, University of Oklahoma Health Sciences Center, 900 NE 10th Street, Oklahoma City, OK 73104, USA, E-mail: .

The author thanks Jasmina Stevanov for assistance identifying the source of the figure/ground illustration. The work by David Barker is at the San Francisco Exploratorium, http://exs.exploratorium.edu/exhibits/angel-columns/. The photograph has been originally published at http://www.flickr.com/photos/shashachu/443215138/, and has been reproduced with kind permission from the photographer Sha Sha Chu.

  1. Conflict of interest statement The author declares no conflict of interest.

References

1. Zwaan L, de Bruijne M, Wagner C, Thijs A, Smits M, van der Wal G, et al. Patient record review of the incidence, consequences, and causes of diagnostic adverse events. Arch Intern Med 2010;170:1015–21.10.1001/archinternmed.2010.146Search in Google Scholar PubMed

2. Kirch W, Schafii C. Misdiagnosis at a university hospital in 4 medical eras. Medicine (Baltimore) 1996;75:29–40.10.1097/00005792-199601000-00004Search in Google Scholar PubMed

3. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what′s the goal? Acad Med 2002;77: 981–92.10.1097/00001888-200210000-00009Search in Google Scholar PubMed

4. Weed LL, Weed L. Medicine in Denial. CreateSpace Independent Publishing Platform, 2011.Search in Google Scholar

5. Kahneman D, Slovic P, Tversky A, editors. Judgment under uncertainty: heuristics and biases. New York: Cambridge University Press, 1982.10.1017/CBO9780511809477Search in Google Scholar

6. Gilovich T, Griffin DW, Kahneman D, editors. Heuristics and biases: the psychology of intuitive judgment. Cambridge, UK: Cambridge University Press, 2002.10.1017/CBO9780511808098Search in Google Scholar

7. Evans JS, Stanovich KE. Dual-process theories of higher cognition: advancing the debate. Perspect Psychol Sci 2013;8:223–41.10.1177/1745691612460685Search in Google Scholar PubMed

8. Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emergency Med 2003;41:110–20.10.1067/mem.2003.22Search in Google Scholar PubMed

9. Bergus G, Vogelgesang S, Tansey J, Franklin E, Feld R. Appraising and applying evidence about a diagnostic test during a performance-based assessment. BMC Med Educ 2004;4:20.10.1186/1472-6920-4-20Search in Google Scholar PubMed PubMed Central

10. Beckstead JW, Beckie TM. How much information can metabolic syndrome provide? An application of information theory. Med Decis Making 2011;31:79–92.10.1177/0272989X10373401Search in Google Scholar PubMed

11. Fasoli A, Lucchelli S, Fasoli R. The role of clinical “experience” in diagnostic performance. Med Decis Making 1998;18:163–7.10.1177/0272989X9801800205Search in Google Scholar PubMed

12. Poses RM, Cebul RD, Centor RM. Evaluating physicians’ probabilistic judgments. Med Decis Making 1988;8:233–40.10.1177/0272989X8800800403Search in Google Scholar PubMed

13. Noguchi Y, Matsui K, Imura H, Kiyota M, Fukui T. A traditionally administered short course failed to improve medical students’ diagnostic performance. A quantitative evaluation of diagnostic thinking. J Gen Intern Med 2004;19:427–32.10.1111/j.1525-1497.2004.30257.xSearch in Google Scholar PubMed PubMed Central

14. Basinga P, Moreira J, Bisoffi Z, Bisig B, Van den Ende J. Why are clinicians reluctant to treat smear-negative tuberculosis? An inquiry about treatment thresholds in Rwanda. Med Decis Making 2007;27:53–60.10.1177/0272989X06297104Search in Google Scholar PubMed

15. Hamm RM. Irrational persistence in belief. In: Kattan MW, editor. Encyclopedia of medical decision making. Thousand Oaks, CA: Sage Publications, 2009:640–4.Search in Google Scholar

16. Hogarth RM. Judgement and choice: the psychology of decision. New York: John Wiley & Sons, 1980.Search in Google Scholar

17. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–80.10.1097/00001888-200308000-00003Search in Google Scholar PubMed

18. Campbell SG, Croskerry P, Bond WF. Profiles in patient safety: a “perfect storm” in the emergency department. Acad Emerg Med 2007;14:743–9.Search in Google Scholar

19. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ quality safety 2013. Epub 2013/09/03.10.1136/bmjqs-2012-001713Search in Google Scholar PubMed PubMed Central

20. Arkes HR. Costs and benefits of judgment errors: Implications for debiasing. Psychological Bulletin 1991;110:486–98.10.1037/0033-2909.110.3.486Search in Google Scholar

21. Larrick RP. Debiasing (Chapter 16). In: Koehler DJ, Harvey N, editors. Blackwell Handbook of judgment and decision making. Malden, MA: Blackwell Publishers, 2004:316–37.Search in Google Scholar

22. Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: theory and implication. Acad Med 1990;65:611–21.10.1097/00001888-199010000-00001Search in Google Scholar PubMed

23. Abernathy CM, Hamm RM. Surgical intuition. Philadelphia, PA: Hanley and Belfus, 1995.Search in Google Scholar

24. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013;22:ii58–64.10.1136/bmjqs-2012-001712Search in Google Scholar PubMed PubMed Central

25. Woloschuk W, Harasym P, Mandin H, Jones A. Use of scheme-based problem solving: an evaluation of the implementation and utilization of schemes in a clinical presentation curriculum. Med Educ 2000;34:437–42.10.1046/j.1365-2923.2000.00572.xSearch in Google Scholar PubMed

26. Gordon R, Franklin N. Cognitive underpinnings of diagnostic error. Acad Med 2003;78:782.10.1097/00001888-200308000-00005Search in Google Scholar PubMed

27. Bursztajn H, Feinbloom RI, Hamm RM, Brodsky A. Medical choices, medical chances. New York: Delacorte, 1981.Search in Google Scholar

28. Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf 2012;21:535–57.10.1136/bmjqs-2011-000149Search in Google Scholar

29. Bar-Hillel M. The base rate fallacy in probability judgments. Acta Psychologica 1980;44:211–33.10.1016/0001-6918(80)90046-3Search in Google Scholar

30. Fong GT, Krantz DH, Nisbett RE. The effects of statistical training on thinking about everyday problems. Cognitive Psychol 1986;18:253–92.10.1016/0010-0285(86)90001-0Search in Google Scholar

31. Brown RV. Decision theory as an aid to private choice. Judgment and Decision Making. 2012;7:207–23.10.1017/S1930297500003041Search in Google Scholar

32. Gigerenzer G, Hoffrage U. How to improve Bayesian reasoning without instruction: frequency formats. Psychological Rev 1995;102:684–704.10.1037/0033-295X.102.4.684Search in Google Scholar

33. Sedlmeier P. Improving statistical reasoning: theoretical models and practical imp lications. Mahwah, NJ: Lawrence Erlbaum Associates, Publishers, 1999.10.4324/9781410601247Search in Google Scholar

34. Van Puymbroeck H, Remmen R, Denekens J, Scherpbier A, Bisoffi Z, Van Den Ende J. Teaching problem solving and decision making in undergraduate medical education: an instructional strategy. Med Teach 2003;25:547–50.10.1080/0142159031000137508Search in Google Scholar

35. Van den Ende J, Bisoffi Z, Van Puymbroek H, Van der Stuyft P, Van Gompel A, Derese A, et al. Bridging the gap between clinical practice and diagnostic clinical epidemiology: pilot experiences with a didactic model based on a logarithmic scale. J Eval Clin Practice 2007;13:374–80.10.1111/j.1365-2753.2006.00710.xSearch in Google Scholar PubMed

Received: 2013-9-13
Accepted: 2013-11-12
Published Online: 2014-01-08
Published in Print: 2014-01-01

©2014 by Walter de Gruyter Berlin/Boston

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.

Downloaded on 29.3.2024 from https://www.degruyter.com/document/doi/10.1515/dx-2013-0019/html
Scroll to top button