Skip to content
Publicly Available Published by De Gruyter July 6, 2020

Progress understanding diagnosis and diagnostic errors: thoughts at year 10

  • Mark L. Graber EMAIL logo
From the journal Diagnosis

“You can observe a lot by just watching” – Yogi Berra

Baseball Umpire #1 I call ‘em as I see ‘em

Baseball Umpire #2 I call ‘em as they are

Baseball Umpire #3 They ain’t nothin’ till I call ‘em

Richard E. Nisbett Mindware – Tools for Smart Thinking

A decade ago, diagnostic errors represented a blind spot in health care – a problem that somehow was overlooked in the early discussions of patient safety [1]. Thankfully, we have seen an explosion of interest in considering the diagnostic error problem over the past few years, and in understanding the nature of diagnosis itself. In the decade from 2000 to 2010, at best a handful of articles appeared every year on these topics; now 10 years later, there are hundreds of publications appearing annually, thankfully including the landmark report from the National Academy of Medicine from 2015 on Improving Diagnosis in Health Care [2]. Diagnostic error has clearly emerged from the shadows: The National Academy report designated diagnostic error to be an urgent national concern, and for the past 3 years the ECRI Institute has designated it to be the #1 patient safety problem in healthcare today [3].

At this 10-year milestone, we are delighted to publish this special edition of our DIAGNOSIS journal, which presents a unique and very different view of diagnosis from the information-processing framework we have been using extensively for the past decade (Figure 1). The articles in this special issue approach diagnosis from a sociological perspective. Annemarie Jutel and Kevin Dew define this perspective as one “that considers how individuals function within a wider social context” [4]. In this vein, diagnosis is a social activity with important social consequences, and that to fully understand diagnosis requires that we examine how the process evolves in a particular social context. In particular, we will be looking at diagnosis from the perspective of ‘situativity’, referring to a family of related social theories focusing specifically on cognition and the clinical reasoning process [5], [6]. From this “situated” perspective, observing how diagnosis evolves is a key methodological approach to studying the diagnostic process.

Figure 1: Information-processing theory clarifies how a diagnosis is made through both the intuitive (System 1) and conscious, rational pathways (System 2). These components of clinical decision-making make up the central elements in the situativity perspective of diagnosis, which portrays how the clinical reasoning steps take place in a particular context, and are influenced by other elements of cognition, the environment, the work system, and the many actors and interactions that can be involved in the process.
Figure 1:

Information-processing theory clarifies how a diagnosis is made through both the intuitive (System 1) and conscious, rational pathways (System 2). These components of clinical decision-making make up the central elements in the situativity perspective of diagnosis, which portrays how the clinical reasoning steps take place in a particular context, and are influenced by other elements of cognition, the environment, the work system, and the many actors and interactions that can be involved in the process.

From the perspective of the information-processing theory, diagnosis was understood to be a sense-making, clinical reasoning process ‘in the head’ of the physician. In contrast, the situativity perspective emphasizes that diagnosis takes place ‘in the world’, and that our cognition is shaped by our interactions with others and with the resources in our environment. The clinical reasoning steps (‘in the head’) are still there and remain a critical part of the diagnostic process, but are embedded within the situation, meaning that the environment, the other participants, and their interactions are key to arriving at the diagnosis. Thus the clinical reasoning steps retain their central importance as the place where the diagnosis will ultimately be declared, but situativity explores in much greater depth how this process actually works, and identifies and clarifies how the various steps are connected to, and influenced by so many other elements and factors. Situativity envisions the mind as “embodied (i.e., interacting with the body), embedded (i.e., interacting with the environment) and extended (i.e., interacting with other people and artefacts in larger systems)” [5].

Observing and characterizing diagnosis from this situativity perspective is important for at least three reasons: First, it changes our world view of diagnosis and how we understand the process. From this new perspective we consider diagnosis in a more holistic sense, taking into account how the ‘in the head’ clinical reasoning steps are influenced by the contextual and other social factors in play at the time. Second, it uses different methods to study the diagnostic process and its outcomes. And finally, this family of social cognitive theories emphasizes a different and much broader set of interventions to improve diagnosis and address diagnostic error. It asks us to consider all of the system-related factors that are relevant to diagnostic quality and safety, as well as the many opportunities to improve the cognitive aspects of diagnosis through its connections to the broader world.

The situativity perspective expands our world-view of diagnosis, and helps us understand the diagnostic process in its entirety

For starters, the situativity perspective opens up and unpacks the clinical reasoning process per se to allow a much more detailed examination. Michael Soh and colleagues describe 26 discrete elements of this one ‘step’, and studied how clinicians move through this sequence during their clinical reasoning process [7].

The National Academy of Medicine used a socio-technical perspective to portray diagnosis as a process, embedded in a particular environment (Figure 2) [2], [8], [9].

Figure 2: The diagnostic process, according to the National Academy of Medicine [2].
Figure 2:

The diagnostic process, according to the National Academy of Medicine [2].

The situativity perspective emphasizes how each of these elements interacts in ultimately determining the success or failure of the process, how easy or hard it is, and how fast or how slowly it proceeds. If this is the map, situativity theories provide the details, what each step involves, and what it all means.

The information-processing theory perspective focuses on just two parties, the doctor and the patient. In contrast, situativity portrays diagnosis as taking place in a ‘crowded room’ [10], with a wide range of individuals and influential factors potentially involved. The #1 recommendation to address diagnostic error in the National Academy of Medicine report was to improve teamwork in the diagnostic process [2]. This begins by involving the patient and family, making them full partners in establishing the diagnosis, and extends to all of the other individuals that touch the patient [11], [12], [13] (Figure 3). The article by Andrew Olson and colleagues in this issue explores the teamwork-in-diagnosis concept from the situativity perspective. This new concept visualizes diagnosis as something that emerges from the team; the clinical reasoning process happens collectively, not individually [14].

Figure 3: The importance of teams in the diagnostic process.Image by Pascale Carayon.
Figure 3:

The importance of teams in the diagnostic process.

Image by Pascale Carayon.

The work environment also plays a critical role, as diagnosis always takes place in a particular setting with its own particular resources and constraints (Figure 4). As examples, diagnosis will be constrained if the appropriate testing or imaging modalities aren’t readily available, if the ‘next available’ appointment to see a specialist is months away, if the time allocated for an appointment is too short, etc. Diagnosis will be enhanced if communication skills and pathways are well established, if the ‘team’ is functioning well, if electronic knowledge resources are easily accessed, if the local culture and learning pathways are favorable, etc. The electronic medical record is one of the most influential factors in this regard; many functionalities in the EMR greatly enhance the timeliness and reliability of diagnosis, and at the same time there are design flaws and unintended consequences that can greatly inhibit the process, or lead directly to lapses in care [15], [16].

Figure 4: Elements of the environment that influence diagnosis. Image by Pascale Carayon.
Figure 4:

Elements of the environment that influence diagnosis. Image by Pascale Carayon.

A fascinating aspect of viewing diagnosis from the situativity perspective is the realization that the diagnostic encounter as it is experienced by the doctor or the patient can be very different things, and neither one corresponds exactly to what actually transpired. The two parties perceive events from their own frame of reference, and these may differ markedly. This is the medical version of ‘selective perception’. An excellent illustration of this is Jonathan Howard’s description of a Princeton-Dartmouth football game in 1951 [17].

The Princeton vs. Dartmouth football game of 1951

The two schools have a longstanding rivalry, and the game was hotly-contested and rough, with multiple penalties and injuries. After the game, naturally, Dartmouth fans blamed Princeton for the violence, and Princeton fans blamed Dartmouth. Hastorf and Cantril conducted an interesting experiment to better document the phenomenon of selective perception: Students from the two schools were asked to review the exact same clips from the game, and provide their opinions on whether it was “rough and dirty” or “clean and fair,” and who was to blame for the offenses. Despite viewing the same exact same footage, students from Princeton overwhelming faulted Dartmouth for the game’s violence and vice versa, confirming the phenomenon of selective perception [18].

In the diagnostic process, a related phenomenon can be found in how differently the doctor and the patient perceive the same diagnostic encounter [4], or how different radiologists would view the same X-ray images. In the language of phenomenology and ethnography, “reality is defined as what individuals perceive or understand about objects or events, not the properties they may have on their own” [19]. Getting back to Umpire #3, in this world view diagnosis is not just the disease that a patient ‘has’, it is the co-created name assigned by the physician to the patient’s illness, emphasizing its sociological nature [20].

One of the most important aspects of situativity is its emphasis on ‘context’. Pat Croskerry describes two type of contextual influence: adjacent and situational. Situational factors would include things like the setting of the diagnostic process, the experience of the provider, the number of interruptions, etc. Adjacent factors could include patient-specific elements, such as their age, gender, or ethnic background, or their past history or associated symptoms. The exact same complaint can mean two entirely different things depending on the context in which it presents: A patient complaining of right-sided flank pain will be perceived and diagnosed in a very different way if he also complains of hematuria (he has a kidney stone), vs. having a dermatomal vesicular eruption (he has shingles). Pat Croskery’s paper on this phenomenon, ‘Context is Everything’, should be required reading for all future clinicians [21]. In their article in this issue, Alan Charney and Jordan Dourmashkin emphasize the critical need to consider context when interpreting lab results [22], and several other articles touch on the importance of contextual issues in diagnosis more generally. Context is everything.

The situativity perspective has major implications for both education and assessment. A central goal of healthcare education is to achieve competency in clinical reasoning, and most current assessment instruments focus on just the ‘in the head’ aspects of this process. However, recent recommendations envision an expanded conceptualization of competency in diagnosis that includes team-based and system-relevant elements [23]. The situativity perspective includes these elements, and many others. As Dario Torre and colleagues point out in their article, teaching clinical reasoning from the situativity perspective will be very different than learning from a printed case scenario [24]. Assessment will need to evolve accordingly; the articles in this issue by Joe Rencic and colleagues and Lambert Schuwirth and colleagues begin to explore this challenge: How should we assess clinical reasoning from a situativity perspective [25], [26], [27]? Direct observation of clinical reasoning abilities, as demonstrated in the article by Brian Garibaldi and colleagues, is likely to be much more informative than the multiple-choice ‘paper’ based evaluations that we rely on today [28].

The social cognitive theory (situativity) perspective uses different methods to study the diagnostic process: observing in place of listening, and learning from stories

Arthur Elstein’s work formed the foundation for our current understanding of ‘how doctors think’, based on the ‘think aloud’ protocol he used as they solved diagnostic cases [29]. In most cases when their diagnosis was correct, they simply recognized it from the features in the case. Constructing a differential diagnosis was the exception. Similarly, the remarkable series of case discussions by Jerome Kassirer and Stephen Pauker in the New England Journal of Medicine provided complementary insights into how doctors think, by walking the reader, step by step, through the clinical reasoning process that culminated in the correct diagnosis [30]. Behavioral scientists had already clarified the cognitive underpinnings of decision-making, and thanks largely to the pioneering efforts of Pat Croskerry, we now understand that diagnostic reasoning uses the same two cognitive capabilities, the ‘fast’, intuitive (System 1) pathway, and the ‘slow’, conscious, rational (System 2) pathway [31]. The dual-pathway model is the central element of the information-processing theory of diagnosis. It is easy to grasp, it makes sense, and it has been especially valuable in explaining many aspects of both clinical reasoning and the diagnostic errors that are seen in clinical practice.

The information-processing theory framework evolved primarily from listening to what doctors said about diagnosis. In contrast, this family of social cognitive theories focus on observing what transpires during the diagnostic process. No less of an authority than Albert Einstein espoused the value of learning about cognition from observation. In a lecture on how to understand theoretical physics: “If you want to find out anything from the theoretical physicists about the methods they use, I advise you to stick closely to one principle, dont listen to their words, fix your attention on their deeds” [32].

Gary Klein’s foundational studies on what he terms naturalistic decision-making were based on observation. He studied fire fighters, aircraft-carrier commanders, nurses and others, but instead of asking people to verbalize their problem-solving process, he studied what they actually did [33]. Although physicians weren’t studied directly, his model of intuitive, ‘recognition-primed’ decision-making seemed immediately applicable to diagnosis, and in sync with Arthur Elstein’s observation of how clinicians solved diagnostic problems intuitively, or by “System 1” processing, in the lingo of the dual-processing paradigm. Gigerenzer and colleagues in their descriptions of ‘bounded rationality’ reached similar conclusions from their experimental studies of decision making, finding that ‘satisficing’ (originally described by Simon [34]) was common, typically produced the correct conclusions, and that many decisions were reached without full consideration of other possibilities [35].

A criticism of the ‘learn from listening’ approach is that in fact we may not be capable of knowing how our own thoughts arise. What physicians say about their clinical reasoning may be a post-hoc effort to justify their decisions. It is reassuring, however, that the ‘listening’ approach and the observational approach seem to agree that the diagnosis, at the expert level, very often arises from pattern recognition at an intuitive level.

Vineet Chopra in his article describes the value of using ethnography, and focused ethnography, to study diagnosis [36]. “In ethnography, researchers embed themselves, (over an extended period of time), into the social world of participants so as to better understand behaviors, organizations, communities, (sub)cultures, and society.” Focused ethnography looks at a particular topic or question, and the article by Mindaugas Briedis in this issue that looks at diagnosis in radiology is a fascinating example [19].

Small learning – from stories

Another thing to like about situativity is being able to learn about the diagnostic process from patient and provider stories. Many of the stories are tragic, some are lighter, but they are all engaging. In contrast to the nature of information emerging from the study of ‘big data’, each of these stories is a ‘small lesson’ in how to improve diagnosis, learning from each patient, one at a time [37], [38]. Some of the most interesting are the stories that physicians tell about their own diagnostic errors [39], or their encounters with the healthcare system in their own diagnostic journey [40].

With autopsies having largely disappeared as a way to learn from error, stories may be the next-best thing. Each one is real, credible, and compelling. Relating to situativity, patients have been able to map out where along the diagnostic process the breakdowns in their care occurred, and explore the lessons that should be learned from these exercises by asking “What if … things had been differently” [41].

The Society to Improve Diagnosis in Medicine (SIDM) has a growing collection of patient stories [42], and many hundreds more can be found from an online search. On our ‘wish list’ of ways to improve diagnosis would be some better way to promote learning from patient stories. Their value could be greatly enhanced if they were aggregated somehow so that the lessons could be more easily found and incorporated into practice. Gordy Schiff and colleagues are endeavoring to begin this effort in their “PRIDE” (Primary Care Research in Diagnosis Errors) project, which will collect, analyze, and categorize cases for storage in a central repository [43]. An unmet need is how to incorporate these lessons back into healthcare education – each case provides a little gem of knowledge, a pitfall to be avoided, but their value will be lost without incorporating them somehow in learning.

Situativity can help us understand and address diagnostic errors

Direct observation of care, and video-recordings of care [44], have already produced invaluable insights into the quality of diagnostic care in the real world, and this information might not have surfaced any other way. Andrews et al. found that 18% of observed hospitalized patients experienced a serious adverse safety event [45]. Similar findings were observed in another study, including instances of diagnostic error [46]. Carl Berdahl and colleagues observed 240 diagnostic encounters involving 12 different Emergency Department physicians; roughly twice as many physical findings and elements from the ‘review of systems’ were documented than were actually performed (or asked) during the visit [47]. Because diagnosis is so critically important on the facts in a particular case, this demonstration of the errors in clinical documentation is particularly alarming.

A related approach is to use ‘unannounced standardized patients’ (USP’s) to learn about the diagnostic process [48], [49], [50]. These are patients trained to portray a particular condition, relate their story in a reproducible way, and observe the response. Quoting Schwartz et al.: “sending standardized patients into clinical practice settings incognito is the gold standard (for assessment) of physician performance” [51], [52], [53]. As an example, Alan Schwartz, Saul Weiner and Amy Binns-Calvey identified clear instances of ‘contextual error’ using standardized patients to observe outpatient care: Physicians in practice commented on or pursued a patient’s mentioning “Boy, it’s been tough since I lost my job” less than half of time in cases where this might have been relevant to determining the diagnosis [51]. Their article in this issue used USP’s to study the evaluation of depression screening in actual practice, and whether feedback to the providers on their own performance and the performance of their group had any impact [54]. Their results showed that screening was performed only half of the time, but improved to near 70% after feedback. In addition, the study found that many times elements of the depression screen were performed but not documented, and an even larger fraction that were documented, but never performed. A separate report from Jeffrey Wilhite and colleagues used USP’s to study the behaviors of internal medicine residents asked to evaluate a patient with recurring exacerbations of asthma [55]. Over a third of the residents failed to elicit the fact that the patient lived in public housing with a severe mold problem, and only 10% addressed all four of the desirable elements to probe for social determinants of health (eliciting the appropriate information, exploring and acknowledging these issues, responding by providing resources or referrals, and documenting appropriately). Consistent with the many other studies using USP’s, documentation of the care encounter was sub-optimal: Even when the relevant social history was obtained, it was documented in less than half of cases.

These kinds of studies are eye-popping in their impact, and will forever change the way we look at data derived from surveys or captured administrative data, which are many steps removed from what actually transpires in practice. The shortcomings of documentation, in particular, are noteworthy, and the article by Carl Berdahl and David Shriger in this issue explores the implications of this for future research work [56].

Several articles in this special edition illustrate the kinds of research work that situativity enables. The studies by Abby Konapasky and colleagues provide illustrative examples [57], including an exploration of how contextual factors change the language of diagnosis [58], and how contextual factors change the likelihood of arriving at the correct diagnosis (the phenomenon of context specificity). A study by Divya Ramani and colleagues explores how uncertainty interacts with contextual factors in the clinical reasoning process [59]. Marcia Docherty and colleagues report their initial findings from observing senior ER residents [60]. The results suggest that even brief observation periods may be able to distinguish minimal versus more fulsome levels of competency by examining the effective exploration of contextual issues.

Conclusions

In his address to the Diagnostic Error in Medicine conference in 2011, “What Physicians Can Learn from Firefighters”, Gary Klein suggested that we could improve diagnosis by both reducing errors, and increasing insights and expertise (Figure 5). Increasing expertise is matter for education, but it seems like the study of situativity provides unique opportunities for advancing the twin goals of reducing mistakes and increasing insights. The next decade will hopefully see a great deal of new work in this direction.

Figure 5: Performance as a function of increasing insights and expertise while reducing mistakes.Gary Klein.
Figure 5:

Performance as a function of increasing insights and expertise while reducing mistakes.Gary Klein.


Corresponding author: Mark L. Graber, Society to Improve Diagnosis in Medicine, Evanston, IL, USA, E-mail:

  1. Research funding: None declared.

  2. Author contributions: The author has accepted responsibility for the entire content of this manuscript and approved its submission.

  3. Competing interests: Author states no conflict of interest.

References

1. Graber, M. Diagnostic errors in medicine: a case of neglect. Jt Comm J Qual Patient Saf 2005;31:106–13.10.1016/S1553-7250(05)31015-4Search in Google Scholar

2. Balogh, E, Miller, B, Ball, J. Improving diagnosis in health care. Washington, DC: National Academy of Medicine; 2015.10.17226/21794Search in Google Scholar PubMed

3. ECRI Institute. Top 10 patient safety concerns 2020; 2020. Available from: https://www.ecri.org2020.Search in Google Scholar

4. Jutel, A, Dew, K. Social issues in diagnosis; an introduction for students and clinicians. Baltimore, MD: Johns Hopkins University Press; 2014.Search in Google Scholar

5. Merkebu, J, Battistone, M, McMains, K, McOwen, K, Witkop, C, Konopasky, A, et al. Situativity: a family of social cognitive theories for clinical reasoning and error. Diagnosis (Berl) 2020;7:169–76.10.1515/dx-2019-0100Search in Google Scholar PubMed

6. Daniel, M, Wilson, E, Seifert, C, Durning, S, Holmboe, E, Rencic, J, et al. Expanding boundaries: a transtheoretical model of clinial reasoning and diagnostic error. Diagnosis (Berl) 2020;7:333–5.10.1515/dx-2019-0102Search in Google Scholar PubMed

7. Soh, M, Konoasky, A, Durning, S, Ramani, D, McBee, E, Ratcliffe, T, et al. Sequence matters: patterns in task-based clinical reasoning. Diagnosis (Berl) 2020;7:281–9.10.1515/dx-2019-0095Search in Google Scholar PubMed

8. Carayon, P, Schoofs Hunt, A, Karsh, B-T, Gurses, AP, Alvarado, CJ, Smith, M, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care 2006;15(Suppl 1):i50–8.10.1136/qshc.2005.015842Search in Google Scholar PubMed PubMed Central

9. Sittig, D, Singh, H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010;19:i68–74.10.1136/qshc.2010.042085Search in Google Scholar PubMed PubMed Central

10. Ebeling, M. The promotion of marketing-mediated diagnosis: turning patients into consumers. In: Jutel, A, Dew, K, editors. Social issues in diagnosis. Baltimore, MD: Johns Hopkins University Press; 2014:134–50 p.Search in Google Scholar

11. McDonald, K. The diagnostic field’s players and interactions: from the inside out. Diagnosis (Berl) 2014;1:55–8.10.1515/dx-2013-0023Search in Google Scholar PubMed

12. McDonald, K, Bryce, C, Graber, M. The Patient is in: patient involvement strategies for diagnostic error mitigation. BMJ Qual Saf 2013;22:ii33–9.10.1136/bmjqs-2012-001623Search in Google Scholar PubMed PubMed Central

13. Graber, M, Rusz, D, Jones, M, Farm-Franks, D, Jones, B, Gluck, JC, et al. The new diagnostic team. Diagnosis (Berl) 2017;4:225–38.10.1515/dx-2017-0022Search in Google Scholar PubMed

14. Olson, A, Durning, S, Fernandez, B, Sick, B, Lane, K, Rencic, J. Teamwork in clinical reasoning - cooperative or parallel play? Diagnosis (Berl) 2020;72:1595–3602.10.1515/dx-2020-0020Search in Google Scholar PubMed

15. Graber, M, Byrne, C, Johnston, D. The impact of electronic health records on diagnosis. Diagnosis (Berl) 2017;4:211–23.10.1515/dx-2017-0012Search in Google Scholar PubMed

16. El-Kareh, R, Hasan, O, Schiff, G. Use of health information technology to reduce diagnostic error. BMJ Qual Saf 2013;22:ii40–4.10.1136/bmjqs-2013-001884Search in Google Scholar PubMed PubMed Central

17. Howard, J. Cognitive errors and diagnostic mistakes. Springer International Publishing; 2019.10.1007/978-3-319-93224-8Search in Google Scholar

18. Hastorf, A, Cantril, H. They saw a game; a case study. J Abnorm Psychol 1954;49:129–34.10.4324/9780203496398-30Search in Google Scholar

19. Briedis, M, Briediene, R. Phenomenological analysis of diagnostic radiology: description and relevance to diagnostic errors. Diagnosis (Berl) 2020;7:215–25.10.1515/dx-2019-0073Search in Google Scholar PubMed

20. Jutel, A. Putting a name to it. Baltimore, MD: Johns Hopkins University Press; 2011.Search in Google Scholar

21. Croskerry, P. Context is everything or How could I have been that stupid? Special Issue: understanding decision-making in healthcare and the law. Healthc Q 2009;12:167–73.10.12927/hcq.2009.20945Search in Google Scholar PubMed

22. Charney, A, Dourmashkin, J. Interpreting clinical and laboratory tests: importance and implications of context. Diagnosis (Berl) 2020. https://doi.org/10.1515/DX-2019-0009 [Epub ahead of print].Search in Google Scholar

23. Olson, A, Rencic, J, Cosby, K, Rusz, D, Papa, F, Croskerry, P, et al. Competencies for improving diagnosis: an interprofessional framework for education and training in healthcare. Diagnosis (Berl) 2019;6:335–41.10.1515/dx-2018-0107Search in Google Scholar PubMed

24. Torre, D, Durning, S, Rencic, J, Lang, V, Holmboe, E, Daniel, M. Widening the lens on teaching and assessing clinical reasoning: from “in the head” to “out in the world”. Diagnosis (Berl) 2020;7:181–90.10.1515/dx-2019-0098Search in Google Scholar PubMed

25. Schuwirth, L, Durning, S, King, S. Assessment of clinical reasoning: three evolutions of thought. Diagnosis (Berl) 2020;7:191–6.10.1515/dx-2019-0096Search in Google Scholar PubMed

26. Rencic, J, Schuwirth, L, Gruppen, L, Durning, S. Clinical reasoning performance assessment: using situated cognition theory as a conceptual framework. Diagnosis (Berl) 2020;7:241–9.10.1515/dx-2019-0051Search in Google Scholar PubMed

27. Rencic, J, Schuwirth, L, Gruppen, K, Durning, S. A situated cognition model for clinical reasoning performance assessment: a narrative review. Diagnosis (Berl) 2020;7:227–40.10.1515/dx-2019-0106Search in Google Scholar PubMed

28. Bennett, C, Niessen, T, Desai, S, Garibaldi, B. Assessing physical examination skills using direct observation and volunteer patients. Diagnosis (Berl) 13 Mar 2020. https://doi.org/10.1515/DX-2019-0089 [Epub ahead of print].Search in Google Scholar

29. Elstein, AS. Clinical reasoning in medicine. In: Higgs, JJM, editor. Clinical reasoning in the health professions. Oxford, England: Butterworth-Heinemann Ltd; 1995:49–59 p.Search in Google Scholar

30. Kassirer, JP. Teaching clinical reasoning: case-based and coached. Acad Med 2010;85:1118–24.10.1097/ACM.0b013e3181d5dd0dSearch in Google Scholar PubMed

31. Croskerry, P. A universal model of diagnostic reasoning. Acad Med 2009;84:1022–8.10.1097/ACM.0b013e3181ace703Search in Google Scholar PubMed

32. Einstein, A. On the method of theoretical physics. Philos Sci 1933;1:163–9.10.1086/286316Search in Google Scholar

33. Klein, G. Sources of power: how people make decisions. Cambridge, MA: The MIT Press; 1998.Search in Google Scholar

34. Simon, HA. Invariants of human behavior. Annu Rev Psychol 1990;41:1–20.10.1146/annurev.ps.41.020190.000245Search in Google Scholar PubMed

35. Gigerenzer, G, Goldstein, DG. Reasoning the fast and frugal way: models of bounded rationality. Psychol Rev 1996;103:650–69.10.1093/acprof:oso/9780199744282.003.0002Search in Google Scholar

36. Chopra, V. Focused ethnography: a new tool to study diagnostic errors? Diagnosis (Berl) 2020;7:211–4.10.1515/dx-2020-0009Search in Google Scholar PubMed

37. Dhaliwal, G, Shojania, K. The data of diagnostic error: big, large, and small. BMJ Qual Saf 2018;27:499–501.10.1136/bmjqs-2018-007917Search in Google Scholar

38. Sacristan, J, Dilla, T. No big data without small data: learning health care systems begin and end with the individual patient. J Eval Clin Pract 2015;21:1014–17.10.1111/jep.12350Search in Google Scholar

39. Smulowitz, P. The illusion of perfection. BMJ Qual Saf 2019. https://doi.org/10.1136/bmjqs-2019-010501.Search in Google Scholar

40. Reilly, B, 2019. The spy who came in with a cold. N Engl J Med 380, 292–5. https://doi.org/10.1056/NEJMms1810861.10.1056/NEJMms1810861Search in Google Scholar

41. Sheridan, S, Merryweather, P, Rusz, D, Schiff, G. What if?: transforming diagnostic research by leveraging a diagnostic process map to engage patients in learning from error. Washington, DC; 2020.10.31478/202002aSearch in Google Scholar

42. Patient stories. 2020. Available from: https://www.improvediagnosis.org/stories/.Search in Google Scholar

43. Betsy Lehman Center. Improving diagnosis in primary care once case at a time; 2020. Available from: https://betsylehmancenterma.gov/news/case-reports-anchor-a-learning-network-for-better-diagnosis-in-primary-care2020.Search in Google Scholar

44. Brogaard, L, Uldbjerg, N. Filming for auditing of real-life emergency teams: a systematic review. BMJ Open Quality 2019;8:e000588.10.1136/bmjoq-2018-000588Search in Google Scholar

45. Andrews, L, Stocking, C, Krizek, T, Gotlieb, L, Krizek, C, Vargis, T, et al. An alternative strategy for studying adverse events in medical care. Lancet 1997;349:309–13.10.1016/S0140-6736(96)08268-2Search in Google Scholar

46. Reema Lamba, A, Linn, K, Fletcher, K. Identifying patient safety problems during team rounds: an ethnographic study. BMJ Qual Saf 2014;24:667–9.10.1136/bmjqs-2013-002324Search in Google Scholar PubMed

47. Berdahl, C, Moran, G, McBride, O, Santini, A, Verzhbinsky, B, Schriger, D. Concordance between electronic clinical documentation and physicians’ observed behavior. JAMA Netw Open 2019;2:e1911390.10.1001/jamanetworkopen.2019.11390Search in Google Scholar PubMed PubMed Central

48. Zabar, S, Gillespie, C, Hanley, K, Kalet, A. Directly observed care: can unannounced standardized patients address a gap in performance measurement? J Gen Int Med 2014;29:1439.10.1007/s11606-014-3004-9Search in Google Scholar PubMed PubMed Central

49. Weiner, S, Schwartz, A. Contextual errors in medical decision making: overlooked and understudied. Acad Med 2016;91:657–62.10.1097/ACM.0000000000001017Search in Google Scholar

50. Weiner, S, Schwartz, A. Directly observed care: can unannounced standardized patients address a gap in performance measurement? J Gen Int Med 2014;29:1183–7.10.1007/s11606-014-2860-7Search in Google Scholar

51. Schwartz, A, Weiner, S, Binns-Calvey, A. Comparing announced with unannounced standardized patients in performance assessment. Jt Comm J Qual Patient Saf 2013;39:83–8.10.1016/S1553-7250(13)39012-6Search in Google Scholar

52. Glassman, P, Luck, J, O’Gara, E, Peabody, J. Using standardized patients to meaure quality: evidence from the literature and a prospective study. Jt Comm J Qual Patient Saf 2000;26:644–53.10.1016/S1070-3241(00)26055-0Search in Google Scholar

53. Peabody, JW, Luck, J, Glassman, P, Jain, S, Hansen, J, Spell, M, et al. Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Int Med 2004;141:771–80.10.7326/0003-4819-141-10-200411160-00008Search in Google Scholar PubMed

54. Schwartz, A, Peskin, S, Spiro, A, Weiner, S. Direct observation of depression screening: identifying diagnostic error and improving accuracy through unannounced standardized patients. Diagnosis (Berl) 2020. https://doi.org/10.1515/DX-2019-0110 [Epub ahead of print].Search in Google Scholar

55. Wilhite, J, Hardowar, K, Fisher, H, Porter, B, Wallach, A, Altshuler, L, et al. Clinical problem solving and social determinants of health: a descriptive study using unannounced standardized patients to directly observe how resident physicians respond to social determinants of health. Diagnosis (Berl) 2020;7:313–24.10.1515/dx-2020-0002Search in Google Scholar PubMed

56. Berdahl, C, Schriger, D. Study design and ethical considerations related to using direct observation to evaluate physician behavior: reflections after a recent study. Diagnosis (Berl) 2020;7:205–9.10.1515/dx-2020-0029Search in Google Scholar PubMed

57. Konopasky, A, Artino, A, Battista, A, Ohmer, M, Hemmer, P, Torre, D, et al. Understanding context specificity: the effect of contextual factors on clinical reasoning. Diagnosis (Berl) 2020;7:257–64.10.1515/dx-2020-0016Search in Google Scholar PubMed

58. Konopasky, A, Durning, S, Artino, AR, Ramani, D, Battista, A. The linguistic effects of context specificity: exploring affect, cognitive processing, and agency in physicians’ think-aloud reflections. Diagnosis (Berl) 2020;7:273–80.10.1515/dx-2019-0103Search in Google Scholar PubMed

59. Ramani, D, Soh, M, Merkebu, J, Durning, S, Battista, A, McBee, E, et al. Examining the patterns of uncertainty across clinical reasoning tasks: effects of contextual factors on clinical reasoning process. Diagnosis (Berl) 2020;7:299–305.10.1515/dx-2020-0019Search in Google Scholar PubMed

60. Docherty, M. Sociocultural learning in emergency medicine: a holistic examination of competence. Diagnosis (Berl) 2020;7:325–32.10.1515/dx-2020-0001Search in Google Scholar PubMed

Received: 2020-04-23
Accepted: 2020-05-02
Published Online: 2020-07-06
Published in Print: 2020-08-27

© 2020 Walter de Gruyter GmbH, Berlin/Boston

Downloaded on 29.3.2024 from https://www.degruyter.com/document/doi/10.1515/dx-2020-0055/html
Scroll to top button