Jump to ContentJump to Main Navigation
Show Summary Details
More options …


Official Journal of the Society to Improve Diagnosis in Medicine (SIDM)

Editor-in-Chief: Graber, Mark L. / Plebani, Mario

Ed. by Argy, Nicolas / Epner, Paul L. / Lippi, Giuseppe / Singhal, Geeta / McDonald, Kathryn / Singh, Hardeep / Newman-Toker, David

Editorial Board: Basso , Daniela / Crock, Carmel / Croskerry, Pat / Dhaliwal, Gurpreet / Ely, John / Giannitsis, Evangelos / Katus, Hugo A. / Laposata, Michael / Lyratzopoulos, Yoryos / Maude, Jason / Sittig, Dean F. / Sonntag, Oswald / Zwaan, Laura

CiteScore 2018: 0.69

SCImago Journal Rank (SJR) 2018: 0.359
Source Normalized Impact per Paper (SNIP) 2018: 0.424

See all formats and pricing
More options …

A workshop to train medicine faculty to teach clinical reasoning

Verity Schaye
  • Corresponding author
  • Department of Medicine, New York University School of Medicine, 462 1st Avenue, New York, NY 10016, USA
  • NYC Health and Hospitals/Bellevue, New York, NY, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Michael Janjigian / Kevin Hauck / Neil Shapiro / Daniel Becker / Penelope Lusk / Khemraj Hardowar / Sondra Zabar / Anne Dembitzer
Published Online: 2019-03-09 | DOI: https://doi.org/10.1515/dx-2018-0059



Clinical reasoning (CR) is a core competency in medical education. Few studies have examined efforts to train faculty to teach CR and lead CR curricula in medical schools and residencies. In this report, we describe the development and preliminary evaluation of a faculty development workshop to teach CR grounded in CR theory.


Twenty-six medicine faculty (nine hospitalists and 17 subspecialists) participated in a workshop that introduced a framework to teach CR using an interactive, case-based didactic followed by role-play exercises. Faculty participated in pre- and post-Group Observed Structured Teaching Exercises (GOSTE), completed retrospective pre-post assessments (RPPs), and made commitment to change statements (CTCs).


In the post-GOSTE, participants significantly improved in their use of problem representation and illness scripts to teach CR. RPPs revealed that faculty were more confident in their ability and more likely to teach CR using educational strategies grounded in CR educational theory. At 2-month follow-up, 81% of participants reported partially implementing these teaching techniques.


After participating in this 3-h workshop, faculty demonstrated increased ability to use these teaching techniques and expressed greater confidence and an increased likelihood to teach CR. The majority of faculty reported implementing these newly learned educational strategies into practice.

Keywords: clinical reasoning; diagnostic reasoning; dual process theory; faculty development; script theory


In 2015, the National Academies of Sciences and Medicine put forth several calls to action to decrease diagnostic errors given their significant impact on the healthcare system, including to “Enhance health care professional education and training in the diagnostic process [1]”. There is much debate about which educational strategies can improve the diagnostic process. Strategies grounded in script theory and dual process theory (DPT), focusing on knowledge reorganization and guided reflection, have shown promise [2], [3], [4]. DPT stipulates that diagnostic reasoning arises from two interacting cognitive processes – intuitive, pattern recognition System 1 and slow, analytical System 2; the latter can be engaged through diagnostic time outs and by reflection on the diagnostic process [5]. Educational strategies related to script theory focus on knowledge organization and include teaching about problem representations (sentence summaries of the patient’s presentation in abstract terms) and illness scripts (organized mental representations of diseases stored in memory) [2], [4], [6].

Many medical schools lack an explicit curriculum in clinical reasoning (CR) citing limited faculty expertise among the barriers [7]. Learners value faculty who can explain their CR in a structured way, but faculty struggle when demonstrating their own reasoning without a framework [8], [9], [10], [11], [12]. Therefore, faculty development efforts are needed to train faculty to teach CR. Few reports of such programs exist [10], [11], [12]. In this report, we describe the development and preliminary evaluation of a workshop to train faculty to teach CR using educational strategies from the CR literature.


This workshop was one of three in a year-long faculty development program (FDP) (from November 2017 to November 2018) to improve teaching skills. Each workshop focused on a different skill (CR, observation and feedback, and assessing the struggling learner). Topics covered were based on a needs assessment of key stakeholders including interviews with medical school deans and program directors as well as a survey of program participants. This work met the New York University School of Medicine’s criteria for certification as a quality improvement project (not a human subjects research project) using a self-certification process to ensure that the data were not collected for research purposes, there was minimal risk of harm, and the primary goal of the project was to improve teaching performance.

Group Observed Structured Teaching Exercise

To assess baseline performance, faculty participated in a Group Observed Structured Teaching Exercise (GOSTE) in November 2017. Based on the work of Holmboe et al., we chose a GOSTE rather than an OSTE to foster collaborative and peer-assisted learning [13], [14]. The GOSTE case simulated ward round teaching: participants were tasked with leading a discussion with three standardized learners (SLs) (medical student, intern, and resident) focusing on the learners’ CR in their approach to a case. SLs evaluated faculty performance on a scoring rubric on a three-point Likert scale with behavioral anchors for “not done”, “partially done”, and “well done” on four skills: use of problem representation and illness scripts, approaches to broadening the differential diagnosis, strategies to prioritize the differential diagnosis, and use of diagnostic time outs. All SLs underwent two, 3-h training sessions including one session focused on rubric scoring and calibration of scores. One year after the pre-GOSTE, participants completed a post-GOSTE (the same case as the pre-GOSTE) to assess change in performance.


We developed the content for this 3-h workshop using our needs assessment, educational literature on teaching CR, and our years of experience implementing CR curriculum in our medicine residency program (Table 1) [4], [15], [16]. We introduced participants to the impact diagnostic errors have on the healthcare system and evidence-based educational strategies to teach CR. We reviewed the concepts of problem representation, illness scripts, and diagnostic time outs in a case-based interactive didactic. Subsequently, participants engaged in role-play exercises on teaching CR on ward rounds and received guided feedback from trained facilitators and program participants.

Table 1:

Clinical reasoning workshop outline.

At the end of the workshop, participants completed (1) retrospective pre-post assessments (RPPs) that measured change in confidence and likelihood of using their newly acquired skills on an eight-point Likert scale (ranging from “not at all likely/confident” to “very likely/confident”) and (2) commitment to change statements (CTCs). After 2 months, participants self-reported whether they had fully, partly, or not yet implemented (without anchoring statements) their CTCs as well as barriers to implementation.

Data analysis

Results of the RPPs were analyzed using a Wilcoxon signed-rank test. Results of the GOSTE were dichotomized into “not done/partly done” and “well done” (allowing the investigators to accommodate for rater bias) and results were reported as percent “well done”. An independent sample t-test, adjusted according to Levene’s test of homogeneity of variance, was conducted to compare the effect of the GOSTE from pre- to post- on each domain of the scoring rubric.


Twenty-six medicine faculty (nine hospitalists and 17 subspecialists), practicing for 1–17 years (median of 5), completed the CR workshop in December 2017. Twenty faculty completed both the pre- and post-GOSTE. Participants significantly improved in their use of problem representation and illness scripts from pre- to post-GOSTE. In the pre-GOSTE, 10% [standard deviation (SD)=30.5%] received a rating of “well-done” compared to 38% (SD=49.8%) in the post-GOSTE (p=0.028). There was no significant difference in the “well done” ratings between the pre- to post-GOSTE for use of diagnostic time outs, teaching approach to broadening the differential diagnosis, or strategies to prioritize the differential diagnosis.

After participating in this workshop, faculty were significantly more confident in their ability and more likely to teach CR using problem representations, illness script theory, and diagnostic time outs with a large effect size in all domains (Figure 1).

Retrospective pre-post assessment results. *PR, problem representation; IS, illness script.
Figure 1:

Retrospective pre-post assessment results.

*PR, problem representation; IS, illness script.

In CTCs, participants created specific plans for incorporating CR teaching strategies into their educational settings. At 2-month follow-up, 81% of participants had partially enacted their CTCs. Frequently cited barriers to implementation were insufficient time, logistical barriers, and forgetting. Only 4% of faculty reported insufficient skills.

Representative comments from CTCs include:

“When with a fellow who struggles with organizing their thoughts, [I will] focus on using (and teaching) the concept of problem representation as a means to systematically think through a case and create a differential.”

“In pivotal moments in case presentations, I will take a diagnostic time-out to discuss what fits/doesn’t fit.”


Similar to national reports, our faculty lacked a framework for teaching CR [7], [8], [9], [10], [11], [12]. After participating in this 3-h workshop, faculty demonstrated increased ability to use problem representation and illness scripts in their CR teaching in the post-GOSTE. Faculty also expressed greater confidence and an increased likelihood to teach CR using a framework. Through CTCs, participants created specific plans for incorporating CR strategies into their educational settings and at 2-month follow-up the majority had partially implemented their CTCs.

Although educational strategies to teach CR are widely reported in the literature, few reports describe successful interventions to train faculty to teach using these methods [2], [10], [12]. Our study adds to the findings of Dhaliwal (2013) who reported on a similarly structured workshop which increased faculty’s understanding of CR teaching methods and their intent to apply those methods in practice [10]. More recently, Addy et al. (2016) published their experiences with training faculty to teach CR using a modified Bayesian approach. They observed five faculty and evaluated their implementation of CR skills by reviewing video recordings of teaching sessions. They reported variable degrees of implementation and similar barriers to what we found including lack of time, forgetting to incorporate, and insufficient skills [12].

One of our study limitations is that we have not directly observed whether faculty are implementing skills learned in this workshop in the real-world clinical environment. This is a challenge shared by many FDPs where learning occurs in workshops [2], [10], [12]. Other limitations of this study include a small sample size, a self-selected group of faculty, and the use of self-reported data. Future evaluations of this program aim to include learner evaluations and direct observations of faculty workplace teaching skills. We plan to reevaluate the workshop structure and content for the next iteration to bring about improvement in all skills taught, including broadening the differential diagnosis, strategies to prioritize the differential diagnosis, and use of diagnostic time outs.

There is debate on how to best teach CR. Issues of context specificity generate questions such as: can we teach learners how to reason generally, or is reasoning best learned by repeated exposure to cases, thus building illness scripts? [8], [9]. Providing learners with an array of strategies to approach cases, including guided reflection and knowledge reorganization, can help learners navigate the diagnostic process [2], [3]. Training faculty to teach CR with these methods can reinforce learner efforts and build a shared language to give learners feedback on their CR.

In conclusion, a faculty development workshop on teaching CR increased faculty performance, confidence, and likelihood of teaching CR. Participants demonstrated improved use of problem representation and illness scripts as observed in the post-GOSTE. Participants also reported implementing changes in their CR teaching in their educational environments. A theory-based CR workshop can be an effective strategy to build a cadre of faculty empowered with the skills and the confidence to teach the next generation of physicians this core skill.


  • 1.

    National Academies of Sciences, Engineering, and Medicine. Improving diagnosis in health care. Washington, DC: National Academies Press, 2016. Google Scholar

  • 2.

    Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review. BMJ Qual Saf 2016;25:808–20. CrossrefPubMedWeb of ScienceGoogle Scholar

  • 3.

    Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med 2017;92:23–30. CrossrefWeb of SciencePubMedGoogle Scholar

  • 4.

    Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 2006;355:2217–25. PubMedCrossrefGoogle Scholar

  • 5.

    Kahneman D, Egan P. Thinking, fast and slow. New York: Farrar, Straus and Giroux, 2011. Google Scholar

  • 6.

    Atkinson K, Ajjawi R, Cooling N. Promoting clinical reasoning in general practice trainees: role of the clinical teacher. Clin Teach 2011;8:176–80. CrossrefPubMedGoogle Scholar

  • 7.

    Rencic J, Trowbridge RL, Fagan M, Szauter K, Durning S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship directors. J Gen Intern Med 2017;32:1242–6. Web of ScienceCrossrefPubMedGoogle Scholar

  • 8.

    Eva KW. What every teacher needs to know about clinical reasoning. Med Educ 2005;39:98–106. PubMedCrossrefGoogle Scholar

  • 9.

    Trowbridge RL, Olson AP. Becoming a teacher of clinical reasoning. Diagnosis (Berl) 2018;5:11–4. CrossrefPubMedGoogle Scholar

  • 10.

    Dhaliwal G. Developing teachers of clinical reasoning. Clin Teach 2013;10:313–7. PubMedCrossrefGoogle Scholar

  • 11.

    Audetat MC, Dory V, Nendaz M, Vanpee D, Pestiaux D, Junod Perron N, et al. What is so difficult about managing clinical reasoning difficulties? Med Educ 2012;46:216–27. Web of ScienceCrossrefPubMedGoogle Scholar

  • 12.

    Addy TM, Hafler J, Galerneau F. Faculty development for fostering clinical reasoning skills in early medical students using a modified bayesian approach. Teach Learn Med 2016;28:415–23. PubMedWeb of ScienceCrossrefGoogle Scholar

  • 13.

    Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence: a randomized trial. Ann Intern Med 2004;140:874–81. CrossrefPubMedGoogle Scholar

  • 14.

    Ludwig AB, Raff AC, Lin J, Schoenbaum E. Group observed structured encounter (GOSCE) for third-year medical students improves self-assessment of clinical communication. Med Teach 2017;39:931–5. Web of SciencePubMedGoogle Scholar

  • 15.

    Trowbridge RL. Twelve tips for teaching avoidance of diagnostic errors. Med Teach 2008;30:496–500. Web of SciencePubMedCrossrefGoogle Scholar

  • 16.

    Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–204. PubMedCrossrefGoogle Scholar

About the article

Corresponding author: Verity Schaye, MD, MHPE, Assistant Professor, Department of Medicine, New York University School of Medicine, 462 1st Avenue, New York, NY 10016, USA; and NYC Health and Hospitals/Bellevue, New York, NY, USA, Phone: +212-562-5081

Received: 2018-08-05

Accepted: 2019-02-08

Published Online: 2019-03-09

Published in Print: 2019-06-26

Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

Conflicts of interest: None of the authors on this paper have conflicts of interest to disclose.

Research funding: This faculty development program was funded by the generous donation of an anonymous donor.

Competing interests: The funding Organization(s) played no role in the study design; in the collection, analysis and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

Citation Information: Diagnosis, Volume 6, Issue 2, Pages 109–113, ISSN (Online) 2194-802X, ISSN (Print) 2194-8011, DOI: https://doi.org/10.1515/dx-2018-0059.

Export Citation

©2019 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

Andrew P.J. Olson, Geeta Singhal, and Gurpreet Dhaliwal
Diagnosis, 2019, Volume 6, Number 2, Page 75

Comments (0)

Please log in or register to comment.
Log in