Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Diagnosis

Official Journal of the Society to Improve Diagnosis in Medicine (SIDM)

Editor-in-Chief: Graber, Mark L. / Plebani, Mario

Ed. by Argy, Nicolas / Epner, Paul L. / Lippi, Giuseppe / Singhal, Geeta / McDonald, Kathryn / Singh, Hardeep / Newman-Toker, David

Editorial Board: Basso , Daniela / Crock, Carmel / Croskerry, Pat / Dhaliwal, Gurpreet / Ely, John / Giannitsis, Evangelos / Katus, Hugo A. / Laposata, Michael / Lyratzopoulos, Yoryos / Maude, Jason / Sittig, Dean F. / Sonntag, Oswald / Zwaan, Laura

Online
ISSN
2194-802X
See all formats and pricing
More options …

Development and evaluation of a clinical reasoning curriculum as part of an Internal Medicine Residency Program

Shwetha Iyer
  • Corresponding author
  • Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, New York, NY, USA
  • Department of General Internal Medicine, Assistant Professor, Department of Family and Social Medicine, 3544 Jerome Avenue, Bronx, NY 10467, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Erin Goss / Casey Browder / Gerald Paccione / Julia Arnsten
Published Online: 2019-03-22 | DOI: https://doi.org/10.1515/dx-2018-0093

Abstract

Background

Errors in medicine are common and often tied to diagnosis. Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety.

Methods

The authors aimed to develop, implement and evaluate a clinical reasoning curriculum for Internal Medicine residents. The authors developed and delivered a clinical reasoning curriculum to 47 PGY2 residents in an Internal Medicine Residency Program at a large urban hospital. The clinical reasoning curriculum consists of six to seven sessions with the specific aims of: (1) educating residents on cognitive steps and reasoning strategies used in clinical reasoning; (2) acknowledging the pitfalls of clinical reasoning and learning how cognitive biases can lead to clinical errors; (3) expanding differential diagnostic ability and developing illness scripts that incorporate discrete clinical prediction rules; and (4) providing opportunities for residents to reflect on their own clinical reasoning (also known as metacognition).

Results

Forty-seven PGY2 residents participated in the curriculum (2013–2016). Self-assessed comfort in recognizing and applying clinical reasoning skills increased in 15 of 15 domains (p < 0.05 for each). Resident mean scores on the knowledge assessment improved from 58% pre-urriculum to 81% post curriculum (p = 0.002).

Conclusions

A case vignette-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and improve residents’ self-assessed comfort in recognizing and applying clinical reasoning skills.

This article offers supplementary material which is provided at the end of the article.

Keywords: clinical reasoning; medical education; metacognition

Introduction

Diagnostic errors in medicine are estimated to account for 40–80,000 deaths per year. Of all medical errors, 10–20% are tied to errors in diagnosis [1]. Prior studies have shown that cognitive errors such as faulty information gathering and faulty synthesis of information are as likely to lead to misdiagnosis as system-related errors [2], [3]. Interventions to identify and prevent cognitive errors are still being studied.

Common cognitive errors are referred to as biases [4]. Though over 100 types of biases have been described, discussion of the impact of cognitive bias is not routine in medical education or practice, due in part to difficulty measuring or monitoring cognitive errors [1]. In addition, physicians are often unaware of their own cognitive processes, and may not appreciate the effect of cognitive bias on their own clinical decision-making [5]. Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety [6], [7]. However, few studies have evaluated the impact of a cognitive decision-making curriculum on trainees’ understanding of clinical reasoning concepts.

We sought to fill this gap by developing and implementing a clinical reasoning curriculum for second year Internal Medicine residents. In this paper, we describe the development and implementation of the curriculum and effect of the curriculum on residents’ knowledge and self-assessed understanding of clinical reasoning.

Methods

Settings and participants

We implemented the curriculum in December 2013 for PGY-2 residents in the Internal Medicine Residency Program at Montefiore Medical Center, Bronx, New York. From December 2013 through May 2016, 47 of 150 residents enrolled in the elective curriculum. The majority (29/47) of residents who enrolled were from the primary care track of our program, with the remainder (18/47) from the categorical track. The curriculum was initially delivered in eight 4-h sessions over the course of 1 month to groups of five to eight residents, but after the first year and in response to feedback it was condensed to six to seven sessions over a 2-week ambulatory block. Faculty facilitators included two of the members of the group that developed the curriculum.

Program description

Our clinical reasoning curriculum was developed by a group of five academic general internists (two senior faculty members with more than 20 years’ experience in medical education and formal roles in curriculum development at our institution, and three junior faculty members with less than 5 years’ experience in clinical education) using a theory-guided, literature-based framework [8]. Over the course of 1 year, we reviewed literature from cognitive psychology and medical education, developed curriculum aims and chose teaching strategies. Our curriculum was created with the following specific aims: (1) educate residents on cognitive steps and strategies used in clinical reasoning; (2) learn how cognitive biases can lead to clinical errors; (3) expand differential diagnostic ability and learn to develop illness scripts that incorporate discrete clinical prediction rules; and (4) provide opportunities for residents to reflect on their own clinical reasoning (metacognition).

We devised six to eight clinical reasoning sessions (Table 1), each highlighting a cognitive step (hypothesis generation, refinement and verification) or reasoning strategy (e.g. probabilistic reasoning, causal reasoning). For each session, we developed two to three clinical case vignettes that illustrated a cognitive step or reasoning strategy (Aim 1), demonstrated a cognitive error (Aim 2) and expanded knowledge of illness scripts to facilitate differential diagnostic ability (Aim 3). This case-based approach is well described as an effective method for teaching clinical reasoning [9].

Table 1:

Components of the curriculum matched to each curricular session and aim.

Clinical case vignettes were selected from published problem-solving cases and from a textbook on clinical reasoning [8]. The cases were modified to include reflective questions after each new history or data element, to prompt problem representation after the initial presentation, and to elicit reflection on possible cognitive errors at the end of the case. Clinical case vignettes were finalized under guidance of senior faculty for accuracy, vetted with other members of the group, and piloted with third year residents (Example Clinical Vignette in Supplementary Index-S1).

Each 4-h clinical reasoning session utilized a modified form of the flipped classroom, in which residents are first exposed to new material (clinical vignettes and readings on clinical reasoning processes) outside of class and then use class time to assimilate new knowledge through problem-solving and discussion [10]. Residents were asked to prepare for sessions by reading articles on cognitive psychology, medical education and clinical prediction rules, and by completing the first half of the clinical case vignettes before the session (List of articles for sessions in Supplementary Index-S2). This was approximately 1–2 h of pre-session work, which was built into the residents’ schedule. During the sessions, residents worked through the clinical case vignettes in small groups, identified key clinical features of each case and proposed a unifying diagnosis. The faculty facilitator reinforced all four curricular aims through facilitated discussion and question prompts that highlighted cognitive steps taken and reasoning strategies employed (Aim 1), cognitive errors the case may elicit (Aim 2) and evidence-based approaches to the clinical scenarios (Aim 3).

As a final exercise, residents utilized a worksheet that we adapted from ambulatory morning reports to reflect on their clinical reasoning on a case from their own ambulatory clinical experience (Aim 4) (Self reflection worksheet in Supplementary Index-S3). Facilitators reviewed these worksheets the night prior to the next clinical reasoning session and provided feedback to the residents on their self-reflections during the ambulatory morning report at the beginning of the clinical reasoning session. Time was also allotted for residents to reflect on any cognitive errors that occurred on inpatient wards by critiquing intern or medical student presentations that were created by the facilitator. This allowed residents to discuss how to incorporate clinical reasoning concepts into teaching medical students and interns.

Program evaluation

We developed survey instruments to measure (1) residents’ self-assessed ability to recognize and apply clinical reasoning concepts and (2) residents’ clinical reasoning knowledge. The latter assessed their knowledge using clinical reasoning scenarios. The survey instruments were developed 1 and 2 years, respectively, after the curriculum was first delivered. Both survey instruments were completed by the residents on the first day of curriculum (pre) and on the final day of curriculum (post).

Residents’ self-assessed ability to recognize and apply clinical reasoning concepts was measured with a 15-item questionnaire with responses entered on a four-point Likert scale, with 1=not at all capable, 2=slightly capable, 3=somewhat capable and 4=very capable. We then compared the percentage of residents who felt “somewhat capable” or “very capable” prior to the curriculum to the percentage of residents who felt “somewhat capable” or “very capable” after the curriculum, using chi-square (χ2) tests for statistical significance. Residents’ knowledge of clinical reasoning steps, cognitive errors and application of Bayesian analysis was assessed with a 10-item multiple-choice questionnaire, and mean scores pre- and post-curriculum were compared using the Wilcoxon test. The purpose of the surveys was described to the residents prior to the start of the curriculum. This study was approved on expedited review by the Albert Einstein College of Medicine/Montefiore Medical Center Institutional Review Board (IRB # 2018-9863).

Results

Between 2013 and 2016, we offered the curriculum 3 times and 47 residents completed it. Because the survey instruments were developed at different times after the first year of the curriculum, 30 residents offered and completed the resident skills self-assessment, and 25 residents were offered the clinical reasoning knowledge test. Twenty-one residents completed both the pre- and post-test surveys on knowledge (84% of those who were offered the knowledge survey).

Resident self-assessed skills following completion of the curriculum increased in all 15 domains (p<0.05 for each). Domains with the greatest increase in self-reported comfort included: comparing analytic and nonanalytic reasoning strategies, describing working memory, understanding heuristics, applying Bayesian analysis to clinical scenarios, identifying cognitive errors, and identifying errors in diagnostic verification. In each of these six domains, fewer than 30% of residents reported feeling “somewhat” or “very capable” before the course compared to more than 75% afterward (Table 2). Resident mean scores on the knowledge assessment improved from 58% pre-curriculum to 81% post-curriculum (p=0.002). Knowledge questions with the greatest change reflected some of the same areas in which residents reported an increase in comfort (i.e. identifying a cognitive error and employing a clinical reasoning strategy to answer a case-based question).

Table 2:

Resident skills self-assessment.

Discussion

Our clinical reasoning curriculum, using a combination of case vignettes, clinical prediction rules, and reflection exercises, improved resident self-assessed ability in recognizing and applying clinical reasoning concepts. Residents also demonstrated improvement in their ability to identify and apply knowledge of clinical reasoning concepts on a knowledge test after completing the curriculum. Our curriculum followed several best practice approaches for teaching clinical reasoning – it was case-based, provided a foundation in clinical reasoning theory with vocabulary and methods of clinical problem solving and provided immediate feedback on residents’ cognitive errors [9]. We suspect that our curriculum’s impact was maximized by incorporating exercises in which residents practiced their new skills by reflecting on their own ambulatory clinical experiences.

Residents reported the least comfort on the pre-curriculum assessment in comparing analytical and non-analytical reasoning strategies, understanding the cognitive psychology of how memory and clinical knowledge are stored and identifying cognitive errors. Even in domains that are more commonly incorporated into formal and informal inpatient teaching (such as utilizing epidemiology and pathophysiology in clinical reasoning), we saw improvements in self-reported comfort and knowledge. It is likely that both repeated exercises in self-reflection and a session focused on assessing the learner helped residents improve their comfort and skills in diagnosing shortcomings in their learners’ clinical reasoning and providing feedback to their learners on clinical presentations.

In informal discussion with the residents weeks to months after the curriculum, we noted that residents who received the curriculum reported that they were able to identify and name cognitive errors generated by medical students and interns, as well as to identify their own potential sources of error. This indicates that residents were able to engage in metacognition, a recommended strategy to reduce cognitive error [4], [11]. Our findings are consistent with other studies [12], [13] that have found that clinical reasoning curricular interventions can improve physician awareness of their own cognitive biases.

There are limitations to our approach. There is currently no standardized tool to measure the effects of clinical reasoning curriculum on resident skills, and only one other published study has assessed resident knowledge of clinical reasoning concepts [12]. While we could consider additional assessment methods, such as script concordance testing, observed standardized clinical exams, or completion of cases from the Human Diagnosis Project, results from studies using these methods have been mixed and none have shown a reduction in diagnostic errors [14], [15], [16]. One prior study describing a curriculum centered on illness scripts for medical students used a case-based assessment that showed an increase in diagnostic performance [17], which may be an area for future expansion of our curriculum. Finally, we were not able to measure diagnostic error rates in the residents’ clinical practice. To strengthen the evaluation of our curriculum, in future iterations we plan to evaluate sustainability of resident knowledge and application of clinical reasoning concepts by conducting additional surveys 1 and 6 months after completion of the curriculum. We also plan to assess intermediate outcomes, such as impact on residents’ use of critical thinking and metacognition.

Conclusions

A case-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and their self-assessed ability to recognize and apply clinical reasoning concepts.

References

Supplementary Material

The online version of this article offers supplementary material (https://doi.org/10.1515/dx-2018-0093).

About the article

Corresponding author: Shwetha Iyer, MD, Albert Einstein College of Medicine/Montefiore Medical Center, Bronx, New York, NY, USA; and Assistant Professor, Department of General Internal Medicine, Assistant Professor, Department of Family and Social Medicine, 3544 Jerome Avenue, Bronx, NY 10467, USA


Received: 2018-09-18

Accepted: 2019-02-25

Published Online: 2019-03-22

Published in Print: 2019-06-26


Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

Research funding: None declared.

Employment or leadership: None declared.

Honorarium: None declared.

Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.


Citation Information: Diagnosis, Volume 6, Issue 2, Pages 115–119, ISSN (Online) 2194-802X, ISSN (Print) 2194-8011, DOI: https://doi.org/10.1515/dx-2018-0093.

Export Citation

©2019 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Supplementary Article Materials

Comments (0)

Please log in or register to comment.
Log in