Diagnostic errors in medicine are estimated to account for 40–80,000 deaths per year. Of all medical errors, 10–20% are tied to errors in diagnosis . Prior studies have shown that cognitive errors such as faulty information gathering and faulty synthesis of information are as likely to lead to misdiagnosis as system-related errors , . Interventions to identify and prevent cognitive errors are still being studied.
Common cognitive errors are referred to as biases . Though over 100 types of biases have been described, discussion of the impact of cognitive bias is not routine in medical education or practice, due in part to difficulty measuring or monitoring cognitive errors . In addition, physicians are often unaware of their own cognitive processes, and may not appreciate the effect of cognitive bias on their own clinical decision-making . Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety , . However, few studies have evaluated the impact of a cognitive decision-making curriculum on trainees’ understanding of clinical reasoning concepts.
We sought to fill this gap by developing and implementing a clinical reasoning curriculum for second year Internal Medicine residents. In this paper, we describe the development and implementation of the curriculum and effect of the curriculum on residents’ knowledge and self-assessed understanding of clinical reasoning.
Settings and participants
We implemented the curriculum in December 2013 for PGY-2 residents in the Internal Medicine Residency Program at Montefiore Medical Center, Bronx, New York. From December 2013 through May 2016, 47 of 150 residents enrolled in the elective curriculum. The majority (29/47) of residents who enrolled were from the primary care track of our program, with the remainder (18/47) from the categorical track. The curriculum was initially delivered in eight 4-h sessions over the course of 1 month to groups of five to eight residents, but after the first year and in response to feedback it was condensed to six to seven sessions over a 2-week ambulatory block. Faculty facilitators included two of the members of the group that developed the curriculum.
Our clinical reasoning curriculum was developed by a group of five academic general internists (two senior faculty members with more than 20 years’ experience in medical education and formal roles in curriculum development at our institution, and three junior faculty members with less than 5 years’ experience in clinical education) using a theory-guided, literature-based framework . Over the course of 1 year, we reviewed literature from cognitive psychology and medical education, developed curriculum aims and chose teaching strategies. Our curriculum was created with the following specific aims: (1) educate residents on cognitive steps and strategies used in clinical reasoning; (2) learn how cognitive biases can lead to clinical errors; (3) expand differential diagnostic ability and learn to develop illness scripts that incorporate discrete clinical prediction rules; and (4) provide opportunities for residents to reflect on their own clinical reasoning (metacognition).
We devised six to eight clinical reasoning sessions (Table 1), each highlighting a cognitive step (hypothesis generation, refinement and verification) or reasoning strategy (e.g. probabilistic reasoning, causal reasoning). For each session, we developed two to three clinical case vignettes that illustrated a cognitive step or reasoning strategy (Aim 1), demonstrated a cognitive error (Aim 2) and expanded knowledge of illness scripts to facilitate differential diagnostic ability (Aim 3). This case-based approach is well described as an effective method for teaching clinical reasoning .
Clinical case vignettes were selected from published problem-solving cases and from a textbook on clinical reasoning . The cases were modified to include reflective questions after each new history or data element, to prompt problem representation after the initial presentation, and to elicit reflection on possible cognitive errors at the end of the case. Clinical case vignettes were finalized under guidance of senior faculty for accuracy, vetted with other members of the group, and piloted with third year residents (Example Clinical Vignette in Supplementary Index-S1).
Each 4-h clinical reasoning session utilized a modified form of the flipped classroom, in which residents are first exposed to new material (clinical vignettes and readings on clinical reasoning processes) outside of class and then use class time to assimilate new knowledge through problem-solving and discussion . Residents were asked to prepare for sessions by reading articles on cognitive psychology, medical education and clinical prediction rules, and by completing the first half of the clinical case vignettes before the session (List of articles for sessions in Supplementary Index-S2). This was approximately 1–2 h of pre-session work, which was built into the residents’ schedule. During the sessions, residents worked through the clinical case vignettes in small groups, identified key clinical features of each case and proposed a unifying diagnosis. The faculty facilitator reinforced all four curricular aims through facilitated discussion and question prompts that highlighted cognitive steps taken and reasoning strategies employed (Aim 1), cognitive errors the case may elicit (Aim 2) and evidence-based approaches to the clinical scenarios (Aim 3).
As a final exercise, residents utilized a worksheet that we adapted from ambulatory morning reports to reflect on their clinical reasoning on a case from their own ambulatory clinical experience (Aim 4) (Self reflection worksheet in Supplementary Index-S3). Facilitators reviewed these worksheets the night prior to the next clinical reasoning session and provided feedback to the residents on their self-reflections during the ambulatory morning report at the beginning of the clinical reasoning session. Time was also allotted for residents to reflect on any cognitive errors that occurred on inpatient wards by critiquing intern or medical student presentations that were created by the facilitator. This allowed residents to discuss how to incorporate clinical reasoning concepts into teaching medical students and interns.
We developed survey instruments to measure (1) residents’ self-assessed ability to recognize and apply clinical reasoning concepts and (2) residents’ clinical reasoning knowledge. The latter assessed their knowledge using clinical reasoning scenarios. The survey instruments were developed 1 and 2 years, respectively, after the curriculum was first delivered. Both survey instruments were completed by the residents on the first day of curriculum (pre) and on the final day of curriculum (post).
Residents’ self-assessed ability to recognize and apply clinical reasoning concepts was measured with a 15-item questionnaire with responses entered on a four-point Likert scale, with 1=not at all capable, 2=slightly capable, 3=somewhat capable and 4=very capable. We then compared the percentage of residents who felt “somewhat capable” or “very capable” prior to the curriculum to the percentage of residents who felt “somewhat capable” or “very capable” after the curriculum, using chi-square (χ2) tests for statistical significance. Residents’ knowledge of clinical reasoning steps, cognitive errors and application of Bayesian analysis was assessed with a 10-item multiple-choice questionnaire, and mean scores pre- and post-curriculum were compared using the Wilcoxon test. The purpose of the surveys was described to the residents prior to the start of the curriculum. This study was approved on expedited review by the Albert Einstein College of Medicine/Montefiore Medical Center Institutional Review Board (IRB # 2018-9863).
Between 2013 and 2016, we offered the curriculum 3 times and 47 residents completed it. Because the survey instruments were developed at different times after the first year of the curriculum, 30 residents offered and completed the resident skills self-assessment, and 25 residents were offered the clinical reasoning knowledge test. Twenty-one residents completed both the pre- and post-test surveys on knowledge (84% of those who were offered the knowledge survey).
Resident self-assessed skills following completion of the curriculum increased in all 15 domains (p<0.05 for each). Domains with the greatest increase in self-reported comfort included: comparing analytic and nonanalytic reasoning strategies, describing working memory, understanding heuristics, applying Bayesian analysis to clinical scenarios, identifying cognitive errors, and identifying errors in diagnostic verification. In each of these six domains, fewer than 30% of residents reported feeling “somewhat” or “very capable” before the course compared to more than 75% afterward (Table 2). Resident mean scores on the knowledge assessment improved from 58% pre-curriculum to 81% post-curriculum (p=0.002). Knowledge questions with the greatest change reflected some of the same areas in which residents reported an increase in comfort (i.e. identifying a cognitive error and employing a clinical reasoning strategy to answer a case-based question).
Our clinical reasoning curriculum, using a combination of case vignettes, clinical prediction rules, and reflection exercises, improved resident self-assessed ability in recognizing and applying clinical reasoning concepts. Residents also demonstrated improvement in their ability to identify and apply knowledge of clinical reasoning concepts on a knowledge test after completing the curriculum. Our curriculum followed several best practice approaches for teaching clinical reasoning – it was case-based, provided a foundation in clinical reasoning theory with vocabulary and methods of clinical problem solving and provided immediate feedback on residents’ cognitive errors . We suspect that our curriculum’s impact was maximized by incorporating exercises in which residents practiced their new skills by reflecting on their own ambulatory clinical experiences.
Residents reported the least comfort on the pre-curriculum assessment in comparing analytical and non-analytical reasoning strategies, understanding the cognitive psychology of how memory and clinical knowledge are stored and identifying cognitive errors. Even in domains that are more commonly incorporated into formal and informal inpatient teaching (such as utilizing epidemiology and pathophysiology in clinical reasoning), we saw improvements in self-reported comfort and knowledge. It is likely that both repeated exercises in self-reflection and a session focused on assessing the learner helped residents improve their comfort and skills in diagnosing shortcomings in their learners’ clinical reasoning and providing feedback to their learners on clinical presentations.
In informal discussion with the residents weeks to months after the curriculum, we noted that residents who received the curriculum reported that they were able to identify and name cognitive errors generated by medical students and interns, as well as to identify their own potential sources of error. This indicates that residents were able to engage in metacognition, a recommended strategy to reduce cognitive error , . Our findings are consistent with other studies ,  that have found that clinical reasoning curricular interventions can improve physician awareness of their own cognitive biases.
There are limitations to our approach. There is currently no standardized tool to measure the effects of clinical reasoning curriculum on resident skills, and only one other published study has assessed resident knowledge of clinical reasoning concepts . While we could consider additional assessment methods, such as script concordance testing, observed standardized clinical exams, or completion of cases from the Human Diagnosis Project, results from studies using these methods have been mixed and none have shown a reduction in diagnostic errors , , . One prior study describing a curriculum centered on illness scripts for medical students used a case-based assessment that showed an increase in diagnostic performance , which may be an area for future expansion of our curriculum. Finally, we were not able to measure diagnostic error rates in the residents’ clinical practice. To strengthen the evaluation of our curriculum, in future iterations we plan to evaluate sustainability of resident knowledge and application of clinical reasoning concepts by conducting additional surveys 1 and 6 months after completion of the curriculum. We also plan to assess intermediate outcomes, such as impact on residents’ use of critical thinking and metacognition.
A case-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and their self-assessed ability to recognize and apply clinical reasoning concepts.
Abrami PC, Bernard RM, Wade EB, Surkes MA, Tamim R, ZhangD. Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev Educ Res 2008;78:1102–34. CrossrefWeb of ScienceGoogle Scholar
Kassirer JP, Wong JB, Kopelman RI. Learning clinical reasoning, 2nd ed. Lippincott: Williams & Wilkins, 2009. Google Scholar
Brame C. Flipping the classroom. Vanderbilt University Center for Teaching. [cited 2013. Available from: http://cft.vanderbilt.edu/guides-sub-pages/flipping-the-classroom/.
Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf 2013;22:1044–50. Web of SciencePubMedCrossrefGoogle Scholar
Humbert AJ, Besinger B, Miech EJ. Assessing clinical reasoning skills in scenarios of uncertainty: convergent validity for a Script Concordance Test in an emergency medicine clerkship and residency. Acad Emerg Med 2011;18:627–34. CrossrefWeb of ScienceGoogle Scholar
Park WB, Kang SH, Lee YS, Myung SJ. Does objective structured clinical examinations score reflect the clinical reasoning ability of medical students? Am J Med Sci 2015;350:64–7. PubMedWeb of ScienceCrossrefGoogle Scholar
The online version of this article offers supplementary material (https://doi.org/10.1515/dx-2018-0093).
About the article
Published Online: 2019-03-22
Published in Print: 2019-06-26
Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.
Research funding: None declared.
Employment or leadership: None declared.
Honorarium: None declared.
Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.