Jump to ContentJump to Main Navigation
Show Summary Details
More options …


Official Journal of the Society to Improve Diagnosis in Medicine (SIDM)

Editor-in-Chief: Graber, Mark L. / Plebani, Mario

Ed. by Argy, Nicolas / Epner, Paul L. / Lippi, Giuseppe / Singhal, Geeta / McDonald, Kathryn / Singh, Hardeep / Newman-Toker, David

Editorial Board: Basso , Daniela / Crock, Carmel / Croskerry, Pat / Dhaliwal, Gurpreet / Ely, John / Giannitsis, Evangelos / Katus, Hugo A. / Laposata, Michael / Lyratzopoulos, Yoryos / Maude, Jason / Sittig, Dean F. / Sonntag, Oswald / Zwaan, Laura

CiteScore 2018: 0.69

SCImago Journal Rank (SJR) 2018: 0.359
Source Normalized Impact per Paper (SNIP) 2018: 0.424

See all formats and pricing
More options …

Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study

Eliana BonifacinoORCID iD: https://orcid.org/0000-0001-8797-2061 / William P. Follansbee / Amy H. Farkas / Kwonho Jeong
  • Center for Research on Healthcare Data Center, Division of General Internal Medicine, University of Pittsburgh, Pittsburgh, PA, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Melissa A. McNeil
  • Professor of Medicine, Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
  • VA Pittsburgh Healthcare System, Department of Medicine, Pittsburgh, PA, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Deborah J. DiNardo
  • VA Pittsburgh Healthcare System, Department of Medicine, Pittsburgh, PA, USA
  • Clinical Instructor in Medicine, Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2019-03-28 | DOI: https://doi.org/10.1515/dx-2018-0063



The National Academies of Sciences report Improving Diagnosis in Healthcare highlighted the need for better training in medical decision-making, but most medical schools lack formal education in clinical reasoning.


We conducted a pseudo-randomized and controlled study to evaluate the impact of a clinical reasoning curriculum in an internal medicine clerkship. Students in the intervention group completed six interactive online modules focused on reasoning concepts and a skills-based workshop. We assessed the impact of the curriculum on clinical reasoning knowledge and skills and perception of education by evaluating: (1) performance on a clinical reasoning concept quiz, (2) demonstration of reasoning in hospital admission notes, and (3) awareness of attending physician utilization of clinical reasoning concepts.


Students in the intervention group demonstrated superior performance on the clinical reasoning knowledge quiz (67% vs. 54%, p < 0.001). Students in the intervention group demonstrated superior written reasoning skills in the data synthesis (2.3 vs. 2.0, p = 0.02) and diagnostic reasoning (2.2 vs. 1.9, p = 0.02) portions of their admission notes, and reported more discussion of clinical reasoning by their attending physicians.


Exposure to a clinical reasoning curriculum was associated with superior reasoning knowledge and superior written demonstration of clinical reasoning skills by third-year medical students on an internal medicine clerkship.

This article offers supplementary material which is provided at the end of the article.

Keywords: clinical reasoning; medical education – curriculum development/evaluation; medical education – undergraduate


Diagnostic errors are important contributors to patient morbidity and mortality [1], and rank among the top patient safety concerns for healthcare organizations. In 2015, the National Academies of Sciences, Engineering, and Medicine called for enhanced training in the diagnostic process for all health care professionals [2]. Within undergraduate medical education (UME), the Association of American Medical Colleges (AAMC) has also identified clinical reasoning as a key competency [3]. However, a recent cross-sectional study of internal medicine clerkship directors demonstrated that most US medical schools lack formal educational sessions dedicated to clinical reasoning [4].

The most effective educational strategies to improve trainee clinical reasoning have not been established. A recent systematic review regarding dual process theory-based interventions found six published educational curricula in addition to studies of workplace interventions like checklists, cognitive forcing strategies, guided reflection, and instruction to use a particular type of reasoning [5]. Of these six curricula, four were performed within UME and only two included teaching on individual reasoning steps and skills [6], [7]. The interventions in the latter studies consisted of a case-based reasoning seminar for senior medical students. The first study focused on self-reflection and knowledge of the steps of the reasoning process [6], while the second highlighted the application of Bayes theorem in diagnosis [7]. Both studies demonstrated that a case-based curriculum grounded in dual process theory can result in improved clinical reasoning as assessed by standardized patient cases and written examinations. Only one study identified in the systematic review involved assessment of reasoning in a hospital setting. This study, performed among staff-level emergency medicine physicians, assessed the impact of a checklist on extensiveness of differential diagnosis and resource utilization. There was a trend toward more expansive differential diagnoses and increased numbers of tests and consults ordered when providers used the checklist. However, the difference was not statistically significant, and the clinical impact of these changes was uncertain [8].

It is unknown if clinical reasoning instruction can impact student clinical reasoning within the context of authentic patient care. Therefore, we developed and implemented a clinical reasoning curriculum for medical students on an internal medicine clerkship. We hypothesized that structured clinical reasoning education would improve knowledge of clinical reasoning concepts, application of clinical reasoning concepts in patient care, and students’ perception of their own clinical reasoning education.


We conducted a pseudo-randomized and controlled study to evaluate the impact of a clinical reasoning curriculum on third-year medical students’ knowledge of clinical reasoning concepts, skills, and perception of clinical reasoning education at a single allopathic US medical school.


Third-year medical students at the University of Pittsburgh School of Medicine (UPSOM) who were enrolled in their internal medicine clerkship between January and June 2017 were eligible to participate. The students who participated in the curriculum from January to April were at the end of their third year and were referred to as “experienced”, and students who participated during May and June were in the first 2 months of their third year and were referred to as “novices”.

Existing educational structure

At UPSOM, students are randomly assigned to one of four clinical training sites for each half of their 8-week inpatient internal medicine clerkship. Three of the training sites are academic medical centers associated with UPSOM, and one site is the Veterans Administration hospital associated with the medical school and residency training programs. All hospital sites are within 3 miles of each other. Students at all sites rotate on general medicine wards and participate in patient care on a team that includes one attending physician, one senior resident, two interns, and two third-year medical students. The faculty at all sites have similar qualifications, participate in a centralized faculty development program, and are evaluated according to a uniform set of expectations for faculty promotion and teaching evaluations. The residents at each site are all part of the same training program. The students at all sites have the same set of expectations for the clerkship, including the same number of admissions or patients to follow per week, the same educational conferences, and the same expectations for feedback. All third-year students also participate in daily Student Teaching Attending (STA) sessions, which are a 1-h case-based clinical reasoning discussions with six to eight students led by a faculty member.

Study design

Students who were assigned to site 1 in the first half of their rotation formed the intervention group, and those assigned to sites 2, 3, and 4 formed the control group. Students in the intervention group participated in the curriculum on the first afternoon of their clerkship, completing the modules in a computer lab and then participating in the skills workshop. In order to ensure educational equity, students in the control group completed the modules outside of the study period. Students in both intervention and control groups otherwise had similar educational experiences throughout the internal medicine clerkship, including the previously described ward teams and the STA sessions.

Institutional review

The project met all necessary criteria for an exemption under section 45 CFR 46.101(b) [1] for educational strategies, curricula, or classroom management methods, as designated by the University of Pittsburgh Institutional Review Board (PRO16110049). Participants were given the opportunity to request that their work not be used for research purposes.

Curriculum content

The curriculum consisted of two components: six interactive online modules and a case-based workshop.

Online modules

We used online modules that were developed by faculty within the Department of Medicine at the University of Pittsburgh Medical Center (UPMC). The modules taught clinical reasoning topics that are ranked highly by expert educators [9], [10]. The topics were: (1) diagnostic error, (2) cognitive psychology of decision-making, (3) specific clinical reasoning skills, including use of semantic qualifiers and problem representation, and (4) cognitive biases and heuristics (including those related to overconfidence, probability estimation, and affective biases). The modules included didactic videos, simulated clinical cases, and interactive prompts for open-ended and multiple-choice questions. Each of the modules was completed in 20 min, amounting to a total of 2 h of online content.

Clinical reasoning workshop

The second component consisted of a 1-h skills-based workshop. We provided students with a written case and prompted them to practice four specific skills: (1) identification of key clinical findings, (2) translation of clinical information into semantic qualifiers, (3) generation of a problem representation/summary statement, and (4) creation of a prioritized differential diagnosis. Students first practiced each skill in groups of three to four. A facilitated large-group discussion regarding each of the four skill-based tasks followed the small groups.

We developed the content for this workshop based on principles for curricular content described in Teaching Clinical Reasoning [11], including cases with prototypical illness scripts, an analysis of clinical reasoning processes during case presentations, a discussion of cognitive psychology principles that pertain to clinical reasoning, dual process thinking, and script theory. Specific content and delivery modalities were derived from reviewing the literature for common errors in medical student reasoning [12], and the previously mentioned case-based educational interventions [5], [6], [7]. The workshop did not include instruction regarding how to document clinical reasoning in a hospital admission note.


We evaluated participants according to three domains: knowledge of clinical reasoning concepts, clinical reasoning skills, and student perception of clinical reasoning concepts in their clerkship experience.


We administered a 20-question post-curricular quiz to the control and intervention groups at the end of the 4-week intervention period. The quiz addressed definitions and concepts that were introduced in the online modules (Supplementary Material). The study authors developed the quiz and piloted it among medical education fellows and third-year medical students outside of the study to optimize clarity, content, and difficulty of the questions.


We used the Interpretive summary, Differential diagnosis, Explanation of reasoning and Alternatives (IDEA) tool [13] to assess clinical reasoning skills. IDEA was designed to assess clinical reasoning in third-year medical student hospital admission notes. The tool has three sections: (1) a data gathering section, for assessment of comprehensiveness and contextualization of the history of present illness, historical features, and physical exam; (2) a data synthesis section for evaluation of reasoning in the written assessment and plan; and (3) a global rating of the reporting, diagnostic reasoning, and decision-making skills. This tool utilizes a three-point scale for each section.

Evaluation of written justification for medical decisions is congruent with educational expectations in the UME literature [14], [15]. According to AAMC’s core entrustable professional activities for UME, students are expected to be able to accurately document their clinical decision-making [3]. Furthermore, because student hospital admission notes are included in the medical record and are evaluated by the attending physicians as part of the student’s grade (and are thus high stakes), we feel confident that the notes represented each student’s best attempt to illustrate their reasoning.

As part of the existing clerkship STA session, the students prepared one hospital admission note per week for grading and feedback by their STA physician. One investigator (DD), who was not involved in evaluation of the notes, collected these notes from all students on the medicine clerkship during the intervention period. The same investigator de-identified the notes by removing student names, identifying patient information, and study site information. Two independent investigators (EB, AF) evaluated 30% of the notes in a blinded fashion. The investigators did not provide feedback to the students and did not see the feedback provided by the STA physician. Percent agreement by item was calculated with an overall agreement of 69%, consistent with the fair to moderate agreement reported in the tool’s validation study [13]. After calculation of percent agreement, one investigator (EB) assessed the remainder of the notes.

Perception of educational experiences

The students were surveyed regarding their attending physicians’ use of clinical reasoning terminology, their attending physicians’ delineation of their own clinical reasoning, and the utility of the clinical reasoning concepts in their daily work on the rotation. Responses were graded on a five-point Likert-style scale.

Data analysis

We conducted a priori power analyses to assure at least 80% power to detect a 10-point difference in average skill scores between groups, and four-point difference in knowledge scores between groups. For analysis of quiz and Likert-scale data, we compared mean group scores using t-tests for two independent samples. For analysis of IDEA tool ratings, we calculated mean scores per student overall and per item and compared groups using t-tests for two independent samples.

In secondary subgroup analyses, we compared mean IDEA tool ratings by study group assignments within each of the experienced student cohorts. We also compared mean IDEA tool ratings by study group using only admission notes submitted during week 1 of the intervention period. We performed all analyses using Stata SE 14.1 for Windows (Stata Corp, College Station, TX, USA).


Participant characteristics

A total of 67 of 68 (99%) eligible students participated in the study. One student from the control group requested to be excluded from the assessments. Overall, 40 (60%) students were in the experienced cohort, while 27 (40%) students were in the novice cohort. The intervention group consisted of 34 (51%) students, while 33 (49%) students formed the control group; this distribution of intervention and control students was the same in the experienced and novice cohorts.


At the end of the 4-week intervention period, 66 of the 67 (99%) students completed the 20-point clinical reasoning concept quiz. Students in the intervention group, on average, scored higher than students in the control group (13.30±2.30 vs. 10.88±3.04, p<0.001).


We collected 250 of 268 (93%) eligible hospital admission notes. The remaining 7% of the notes were not turned in to study investigators. There were no significant differences in mean total IDEA scores by study group assignments or by comparison of novice and experienced students. However, students in the intervention group, on average, scored higher by item in the data synthesis and the diagnostic reasoning portions of the IDEA assessment tool (Table 1). This superior performance was also seen when we limited our analysis to admission notes collected during week 1 (Appendix 1). Data stratified by experience cohort are displayed in Appendix 2.

Table 1:

Mean hospital admission note scores per item using IDEA assessment tool.

Perception of educational experiences

At the end of the study period, 66 of 67 (99%) students completed a post-curriculum survey. Students in the intervention group reported high levels of agreement with a statement regarding use of clinical reasoning skills learned during the curriculum in their clinical work (3.83±0.70 out of 5), with experienced students reporting higher ratings than novice students (4.16±0.50 vs. 3.27±0.65, p<0.01).

Students in the intervention group reported higher agreement with the statement, “My attending explicitly discussed common clinical reasoning terminology”, in reference to both ward and STA physicians. Students in the intervention group also reported higher ratings of agreement with the statement, “My attending explicitly outlined his/her clinical reasoning on most cases”, in reference to STA physicians (Table 2).

Table 2:

Mean Likert ratings of agreement with statements regarding clinical reasoning education.


In this pseudo-randomized and controlled study of a clinical reasoning curriculum for third-year medical students on an internal medicine clerkship, we observed superior knowledge of clinical reasoning concepts, superior utilization of clinical reasoning skills in admission notes, and increased awareness of attending physician utilization of clinical reasoning concepts in the clinical and conference room settings. To our knowledge, this study represents the first investigation of a curricular intervention’s impact on clinical reasoning skills in the context of patient care, as opposed to simulated or written clinical cases.

Clinical reasoning vocabulary and concepts can serve as a shared foundation for improvement by informing assessment methods, remediation plans, and reflection on one’s own cognitive performance [16]. Medical students value instruction on clinical reasoning concepts [17] because they provide an intuitive structure for reasoning, and they find the subject helpful for improving their own clinical reasoning [18], [19]. Our demonstration of knowledge durability at 4 weeks contrasts with previous studies which have demonstrated immediate knowledge changes [20], [21]. Whether gaining such knowledge contributes to the development and refinement of reasoning skills is unknown.

The ultimate goal of clinical reasoning interventions is the enhancement of reasoning skills at the point of patient care. Our evaluation of student reasoning skills in the context of patient care distinguishes our study from previous investigations and suggests that the introduction of clinical reasoning concepts may contribute to improvements in reasoning skills in medical students on an internal medicine clerkship. While the absolute difference in IDEA skill ratings between our intervention and control groups was small (0.3 points on a three-point scale), this was in the context of multiple factors that were likely to bias results toward the null including the brief one-time nature of the intervention, the well-established standard educational experiences provided to both intervention and control groups, and the ample opportunities for feedback from clinical supervisors for both groups. At the time of a 2016 systematic review, no published educational intervention had evaluated an outcome higher than 2b (acquisition of knowledge or skills) on Kirkpatrick’s adapted hierarchy. Our study is the first to evaluate a Kirkpatrick Level 3 (change in behavior with transfer of knowledge to the workplace) outcome for clinical reasoning education.

The optimal methods for teaching clinical reasoning skills have not been established. A 2017 cross-sectional survey of internal medicine clerkship directors revealed that the most common settings for clinical reasoning education within existing UME internal medicine programs are attending rounds and morning report. A small percentage of programs reported sessions or online modules specifically devoted to clinical reasoning education [4]. Our findings suggest that use of online programs coupled with clinical reasoning-specific sessions may enhance clinical reasoning education within UME.

We also found that students who were exposed to our curriculum reported an increased recognition of an explicit clinical reasoning approach in their role models. This suggests an increased awareness and identification of clinical reasoning concepts by students in the intervention group. This finding lends support to the idea that education can prime students to identify and learn from reasoning by their clinical teachers that they may have otherwise missed.

Our study has several limitations. First, it was conducted at a single US allopathic medical school, limiting generalizability. As we randomized students by site, it is possible that differences in the teachers, learning environment, or other site-specific characteristics confounded the relationship between our intervention and outcomes. For instance, it is possible that that attending physicians at the intervention site were more aware of or more facile with clinical reasoning terminology because of their proximity with the curriculum than faculty at the control sites. However, in addition to the previously mentioned similarities in attending physicians between sites, the training sites share the same population of residents and similar patient populations. Additionally, we observed superior clinical reasoning skills in the intervention group compared to controls even during the first week of the intervention period, when exposure to their clinical training site was limited. We also found that intervention group students had superior performance in areas that were targeted by our curriculum (data synthesis, diagnostic reasoning), but not in areas left unaddressed (data gathering). Collectively, these findings suggest that the curriculum (and not the site) was responsible for the outcomes.

Lastly, we relied on written documentation in hospital admission notes as a proxy for the students’ clinical reasoning. It is possible that this mode of assessment failed to capture aspects of reasoning that were not written. We also cannot exclude the possibility that students may have received feedback on the clinical reasoning documented in their notes from their attending physicians or residents who were working with them. However, we would not expect these effects to differ by site.

In conclusion, we have demonstrated that structured clinical reasoning education can lead to improved knowledge of clinical reasoning concepts and superior written clinical reasoning skills in third-year medical students on an internal medicine clerkship. Our findings suggest that online modules coupled with a clinical reasoning workshop are effective and can prime students to more readily identify reasoning processes in their clinical role models. Next steps include development and evaluation of clinical reasoning curricula for pre-clinical medical students. Medical educators should consider addition of online modules and/or workshop sessions to introduce clinical reasoning concepts and facilitate clinical reasoning skill development.


Appendix 1:

Mean week 1 hospital admission note scores per item using IDEA assessment tool.

Appendix 2:

Mean hospital admission note scores per item using IDEA assessment tool by experience cohort.


  • 1.

    Makary MA, Daniel M. Medical error-the third leading cause of death in the US. Br Med J 2016;353:i2139. Web of ScienceGoogle Scholar

  • 2.

    Balogh EP, Miller BT, Ball JR. Improving diagnosis in health care. Washington, DC: National Academies Press, 2015. Google Scholar

  • 3.

    Flynn T. Core entrustable professional activities for entering residency-curriculum developers’ guide. Washington, DC: Association of American Medical Colleges, 2014. Google Scholar

  • 4.

    Rencic J, Trowbridge RL Jr., Fagan M, Szauter K, Durning S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship directors. J Gen Intern Med 2017;32:1242–6. Web of ScienceCrossrefPubMedGoogle Scholar

  • 5.

    Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review. BMJ Qual Saf 2016;25:808–20. CrossrefPubMedWeb of ScienceGoogle Scholar

  • 6.

    Nendaz MR, Gut AM, Louis-Simonet M, Perrier A, Vu NV. Bringing explicit insight into cognitive psychology features during clinical reasoning seminars: a prospective, controlled study. Educ Health (Abingdon) 2011;24:496. PubMedGoogle Scholar

  • 7.

    Round AP. Teaching clinical reasoning – a preliminary controlled study. Med Educ 1999;33:480–3. CrossrefPubMedGoogle Scholar

  • 8.

    Graber ML, Sorensen AV, Biswas J, Modi V, Wackett A, Johnson S, et al. Developing checklists to prevent diagnostic error in Emergency Room settings. Diagnosis (Berl) 2014;1:223–31. CrossrefPubMedGoogle Scholar

  • 9.

    Musgrove JL, Morris J, Estrada CA, Kraemer RR. Clinical reasoning terms included in clinical problem solving exercises? J Grad Med Educ 2016;8:180–4. PubMedCrossrefGoogle Scholar

  • 10.

    Croskerry P. A universal model of diagnostic reasoning. Acad Med 2009;84:1022–8. Web of SciencePubMedCrossrefGoogle Scholar

  • 11.

    Trowbridge RL, Rencic JJ, Durning SJ, editors. Teaching clinical reasoning. Philadelphia, PA: American College of Physicians, 2015. Google Scholar

  • 12.

    Friedman MH, Connell KJ, Olthoff AJ, Sinacore JM, Bordage G. Medical student errors in making a diagnosis. Acad Med 1998;73(10 Suppl):S19–21. CrossrefPubMedGoogle Scholar

  • 13.

    Baker EA, Ledford CH, Fogg L, Way DP, Park YS. The IDEA assessment tool: assessing the reporting, diagnostic reasoning, and decision-making skills demonstrated in medical students’ hospital admission notes. Teach Learn Med 2015;27:163–73. Web of ScienceCrossrefPubMedGoogle Scholar

  • 14.

    Smith S, Kogan JR, Berman NB, Dell MS, Brock DM, Robins LS. The development and preliminary validation of a rubric to assess medical students’ written summary statements in virtual patient cases. Acad Med 2016;91:94–100. Web of SciencePubMedCrossrefGoogle Scholar

  • 15.

    Williams RG, Klamen DL. Examining the diagnostic justification abilities of fourth-year medical students. Acad Med 2012;87:1008–14. PubMedCrossrefWeb of ScienceGoogle Scholar

  • 16.

    Dhaliwal G, Ilgen J. Clinical reasoning: talk the talk or just walk the walk? J Grad Med Educ 2016;8:274–6. CrossrefPubMedGoogle Scholar

  • 17.

    Goss JR. Teaching clinical reasoning to second-year medical students. Acad Med 1996;71:349–52. CrossrefPubMedGoogle Scholar

  • 18.

    Gay S, Bartlett M, McKinley R. Teaching clinical reasoning to medical students. Clin Teach 2013;10:308–12. CrossrefPubMedGoogle Scholar

  • 19.

    Lee A, Joynt GM, Lee AK, Ho AM, Groves M, Vlantis AC, et al. Using illness scripts to teach clinical reasoning skills to medical students. Fam Med 2010;42:255–61. PubMedGoogle Scholar

  • 20.

    Nendaz MR, Bordage G. Promoting diagnostic problem representation. Med Educ 2002;36:760–6. CrossrefPubMedGoogle Scholar

  • 21.

    Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf 2013;22:1044–50. Web of SciencePubMedCrossrefGoogle Scholar

Supplementary Material

The online version of this article offers supplementary material (https://doi.org/10.1515/dx-2018-0063).

About the article

Corresponding author: Eliana Bonifacino, MD, MS, Assistant Professor of Medicine, Department of Medicine, University of Pittsburgh School of Medicine, 200 Lothrop Street 9 South, Pittsburgh, PA 15213, USA

Received: 2018-08-07

Accepted: 2019-03-07

Published Online: 2019-03-28

Published in Print: 2019-06-26

Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

Research funding: Funding support was provided by a grant from the Thomas H. Nimick, Jr. Competitive Research Fund of UPMC Shadyside Hospital and the Shadyside Foundation. Support for development of online modules was provided by a grant from the Hearst Foundations awarded to Dr. William Follansbee.

Employment or leadership: None declared.

Honorarium: None declared.

Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

Citation Information: Diagnosis, Volume 6, Issue 2, Pages 165–172, ISSN (Online) 2194-802X, ISSN (Print) 2194-8011, DOI: https://doi.org/10.1515/dx-2018-0063.

Export Citation

©2019 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Supplementary Article Materials

Comments (0)

Please log in or register to comment.
Log in