BY-NC-ND 3.0 license Open Access Published by De Gruyter March 25, 2015

Uptake and impact of a clinical diagnostic decision support tool at an academic medical center

John S. Barbieri, Benjamin French and Craig A. Umscheid
From the journal Diagnosis

Abstract

Background: Use of differential diagnosis (DDX) generators may reduce the incidence of misdiagnosis-related harm, but there is a paucity of research examining the use and impact of such systems in real-world settings.

Methods: In September 2012, the DDX generator VisualDx was made available across our entire academic healthcare system. We examined the use of VisualDx by month for the 18 months following its introduction. In addition, we compared the number of inpatient dermatology consults requested per month at the flagship hospital of our healthcare system for the 12 months before versus 18 months after VisualDx introduction.

Results: Across our entire academic healthcare system, there were a median of 474 (interquartile range 390–544) unique VisualDx sessions per month. VisualDx was accessed most frequently through mobile devices (35%) and the inpatient electronic health record (34%). Prior to VisualDx introduction, there was a non-significant increase in the number of inpatient dermatology consultations requested per month at the flagship hospital of our healthcare system (1.0 per month, 95% CI –2.5–4.6, p=0.54), which remained 1.0 consults per month (95% CI –0.9–2.9, p=0.27) following its introduction (p=0.99 comparing post- versus pre-introduction rates).

Conclusions: The DDX generator VisualDx was regularly used, primarily on mobile devices and inpatient workstations, and was not associated with a change in inpatient dermatology consultation requests. Given the interest in DDX generators, it will be important to evaluate further the impact of such tools on the quality and value of care delivered.

Introduction

Differential diagnosis (DDX) generators have been suggested as a potential tool to reduce the incidence of misdiagnosis-related harm, which is estimated to result in 40,000–80,000 deaths in US hospitals annually [1]. By providing access to a broad differential, DDX generators can potentially help users overcome cognitive biases (such as availability bias) [2, 3] and fund of knowledge deficiencies by directing them to diagnoses they might not have considered otherwise. In fact, simulations suggest that DDX generators have the potential to improve diagnostic accuracy [4–7]. However, there is a paucity of research examining the use of such systems in real-world settings [8], and even less is known about the impact of DDX generator use on patient care and resource utilization [9, 10]. These issues are important because barriers such as those related to provider overconfidence and workflow could impede the use of DDX tools in real world settings. Moreover, DDX generators by design suggest numerous potential diagnoses, which could result in an increase in unnecessary testing and specialty consultation, and associated costs and harms, particularly in the hands of less experienced clinicians.

One area where DDX generators offer particular promise for reducing diagnostic errors is dermatology. Research suggests dermatologic conditions are often misdiagnosed [5, 11, 12], which can be exacerbated when access to dermatologists is limited [13]. VisualDx is a commercially available DDX generator that can help providers diagnose the causes of skin findings and preliminary studies suggest it may have the potential to reduce diagnostic error [5]. It can be accessed using internet browsers, a mobile app, and through integration with UpToDate (an electronic clinical care resource covering over 20 specialties). VisualDx includes over 1300 diseases with more than 28,000 associated images, with new disease summaries and images added over time to ensure coverage of both common and uncommon dermatologic conditions. It is currently used by the US Department of Veterans Affairs and a number of other large healthcare systems. The VisualDx interface allows one to either build a differential based on history and exam findings or to browse clinical summaries and images of select diseases as a point-of-care clinical reference.

This study describes the use of the DDX generator VisualDx when made available across our entire academic healthcare system and examines its impact on inpatient dermatology consult requests at the flagship hospital of our healthcare system.

Materials and methods

Study setting

The DDX generator was implemented and evaluated in a quaternary care academic healthcare system comprised of 3 acute care teaching hospitals with a combined capacity of more than 1600 beds and 75,000 annual inpatient admissions and 2.1 million annual outpatient visits. All inpatient units within the healthcare system use Sunrise Clinical Manager version 5.5 (Allscripts, Chicago, IL, USA) as their electronic health record (EHR), all outpatient clinics use EpicCare (Epic Systems Corporation, Verona, WI, USA), and all emergency departments use a locally developed EHR EMTRAC. Along with VisualDx, a number of other subscription point-of-care evidence summary resources were available at our institution during the study timeframe, including UpToDate, DynaMed and MDConsult.

Implementation

In September 2012, VisualDx was made available across our entire academic healthcare system to all healthcare system trainees and staff through dropdown tool menus of the inpatient, outpatient and emergency department EHRs, the library website, the search function of UpToDate, and a downloadable mobile app. It was advertised through broadcast emails, screensavers on inpatient computer terminals, and select in-person demonstrations. These advertisements and demonstrations occurred predominately in September 2012 during the initial implementation and were repeated in Spring 2013 after the first 6 months of use.

Data collection

Data were collected on the use of VisualDx across our entire academic healthcare system during the first 18 months after its introduction. In addition, data were obtained on the number of inpatient dermatology consults requested at the flagship hospital of our healthcare system in the 12 months preceding and the 18 months following the introduction of VisualDx. Only 12 months of data prior to the introduction of VisualDx were included in the analysis given major changes in inpatient team structures implemented on July 1, 2011 following new housestaff duty hour regulations.

Statistical analysis

The number, duration, and source of VisualDx sessions was summarized using standard descriptive statistics. A linear regression model was used to compare the number of inpatient dermatology consults requested per month at the flagship hospital of our academic healthcare system before versus after the introduction of VisualDx [14, 15]. The model included an interaction term between calendar month and an indicator for VisualDx, which allowed us to determine whether implementation of VisualDx was associated with an immediate change in the absolute number of consults requested as well as with the rate of change in consults over time. In addition, due to significant month-to-month fluctuations in inpatient dermatology consult requests at the flagship hospital of our healthcare system, we constructed a model which included month as a categorical variable to look for effects due to seasonality. Finally, we constructed a model with the addition of the number of VisualDx sessions per month, to explore whether the volume of VisualDx usage was associated with changes in consult requests. Our quasi-experimental analysis [16] allowed us to control for secular trends in use of VisualDx and to assess the pre-implementation trend, the difference in rates immediately before and after the implementation, and the post-implementation trend. Statistical analyses were performed using Microsoft Excel (Microsoft Corporation, Redmond, WA, USA) and Stata 13 (StataCorp, College Station, TX, USA).

The study received expedited approval and a HIPAA waiver from the University of Pennsylvania Institutional Review Board.

Results

There were a median of 474 [interquartile range (IQR) 390–544] unique VisualDx sessions per month across our entire academic healthcare system (Figure 1). VisualDx use increased following initial broadcast emails and in-person demonstrations in September 2012 and a second set of broadcast emails and in-person demonstrations in Spring 2013. Overall, VisualDx was accessed through mobile devices (35%), the inpatient (34%), outpatient (11%), and emergency department (1%) EHRs, and via searches in UpToDate (19%) (Table 1). The ten most common diagnoses viewed (12% of total views) were: atopic dermatitis, scabies, allergic contact dermatitis, bedbug bites, tinea corporis, herpes zoster, psoriasis, folliculitis, erythema multiforme minor, and pityriasis rosea.

Figure 1: Number of VisualDx Sessions per month following Introduction Across the Entire University of Pennsylvania Health System (9/2012–2/2014).Percentages inside bars represent each location’s contribution to total use for that month. For September and October it was not possible to identify the specific source of use, so only the total number of sessions is reported.

Figure 1:

Number of VisualDx Sessions per month following Introduction Across the Entire University of Pennsylvania Health System (9/2012–2/2014).

Percentages inside bars represent each location’s contribution to total use for that month. For September and October it was not possible to identify the specific source of use, so only the total number of sessions is reported.

Table 1

VisualDx usage characteristics by location.

Total sessions, %Median session length, minutes (IQR)Number topic views per session, %
12–56–10>10
Mobile351.18 (0.22–4.37)563662
Inpatient EHR341.53 (0.23–5.37)444196
UpToDate190.28 (0.07–1.35)811531
Outpatient EHR111.65 (0.57–3.52)504361
ED EHR12.70 (1.12–5.18)474391
Combined1001.10 (0.18–4.00)573562

ED, emergency department; EHR, electronic health record; IQR, interquartile range.

The median VisualDx session length was 1.10 min (IQR 0.18–4.00). Forty-nine percent of sessions lasted <1 min, 30% of sessions lasted 1–5 min, 9% of sessions lasted 5–10 min, and 12% of sessions lasted more than 10 min. Session length was longest when VisualDx was accessed from the emergency department EHR (median 2.70 min, IQR 1.12–5.18), followed by the outpatient EHR (median 1.65 min, IQR 0.57–3.52), inpatient EHR (median 1.53 min, IQR 0.23–5.37), and mobile devices (median 1.18 min, IQR 0.22–4.37) (Table 1). Session length was shortest when VisualDx was accessed through UpToDate (median 0.28 min, IQR 0.07–1.35). The majority of sessions (57%) involved only one topic view. Thirty-four percent of sessions involved 2–5 topic views, 6% involved 6–10 topic views, and 3% involved more than ten topic views. Of note, for sessions accessed through UpToDate, 81% involved only one topic view.

Prior to the introduction of VisualDx, the number of inpatient consults requested at the flagship hospital of our healthcare system increased at a rate of 1.0 per month (95% CI –2.5–4.6, p=0.54) and this rate remained 1.0 per month (95% CI –0.9–2.9, p=0.27) following the introduction of VisualDx (Figure 2); the difference between these rates was not statistically significant (p=0.99). In addition, the introduction of VisualDx was not associated with a statistically significant difference in the absolute number of inpatient dermatology consults requested per month (absolute difference 4.6 fewer consults per month after introduction of VisualDx, 95% CI –33.7–24.6, p=0.75). Models constructed to assess for seasonality as well as an association between the number of VisualDx sessions and the number of inpatient consults requested per month demonstrated no significant effects.

Figure 2: Number of new dermatology consults requested per month at the Hospital of the University of Pennsylvania Before and After the Introduction of VisualDx (9/2011–2/2014).Diamonds represent months in the year prior to the introduction of VisualDx and squares represent months in the year and a half following the introduction of VisualDx. The dashed line represents the projected increase in inpatient consults expected from the growth rate in consults before the introduction of VisualDx.

Figure 2:

Number of new dermatology consults requested per month at the Hospital of the University of Pennsylvania Before and After the Introduction of VisualDx (9/2011–2/2014).

Diamonds represent months in the year prior to the introduction of VisualDx and squares represent months in the year and a half following the introduction of VisualDx. The dashed line represents the projected increase in inpatient consults expected from the growth rate in consults before the introduction of VisualDx.

Discussion

This study is one of the first to describe the use of a DDX generator in a real-world setting, and to examine its impact on patient care. The DDX generator VisualDx was regularly used across our entire academic healthcare system, primarily on mobile devices and inpatient workstations, and was not associated with a change in inpatient dermatology consult requests at the flagship hospital of our healthcare system. Messaging to staff and trainees was associated with increases in use, with use waning over time until reaching a steady state, potentially representing user adoption and subsequent abandonment by a subsegment of those users over time. The high level of use on mobile devices emphasizes the importance of mobile versions of point-of-care resources such as DDX generators.

The majority of VisualDx sessions were of short duration, with 79% of sessions lasting <5 min and with 49% of sessions lasting <1 min. These results likely reflect the frequent use of this tool on mobile devices, and suggest users either found what they were looking for or abandoned their search after relatively short time frames. Interestingly, the shortest sessions with the fewest topic views were those accessed through UpToDate. Since UpToDate is a popular clinical resource, these users may have been using UpToDate to read about a particular topic and were using VisualDx to view images associated with the topic, which could explain the very short session lengths most commonly involving only one topic. Given these constraints for real-world use, designing DDX interfaces that allow for rapid generation of differential diagnoses with easily accessible associated clinical reference content may be an important feature to allow for effective integration into clinical settings.

Our study has several limitations. First, our quasi-experimental design may not fully account for secular changes in inpatient dermatology consultations over time. In addition, our design lacks a concurrent control group, which could have examined changes in inpatient dermatology consultations in a setting without VisualDx. However, our use of an interrupted time series methodology should protect against most secular changes in inpatient dermatology consultations not associated with VisualDx. Moreover, we constructed additional models to examine for potential cofounders such as seasonality.

Our study was also limited to a single academic healthcare system, thus our experience may not be generalizable to other healthcare systems with different EHRs and staff. However, the integration of our DDX generator into commercially available EHRs serving a diverse array of patient populations, clinical services, and service models throughout our entire academic healthcare system may improve the generalizability of our experience to other settings. Lastly, due to limitations in available data, we were unable to assess whether the introduction of VisualDx had an impact on individual patient-level outcomes or quality of care.

In conclusion, the DDX generator VisualDx was regularly used across our entire academic healthcare system, primarily on mobile devices and inpatient workstations, and was not associated with a change in inpatient dermatology consultation requests at the flagship hospital of our healthcare system. Given the interest in using DDX generators in the healthcare setting, future research should continue to evaluate the impact of such tools on patient care quality and the value of care delivered.


Corresponding author: Craig A. Umscheid, MD, MS, Perelman School of Medicine at the University of Pennsylvania, 3535 Market Street, Mezzanine, Suite 50, Philadelphia, PA 19104, USA, Phone: +215-349-8098, Fax: +215-349-5829, E-mail:

Acknowledgments

We would like to acknowledge Diana Greene and Kyle Aaronson of the Department of Dermatology at the University of Pennsylvania Perelman School of Medicine for their assistance with acquiring dermatology consultation data, and Donna Reinhart and Art Papier, MD of VisualDx for their assistance with acquisition of VisualDx usage data at Penn Medicine.

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission. Study concept and design: Barbieri, Umscheid; Analysis and interpretation of data: Barbieri, French, Umscheid; Drafting of the manuscript: Barbieri; Critical revision of the manuscript for important intellectual content: Barbieri, French, Umscheid; Statistical analysis: Barbieri, French; Obtained funding: Umscheid; Administrative, technical, or material support: Barbieri, French, Umscheid; Study supervision: Umscheid. The content of this paper is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

  2. Research funding: Visual Dx had no role in the data analysis or interpretation related to their program, nor did they provide funding for the evaluation. Dr. Barbieri is the author of two disease summaries for VisualDx. He is not employed by VisualDx and does not receive any financial compensation for his work. Dr. French has no potential conflicts to disclose. Dr. Umscheid’s contribution to this project was supported in part by the National Center for Research Resources, Grant UL1RR024134, which is now at the National Center for Advancing Translational Sciences, Grant UL1TR000003.

  3. Employment or leadership: None declared.

  4. Honorarium: None declared.

  5. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

  6. Ethical Approval: University of Pennsylvania Institutional Review Board: Protocol #818265.

References

1. Graber ML, Trowbridge R, Myers JS, Umscheid CA, Strull W, Kanter MH. The next organizational challenge: finding and addressing diagnostic error. Jt Comm J Qual Patient Saf 2014;40:102–10.Search in Google Scholar

2. Schiff GD, Kim S, Abrams R, Cosby K, Lambert B, Elstein AS, et al. Diagnosing diagnosis errors: lessons from a multi-institutional collaborative project. In: Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology) [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2005. Available from: . Accessed March 6, 2015.Search in Google Scholar

3. Mamede S, van Gog T, van den Berge K, Rikers RM, van Saase JL, van Guldener C, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. J Am Med Assoc 2010;304:1198–203.Search in Google Scholar

4. Segal MM, Williams MS, Gropman AL, Torres AR, Forsyth R, Connolly AM, et al. Evidence-based decision support for neurological diagnosis reduces errors and unnecessary workup. J Child Neurol 2014;29:487–92.Search in Google Scholar

5. David CV, Chira S, Eells SJ, Ladrigan M, Papier A, Miller LG, et al. Diagnostic accuracy in patients admitted to hospitals with cellulitis. Dermatol Online J 2011;17:1.Search in Google Scholar

6. Graber ML, Mathew A. Performance of a web-based clinical diagnosis support system for internists. J Gen Intern Med 2008;23(Suppl 1):37–40.Search in Google Scholar

7. Bond WF, Schwartz LM, Weaver KR, Levick D, Giuliano M, Graber ML. Differential diagnosis generators: an evaluation of currently available computer programs. J Gen Intern Med. 2012;27:213–9.Search in Google Scholar

8. Ramnarayan P, Winrow A, Coren M, Nanduri V, Buchdahl R, Jacobs B, et al. Diagnostic omission errors in acute paediatric practice: impact of a reminder system on decision-making. BMC Med Inform Decis Mak 2006;6:37.Search in Google Scholar

9. Umscheid C, CW Hanson III. A follow-up report card on computer-assisted diagnosis – the Grade: C+. J Gen Intern Med 2012;27:142–4.Search in Google Scholar

10. Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR, et al. Effect of clinical decision-support systems: a systematic review. Ann Intern Med 2012;157:29–43.Search in Google Scholar

11. Bauer J, Maroon M. Dermatology inpatient consultations: a retrospective study. J Am Acad Dermatol 2010;62:518–9.Search in Google Scholar

12. Hepburn MJ, Dooley DP, Ellis MW. Alternative diagnoses that often mimic cellulitis. Am Fam Physician 2003;67:2471.Search in Google Scholar

13. Resneck J Jr, Pletcher MJ, Lozano N. Medicare, Medicaid, and access to dermatologists: the effect of patient insurance on appointment access and wait times. J Am Acad Dermatol 2004;50:85–92.Search in Google Scholar

14. French B, Heagerty PJ. Analysis of longitudinal data to evaluate a policy change. Stat Med 2008;27:5005–25.Search in Google Scholar

15. Penfold RB, Zhang F. Use of interrupted time series analysis in evaluating health care quality improvements. Acad Pediatr 2013;13(6 Suppl):S38–44.Search in Google Scholar

16. Wagner AK, Soumerai SB, Zhang F, Ross-Degnan D. Segmented regression analysis of interrupted time series studies in medication use research. J Clin Pharm Ther 2002;27:299–309.Search in Google Scholar

Received: 2014-9-10
Accepted: 2015-2-20
Published Online: 2015-3-25
Published in Print: 2015-6-1

©2015, Craig A. Umscheid et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.