Skip to content
BY 4.0 license Open Access Published by De Gruyter February 14, 2020

Assessing the utility of a differential diagnostic generator in UK general practice: a feasibility study

  • Sudeh Cheraghi-Sohi ORCID logo EMAIL logo , Rahul Alam , Mark Hann , Aneez Esmail , Stephen Campbell and Nicholas Riches
From the journal Diagnosis



Despite growing positive evidence supporting the potential utility of differential diagnostic generator (DDX) tools, uptake has been limited in terms of geography and settings and calls have been made to test such tools in wider routine clinical settings. This study aims to evaluate the feasibility and utility of clinical use of Isabel, an electronic DDX tool, in a United Kingdom (UK) general practice setting.


Mixed methods. Feasibility and utility were assessed prospectively over a 6-month period via: usage statistics, survey as well as interview data generated from clinicians before and after Isabel was available for clinical use. Normalisation process theory (NPT) was utilised as a sensitising concept in the data collection and analysis of the qualitative data.


Usage was extremely limited (n = 18 searches). Most potential users did not utilise the program and of those that did (n = 6), usage was restricted and did not alter subsequent patient management. Baseline interview findings indicated some prior awareness of DDX tools and ambivalent views with regards to potential utility. Post-use interviews supported analytic data and indicated low usage due to a range of endogenous (professional) and exogenous (organisational) factors.


In its current form, this small exploratory study suggests that Isabel is a tool that is unlikely to be utilised on a routine basis in primary care, but may have potential utility for diagnostic support in (1) education/training and (2) rare and diagnostically complex cases.


Differential diagnosis (DDX) generators are electronic diagnostic aids or tools. User interfaces of DDX tools allow users to input clinical data (e.g. symptoms), and outputs consist of retrieved lists of differential diagnoses. Such tools are capable of rapidly searching large electronic, medical databases and are predominantly web-based, facilitating easy access and flexibility in use while being continuously updated to reflect current evidence. The tools are designed to be interactive so that they support the clinical diagnostic process, in terms of providing a differential diagnosis list of potential diagnoses and therefore utilised as a helpful adjunct to clinical acumen, rather than simply providing a single ‘correct’ or most probable diagnosis. Evidence to date suggests that DDX tools can provide accurate diagnostic suggestions [1], [2], [3] as well as potentially improve practitioners’ diagnostic performance in multiple ways (e.g. information gathering, generating differential diagnoses, weighting diagnoses) [3], [4] and have been discussed as a potential tool to improve diagnosis in the recent National Academies of Science and Medicine (NAM) report [5] which also highlights that evidence to understand the performance of DDX tools in routine clinical practice is currently lacking.

Despite growing positive evidence supporting the potential utility of DDX tools in supporting diagnostic decision-making, uptake has been limited in terms of geography and settings with evidence and usage stemming predominantly from the United States and hospital secondary care settings [1]. The challenges for diagnostic decision-making in primary care differ from secondary care because of the need to take into account presentations with undifferentiated symptoms in contrast to secondary care where patients often present with a working diagnosis. It has been suggested, for example, that in 25–50% of patient visits to family physicians, no disease specific diagnoses are possible [6]. In primary care, the final diagnosis may not be elaborated until the passage of time or the development of further symptoms depending on the natural history of the presenting complaint. Furthermore, diagnoses within primary care are made in increasingly time-pressured and multi-agenda consultations [7]; it is within this complex context that there is increasing focus on the role of DDX tools as a means of providing a useful aid to the primary care clinician in order to come to a working diagnosis for treatment and appropriate management to ensue. We sought to test whether such tools would have utility in aiding the diagnostic process of primary care clinicians in the UK by selecting one DDX generator, Isabel, for evaluation within a UK general practice setting.

Isabel (see Table 1 for summary of key features) was selected for this study as it has been shown in comparison to other available DDX tools, to provide the highest accurate diagnosis retrieval rates [1] as well as clinical utility, in terms of users reporting that they found it ‘useful’ and that it offered diagnoses not previously considered by the clinician [8], [9]. Conversely however, a survey of registered users found that Isabel was not considered useful or convenient by 90% of users [10]. Only one study to date has sought to assess Isabel for use in a routine UK primary medical care setting and focused on Isabel’s utility and uptake among clinicians. However, the outcomes were limited by poor uptake among potential users [10] with only 16 unique patient searches being conducted across four general practices over the 3-month evaluation period. Explanations for this may include a lack of regular reminders about the software’s availability, lack of local implementation champions and a relatively short evaluation period. Additionally, there were also concerns about a lack of familiarity with the tool and the time taken to use it [5].

Table 1:

Isabel – key features.

Web based diagnosis checklist system that can operate as a stand-alone tool or as an integrated component of the electronic health record (EHR)
Interface requires patient’s age, gender, and positive clinical features plus travel history
The results, initially 10 diagnoses (more can be viewed with a single click), are displayed within seconds (<2 on average) including flagged ‘Don’t Miss Diagnoses’
Can be customised to suit local requirements
Powered by statistical natural language processing software

Materials and methods


In order to ascertain the feasibility of Isabel, a DDX tool, for routine primary medical care use in the UK, this study aimed to conduct a process evaluation of Isabel’s implementation within a UK general practice.


A large inner-city UK general practice in Greater Manchester, England was selected. The practice consisted of approximately 18,000 patients. The practice team included 17 general practitioners and two nurse practitioners who were invited to take part in the study.


In relation to the second aim, a mixed-methods approach was taken. All clinicians (see Table 2 for sample details) with diagnostic responsibility were invited for interview before and 6 months after Isabel was made available for use. In total, 20 semi-structured interviews (11 before and nine after) were conducted by SCS and RA, both experienced qualitative health service researchers. The interviews were digitally recorded and professionally transcribed. In addition to the interview data, both post-use survey and usage data were recorded by Isabel. All clinicians were trained and provided with free access to Isabel for a 6-month period (October 2014–March 2015). Fortnightly reminders about the availability of the system were sent via the practice’s electronic health records (EHRs) so that a reminder was visible when the GP logged into the system and one of the authors (AE) a senior GP of 23 years of service within the practice and still working three clinical sessions per week in the practice, acted as the local practice champion. At the start of the study period, AE provided all clinical colleagues (during an internal practice meeting of 1 h duration) with exemplars from the literature of where Isabel had been found to have previous clinical utility. AE also discussed, his personal usage of Isabel prior to the study period (both positive and negative) as evidence of potential (dis)utility and reminded colleagues of its availability for use at regular fortnightly practice meetings where all staff were present.

Table 2:

Sample characteristics.

IDRolePre and post dataExperience in primary care, years
  1. GP, general practitioner; NP, nurse practitioner.

Qualitative data collection and analysis was guided by the use of the normalisation process theory (NPT) as a sensitising concept [11], [12]. Developed initially as a model for assessing the likelihood of complex interventions becoming implemented (i.e. workable and integrated) into routine health care settings in relation to the work that is involved. The model was later expanded into a more general theory and provides a way to assess or predict whether new types of practices, in this case the use of a DDX tool, may become embedded into existing routines/work, in this case, diagnostic work within general practice. NPT has four main underlying constructs that influence implementation: coherence (i.e. meaning and sense making by the participants); cognitive participation (i.e. engagement and commitment by the participants); collective action (i.e. the work needed to make the intervention function); and reflexive monitoring (i.e. reflection and appraisal of the intervention by participants) [13]. We used NPT to create a framework to examine the context of implementation and combined this with a thematic approach to examine and refine emerging themes [14]. Two researchers read the transcripts independently (SCS and RA) and through comparative analysis independently noted down main themes and sub-themes that emerged. Key themes and quotes were circulated to all the authors for comments and discussion. A final set of themes and sub-themes were agreed by all authors. Atlas ti 6.2 qualitative analysis software was used to support analysis (Berlin, Germany).

Ethics approval and consent to participate

Institutional ethical approval was granted from the University of Manchester Research Ethics Committee 2 (Ref. 14148) and in accordance, written consent taken from all study participants.

Consent for publication

Written consent was taken from all participants regarding the use of their data for analysis and publication. All quotes are however anonymised in accordance with ethical procedures.


Isabel analytics

Despite frequent reminders for clinicians to use Isabel on a regular basis, its usage was limited. No meaningful quantitative analysis could be conducted on the basis of usage statistics. The program was accessed 32 times and 18 searches were conducted (see Figure 1). The post-usage survey was completed 11 times (see Table 3). Results indicate that Isabel broadened the clinician’s differential in 64% of (7/11) cases however, in terms of wider utility, in the vast majority of cases, Isabel did not affect referral or management decisions. The reasons for this low usage are confirmed (i.e. the two types of data were triangulated) and elaborated upon in the qualitative element of the study.

Figure 1: Isabel usage frequencies for 6-month study period (October 2014–March 2015).
Figure 1:

Isabel usage frequencies for 6-month study period (October 2014–March 2015).

Table 3:

Post-use Isabel Survey results.

1.How did Isabel PRO help with your differential diagnosis?
(1) Confirmed my differential2
(2) Broadened my differential7
(3) No impact on my differential1
2.Did Isabel PRO influence your referral decisions? If so, how?
(1) No longer needed to refer (Go to 4)0
(2) No longer needed to use the consultant advice line (Go to 4)0
(3) No impact on my referral9
3.If you did refer the patient, did Isabel PRO:
(1) Help you refer more appropriately3
(2) Have no impact7
4.Isabel PRO impacted your management plans by: (check all that apply)
(1) Ordering a diagnostic test3
(2) Cancelling a diagnostic test0
(3) Ordering a medication1
(4) Cancelling a medication0
(5) No impact6

Qualitative results

Overall, just over half of participants (6/11), showed some awareness of the existence of DDX tools and these six participants all expressed some initial enthusiasm and understanding surrounding their potential utility, particularly with regards to supplementing and aiding the diagnostic process in areas they felt less proficient in, as well as for rare or complex cases:

I think it could be really helpful at times, I think particularly on areas that I [sic] aren’t my…I don’t feel very confident in orthopaedic medicine, so if I had somebody with knee pain and little toe pain, to put it in and come out with a diagnosis it would be interesting, yes”. (GP01, baseline interview)

[Isabel] could be really helpful, in highlighting overlooked things. And there are, I’m sure, some rarer conditions that are more often overlooked, which is a double whammy, in terms of impact for the patient.” (GP04, baseline interview)

On the other hand, some participants with no prior awareness of the existence of DDX tools or their potential application to primary care were ambivalent to the use of such tools once a description of the tools and potential utility was provided to the participant by the interviewer. However, any initial enthusiasm, did not necessarily translate into routine adoption of the Isabel tool during the evaluation period by any participant, with only six participants describing utilising Isabel on one or more occasions. This poor uptake was a consequence of a range of factors that negatively influenced implementation. They ranged from endogenous individual and professional factors (including concerns over: time for use within the consultation; not perceiving it as a routine tool, the impact on the patient relationship as well as impact on the diagnostic process in terms of the cognitive work clinicians undertake); to exogenous organisational factors around contextual integration such as a lack of integration of Isabel into the EHR and no collective engagement. Such factors are discussed in relation to broad NPT constructs to illustrate why Isabel in its current form is unlikely to become normalised within primary care consultations for routine diagnostic work.

Sense-making: prior-, post-use and actual use

Despite participants indicating mostly positive or at a minimum, ambivalent views on the potential utility of Isabel prior to their experience of or ability to use Isabel, Isabel was not perceived to bring additional utility to the diagnostic process by those who actually utilised it. It was felt to be too time consuming and cumbersome, producing too many outputs including improbably rare conditions which then required further time for the clinician to sift through. When requiring diagnostic support, participants described either asking for a colleague’s opinion or using existing electronic resources that they were already familiar with:

But there are two aspects of the use: firstly online, I don’t have the time, if I’m here every second counts. Every second I’m using my time. Secondly, …I have other means, [and] use it [sic].” (GP05_6-month interview)

“… thereare two things that I think that I do need, one is something like a differential diagnosis and the other is drug interactions, because that’s another thing that I go to different websites for. So this is another good thing because the one I’m using at the moment that’s more text based, so it doesn’t give me too much of a list to go through”. (GP09_6-month interview)

Ithink it was a rash and I wasn’t sure what it was and I used it and the list that I came up with, I didn’t find it very helpful. I still went through it. I ended up referring back to teledermatology because it’s a very useful service and they came up with the answer”. (GP11_6-month interview)

The typical consultation was described as containing multiple competing agendas and only tools or methods which would facilitate the processing of that work were valued and utilised. Isabel, however, was not seen as helpful to the processing of diagnostic work within the consultation and/or good enough to replace any pre-existing electronic resources or methods.

Professional reasons were also cited for a lack of utilisation, such as concerns for potential interference with the clinician-patient relationship as well as hindering the cognitive processes when making a diagnosis as well as knowing when to use it:

I wouldn’t use it when the patient was there, because I think it would bring up things that would perhaps frighten them, and you get this list of… hundreds of different things, that could mean death… I don’t think that’s always helpful. And I would look at it afterwards, perhaps when I’m mulling things over on one of those patients I really wasn’t quite sure…” (GP05_6-month interview)_

I think, on the whole, I’m not sure where it’s going to sit in the mental processes, given that most of what we do is pattern recognition and then testing that out. And we do it incredibly early on, in the thinking process. A few seconds in. So I’m not quite sure, at what point, people would choose to try it out”. (GP03_6-month interview)

Of those who held wholly negative initial opinions of Isabel (n=5), the exposure to Isabel via training and free access to the program, did not appear to subsequently influence their views:

P:“I will be honest, maybe I was a bit negative about Isabel to start with, if you remember? So that might have influenced how I used it or how I looked at it.

I:“And now that you have used it, has your perception changed at all?

P:“I still will go with the same opinion that I had initially”. (GP11_6-month interview)

Finally, some participants felt that Isabel may have a place in supporting diagnostic work but in specific areas such as training, in helping with rare or diagnostically complex cases (which would be difficult to define a priori) or that the technology was more suited to secondary care settings:

It was great to look at stuff that we will never, ever see in general practice, but…or rarely see. If we do, it’s one in a million. But it’s good for that…when you’re looking at that stage, is it here that that patient is going to be? Or is it the hospital? Because our…you know, you get to the level of your diagnostic tests available. You know, we can’t do CTs and things. We can’t get more detail. So is Isabel for here or is it for here and also for secondary care?” (GP03_6-month interview)

You know it will probably help you learn, help me collaborate my knowledge in the skills…” (GP06 6-month interview)

Yousee, I’m not completely against it…because if I don’t know anything I’m going to use it as a learning tool as well at the same time…So if I think it is generating things that I didn’t know which is prompting me to learn more so it’s good, isn’t it?” (GP11 6-month interview)

Ithink Isabel is a good programme, it’s primarily more orientated to a hospital setting. But again you can modify it to [be] user friendly…” (GP05_6-month interview)

Embedding Isabel into the consultation

Technology use is a routine part of UK primary care consultations in the form of EHRs. However, as described earlier, clinicians already feel some concerns about the impact of technology on clinician-patient interactions and expressed further concerns when introducing other electronic resources which may further impinge upon this, especially those which do not seamlessly fit into their workflow and/or produce no added value to existing resources. Those who did utilise Isabel, did so after or outside of the consultation:

I:“Did you use any…Isabel within those consultations?

R:“Yes, I would say probably, rarely actually when I had the patient with me. It was probably with cases which are very complex and uncertain. And it would be at a later stage, that yeah, you’d have the time maybe, to go in and just sort of put in, to bring up the differential really, and to see if there’s anything that maybe, you’re not looking at really, in that respect.(NP09_6-month interview)

A minority of participants felt that uptake may be improved if Isabel was integrated into the EHR as there were no prompts or other apparent incentives to utilise it. Thus, technological logistics hindered potential and actual Isabel use even in participants who felt they required diagnostic support and who were initially positive regarding DDX use:

I suppose if you put it on computers as icons, we could just bring it up. That never happens as far as I’m aware, so…that would have been easy. I could just click on it. But then have to go, log in…remember my logins…if you’re not using something every day, it…you just tend not to use it unless it’s really easy to do. So if it on was, like…or perhaps I could link to it from EMIS and all the patient demographics automatically went through…That would be…that would save time. That would probably make it more useful”. (GP03_6-month interview).

Ithink the other thing, if I’m honest, I think it probably, sort of, falls off my thought list, if you know what I mean. I don’t…I, sort of, forget it’s there and just don’t use it, you know”. (GP01_6-month interview)

Collective engagement and resourcing for Isabel implementation

Despite the presence of a local Isabel implementation champion (AE), the provision of training on Isabel, free access to the tool and regular reminders at practice meetings of the tool’s availability for use, it was clear from participants’ accounts that Isabel had not become normalised for use within individual consultations or as a valued tool more widely within the organisation. Clinicians described little collective enthusiasm for the tool with no positive feedback from their colleagues which could act to positively affect uptake:

I guess if a few people had come back, you know, if we talked about it and they said, oh, you know, I typed it in to Isabel and it came up with this and sure enough, that’s…sort of, like that…I suppose there hasn’t been, sort of, any…and I just, sort of think, oh I must remember, you know…I must try that next time. And there hasn’t been that…or I haven’t been involved in any, sort of, positive discussions, I suppose to remind me to continue to use it”. (GP01_6-month interview)

Finally, once Isabel was available for use, there was little requirement for any other practice resources to aid implementation/normalisation except for the time required for the individual clinician to use to the tool. As described previously, suggestions to improve this aspect of utility surrounded the tool’s integration into the existing EHR.


This small, pragmatic and exploratory study aimed to evaluate the feasibility and utility of Isabel, an electronic DDX generator tool, in a UK general practice setting.

Our process evaluation indicates that Isabel is unlikely to become normalised within UK primary care in its current form despite implementation measures to enhance the likelihood of adoption. Data triangulation via the combination of the analytics data and qualitative findings, indicated both low usage and that a range of various underlying and complex factors, both endogenous and exogenous in nature contributed to this. GP views on the lack of ‘added-value’ to existing diagnostic support mechanisms, time to use the tool, concerns for the clinician-patient relationship and its lack of integration into the EHR/diagnostic workflow dominated. Similar concerns are also reported elsewhere in the literature on introducing diagnostic support tools into UK primary care medical practice [15]. Potential utility for the tool was suggested for rare or diagnostically complex cases as well as highlighting knowledge gaps potentially providing an impetus for directed learning. However, even these uses would still constitute limited usage as opposed to more routine use of the tool, and even then only when other existing electronic resources did not meet clinicians’ diagnostic support needs. Individual/professional views or ‘sense-making’ with regards to Isabel, were therefore the largest barrier to Isabel implementation. In terms of feasibility and utility, only one other study [10] has sought to assess Isabel for use in a routine UK primary medical care setting and our findings therefore support this study and suggest that Isabel is unlikely to become normalised in this setting in its current form.

These findings suggest that primary care clinicians in the UK do not currently see a place for Isabel to become a routine part of their diagnostic work within the consultation. Primary care in the UK is a challenging environment for the introduction of a new diagnostic support tool. Firstly, the typical primary care consultation is complex with multiple and competing agendas [7]. Secondly, diagnostic work only forms part of the work within that consultation and therefore a tool which appears cumbersome, producing a huge range of outputs for the clinician to work through, many of which rare with low diagnostic possibilities, is unlikely to be valued. This is potentially why study participants suggested that the tool was more appropriate to a secondary care environment. We suggest that this is because the primary care diagnostic environment is typified by patients with undifferentiated symptoms leading to a wider range of potential diagnostic possibilities. As a result, the secondary care clinician has more capability and capacity to work through and refine their diagnostic thinking alongside the potentially expansive outputs from a tool like Isabel. For example, clinicians in secondary care are able to order and receive diagnostic tests and gather other diagnostic information rapidly which allows for rapid assimilation and decision-making. Finally, secondary care clinicians have also had the added benefit of their patients being filtered by the fact that a primary care referral has occurred, therefore already narrowing down the diagnostic possibilities. Our propositions are supported by the fact that tools like Isabel were in fact developed initially for a secondary care environment. Finally, a major barrier to introducing DDX tools in any clinical context is that clinicians are often unable to self-assess their diagnostic capabilities [16] and identify if/when they need decision support and therefore a tool that would support their perceived diagnostic capabilities is unlikely to become normalised. Based on our findings, we highlight some key recommendations for improving the utility and uptake of DDX tools in Table 4 into general practice.

Table 4:

Key recommendations based on study results to enhance the general utility of DDX tools to clinicians in UK primary care.

Endogenous (professional) factors
Lack of awareness of DDX toolsMedical education needs to include the potential benefits of DDX tools to raise awareness amongst clinicians and provide/support DDX access during training and continuing medical education
Concerns over impact on patient-clinician relationshipEvidence is required to design effective consultation skills training to manage DDX tools and minimise concerns for both clinicians and patients
Uncertainty and variability as to when to use DDX toolsMedical education should utilise current evidence to guide clinicians as to when to utilise DDX tools in the cognitive/diagnostic process
Exogenous (organisational/technology) factors
Lack of integration of DDX tools into the EHRIntegration and pull through of patient data from EHR to DDX tool would allow for improved workflow and improved uptake
TimeLengthening of the consultation and refinement of the diagnostic lists/output would enhance the likelihood of clinical use
  1. DDX, differential diagnostic generator; EHR, electronic health record.

Utilising our study results and in the context of the existing literature, we propose that DDX tools like Isabel may have more utility and therefore are more likely to be adopted in two specific clinical areas. The first is in relation to rare diseases and/or diagnostically challenging cases. Rare diagnoses were not part of our study samples but were suggested to be an area of use by participants. By definition usage would be extremely limited in a primary care context and identifying when to use a tool in the case of a rare disease is itself problematic. Nonetheless, tools specific to rare disease diagnosis are being developed [17], [18] and centres specialising in rare disease diagnosis, such as at the University clinic Marburg, Marburg, Germany are using such tools.

Secondly, there may be potential in terms of education and/or training as our study shows that where Isabel was used, it was used outside the consultation as a means of learning rather than ‘real time’ help with a diagnostic problem. This may be explained by the concerns for the clinician-patient relationship as suggested in our findings and by literature studying the implementation of other new initiatives/innovations into primary care [19]. Including the introduction of computers themselves into the consultation. Despite this, some clinicians did refer to Isabel being useful in highlighting knowledge gaps which means that any such gaps can be explored in more detail with directed learning outside of the consultation. Furthermore, in an increasingly atomised medical environment where primary care physicians are working on their own and with increasing workload and time pressures, access to a tool like Isabel can act as an additional and independent source of advice.

Thirdly, the issue is when such tools should be used in the diagnostic process. Diagnostic errors occur most commonly due to cognitive errors [20] and issues in the diagnostic process such as premature closure (once a plausible diagnosis is identified alternatives are no longer considered), Availability bias (where a more likely [common] or a more familiar diagnosis is preferred or comes to mind more easily), Anchoring bias (relying on an initial impression and failing to adjust this impression in light of subsequent information) and Framing effect or bias (where a diagnostic thought pathway is influenced in subtle ways due to the way the problem is presented) [21]. In order to mitigate against such sources of error, some recent work suggests that DDX tools should be used early on in the diagnostic process before GPs start gathering information to test hypotheses [22]. In order for Isabel and other DDX tools to be used in this manner, there is a critical issue of clinicians needing to recognise the requirement for diagnostic support, which our findings and other studies [16] suggest they do not at present. As suggested by our study, clinicians currently seemingly prefer to rely on their own clinical acumen occasionally referring to other decision support tools which they have prior experience of and which many only serve in a confirmatory manner. The reliance on one’s own clinical knowledge is also supported by the fact that GPs typically operate in an individualistic manner within their own consultation rooms, they receive little diagnostic feedback and when it does occur, it is often delayed and confounding prevents the attribution of outcomes to clinical actions [23]. Until there is a recognition of these issues, in addition to the practical issues of pressurised consultations are addressed, we suggest that DDX tools like Isabel will be normalised into routine diagnostic work in UK general practice.

Finally, since the study was conducted, and due to the increasing pace of technological change and adoption, we comment briefly on this changing context for any future implementation studies of Isabel and DDX tools more widely for routine care in UK general practice. Firstly, the ‘Big data’ revolution has become more widespread across industries and learning health systems are now discussed as part of national policy and research agendas within health care [24], [25] in the UK and globally. These developments have translated into real world improvements for prescribing and preventative medical interventions [23] including widespread use of clinical risk prediction tools [25]. Diagnosis, however, remains a key aspect of clinical work within primary care for which the reach of decision support tools currently remains limited [23]. For how long this will remain the case is yet to be seen. Improvements in the accuracy and filtering of differential lists to (Jason Maude CEO of Isabel, personal communication) allow a focus on more likely diagnoses will go some way to addressing some of the utility issues highlighted in our findings. The integration of Isabel into EHRs (currently not the case for the specific EHR in our study practice although feasible in others) would help with workflow. Furthermore, if the system were automated to commence operation at the start of the consultation (upon the clinician accessing the individual patient’s record) as is now indicated [22], [26], in order to align with and enhance the diagnostic cognitive process, then these actions could also potentially aid with the ‘embedding’ aspect of the technology. Despite these developments however, other issues remain as barriers including the perceived implications for the patient-clinician relationship and an increasingly busy and pressurised primary care environment [27].

Strengths and limitations

This study provided an in-depth examination of the implementation of Isabel into routine primary care consultations. It is only the second time that Isabel’s usage and potential for implementation within the UK has been studied. Like Henderson and Rubin [10] we adopted a theoretically guided approach to our data collection and analysis, however, we also adapted our study design to allow for a longer evaluation period (twice as long as Henderson et al.s’) allowing us to assess whether ‘steady state’ usage was achieved and we also utilised a practice champion who was senior and well embedded into the practice organisation management in order to promote the technology locally. These measures may account for the higher usage seen in our study vs. Henderson’s. Our one practice sample and therefore limited number of participants may limit the transferability of our findings, however, the validity of our findings is supported by the fact when taken together, the various sources of data triangulated (survey, analytics and qualitative data) to all suggest the same conclusions, i.e. poor uptake of the current form of Isabel in UK general practice and are also supported by the only other evaluation study of Isabel in the UK [10]. We did not engage in EHR integration testing, which would require additional fees and an integration process but clearly this could be done to see if this improved perceived utility as highlighted in our discussion, however, our results suggest that this may have limited impact on uptake as clinicians held multiple concerns beyond the design features of the product. Finally, previous experimental studies of Isabel have been undertaken in controlled clinical contexts, and have used outcome measures such as rates of diagnostic omission errors, unfavourable outcomes and ‘correct’ diagnoses, using more experienced clinicians as the gold standard reference [10]. It is arguably better, and as the NAM report suggests [5], to test the system by its intended users within their working environment which is what we have done here and assess the tools potential utility for those users. On this basis, our evaluation study suggests that its current form, uptake would continue to be limited and not normalised within in UK primary care.


This small exploratory study suggests that the use of Isabel as a differential diagnostic generator tool in its current form is unlikely to become normalised within routine primary care consultations in the UK. The reasons for this finding are multiple and complex, with both endogenous and exogenous factors, which may be unique to Isabel, DDX tools themselves and/or the diagnostic challenges of working within an increasingly pressurised primary care environment.

Corresponding author: Dr. Sudeh Cheraghi-Sohi, NIHR Greater Manchester Primary Care Patient Safety Translational Research Centre, University of Manchester, Manchester M13 9PL, UK


We would like to thank all the participants in the study who generously gave their time to make this research possible.

  1. Author contributions: SCS, NR and AE designed the study. SCS, RA, NR and AE collected and analysed the data. SCS drafted the manuscript and all authors read, contributed to and approved the final manuscript. All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Availability of data and material: The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

  3. Research funding: This research was funded by the National Institute for Health Research (NIHR) Greater Manchester Primary Care Patient Safety Translational Research Centre. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health, Grant number: gmpstrc-2012-1. Programme Grants for Applied Research, Funder Id:, Grant Number: PSTRC-2016-003.

  4. Employment or leadership: None declared.

  5. Honorarium: None declared.

  6. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.


1. Riches N, Panagioti M, Alam R, Cheraghi-Sohi S, Campbell S, Esmail A, et al. The effectiveness of electronic differential diagnoses (DDX) generators: a systematic review and meta-analysis. PLoS One 2016;11:e0148991.10.1371/journal.pone.0148991Search in Google Scholar PubMed PubMed Central

2. Bond WF, Schwartz LM, Weaver KR, Levick D, Giuliano M, Graber ML. Differential diagnosis generators: an evaluation of currently available computer programs. J Gen Intern Med 2012;27:213–9.10.1007/s11606-011-1804-8Search in Google Scholar PubMed PubMed Central

3. Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. J Am Med Assoc 2005;293:1223–38.10.1001/jama.293.10.1223Search in Google Scholar PubMed

4. El-Kareh R, Hasan O, Schiff GD. Use of health information technology to reduce diagnostic errors. Br Med J Qual Saf 2013;22(Suppl 2):ii40–ii51.10.1136/bmjqs-2013-001884Search in Google Scholar PubMed PubMed Central

5. National Academies of Sciences, Engineering, and Medicine. Improving diagnosis in health care. Washington, DC: NASEM; 2015.Search in Google Scholar

6. McWhinney IR, Freeman T. Textbook of Family Medicine. New York, NY: Oxford University Press; 2009. p. 140–92.Search in Google Scholar

7. Davies P. The crowded consultation. Br J Gen Pract 2012;62:648–9.10.3399/bjgp12X659367Search in Google Scholar PubMed PubMed Central

8. Amy LR, Borowitz SM, Brown PA, Mendelsohn MJ, Lyman JA. Impact of a web-based diagnosis reminder system on errors of diagnosis. AMIA Annu Symp Proc 2006;2006:843.Search in Google Scholar

9. Maffei FA, Nazarian EB, Ramnarayan P, Thomas NJ, Rubenstein JS. Use of a web-based tool to enhance medical student learning in the pediatric intensive care unit and inpatient wards: 27. PedsCCM 2005;6:109.10.1097/00130478-200501000-00079Search in Google Scholar

10. Henderson EJ, Rubin GP. The utility of an online diagnostic decision support system (Isabel) in general practice: a process evaluation. JRSM Short Rep 2013;4:31.Search in Google Scholar

11. May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology 2009;43:535–54.10.1177/0038038509103208Search in Google Scholar

12. Blakeman T, Protheroe J, Chew-Graham C, Rogers A, Kennedy A. Understanding the management of early-stage chronic kidney disease in primary care: a qualitative study. Br J Gen Pract 2012;62:e233–42.10.3399/bjgp12X636056Search in Google Scholar PubMed PubMed Central

13. Mair FSMay C, O’Donnell C, Finch T, Sullivan F, Murray E. Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review. Bull World Health Org 2012;90:357–64.10.2471/BLT.11.099424Search in Google Scholar PubMed PubMed Central

14. Ziebland S, McPherson A. Making sense of qualitative data analysis: an introduction with illustrations from DIPEx (personal experiences of health and illness). Med Educ 2006;40:405–14.10.1111/j.1365-2929.2006.02467.xSearch in Google Scholar PubMed

15. Porat T, Delaney B, Kostopoulou O. The impact of a diagnostic decision support system on the consultation: perceptions of GPs and patients. BMC Med Inform Decis Mak 2017;17:79.10.1186/s12911-017-0477-6Search in Google Scholar PubMed PubMed Central

16. Meyer AND, Payne VL, Meeks DW, Rao R, Singh H. Physicians’ diagnostic accuracy, confidence, and resource requests: a vignette study. J Am Med Assoc Intern Med 2013;173:1952–8.10.1001/jamainternmed.2013.10081Search in Google Scholar PubMed

17. Svenstrup D, Jørgensen HL, Winther O. Rare disease diagnosis: a review of web search, social media and large-scale data-mining approaches. Rare Dis 2015;3:e1083145.10.1080/21675511.2015.1083145Search in Google Scholar PubMed PubMed Central

18. Alves RPM, Vilaplana J, Teixidó I, Cruz J, Comas J, Vilaprinyo E, et al. Computer-assisted initial diagnosis of rare diseases. Peer J 2016;4:e2211.10.7717/peerj.2211Search in Google Scholar PubMed PubMed Central

19. Pearce C, Arnold M, Phillips C, Trumble S, Dwan K. The patient and the computer in the primary care consultation. J Am Med Inform Assoc 2011;18:138–42.10.1136/jamia.2010.006486Search in Google Scholar PubMed PubMed Central

20. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.10.1001/archinte.165.13.1493Search in Google Scholar PubMed

21. Lee CS, Nagy PG, Weaver SJ, Newman-Toker DE. Cognitive and system factors contributing to diagnostic errors in radiology. Am J Roentgenol Radium Ther 2013;201:611–7.10.2214/AJR.12.10375Search in Google Scholar PubMed

22. Kostopoulou O, Rosen A, Round T, Wright E, Douiri A, Delaney B. Early diagnostic suggestions improve accuracy of GPs: a randomised controlled trial using computer-simulated patients. Br J Gen Pract 2015;65:e49–54.10.3399/bjgp15X683161Search in Google Scholar PubMed PubMed Central

23. Delaney BC, Kostopoulou O. Decision support for diagnosis should become routine in 21st century primary care. Br J Gen Pract 2017;67:494–5.10.3399/bjgp17X693185Search in Google Scholar PubMed PubMed Central

24. Foley TJ, Vale L. What role for learning health systems in quality improvement within healthcare providers? Learn Health Syst 2017;1:e10025.10.1002/lrh2.10025Search in Google Scholar PubMed PubMed Central

25. Bardsley M, Steventon A, Fothergill G. Untapped potential: investing in health and care data analytics. The Health Foundation; 2019. Available at: Accessed: 25 Mar 2019.Search in Google Scholar

26. Kostopoulou O, Porat T, Corrigan D, Mahmoud S, Delaney BC. Diagnostic accuracy of GPs when using an early-intervention decision support system: a high-fidelity simulation. Br J Gen Pract 2017;67:e201.10.3399/bjgp16X688417Search in Google Scholar

27. Hobbs FDR, Bankhead C, Mukhtar T, Stevens S, Perera-Salazar R, Holt T, et al. Clinical workload in UK primary care: a retrospective analysis of 100 million consultations in England, 2007–14. Lancet 2016;387:2323–30.10.1016/S0140-6736(16)00620-6Search in Google Scholar

Received: 2019-04-25
Accepted: 2020-01-06
Published Online: 2020-02-14
Published in Print: 2021-02-23

©2020 Sudeh Cheraghi-Sohi et al., published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 3.2.2023 from
Scroll Up Arrow