Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Diagnosis

Official Journal of the Society to Improve Diagnosis in Medicine (SIDM)

Editor-in-Chief: Graber, Mark L. / Plebani, Mario

Ed. by Argy, Nicolas / Epner, Paul L. / Lippi, Giuseppe / McDonald, Kathryn / Singh, Hardeep

Editorial Board Member: Basso , Daniela / Crock, Carmel / Croskerry, Pat / Dhaliwal, Gurpreet / Ely, John / Giannitsis, Evangelos / Katus, Hugo A. / Laposata, Michael / Lyratzopoulos, Yoryos / Maude, Jason / Newman-Toker, David / Singhal, Geeta / Sittig, Dean F. / Sonntag, Oswald / Zwaan, Laura

4 Issues per year

Online
ISSN
2194-802X
See all formats and pricing
More options …

Bias: a normal operating characteristic of the diagnosing brain

Pat Croskerry
  • Corresponding author
  • Dalhousie University – Critical Thinking Program, DME, 5849 University Avenue, PO Box 15000, Halifax, Nova Scotia, Canada
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2014-01-08 | DOI: https://doi.org/10.1515/dx-2013-0028

Abstract

People diagnose themselves or receive advice about their illnesses from a variety of sources ranging from their family or friends, alternate medicine, or through conventional medicine. In all cases, the diagnosing mechanism is the human brain which normally operates under the influence of a variety of biases. Most, but not all biases, reside in intuitive decision making, and no individual or group is immune from them. Two biases in particular, bias blind spot and myside bias, have presented obstacles to accepting the impact of bias on medical decision making. Nevertheless, there is now a widespread appreciation of the important role of bias in the majority of medical disciplines. The dual process model of decision making now seems well accepted, although a polarization of opinions has arisen with some arguing the merits of intuitive approaches over analytical ones and vice versa. We should instead accept that it is not one mode or the other that enables well-calibrated thinking but the discriminating use of both. A pivotal role for analytical thinking lies in its ability to allow decision makers the means to detach from the intuitive mode to mitigate bias; it is the gatekeeper for the final diagnostic decision. Exploring and cultivating such debiasing initiatives should be seen as the next major research area in clinical decision making. Awareness of bias and strategies for debiasing are important aspects of the critical thinker’s armamentarium. Promoting critical thinking in undergraduate, postgraduate and continuing medical education will lead to better calibrated diagnosticians.

Keywords: clinical decision making; cognitive bias; critical thinking; diagnostic error; patient safety; undergraduate medical education

Introduction

Diagnosing and treating illness has been going on for many thousands of years, probably well back into the Paleolithic period. One of the functions of shamans was to heal ailments by extracting the disease spirit from the body of the afflicted [1]. Allopathic (orthodox) medicine evolved from those early beginnings, but other groups have also evolved to expand the options of those seeking diagnostic services. Modern day shamans continue, along with faith healers, naturopaths, homeopaths, and many others. It is estimated that over 38% of Americans used some form of alternate medicine in 2007 [2]. Notwithstanding what is available through allopathic medicine, and what these alternate approaches offer, the majority of common illnesses and complaints are diagnosed and managed through self-diagnosis, often through the internet, or diagnosis by what Friedson calls the “lay referral system” of family, friends and acquaintances [3]. Generally speaking, no group, society or culture suffers a shortage of diagnosticians; however, their main instrument of diagnosis, the brain, operates under an inherent restraining characteristic – bias.

At its heart, the diagnostic process involves an appreciation of cause and effect. When signs and symptoms are manifest the diagnostician makes a connection between them using some frame of understanding that is distributed along a spectrum of evidence-based to faith-based “knowledge.” Understanding cause and effect is not as easy as it sounds. We are ever vulnerable to seeing correlation as causation, misinterpreting things simply because of temporal relationships, suckered by logical fallacies, seeing meaningful patterns where none exist, and are completely fooled by illusions and magicians. These are failings in the way we process our perceptions i.e., in the ways that our brains work when making decisions. The problem may lie in insufficient knowledge (declarative shortcoming) or the process of reasoning and deciding (procedural shortcoming). Studies suggest that procedural shortcomings, the ways in which we think, are principal factors determining diagnostic calibration [4, 5]; these are generally referred to as cognitive factors. Although a distinction is sometimes made between cognitive and affective factors [6], affect is involved in nearly all of cognition, so cognitive factors generally include affective factors.

Dual process decision making

It is now widely accepted that there are two modes of decision making. The intuitive mode is fast, autonomous and frugal. The cognitive psychologists refer to it as System 1 or Type 1 processing and it is where we make most of our decisions in the course of our daily affairs. Kahnemann notes that many of our decisions are made through mindless serial association [7] – not “mindless” in a negative, thoughtless sense but simply the way in which particular patterns elicit particular responses such that it is possible to get through much of the day simply by moving from one association to another. Performing a well-learned act such as driving a car is a good example. Very little active reasoning is involved in getting the car from A to B. However, if parallel parking is required at B then reasoning and judgment is essential to maneuver the car successfully into the parking space. Much of serial association is guided by heuristics which are maxims, abbreviated ways of thinking, rules of thumb, educated guesses or often just plain common sense which reduce the cognitive effort in decision making. They are also effective in situations where information is incomplete or there is uncertainty. They have their origins in the concept of “bounded rationality” first described by Herbert Simon [8]. They are either learned or hard-wired and range from the very simple to more complex, but all can be executed in a fairly mindless fashion i.e., without deliberate thinking. All of us, in all walks of life, live by these heuristics which mostly serve us well. The second mode of decision making involves analytical thinking, referred to as System 2 or Type 2 processing, which involves deliberate reasoning; it is generally slower and more resource intensive.

Given that we spend an estimated 95% of our time in System 1 [9], and given how often we are rewarded for the time spent there, it is not surprising that we have come to rely upon and trust our intuitions. On the face of it, Type 1 decision making is a very good deal – little cognitive effort is required and it is mostly effective. Simply having biases often removes interference and distractions and allows us to cut to the chase with a set response. However, the dark side of heuristics is that biased ways of looking at things will occasionally fail. It is the price we pay for low resourced decision making. Failure can also occur in System 2, but usually when analytical reasoning is not properly conducted due to deficiencies in reasoning, incorrect information being used, or correct rules not being applied correctly. There are occasions too when analytical reasoning may not be appropriate e.g., in an emergency where a very fast response is required, or when too much time is spent on the process leading to “analysis paralysis.”

Cognitive and affective bias

Lists of cognitive and affective biases are now abundant. Wikipedia currently lists 95 cognitive biases, as well as a variety of social biases and memory errors and biases [10]. Jenicek lists over 100 cognitive biases [11]. Dobelli reviews about a 100 in his recent book [12] along with practical advice on how to deal with them in everyday life. Bias is so widespread that we need to consider it as a normal operating characteristic of the brain.

Originally perhaps, physicians may have quietly nursed some hope that they have immunity from such defects in decision making. However, cognitive biases, reasoning failures, and logical fallacies are all universal and predictable features of human brains in all cultures and there is no reason to believe physicians as a group are much different from any other. Some will persist with the notion that they are not vulnerable to these phenomena which itself constitutes bias blind spot [13]. The difficulty some people have in recognizing their own biases is one of several obstacles to a more widespread awareness of the problem (Table 1). Nevertheless, the impact of cognitive bias on clinical decision making has now been recognized in most disciplines in medicine: Anesthesia [14], Dermatology [15], Emergency Medicine [16], Medicine [17, 18], Neurology [19], Obstetrics [20], Ophthalmology [21], Pathology [22, 23], Pediatrics [24], Psychiatry [25], Radiology [26], Surgery [27], as well as medical education [28] and specialty environments such as the Intensive Care Unit [29], and Dentistry [30].

Table 1

Impediments to awareness and understanding of cognitive biases in clinical judgment.

A false dichotomy

In the early 1970s the publication of work by two psychologists, Tversky and Kahneman [31, 32], heralded the heuristics and biases approach to human judgment and decision making that challenged simplistic rational models. Since then a voluminous literature has emerged providing widespread support for the existence of many heuristics and biases in human decision making. Generally, but not exclusively, they exert their influence in the intuitive mode (System 1 or through Type 1 processing), and cognitive psychologists have provided numerous experimental demonstrations of flawed decision making that result from them [33–36]. Over the last couple of decades many books on the subject, often with a view towards correcting the effects of biases, have been published in the lay press. It is something that has captured the public imagination.

Others have argued either directly against the existence of such biases [37], and/or that intuitive reasoning is a preferred mode of decision making [38–40]. Kahneman and Klein came to be seen as polar opposites on the issue and eventually attempted to resolve it through a joint publication [41], but both appear to have since reverted to their original positions [7, 42]. Polemics and a degree of rancor have arisen over this issue, much of which in retrospect appears not to have been the best investment of time. Some of the polarization appears to have arisen through misunderstandings of basic psychology theory, and a concomitant myside bias [41].

A dichotomy certainly exists in the sense that there are two distinct ways in which people arrive at decisions, but it is false to say that human decision making has to be one or the other. There are circumstances in which intuitive decision making is entirely appropriate – for the generation of ideas, when split second decisions are required, or when non-quantifiable decisions need to be made around such issues as esthetics or creative endeavors. Discoveries and new ways of thinking about issues often arise from inspirations that are not derived analytically. In contrast, building a linear particle accelerator or staging a cancer depends upon purposeful, deliberate, scientific reasoning and nothing else will do. The beauty of the human mind is that it can toggle between the two systems and select which mode is most appropriate to the task at hand; it is a dynamic process. In diagnosing illness, much of our initial contact will be with the intuitive mode allowing various possibilities to come and go. One’s intuitions and biases will vary with experience which, as Smallberg nicely puts it, is “the thumb on the scale” when weighing the options [43]. But the proper selection and verification of the final diagnosis rests with the analytic system, if indeed we follow the prudent clinician and choose to use it. Even with as simple a diagnosis as constipation, several other possibilities need to be offered up to the analytic mind before prescribing a laxative.

Going forward

A little over a decade ago, talking about cognitive bias was rare despite some notable efforts to draw attention to its role in clinical reasoning [44]. Now, we seem to have arrived at a sufficient level of awareness to examine its impact on the process of making a diagnosis, and need to consider the next steps.

We need to know more about the psychological processes that underlie human decision making and, in particular, those involved in undoing bias [45–47]. Clinical medicine needs to fully embrace recent gains in cognitive psychology to develop effective techniques for debiasing, and pursue collaborations with psychologists specialized in decision making. A significant obstacle is that psychologists do not see patients, and clinicians are usually not trained in the research methods of psychology. Importantly, the two will need to work closely with each other to ensure the ecological validity of their research design, and applicability of their findings [48].

In the meantime, we need to stay focused on how we train clinicians in decision making. Historically, medicine has put more emphasis on content knowledge than on how to think. There is now an added danger with modern technology that physicians’ cognitive calibration might be further reduced. However, some recent developments in medical education are encouraging with an increasing emphasis on critical thinking (CT) [49], and the importance of bias [50] and mindfulness [51] on clinical reasoning [52]. In education research, by far the most significant gains lie in CT interventions. A meta-analysis showed that such interventions in the age range 5–17 years had an overall positive effect size of 24 percentile points – the equivalent of moving a class from the 50th up to the 26th percentile in terms of their reasoning and problem solving skills [53]. While medical undergraduates and postgraduates are outside this age range, they are still well within the period of rapid development of CT skills [54, 55]. Critical thinking improves generally during undergraduate education [56], and other studies have shown marked improvements with specific CT interventions in post-secondary learners [57–59].

We need to accept that cognitive and affective biases are a normal part of brain function. Being able to recognize and deal with bias is the mark of a critical thinker [60] but there is more to CT than that. It involves a variety of cognitive skills such as developing accuracy, precision, relevance, depth, breadth and logic in one’s thinking, and awareness of deception, propaganda and other distorting influences on our thinking [61] Given that a significant proportion of diagnostic failure may be laid at the door of cognition, any initiative aimed at improving thinking skills in clinicians would appear worthwhile.

References

  • 1.

    Available from: http://en.wikipedia.org/wiki/Shamanism. Accessed on 8 September 2013.

  • 2.

    Barnes PM, Bloom B, Nahin RL. Complementary and alternative medicine use among adults and children: United States, 2007. National health statistics reports; no 12. Hyattsville, MD: National Center for Health Statistics, 2008.Google Scholar

  • 3.

    Friedson E. Client control and medical practice. Am J Sociol 1960;65:374–82.CrossrefGoogle Scholar

  • 4.

    Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.CrossrefGoogle Scholar

  • 5.

    Zwaan L, de Bruijne M, Wagner C, Thijs A, Smits M, van der Wal G, Timmermans DR. Patient record review of the incidence, consequences, and causes of diagnostic adverse events. Arch Intern Med 2010;170:1015–21.CrossrefGoogle Scholar

  • 6.

    Croskerry P, Abbass A, Wu A. Emotional issues in patient safety. J Patient Safety 2010;6:1–7.Google Scholar

  • 7.

    Kahneman D. Thinking fast and slow. Toronto, ON: Doubleday, 2011.Google Scholar

  • 8.

    Simon H. A behavioral model of rational choice, in models of man, social and rational: mathematical essays on rational human behavior in a social setting. New York: Wiley, 1957.Google Scholar

  • 9.

    Lakoff G, Johnson M. Philosophy in the flesh: the embodied mind and its challenge to western thought. New York: Basic Books, 1999.Google Scholar

  • 10.

    Available from: http://en.wikipedia.org/wiki/List_of_cognitive_biases accessed on 7 September 2013.

  • 11.

    Jenicek M. Medical error and harm: understanding, prevention and control. New York: Productivity Press, 2011.Google Scholar

  • 12.

    Dobelli R. The art of thinking clearly. New York: HarperCollins, 2013.Google Scholar

  • 13.

    Pronin E, Lin DY, Ross L. The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol B 2002;28:369–81.CrossrefGoogle Scholar

  • 14.

    Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology. Br J Anaesthesiol 2012;108:229–35.CrossrefGoogle Scholar

  • 15.

    David CV, Chira S, Eells SJ, Ladrigan M, Papier A, Miller LG, Craft N, et al. Diagnostic accuracy in patients admitted to hospital with cellulitis. Dermatology Online J 2011;17:1.Google Scholar

  • 16.

    Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–204.CrossrefGoogle Scholar

  • 17.

    Croskerry P. The importance of cognitive errors in diagnosis and strategies to prevent them. Acad Med 2003;78:1–6.CrossrefGoogle Scholar

  • 18.

    Redelmeier D. The cognitive psychology of missed diagnoses Ann Intern Med 2005;142:115–20.CrossrefGoogle Scholar

  • 19.

    Vickrey BG, Samuels MA, Ropper AH. How neurologists think: a cognitive psychology perspective on missed diagnoses. Ann Neurol 2010;67:425–33.CrossrefGoogle Scholar

  • 20.

    Dunphy BC, Cantwell R, Bourke S, Fleming M, Smith B, Joseph KS, et al. Cognitive elements in clinical decision-making. Toward a cognitive model for medical education and understanding clinical reasoning. Adv Health Sci Educ 2010;15:229–50.CrossrefGoogle Scholar

  • 21.

    Margo CE. A pilot study in ophthalmology of inter-rater reliability in classifying diagnostic errors: an under investigated area of medical error. Qual Saf Health Care 2003;12:416–20.CrossrefGoogle Scholar

  • 22.

    Foucar E. Error in anatomic pathology. Am J Clin Pathol 2001;116:S34–46.Google Scholar

  • 23.

    Crowley RS, Legowski E, Medvedeva O, Reitmeyer K, Tseytlin E, Castine M, Jukic D, Mello-Thomas C. Automated detection of heuristics and biases among pathologists in a computer-based system. Adv Health Sci Educ 2013;18:343–63.CrossrefGoogle Scholar

  • 24.

    Singh H, Thomas EJ, Wilson L, Kelly PA, Pietz K, Elkeeb D, Singhal G. Errors of diagnosis in pediatric practice: a multisite survey. Pediatrics 2010;126:70–9.CrossrefGoogle Scholar

  • 25.

    Crumlish N, Kelly BD. How psychiatrists think. Adv Psychiat Treat 2009;15:72–9.Google Scholar

  • 26.

    Sabih D, Sabih A, Sabih Q, Khan AN. Image perception and interpretation of abnormalities; can we believe our eyes? Can we do something about it? Insight Imag 2011;2:47–55.CrossrefGoogle Scholar

  • 27.

    Shiralkar U. Smart surgeons, sharp decisions: cognitive skills to avoid errors and achieve results. Shropshire, United Kingdom: TFM Publishing, 2010.Google Scholar

  • 28.

    Hershberger PJ, Markert RJ, Part HM, Cohen SM, Finger WW. Understanding and addressing cognitive bias in medical education. Adv Health Sci Educ 1997;1:221–6.CrossrefGoogle Scholar

  • 29.

    Gillon SA, Radford ST. Zebra in the intensive care unit: a metacognitive reflection on misdiagnosis. Crit Care Resuscitation 2012;14:216–21.Google Scholar

  • 30.

    Hicks EP, Kluemper GT. Heuristic reasoning and cognitive biases: are they hindrances to judgments and decision making in orthodontics? Am J Orthod Dentofac2011;139:297–304.CrossrefGoogle Scholar

  • 31.

    Tversky A, Kahnemann D. Belief in the law of small numbers. Psychol B 1971;76:105–10.Google Scholar

  • 32.

    Tversky A, Kahnemann D. Judgment under uncertainty: heuristics and biases. Science 1974;185:1124–31.CrossrefGoogle Scholar

  • 33.

    Kahneman D, Slovic P, Tversky A. Judgment under uncertainty: heuristics and biases. Cambridge, UK: Cambridge University Press, 1982.Google Scholar

  • 34.

    Plous S. The psychology of judgment and decision making. New York: McGraw-Hill, 1993.Google Scholar

  • 35.

    Baron J. Thinking and deciding, 3rd edn. New York: Cambridge University Press, 2000.Google Scholar

  • 36.

    Gilovich T, Griffin D, Kahneman D. Heuristics and biases: the psychology of intuitive judgment. New York: Cambridge University Press, 2002.Google Scholar

  • 37.

    Klein G, Orasanu J, Calderwood R, Zsambok CE. Decision making in action: models and methods. Norwood, NJ: Ablex Publishing Co., 1993.Google Scholar

  • 38.

    Klein G. The power of intuition. New York: Doubleday, 2004.Google Scholar

  • 39.

    Gladwell, M. Blink: the power of thinking without thinking. New York: Little, Brown and Company, 2005.Google Scholar

  • 40.

    Gigerenzer G. Gut feelings: the intelligence of the unconscious. New York: Viking Penguin, 2007.Google Scholar

  • 41.

    Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree Am Psychol 2009;64:515–26.CrossrefGoogle Scholar

  • 42.

    Klein G. What physicians can learn from firefighters. keynote presentation at diagnostic error in medicine (DEM) Annual Conference, October 23–26, 2011 Chicago, Illinois, US.Google Scholar

  • 43.

    Smallberg G. Bias is the nose for the story. In: Brockman J, editor. This will make you smarter. New York: Harper Perennial, 2012:43–5.Google Scholar

  • 44.

    Perkins DN. Postprimary education has little impact on informal reasoning. J Exp Psychol 1985;77:562–71.Google Scholar

  • 45.

    Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Quality and Safety 2012;21: 535–57.Google Scholar

  • 46.

    Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Quality and Safety 2013;22:ii58–64.Google Scholar

  • 47.

    Croskerry P,Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Quality and Safety 2013;22:ii65–72.Google Scholar

  • 48.

    Croskerry P, Petrie D, Reilly M, Tait G. Deciding about fast and slow decisions. Acad Med (accepted for publication).Google Scholar

  • 49.

    Novella S. Your deceptive mind: a scientific guide to critical thinking skills. Chantilly VA: The Great Courses, 2012.Google Scholar

  • 50.

    Epstein R. Mindful practice. J Am Med Assoc 1999;282:833–9.CrossrefGoogle Scholar

  • 51.

    Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimise them. Acad Med 2003;78:775–80.CrossrefGoogle Scholar

  • 52.

    Gay S, Bartlett M, McKinley R. Teaching clinical reasoning to medical students. The Clinical Teacher 2013;10:308–12.Google Scholar

  • 53.

    Higgins S, Hall E, Baumfield V, Moseley D. A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In: Research evidence in education library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London, 2005.Google Scholar

  • 54.

    Friend CM, Zubek JP. The effects of age on critical thinking ability. J Gerontol 1958;13:407–13.CrossrefGoogle Scholar

  • 55.

    Denney NW. Critical thinking during the adult years: has the developmental function changed over the last four decades? Exp Aging Res 1995;21:191–207.CrossrefGoogle Scholar

  • 56.

    Lehman DR, Nisbett RE. A longitudinal study of the effects of undergraduate training on reasoning. Dev Psychol 1990;26:952–60.CrossrefGoogle Scholar

  • 57.

    Solon T. Generic critical thinking infusion and course content learning in introductory psychology. J Instructional Psychol 2007;34:95–109.Google Scholar

  • 58.

    Bensley D, Crowe DS, Bernhardt P, Buckner C, Allman AL. Teaching and assessing critical thinking skills for argument analysis in psychology. Teach Psychol 2010;37:91–96.CrossrefGoogle Scholar

  • 59.

    Butler HA, Dwyer CP, Hogan MJ, Franco A, Rivas SF, Saiz C, Almeida LS, et al. The Halpern Critical Thinking Assessment and real-world outcomes: cross-national applications. Thinking Skills and Creativity 2012;7:112–21.Google Scholar

  • 60.

    West RF, Toplak ME, Stanovich KE. Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J Educ Psychol 2008;100:930–41.CrossrefGoogle Scholar

  • 61.

    Elder L, Paul R. Critical thinking development: a stage theory with implications for instruction. Tomales, CA: Foundation for Critical Thinking, 2010.Google Scholar

About the article

Corresponding author: Pat Croskerry, Dalhousie University – Critical Thinking Program, DME, 5849 University Avenue, PO Box 15000, Halifax, Nova Scotia, Canada, Phone: +1-902-494-4147, Fax: +1-902-494-2278, E-mail:


Received: 2013-09-20

Accepted: 2013-10-16

Published Online: 2014-01-08

Published in Print: 2014-01-01


Conflict of interest statement The author declares no conflict of interest.


Citation Information: Diagnosis, ISSN (Online) 2194-802X, ISSN (Print) 2194-8011, DOI: https://doi.org/10.1515/dx-2013-0028.

Export Citation

©2014 by Walter de Gruyter Berlin/Boston. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. BY-NC-ND 3.0

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Samuel G. Campbell, Pat Croskerry, and David A. Petrie
Healthcare Management Forum, 2017, Volume 30, Number 5, Page 257
[2]
Stephen Walmsley and Andrew Gilbey
Applied Ergonomics, 2017, Volume 65, Page 200
[3]
Brett J. Bordini, Alyssa Stephany, and Robert Kliegman
The Journal of Pediatrics, 2017, Volume 185, Page 19
[5]
Jane Heyhoe, Yvonne Birks, Reema Harrison, Jane K O’Hara, Alison Cracknell, and Rebecca Lawton
Journal of the Royal Society of Medicine, 2016, Volume 109, Number 2, Page 52
[6]
Stephen Walmsley and Andrew Gilbey
Applied Cognitive Psychology, 2016, Volume 30, Number 4, Page 532
[7]
Chang Hyun Nam and Beomseok Jeon
The Neurologist, 2015, Volume 20, Number 4, Page 67
[8]
Rhona Flin and Nikki Maran
Best Practice & Research Clinical Anaesthesiology, 2015, Volume 29, Number 1, Page 27
[9]
Pat Croskerry, David A. Petrie, James B. Reilly, and Gordon Tait
Academic Medicine, 2014, Volume 89, Number 9, Page 1196

Comments (0)

Please log in or register to comment.
Log in