Diagnosing and treating illness has been going on for many thousands of years, probably well back into the Paleolithic period. One of the functions of shamans was to heal ailments by extracting the disease spirit from the body of the afflicted . Allopathic (orthodox) medicine evolved from those early beginnings, but other groups have also evolved to expand the options of those seeking diagnostic services. Modern day shamans continue, along with faith healers, naturopaths, homeopaths, and many others. It is estimated that over 38% of Americans used some form of alternate medicine in 2007 . Notwithstanding what is available through allopathic medicine, and what these alternate approaches offer, the majority of common illnesses and complaints are diagnosed and managed through self-diagnosis, often through the internet, or diagnosis by what Friedson calls the “lay referral system” of family, friends and acquaintances . Generally speaking, no group, society or culture suffers a shortage of diagnosticians; however, their main instrument of diagnosis, the brain, operates under an inherent restraining characteristic – bias.
At its heart, the diagnostic process involves an appreciation of cause and effect. When signs and symptoms are manifest the diagnostician makes a connection between them using some frame of understanding that is distributed along a spectrum of evidence-based to faith-based “knowledge.” Understanding cause and effect is not as easy as it sounds. We are ever vulnerable to seeing correlation as causation, misinterpreting things simply because of temporal relationships, suckered by logical fallacies, seeing meaningful patterns where none exist, and are completely fooled by illusions and magicians. These are failings in the way we process our perceptions i.e., in the ways that our brains work when making decisions. The problem may lie in insufficient knowledge (declarative shortcoming) or the process of reasoning and deciding (procedural shortcoming). Studies suggest that procedural shortcomings, the ways in which we think, are principal factors determining diagnostic calibration [4, 5]; these are generally referred to as cognitive factors. Although a distinction is sometimes made between cognitive and affective factors , affect is involved in nearly all of cognition, so cognitive factors generally include affective factors.
Dual process decision making
It is now widely accepted that there are two modes of decision making. The intuitive mode is fast, autonomous and frugal. The cognitive psychologists refer to it as System 1 or Type 1 processing and it is where we make most of our decisions in the course of our daily affairs. Kahnemann notes that many of our decisions are made through mindless serial association  – not “mindless” in a negative, thoughtless sense but simply the way in which particular patterns elicit particular responses such that it is possible to get through much of the day simply by moving from one association to another. Performing a well-learned act such as driving a car is a good example. Very little active reasoning is involved in getting the car from A to B. However, if parallel parking is required at B then reasoning and judgment is essential to maneuver the car successfully into the parking space. Much of serial association is guided by heuristics which are maxims, abbreviated ways of thinking, rules of thumb, educated guesses or often just plain common sense which reduce the cognitive effort in decision making. They are also effective in situations where information is incomplete or there is uncertainty. They have their origins in the concept of “bounded rationality” first described by Herbert Simon . They are either learned or hard-wired and range from the very simple to more complex, but all can be executed in a fairly mindless fashion i.e., without deliberate thinking. All of us, in all walks of life, live by these heuristics which mostly serve us well. The second mode of decision making involves analytical thinking, referred to as System 2 or Type 2 processing, which involves deliberate reasoning; it is generally slower and more resource intensive.
Given that we spend an estimated 95% of our time in System 1 , and given how often we are rewarded for the time spent there, it is not surprising that we have come to rely upon and trust our intuitions. On the face of it, Type 1 decision making is a very good deal – little cognitive effort is required and it is mostly effective. Simply having biases often removes interference and distractions and allows us to cut to the chase with a set response. However, the dark side of heuristics is that biased ways of looking at things will occasionally fail. It is the price we pay for low resourced decision making. Failure can also occur in System 2, but usually when analytical reasoning is not properly conducted due to deficiencies in reasoning, incorrect information being used, or correct rules not being applied correctly. There are occasions too when analytical reasoning may not be appropriate e.g., in an emergency where a very fast response is required, or when too much time is spent on the process leading to “analysis paralysis.”
Cognitive and affective bias
Lists of cognitive and affective biases are now abundant. Wikipedia currently lists 95 cognitive biases, as well as a variety of social biases and memory errors and biases . Jenicek lists over 100 cognitive biases . Dobelli reviews about a 100 in his recent book  along with practical advice on how to deal with them in everyday life. Bias is so widespread that we need to consider it as a normal operating characteristic of the brain.
Originally perhaps, physicians may have quietly nursed some hope that they have immunity from such defects in decision making. However, cognitive biases, reasoning failures, and logical fallacies are all universal and predictable features of human brains in all cultures and there is no reason to believe physicians as a group are much different from any other. Some will persist with the notion that they are not vulnerable to these phenomena which itself constitutes bias blind spot . The difficulty some people have in recognizing their own biases is one of several obstacles to a more widespread awareness of the problem (Table 1). Nevertheless, the impact of cognitive bias on clinical decision making has now been recognized in most disciplines in medicine: Anesthesia , Dermatology , Emergency Medicine , Medicine [17, 18], Neurology , Obstetrics , Ophthalmology , Pathology [22, 23], Pediatrics , Psychiatry , Radiology , Surgery , as well as medical education  and specialty environments such as the Intensive Care Unit , and Dentistry .
Impediments to awareness and understanding of cognitive biases in clinical judgment.
|Clinical relevance||Medical undergraduates are not explicitly exposed to cognitive training in decision making. Historically, this area has not been seen as relevant to clinical performance and calibration|
|Lack of awareness||Although the lay press has heavily promoted the impact of cognitive and affective biases on everyday decision making, clinicians are generally unaware of their potential impact on medical decision making|
|Invulnerability||Even where awareness does exist, physician hubris, bias blind spot, overconfidence and lack of intellectual humility may deter them from accepting that they are just as vulnerable as others to impaired judgment through bias|
|Myside bias||A tendency to evaluate and favor information that supports one’s preconceptions and beliefs. Also known as “one sided bias“ it is a form of confirmatory bias that appears to gain strength as issues become more polarized.|
|Status quo bias||It is always easier for clinicians to continue to make decisions as they have done in the past. There is a prevailing tendency against learning de-biasing strategies and executing them as this requires considerably more cognitive effort and time|
|Vivid-Pallid dimension||Cognitive and affective processes are mostly invisible and, at present, can only be inferred from outcomes or the clinician’s behavior. Descriptions of them are invariably dry, pallid, abstract and uninteresting. They typically lack the vividness and concrete nature of clinical disease presentations that are far more meaningful and appealing to the medically trained mind.|
A false dichotomy
In the early 1970s the publication of work by two psychologists, Tversky and Kahneman [31, 32], heralded the heuristics and biases approach to human judgment and decision making that challenged simplistic rational models. Since then a voluminous literature has emerged providing widespread support for the existence of many heuristics and biases in human decision making. Generally, but not exclusively, they exert their influence in the intuitive mode (System 1 or through Type 1 processing), and cognitive psychologists have provided numerous experimental demonstrations of flawed decision making that result from them [33–36]. Over the last couple of decades many books on the subject, often with a view towards correcting the effects of biases, have been published in the lay press. It is something that has captured the public imagination.
Others have argued either directly against the existence of such biases , and/or that intuitive reasoning is a preferred mode of decision making [38–40]. Kahneman and Klein came to be seen as polar opposites on the issue and eventually attempted to resolve it through a joint publication , but both appear to have since reverted to their original positions [7, 42]. Polemics and a degree of rancor have arisen over this issue, much of which in retrospect appears not to have been the best investment of time. Some of the polarization appears to have arisen through misunderstandings of basic psychology theory, and a concomitant myside bias .
A dichotomy certainly exists in the sense that there are two distinct ways in which people arrive at decisions, but it is false to say that human decision making has to be one or the other. There are circumstances in which intuitive decision making is entirely appropriate – for the generation of ideas, when split second decisions are required, or when non-quantifiable decisions need to be made around such issues as esthetics or creative endeavors. Discoveries and new ways of thinking about issues often arise from inspirations that are not derived analytically. In contrast, building a linear particle accelerator or staging a cancer depends upon purposeful, deliberate, scientific reasoning and nothing else will do. The beauty of the human mind is that it can toggle between the two systems and select which mode is most appropriate to the task at hand; it is a dynamic process. In diagnosing illness, much of our initial contact will be with the intuitive mode allowing various possibilities to come and go. One’s intuitions and biases will vary with experience which, as Smallberg nicely puts it, is “the thumb on the scale” when weighing the options . But the proper selection and verification of the final diagnosis rests with the analytic system, if indeed we follow the prudent clinician and choose to use it. Even with as simple a diagnosis as constipation, several other possibilities need to be offered up to the analytic mind before prescribing a laxative.
A little over a decade ago, talking about cognitive bias was rare despite some notable efforts to draw attention to its role in clinical reasoning . Now, we seem to have arrived at a sufficient level of awareness to examine its impact on the process of making a diagnosis, and need to consider the next steps.
We need to know more about the psychological processes that underlie human decision making and, in particular, those involved in undoing bias [45–47]. Clinical medicine needs to fully embrace recent gains in cognitive psychology to develop effective techniques for debiasing, and pursue collaborations with psychologists specialized in decision making. A significant obstacle is that psychologists do not see patients, and clinicians are usually not trained in the research methods of psychology. Importantly, the two will need to work closely with each other to ensure the ecological validity of their research design, and applicability of their findings .
In the meantime, we need to stay focused on how we train clinicians in decision making. Historically, medicine has put more emphasis on content knowledge than on how to think. There is now an added danger with modern technology that physicians’ cognitive calibration might be further reduced. However, some recent developments in medical education are encouraging with an increasing emphasis on critical thinking (CT) , and the importance of bias  and mindfulness  on clinical reasoning . In education research, by far the most significant gains lie in CT interventions. A meta-analysis showed that such interventions in the age range 5–17 years had an overall positive effect size of 24 percentile points – the equivalent of moving a class from the 50th up to the 26th percentile in terms of their reasoning and problem solving skills . While medical undergraduates and postgraduates are outside this age range, they are still well within the period of rapid development of CT skills [54, 55]. Critical thinking improves generally during undergraduate education , and other studies have shown marked improvements with specific CT interventions in post-secondary learners [57–59].
We need to accept that cognitive and affective biases are a normal part of brain function. Being able to recognize and deal with bias is the mark of a critical thinker  but there is more to CT than that. It involves a variety of cognitive skills such as developing accuracy, precision, relevance, depth, breadth and logic in one’s thinking, and awareness of deception, propaganda and other distorting influences on our thinking  Given that a significant proportion of diagnostic failure may be laid at the door of cognition, any initiative aimed at improving thinking skills in clinicians would appear worthwhile.
Available from: http://en.wikipedia.org/wiki/Shamanism. Accessed on 8 September 2013.
Barnes PM, Bloom B, Nahin RL. Complementary and alternative medicine use among adults and children: United States, 2007. National health statistics reports; no 12. Hyattsville, MD: National Center for Health Statistics, 2008.
Friedson E. Client control and medical practice. Am J Sociol 1960;65:374–82.
Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.
Zwaan L, de Bruijne M, Wagner C, Thijs A, Smits M, van der Wal G, Timmermans DR. Patient record review of the incidence, consequences, and causes of diagnostic adverse events. Arch Intern Med 2010;170:1015–21.
Croskerry P, Abbass A, Wu A. Emotional issues in patient safety. J Patient Safety 2010;6:1–7.
Kahneman D. Thinking fast and slow. Toronto, ON: Doubleday, 2011.
Simon H. A behavioral model of rational choice, in models of man, social and rational: mathematical essays on rational human behavior in a social setting. New York: Wiley, 1957.
Lakoff G, Johnson M. Philosophy in the flesh: the embodied mind and its challenge to western thought. New York: Basic Books, 1999.
Available from: http://en.wikipedia.org/wiki/List_of_cognitive_biases accessed on 7 September 2013.
Jenicek M. Medical error and harm: understanding, prevention and control. New York: Productivity Press, 2011.
Dobelli R. The art of thinking clearly. New York: HarperCollins, 2013.
Pronin E, Lin DY, Ross L. The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol B 2002;28:369–81.
Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology. Br J Anaesthesiol 2012;108:229–35.
David CV, Chira S, Eells SJ, Ladrigan M, Papier A, Miller LG, Craft N, et al. Diagnostic accuracy in patients admitted to hospital with cellulitis. Dermatology Online J 2011;17:1.
Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–204.
Croskerry P. The importance of cognitive errors in diagnosis and strategies to prevent them. Acad Med 2003;78:1–6.
Redelmeier D. The cognitive psychology of missed diagnoses Ann Intern Med 2005;142:115–20.
Vickrey BG, Samuels MA, Ropper AH. How neurologists think: a cognitive psychology perspective on missed diagnoses. Ann Neurol 2010;67:425–33.
Dunphy BC, Cantwell R, Bourke S, Fleming M, Smith B, Joseph KS, et al. Cognitive elements in clinical decision-making. Toward a cognitive model for medical education and understanding clinical reasoning. Adv Health Sci Educ 2010;15:229–50.
Margo CE. A pilot study in ophthalmology of inter-rater reliability in classifying diagnostic errors: an under investigated area of medical error. Qual Saf Health Care 2003;12:416–20.
Foucar E. Error in anatomic pathology. Am J Clin Pathol 2001;116:S34–46.
Crowley RS, Legowski E, Medvedeva O, Reitmeyer K, Tseytlin E, Castine M, Jukic D, Mello-Thomas C. Automated detection of heuristics and biases among pathologists in a computer-based system. Adv Health Sci Educ 2013;18:343–63.
Singh H, Thomas EJ, Wilson L, Kelly PA, Pietz K, Elkeeb D, Singhal G. Errors of diagnosis in pediatric practice: a multisite survey. Pediatrics 2010;126:70–9.
Crumlish N, Kelly BD. How psychiatrists think. Adv Psychiat Treat 2009;15:72–9.
Sabih D, Sabih A, Sabih Q, Khan AN. Image perception and interpretation of abnormalities; can we believe our eyes? Can we do something about it? Insight Imag 2011;2:47–55.
Shiralkar U. Smart surgeons, sharp decisions: cognitive skills to avoid errors and achieve results. Shropshire, United Kingdom: TFM Publishing, 2010.
Hershberger PJ, Markert RJ, Part HM, Cohen SM, Finger WW. Understanding and addressing cognitive bias in medical education. Adv Health Sci Educ 1997;1:221–6.
Gillon SA, Radford ST. Zebra in the intensive care unit: a metacognitive reflection on misdiagnosis. Crit Care Resuscitation 2012;14:216–21.
Hicks EP, Kluemper GT. Heuristic reasoning and cognitive biases: are they hindrances to judgments and decision making in orthodontics? Am J Orthod Dentofac2011;139:297–304.
Tversky A, Kahnemann D. Belief in the law of small numbers. Psychol B 1971;76:105–10.
Tversky A, Kahnemann D. Judgment under uncertainty: heuristics and biases. Science 1974;185:1124–31.
Kahneman D, Slovic P, Tversky A. Judgment under uncertainty: heuristics and biases. Cambridge, UK: Cambridge University Press, 1982.
Plous S. The psychology of judgment and decision making. New York: McGraw-Hill, 1993.
Baron J. Thinking and deciding, 3rd edn. New York: Cambridge University Press, 2000.
Gilovich T, Griffin D, Kahneman D. Heuristics and biases: the psychology of intuitive judgment. New York: Cambridge University Press, 2002.
Klein G, Orasanu J, Calderwood R, Zsambok CE. Decision making in action: models and methods. Norwood, NJ: Ablex Publishing Co., 1993.
Klein G. The power of intuition. New York: Doubleday, 2004.
Gladwell, M. Blink: the power of thinking without thinking. New York: Little, Brown and Company, 2005.
Gigerenzer G. Gut feelings: the intelligence of the unconscious. New York: Viking Penguin, 2007.
Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree Am Psychol 2009;64:515–26.
Klein G. What physicians can learn from firefighters. keynote presentation at diagnostic error in medicine (DEM) Annual Conference, October 23–26, 2011 Chicago, Illinois, US.
Smallberg G. Bias is the nose for the story. In: Brockman J, editor. This will make you smarter. New York: Harper Perennial, 2012:43–5.
Perkins DN. Postprimary education has little impact on informal reasoning. J Exp Psychol 1985;77:562–71.
Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Quality and Safety 2012;21: 535–57.
Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Quality and Safety 2013;22:ii58–64.
Croskerry P,Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Quality and Safety 2013;22:ii65–72.
Croskerry P, Petrie D, Reilly M, Tait G. Deciding about fast and slow decisions. Acad Med (accepted for publication).
Novella S. Your deceptive mind: a scientific guide to critical thinking skills. Chantilly VA: The Great Courses, 2012.
Epstein R. Mindful practice. J Am Med Assoc 1999;282:833–9.
Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimise them. Acad Med 2003;78:775–80.
Gay S, Bartlett M, McKinley R. Teaching clinical reasoning to medical students. The Clinical Teacher 2013;10:308–12.
Higgins S, Hall E, Baumfield V, Moseley D. A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In: Research evidence in education library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London, 2005.
Friend CM, Zubek JP. The effects of age on critical thinking ability. J Gerontol 1958;13:407–13.
Denney NW. Critical thinking during the adult years: has the developmental function changed over the last four decades? Exp Aging Res 1995;21:191–207.
Lehman DR, Nisbett RE. A longitudinal study of the effects of undergraduate training on reasoning. Dev Psychol 1990;26:952–60.
Solon T. Generic critical thinking infusion and course content learning in introductory psychology. J Instructional Psychol 2007;34:95–109.
Bensley D, Crowe DS, Bernhardt P, Buckner C, Allman AL. Teaching and assessing critical thinking skills for argument analysis in psychology. Teach Psychol 2010;37:91–96.
Butler HA, Dwyer CP, Hogan MJ, Franco A, Rivas SF, Saiz C, Almeida LS, et al. The Halpern Critical Thinking Assessment and real-world outcomes: cross-national applications. Thinking Skills and Creativity 2012;7:112–21.
West RF, Toplak ME, Stanovich KE. Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J Educ Psychol 2008;100:930–41.
Elder L, Paul R. Critical thinking development: a stage theory with implications for instruction. Tomales, CA: Foundation for Critical Thinking, 2010.