Cognitive biases

Post on 28-Jan-2015

1.608 views 5 download

Tags:

description

 

Transcript of Cognitive biases

CRITICAL THINKING IN CLINICAL DECISION MAKING

DR. CHEW KENG SHENG

Question #1

•  Jack is looking at Anne, but Anne is looking at George. Jack is married, but George is not. Is a married person looking at an unmarried person? A.  Yes B.  No C.  Cannot be determined

Disjunctive Reasoning

•  Would you have answered differently if the options are only Yes or No?

•  This thought process is called fully disjunctive reasoning – reasoning that considers all possibilities

•  Most people can carry out fully disjunctive reasoning when they are explicitly told that it is necessary but most do not automatically do so.

What if this is a clinical case? Does it make a difference in your decision making process if you have only option A and option B as compared to if you are given option C as well (which essentially is a permission or excuse not to make a definite choice on the basis of “inadequate information given”)?

Discuss further

“Humans are cognitive misers because our basic tendency is to

default to the processing mechanisms that require less

computational effort, even if they are less accurate” – Keith Stanovich,

cognitive psychologist

Question #2

•  Suppose you want to buy a book and a pencil. The book and the pencil cost RM1.20 in total. If the book costs RM1.00 more than the pencil, how much does the pencil cost?

Discuss further

•  Discuss on intelligence vs Rationality •  “We often assume intelligence and

rationality go together but we shouldn’t be surprised when smart people do foolish things” – Keith Stanovich

•  Dysrationalia – is the inability to think and behave rationally despite having adequate intelligence

What does the middle character look like?

Which line is longer?

How do we make decisions?

•  Decision making is one of the most important we do, it is the engine that drives our behavior.

•  We make many decisions continuously in the course of our waking hours. These decisions vary in complexity

•  Some are relatively simple, automatic process, well-rehearsed. Some have consequential implications – like choosing our life-partners

“What we are, or how we live our lives are largely determined by the

decisions we made”

“We first make our choices, then our choices make us”

How do we make decisions?

•  One of the major developments in cognitive psychology over the last 20 years is the dual process theory (DPT) of reasoning.

•  The DPT of reasoning has emerged as the dominant theory of reasoning particularly through the works of people like Epstein, Tversky and Kahneman, Stanovich and West, and Evans.

Dual-process thinking

•  According to the DPT of reasoning, there are two modes of decision making, i.e., System 1 and System 2.

•  System 1 is the fast, intuitive, reflexive, automatic and frugal thinking and it is where we spend most of our time making most of our decisions. Driving a car for someone who has been driving for a long time is an example of System 1 thinking.

Dual-process thinking

•  System 2, on the other hand is a deliberate, analytical, purposeful or effortful form of thinking that is usually slower.

•  Discuss: give further examples of some of the decisions that you make in your daily lives that are largely based on System 1 and those that are based on System 2

Dual-process thinking

System 1 (Intuitive) System 2 (Analytical)

Experiential-inductive Hypothetico-deductive Heuristic Systematic

Pattern recognition Robust decision making Unconscious thinking theory Deliberate, purposeful

thinking Fast Slow

High capacity Limited

High emotional attachment Low emotional attachment

Low scientific rigor High scientific rigor

Case illustration #1

This child develops this rash after 5 days of antibiotics for fever and cough. The resident takes a quick glance of this child and diagnose him with Stevens-Johnson syndrome. He says that he has seen a similar case before when he was a house officer and he remember that case very well because the child died later on.

Case illustration #2

Was the resident right?

•  The resident employed System 1 thinking •  Quick, intuitive, pattern recognition based

on what he has seen before •  High emotional association – his previous

patient died following a ‘similar case’ •  But was he right? •  SJS often has extensive mucosal

involvement. SSSS usually does not. •  Nikolsky’s sign is usually present in SSSS

Heuristics

•  Although System 1 is the fast, reflexive thinking mode that we commonly used, inherent to the intuitive nature of this system, it often requires the use of heuristics.

•  Heuristics are mental shortcuts or “rules of thumb” or “gut-feeling” used to assist us to rapidly make decisions without formal analysis.

Heuristics

•  Two heuristics that are considered essential for a clinician when faced with an emergency situation are the “rule-out-worst-case-scenario” and the sick/not sick dichotomy

Pattern Recognition

Repetition

Executive override

Dysrationalia override Calibration Diagnosis Patient

Presentation Pattern

Processor

RECOGNIZED

NOT RECOGNIZED

T

System 1 and System 2 in play

Cognitive biases

•  While heuristics are helpful cues for System 1, at times, they are prone to cognitive biases and errors.

•  Cognitive biases or cognitive disposition to respond are our predictable tendencies to respond in a certain way to the contextual clues at that time

•  These biases are often unconsciously committed, and may result in flawed reasoning

Availability bias

•  Availability bias – this refers to our tendency to judge things as being more likely, or frequently occurring, if they readily come to mind.

•  Therefore, a recent experience with a particular disease, for example, thoracic aortic dissection may inflate likelihood of a clinician to diagnose the patient with this disease every time when the clinician sees a case of chest pain.

Anchoring

•  Anchoring – this refers to our tendency to fixate our perception on to the salient features in the patient’s initial presentation at an early point of the diagnostic process so much so that we fail to adjust our initial impression even in light of later information.

Confirmation bias

•  Confirmation bias – this refers to our tendency to look for confirming evidence to support the diagnosis we are “anchoring” to, while downplaying, or ignoring or not actively seeking evidences that may point to the contrary.

Confirmation bias

•  Confirmation bias often goes together with anchoring. For example, if a clinician has anchored or fixated the diagnosis of myocardial infarction in his mind, he will have the tendency to look for evidences to support this diagnosis, say, ST segment elevation on electrocardiography even if the amount of elevation is very minimal.

Confirmation bias

•  In contrast, if the patient’s chest X-ray demonstrates a widened mediastinum width with unequal pulses on examination and high blood pressure, the clinician may have ignored such important cues that may point to the life threatening condition of thoracic aortic dissection.

Search satisficing

•  This refers to our tendency to stop looking or call off a search for a second diagnoses when we have found the first one.

•  This bias can prove to be detrimental in polytrauma cases.

Search satisficing

•  A classic example of this bias is the tendency of the physician to call off the search for a second fracture once he thinks he is “sufficiently satisfied” with finding the first fracture of medial malleolus, when in fact, the patient may have sustained Maisonneuve fracture with a second proximal fibula fracture.

Case illustration #3

This patient claimed to have twisted his left ankle and complained of severe ankle pain. The medical officer in the A&E ordered an X-ray of that ankle. He saw some abnormalities over the medial malleolar region and then referred the case to the orthopedics.

Question: Do you agree with his plan of management? Give your comments.

Normal mortise view

Normal mortise view

•  The entire mortise joint space should be of uniform width, ≤ 4 mm (light gray).

•  The distal tibiofibular joint (dark gray) should be only slightly wider than the mortise joint space, ≤ 5.5 mm.

•  The tibiofibular overlap should be > 1 mm on the mortise view.

A Maisonneuve fracture should be suspected whenever there is a fracture to the medial aspect of the ankle or widening of the distal tibiofibular joint

An example of search satisficing

Always remember the adage in X-rays of #:

“One joint below, and one joint above”

Triage cueing

•  This is basically a form of anchoring where once a triage tag has been labelled on a patient, the tendency is to look at the patient only from the perspective of the discipline in which the patient is tagged to.

Diagnostic momentum

•  Once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries, (patients, paramedics, nurses, physicians) what might have started as a possibility gathers increasing momentum until it becomes definite and all other possibilities are excluded.

Sunk cost fallacy/bias

•  The more a clinician invest in a particular diagnosis, the less likely he is to release it and consider alternatives. This form of entrapment is common in financial investment. In clinical setting, the time mental energy, and for some, the ego may be a precious investment to let go. Confirmation bias maybe a manifestation of such unwillingness to let go of a failing diagnosis.

Sunk cost fallacy/bias

Ego bias

•  This refers to our tendency of overestimating the prognosis of one’s own patients compared to that of a population of similar patients under the care of other physicians.

Blind spot bias

•  This refers to the bias that many people have where they believe that they are less susceptible to errors compared to others. This has some similarities with ego bias.

Hindsight bias

•  This bias typically occurs during morbidity and mortality meetings where the outcome of the case is already known.

•  With hindsight bias, a case with a bad outcome is judged negatively where the sequence of decisions made leading up to the outcome must be bad as well.

Hindsight bias

•  However, it is not necessarily true that just because the outcomes are bad, the decisions are bad too, as people generally do not deliberately make bad decisions.

•  The decisions taken at that time must have made sense to them.

Hindsight bias

•  Furthermore, the process of cognitive autopsy during morbidity and mortality meetings are devoid of the ambient context (e.g. a busy working emergency department) and the affective dispositions (e.g. the stress, sleep-deprived or depressed nature of the doctor) in which the decision was made during that particular time.

Overconfidence bias

•  It refers to our universal tendency to believe that we know more than we do.

•  Overconfidence reflects a tendency to act upon incomplete information, intuitions, of hunches.

Gambler’s fallacy

•  The concept of this bias is borrowed from the gambling situation where if a coin is tossed ten times, and for every case of the toss, head is shown.

•  A person with gambler’s fallacy will say that if the coin is tossed for the 11th time, there must be a greater chance of being tail.

Gambler’s fallacy

•  However, the coin has no memory and the coin actually has a 50-50 chance of showing tail in each toss, which is independent of the previous outcomes.

Gambler’s fallacy

•  An example of this fallacy can happen when a clinician see five cases of shortness of breath in the course of a working shift, and in each case, the patient turns out to be having pneumonia.

Gambler’s fallacy

•  When the 6th patient with shortness of breath arrives in the emergency department, a clinician with this fallacy will probably think that for this 6th time, the patient must be having a condition other than pneumonia, such as asthmatic attack.

Posterior probability error

•  This is the opposite of gambler’s fallacy. In this bias, if a clinician sees five patients with shortness of breath in the course of a working shift, which turn out to be pneumonia in every cases; when the 6th patient with shortness of breath arrives in the emergency department, the tendency is to believe that this patient must be having pneumonia as well.

Summary of common cognitive biases (1)

Cognitive bias Thought process Availability bias “I remember seeing a similar

patient with diagnosis X. Therefore this patient must be having diagnosis X”

Anchoring bias “From the very offset, it seems that this patient is having diagnosis X, so, he must be having diagnosis X”

Confirmation bias “Since this patient has diagnosis X, I must look for evidence to support that this patient has diagnosis X”

Search satisficing “I have found diagnosis X in this patient and I am happy with it!”

Summary of common cognitive biases (2)

Cognitive bias Thought process Triage cueing “The triage officer found that the

patient has diagnosis X. Let’s treat the patient as having diagnosis X”

Diagnostic momentum “The HO says the patient has diagnosis X. The MO says the patient has diagnosis X. The specialist says the patient has diagnosis X. And nobody is challenging it”

Sunk cost fallacy “I have invested so much of my time and energy in managing this patient as having diagnosis X. What else could it be?”

Summary of common cognitive biases (3)

Cognitive bias Thought process Gambler’s fallacy “I have seen the last 5 patients

with diagnosis Y. This time, this patient must be having diagnosis X”.

Posterior probability error “I have seen the last 5 patients with diagnosis Y. This time, this patient must be having diagnosis Y as well”.

Ego bias “Statistically speaking, my patients often do better than patients from the other team!”

Blind spot bias “This kind of mistakes often happen to Dr. X’s patients. I wouldn’t have made such mistakes”

Cognitive biases categories

•  Biases due to over-attachment to a particular diagnosis – Anchoring, confirmation bias

•  Biases due to failure to consider other diagnosis – Search satisficing

•  Biases due to inaccurate estimation of prevalence – Availability bias, gambler’s fallacy, posterior

probability error

Cognitive biases categories

•  Biases due to the way the patient is presented – Triage cueing

•  Biases due to inheriting someone else’s thinking – Diagnostic momentum

•  Biases due to physician’s personality and affect, decision style – Ego bias, blind spot bias

Critical Thinking (1)

1.  Knowing and understanding the System 1 & System 2 thinking

2.  Recognizing the distracting stimuli, biases and irrelevance affecting our decisions

3.  Identifying, analyzing and challenging assumptions in arguments

4.  Be aware of cognitive fallacies and poor reasoning

Critical Thinking (2)

5.  Recognizing deceptions – deliberate or otherwise

6.  Having the capacity for assessing the credibility of information

7.  Understand the need for monitoring and control of our own thinking processes

8.  Be aware of the critical impact of fatigue and sleep deprivation on decision making

Critical Thinking (3)

9.  Understand the importance of monitoring and control of our own affective states that influence the quality of our decisions

10. Understand the context under which decisions are made

11. Capacity to anticipate the consequences of our decisions

Pre-dispositional factors

•  Further compounding the difficulty in clinical decision making is the undeniable fact that the quality of our clinical decisions is also influenced by ambient or environmental conditions under which the decision is made.

•  For example, when faced with a potential clinical emergency situation, physicians are often expected to make diagnostic decisions within a limited time frame.

Affective state of the decision maker

•  Other factors such as the affective state of the clinician, general fatigue, interruptions, distractions, sleep deprivation etc, can influence the quality of our decisions too. For example, sleep deprivation (in the course of a long working shift, for example) can have a lot of negative impact, not only to the quality of the decision making, but to the general health of the clinician as well.

Sleep deprivation

•  Sleep deprivation and circadian dysynchrony can impair performance and reduce many aspects of human capability including reduced attention vigilance, impaired memory, impaired decision-making, lagged reaction time, impaired hand-eye coordination and disruptive communications.

Sleep deprivation

•  For example, it has been shown that after 17 hours of continuous wakefulness, hand-eye coordination task would have declined to such a level equivalent to that of a blood alcohol level of 0.05%. And at 24 hour of sustained wakefulness, the impairment in psychomotor function is equivalent to a blood alcohol concentration of 0.1%

Sleep deprivation

•  Furthermore, a fatigued worker will also have a tendency to slow down work his work processes in order to maintain accuracy (known as the “speed-accuracy trade-off”)

De-biasing strategies

•  One of the tremendous challenges in cognitive biases is finding ways to de-bias them. A de-biasing strategy commonly used is called the cognitive forcing strategies. These are deliberate, systematic self-regulatory cognitive mechanisms to provide a check and balance to minimize biases.

Metacognition

•  An example of cognitive forcing strategies is metacognition. Metacognition is an individual’s ability to stand apart from his own thinking in order to be aware of his own preferred learning approaches and ultimately to manipulate his own cognitive processes to his own advantages.

Metacognition

•  In short, metacognition is “thinking about thinking.” It allows one to ask questions like: “How well did I do?” “What could I have done it differently if I am given a chance again?” etc.

De-biasing strategies

•  But suppose one has the necessary mindware, then the next question Stanovich argues would be whether one actually perceives a need to de-bias them. But even if the person perceives the need for de-biasing, the next question would be whether the de-biasing effort needed is a sustained effort.

De-biasing strategies

•  If it is but the person does not have the capacity for sustained de-biasing, then the natural tendency is still to fall back into System 1 of reasoning. This is because when it comes to choosing the cognitive strategies to apply for solving a problem, we generally choose the fast, computationally inexpensive strategy (System 1).

Cognitive forcing strategies (1)

•  One of the ways to minimize the risk of committing cognitive biases is to forcibly ask ourselves these few questions whenever we have made our clinical decisions (especially if our decision is to discharge the patient):

1  What is/are the possible life/limb threats in this patient? Why does the patient come?

Cognitive forcing strategies (2)

2  What if I am wrong? What else could it be?

3  Do I have evidences for/against this decision/diagnosis that I've made?

4  What are the ambient/affective factors that are influencing my decisions?

Cognitive forcing strategies (3)

5  In the unfortunate event that this case landed as a medico-legal case 10 years down the road, is what I've documented defensible? (in other words, have I documented what needs to be documented, is my writing legible enough, is the date and time written, etc).

•  Download a free article on ‘Making decision better’ here:

•  http://tinyurl.com/cbjvjof

Authority gradient

•  Another issue that may hamper the learning and practice of critical thinking is the issue of authority gradient.

•  Authority gradient is defined as the gradient that may exist between two individuals’ professional status, experience, or expertise that contributes to difficulty exchanging information or communicating concern.

Authority gradient

•  Authority gradient is especially prevalent in our Asian culture - which maybe heavily influenced by Asian philosophies of respecting the seniors.

•  Such noble value is of course vitally important in maintaining societal harmony but can be dangerous if taken to the extreme and junior doctor adopts an unhealthy pessimism attitude.