Normalized Deviance Paul Barach, MD,...

28
1 Normalized Deviance Paul Barach, MD, MPH Normalized Deviance By a deviant organizational behavior, we refer to “an event, activity or circumstance, occurring in and/or produced by a formal organization, that deviates from both formal design goals and normative standards or expectations, either in the fact of its occurrence or in its consequences.” Once a community normalizes a deviant organizational practice, it is no longer viewed as an aberrant act that elicits an exceptional response; instead, it becomes a routine activity that is commonly anticipated and frequently used. Diane Vaughan, 1999: 273.

Transcript of Normalized Deviance Paul Barach, MD,...

1

Normalized Deviance

Paul Barach, MD, MPH

Normalized Deviance

� By a deviant organizational behavior, we refer to “an event, activity or circumstance, occurring in and/or produced by a formal organization, that deviates from both formal design goals and normative standards or expectations, either in the fact of its occurrence or in its consequences.”

� Once a community normalizes a deviant organizational practice, it is no longer viewed as an aberrant act that elicits an exceptional response; instead, it becomes a routine activity that is commonly anticipated and frequently used.

• Diane Vaughan, 1999: 273.

2

Normalized Deviance Spread

� Research into organizational misconduct has demonstrated that deviant behavior

– may not only grow within an organization, but also may spread between organizations that work closely with each other (Vaughan, 1996; Zey, 1993, 1998)

– spreads between organizations that operate in the same industry (Geis, 1977; Baucus and Near, 1991; Simpson, 1986).

� Based on a similar logic, we hypothesize that, all else being equal, managers operating in communities with a higher prevalence of deviance will be more likely to engage in deviant behavior than will managers operating in communities with a lower prevalence.

Variation/Normalized Deviance in Healthcare

� US Institute of Medicine To Err is Human,

Crossing the Quality Chasm

� Bristol Royal Infirmary

� Midstaffordshire

� Dutch Radboud Hospital Investigation

� National Health and Hospital Reform Commission Report

� NSW Garling Report

3

The Normalization of Deviance: Do We (Un)Knowingly Accept Doing the Wrong Thing?

� Failure to wash the hands before and after patient contact.

� Failure to follow recognized isolation procedures and protocols.

� Disconnect patient from monitors on transfer

� Incomplete and wrongful documentation.

� Handoffs of care at vital times (emergence, induction, separation from cardiopulmonary bypass, etc.).

� Wearing hospitals scrubs home

How does it start?

� The normalization literature distinguishes between factors that lead to the genesis of organizational deviance and factors that cause deviance to become routine, rather than idiosyncratic, behavior.

� A permissive ethical climate, an emphasis on financial goals at all costs, and an opportunity to act amorally or immorally, all contribute to managerial decisions to initiate deviance.

4

DOES THE DAY OF WEEK MATTER?

operations performed on Fridays were associated with a higher 30-day mortality rate than those performed on

Mondays through Wednesdays: 2.94% vs. 2.18%;

Odds ratio, 1.36; 95% CI, 1.24–1.49)

Variation caused by Trainees-July Effect

� Anesthesia registrars in first 4 months at the Alfred Hospital, had worse patient outcomes than in the subsequent 8 months of the year

� This relationship held for 1-5th year registrars.

Haller G. et al BMJ, 2009; Barach, Johnson. BMJ 2009.

5

CPR Quality during Cardiac Arrest

Two companion studies of CPR quality:

� Chest compressions were not delivered half of the time and compressions were too shallow (““““out-of-hospital””””).

� Quality of multiple CPR parameters was inconsistent and often did not meet published guidelines (““““in-hospital””””).

Abella BS, Alvarado JP, Hyklebust H, et. al. Quality of Cardiopulmonary Resuscitation

During In-Hospital Cardiac Arrest. JAMA, January 19, 2005, 293(3):305-310

FEB 1, 2003 8:59 EST

– All 7 astronauts are killed

– $4 billion spacecraft is destroyed

– Debris scattered over 2000 sq-miles of Texas

– NASA grounds shuttle fleet for 2-1/2 years

Space shuttle Columbia, re-entering Earth’’’’s atmosphere at 10,000 mph, disintegrates

6

Columbia- The Physical Cause

� Insulating foam separates from external tank 81 seconds after lift-off

� Foam strikes underside of left wing, breaches thermal protection system (TPS) tiles

� Superheated air enters wing during re-entry, melting aluminum struts

� Aerodynamic stresses destroy weakened wing

A Flawed Decision Process

� Foam strike detected in launch videos on Day 2

� Engineers requested inspection by crew or remote photo imagery to check for damage

� Mission managers discounted foam strike significance

� No actions were taken to confirm shuttle integrity or prepare contingency plans

7

Seventeen Years Earlier…

� January 28, 1986, the shuttle Challenger explodes 73 seconds into its launch, killing all seven crew members

� Investigation reveals that a solid rocket booster (SRB) joint failed, allowing flames to impinge on the external fuel tank

Challenger…

� Liquid hydrogen tank explodes, ruptures liquid oxygen tank

� Resulting massive explosion destroys the shuttle

8

Columbia- The Organizational Causes

““““In our view, the NASA organizational

culture had as much to do with this

accident as the foam.””””CAIB Report, Vol. 1, p. 97

� NASA had received painful lessons about its culture from the Challenger incident

� CAIB found disturbing parallels remaining at the time of the Columbia incident… these are the topic of this presentation

STS-107 Columbia Space Shuttle

� February 1, 2003 Space Shuttle Columbia and its 7-member crew are lost re-entering the Earth’s atmosphere

� The Columbia Accident Investigation Board’s independent

assessment takes seven months

9

10

Columbia Accident Investigation Board

“Cultural norms tend to be fairly resilient…the norms bounce back into

shape after being stretched or bent. Beliefs held in common resist

alteration….This culture acted over time to resist externally imposed

changes.

By the eve of the Columbia accident, institutional practices that were in

effect at the time of the Challenger accident had returned to NASA.”

Stages in Deviance

� Institutionalization refers to the process by which initial deviant decisions or acts become embedded in organizational structures and processes;

� Rationalization to the process by which new ideologies develop to justify and perhaps even valorize corruption; and

� Socialization refers to the process by which newcomers come to accept deviance as permissible if not desirable.

� Each process “reinforces and in turn is reinforced by the other two.”

Ashford and Anand (2003)

11

Stakeholder Reactions� Social control agents, whistle-blowers and workers have

all been shown to successfully challenge or reverse normalization processes

� Institutionalized deviance typically continues until stopped from inside or outside the organization.

� Internally, whistle blowers may step forward with accusations and evidence of wrongdoing.

� Externally, the media, prosecutors, or victims may challenge organizational actions.

� Actions of “social control agents”: The identification of external challengers to organizational deviance, such as the media and prosecutors;

� If community leaders and regulators do not forcibly respond to organizational deviance, then organizational members are likely to conclude that there are few regulatory consequences or normative improprieties in violating formal standards of behavior

-- Ermann and Lundman (2002)

The Role of Managers

� Normative standards of behavior are not simply imposed on managers by more powerful organizations such as the state or professional organizations.

� Managers themselves are participants in the construction of the commonly accepted standards of behavior under which they operate.

� A process of social learning and observation moves an organizational practice from an innovation that requires active efforts of sense-making to a routine behavior that operates as a habitual response to common organizational problems.

12

Een onvolledig bestuurlijk proces: hartchirurgie in

het UMC St Radboud

Rapport Onderzoeksraad voor de Veiligheid 2008

13

?

14

15

29

1. Maintain Sense Of Vulnerability2. Combat Normalization Of Deviance 3. Establish an Imperative for Safety4. Perform Valid/Timely Hazard/Risk Assessments5. Ensure Open and Frank Communications6. Learn and Advance the Culture

Key Organizational Culture Findings– What NASA Did Not Do

Establish An Imperative for Safety

““““When I ask for the budget to be cut,

I’m told it’s going to impact safety on

the Space Shuttle … I think that’s a

bunch of crap.””””Daniel S. Goldin,

NASA Administrator, 1994

� The shuttle safety organization, funded by the programs it was to oversee, was not positioned to provide independent safety analysis

� The technical staff for both Challenger and Columbia were put in the position of having to prove that management’’’’s intentions were unsafe

– This reversed their normal role of having to prove

mission safety

16

Ensure Open and Frank Communications

I must emphasize (again) that severe enough

damage… could present potentially grave hazards…

Remember the NASA safety posters everywhere

around stating, ““““If it’’’’s not safe, say so””””? Yes, it’s that

serious.

Memo that was composed but never sent

� Management adopted a uniform mindset that foam strikes were not a concern and was not open to contrary opinions.

� The organizational culture– Did not encourage ““““bad news””””– Encouraged 100% consensus– Emphasized only ““““chain of command”””” communications– Allowed rank and status to trump expertise

… An Epilog

� Shuttle Discovery was launched on 7/26/05

� NASA had formed an independent Return To Flight (RTF) panel to monitor its preparations

� 7 of the 26 RTF panel members issued a minority report prior to the launch

– Expressing concerns about NASA’’’’s efforts

– Questioning if Columbia’’’’s lessons had been learned

17

… An Epilog

� During launch, a large piece of foam separated from the external fuel tank, but fortunately did not strike the shuttle, which landed safely 14 days later

� The shuttle fleet was once again grounded, pending resolution of the problem with the external fuel tank insulating foam

Piper Alpha

� On 7/6/1988, a series of explosions and fires destroyed the Piper Alpha oil platform

� 165 platform workers and 2 emergency responders were killed

– 61 workers survived by jumping into the North Sea

18

Flixborough

� On 6/1/1974, a massive vapor cloud explosion (VCE) destroyed a UK chemical plant

� Consequences:

– 28 employees died and 36 were injured

– Hundreds of off-site injuries

– Approx. 1800 homes and 170 businesses damaged

The Physical Cause

� Other interconnected platforms continued production, feeding the leaks on Piper Alpha

� Ensuing fires breached high pressure natural gas inlet lines on the platform

� The enormity of the resulting conflagration prevented any organized evacuation

19

Parallels to NASA and Columbia

� Each organizational cause can be mapped to one or more of the NASA lessons

– Maintain Sense Of Vulnerability

– Combat Normalization Of Deviance

– Establish an Imperative for Safety

– Perform Valid/Timely Hazard/Risk Assessments

– Ensure Open and Frank Communications

– Learn and Advance the Culture

Could this contribute to patient harm?

� Complacency due to our superior safety performance

� Normalizing our safety critical requirements

� Ineffective Risk Assessments of our systems

� Reversing the Burden of Proof when evaluating safety of operations

� Employees Not Speaking Freely about their safety concerns

� Business/Productiveity Pressures at odds with safety priorities

� Failure to Learn and apply learnings to improving our culture

20

Human Factors Contributing to Mishaps

� Normalization of deviance

� Poor communication

� Production pressure

� Fatigue and stress

� Emergency operations

� Inadequate provider experience

� Inadequate familiarity with equipment, device, surgical procedure, anesthetic technique

� Lack of skilled assistance or supervision

� Afferent overload (excess stimuli or noise)

� Normalcy bias (assuming alarms are ‘false alarms’

� Faulty or absent policy and procedures

Prielipp R, Anesthesia & Analgesia. 2010;110(5):1499-1502.

Indicators Of Organizational Culture Weaknesses

The following slides provide

examples of indicators that emerged in the inquiries around Bristol, MidStaff,

Garling, Nijmechen…

21

…NOT Maintaining a Sense of Vulnerability

� Safety performance has been good… and you do not recall the last time you asked ““““But what if…?””””

� Assume your safety systems are good enough

� Treat critical alarms as operating indicators

� Allow backlogs in addressing RCA, FMEA of critical equipment

� Actions are not taken when trends of similar deficiencies are identified.

…NOT Preventing the Normalization of Deviance

� Allow operations outside established safe operating limits without detailed risk assessment

� Willful, conscious, violation of an established procedure is tolerated without investigation, or without consequences for the persons involved

� Staff cannot be counted on to strictly adhere to safety policies and practices when supervision is not around to monitor compliance

� Tolerating practices or conditions that would have been deemed unacceptable a year or two ago

22

…NOT Establishing An Imperative for Safety

� Staff monitoring safety related decisions are not technically qualified or sufficiently independent

� Key process safety management positions have been downgraded over time or left vacant

� Recommendations for safety improvements are resisted on the grounds of cost or schedule impact

� No system is in place to ensure an independent review of major safety-related decisions

� Audits are weak, not conducted on schedule, or are regarded as negative or punitive and, therefore, are resisted

…NOT Performing Valid/Timely Hazard/Risk Assessments

� Availability of experienced resources for hazard or risk assessments is limited

� Assessments are not conducted according to schedule

� Assessments are done in a perfunctory fashion, or seldom find problems

� Recommendations are not meaningful and/or are not implemented in a timely manner

� Bases for rejecting risk assessment recommendations are mostly subjective judgments or are based upon previous experience and observation.

23

…NOT Ensuring Open and Frank Communications

� The bearer of ““““bad news”””” is viewed as ““““not a team player””””

� Safety-related questioning ““““rewarded”””” by requiring the suggested to prove he / she is correct

� Communications get altered, with the message softened, as they move up or down the management chain

� Safety-critical information is not moving laterally between work groups

� Employees can not speak freely, to anyone else, about their honest safety concerns, without fear of career reprisals.

…NOT Learning and Advancing the Culture

� Recurrent problems are not investigated, trended, and resolved

� Investigations reveal the same causes recurring time and again

� Staff expresses concerns that standards of performance are eroding

� Concepts, once regarded as organizational values, are now subject to expedient reconsideration

24

Whistle-Blowing Dangers

� Whistle- blowing is so “fraught with career-threatening outcomes, that when three individuals did it in one year, they were named Time Magazine’s Persons of the Year in 2002.”

EXTRA FOR DISCUSSION

25

Conforming the Model to Real Life

Detecting devianceAdapting the barriers to real conditions

PERFORMANCEE

ACCIDENT

Systemic Migration to Boundaries

VE

RY

UN

SA

FE

SP

AC

E

INDIVIDUAL BENEFITS ‘ ‘‘‘‘Illegal

Illegal’’’’Space

Nevernever

NeverSometimes

BTCUs

Usual Space

of Action

‘‘‘‘Illegal normal’’’’Real life standards

Alwayssometimes

Expected safe space of action

as defined by professional standards

Individual

concernsTime on duty, Life

quality, ...

Market demand

Technology

Safety regs & good practices

Certification/ accreditation standards

Always always

26

ADOPTING A SYSTEMS APPROACH

52

No

syste

m b

ey

on

d th

is po

int

10-2

10-3

10-4

10-5

10-6

Civil Aviation

Nuclear Industry

Railways (France)

Chartered Flight

Road Safety

Chemical Industry (total)

Fatal

risk

Medical risk (total)

Anesthesiology ASA1

Cardiac Surgery Patient ASA 3-5

Fatal Iatrogenic adverse events

Very unsafe Ultra safe

Professional fishing

Two Safety models

Unsafe Safe

Hymalayamountaineering

Microlight spreading activity

Professional Fishing

Context : Taking up the gauntlet, facing new dangers

and unstable environments are inherent traits of my job

(and rewards). Deaths are here and there and concern

persons expecting benefit from risk taking.

Safety model : give me best chances and safest tools

to survive in these adverse conditions

e.g.: a fisherman in stormy sea increases collision with

tankers; A safety improvement consists in providing him

with a Traffic Avoidance Collision System

Context : Catastrophes are causing

numerous deaths of people not willing to take

risk

Safety model : de-expose to risk. Bet on the

organization to let workers and users

unexposed to risk

e.g.: a flight to Miami will remain grounded in

its departure airport until the storm ends in

Miami

Amalberti, Auroy, Berwick, Barach, Five System Barriers To Achieving Ultrasafe Health Care, Ann Intern Med. 2005;142, 9: 756-764.

27

Toward a strategic view on medical safety – a tentative mapping exercise

ULTRA SAFEUNSAFE

ULTRA ADAPTIVEto MARKET DEMANDS & NON STANDARDS CASES - LEARNING SYSTEMS

NON ADAPTIVEPOOR LEARNING SYSTEMS

RESILIENCEBetting on

Individuals’

Competences

/expertise

HROBetting on sense

making

Cognitive maps,

global vision

Procedures

& team regulations

ULTRA SAFE

SYSTEMS

Betting on

Systems

supervision

incompatible

with market demands

Incompatible with social risk

acceptance

Technical progresses

Cherry Picking

Percentage

Patients

excluded

Optimizing vs Shifting

safety

Resilient model

HRO model

Ultra safe model

10-2

10-3

10-4

10-5

10-6

Emergency Triage

Post op care

Radiotherapy

ICU

Imagery

BiologyDialysis

SERVICES

Performance shaping factors PSF

UNSTABLE ENVIRONMENTHIGH FREQUENCY OF SURPRISES

COOPERATIVE ENVIRONMENTEARLY RECOVERY AS A PRIORITY

PLANNED ENVIRONMENTPROTOCOL AND COMPLIANCE AS A PRIORITY

PATIENT JOURNEY

CRISIS

Provided you are not able to change the PSFs, safety improvements are greater betting on consistency and local improvement of the most appropriate model rather than betting

on a ‘‘‘‘potentially more performing model’’’’, but inadequate to the demand. The worse strategy consists in mixing surface traits from different models.

Range of local optimization

Range of local optimization

Range of local optimization

28

Impact of safety strategies