My Measured Experience Brent C. James, MD, MStat · My Measured Experience . Brent C. James, MD,...

37
My Measured Experience Brent C. James, MD, MStat Chief Quality Officer, Executive Director, Intermountain Institute for Health Care Delivery Research, Intermountain Healthcare; Salt Lake City, Utah Objectives: Discuss the gap between current care delivery and the transition results Describe the difference between improvements and research Discuss the need to move from managed care to managing processes of care Explain process control theory to clinical process and a multidisciplinary team approach

Transcript of My Measured Experience Brent C. James, MD, MStat · My Measured Experience . Brent C. James, MD,...

My Measured Experience

Brent C. James, MD, MStat

Chief Quality Officer, Executive Director, Intermountain Institute for Health Care Delivery Research, Intermountain Healthcare;

Salt Lake City, Utah Objectives: • Discuss the gap between current care delivery and the

transition results • Describe the difference between improvements and research • Discuss the need to move from managed care to managing processes

of care • Explain process control theory to clinical process and a

multidisciplinary team approach

My Measured Experience

Brent C. James, M.D., M.Stat.Executive Director, Institute for Health Care Delivery ResearchIntermountain HealthcareSalt Lake City, Utah, USA

Intermountain HealthcareExcellence in Trauma and Critical Care

Hyatt Escala Lodge -- The Canyons Resort, Park City, UtahFriday, 19 September 2014 -- 3:00p - 4:30p

Disclosures

Neither I, Brent C. James, nor any family members, have any relevant financial relationships to be discussed, directly or indirectly, referred to or illustrated with or without recognition within the presentation.

I have no financial relationships beyond my employment at Intermountain Healthcare.

Outline

1. A framework for practice: the Craft of Medicine

2. Two "inconvenient truths":(1) The rate of increase of new medical knowledge(2) Limitations of the "unaided expert eye" - limited ability to see

cause and effect (effect size) and inability to estimate rates

3. The Learning Health Care System:Clinical transparency and knowledge management as the foundation for excellent care delivery practice

The healing professions

We police our own ranks -- acting on behalf of patients, we assure that all members of the healing profession respect our fiduciary trust and are competent (a social contract; the 'official' definition of "professional autonomy")

We maintain a special body of knowledge -- as clinicians, (1) we practice - we apply knowledge not generally available outside of the professions (information disparity). (2) We teach - we transmit that knowledge to the next generation. And (3) we learn - we improve the knowledge we ourselves received.(e.g., Geisinger Health System mission: Heal. Teach. Discover. Serve;

Stanford Hospital & Clinics mission: To Care, To Educate, To Discover)

We put our patients first -- as clinicians, we place our patients' health needs before any other end or goal; we act as our patients' advocates. We accept, promote, and honor a fiduciary trust on behalf of our patients.

The craft of medicine

placing her patient's health care needs before any other end or goal,

An individual physician

drawing on extensive clinical knowledge gained through formal education and experience

Can crafta unique diagnostic and treatment regimen

customized for that particular patient.

This approach will produce the best result possible for each patient.

Medicine's promise:

"Our minds are interpreters of evidence. We can accurately convert all forms of evidence (formal evidence, observations, experiences, colleague's experiences) into conclusions, which in turn determine our actions."

The Craft's core assumption

"Therefore, no one has to tell us what to do. Just give us the evidence and we will figure it out. Besides, there are lots of other factors that need to be considered. This can only be done with clinical judgment."

Evidence Ourminds Conclusions Actions

Dr. David Eddy

Growth rate for Level I evidence

Bastian H, Glasziou P, Chalmers I. Seventy-five trials and 11 systematic reviews a day: how will we ever keep up? PLoS Med 2010; 7(9):e1000326 (Sep).

Shaneyfelt, TM. Building bridges to quality. JAMA 2001; 286(20):2600-2601 ( Nov 28).

Exploding knowledge base

20 articles per day,365 days of the year

To maintain current knowledge, a general internist would need to read

an impossible task ...

3 to 4 years after board certification, internists -- both generalists and subspecialists -- begin to show "significant declines in general medical knowledge" ...

14 to 15 years postcertification, ~68% of internists would not have passed the American Board of Internal Medicine certifying exam ...

Until now, we have believed that the best way to transmit knowledge from its source to its use in patient care is to first load the knowledge into human minds … and then expect those minds, at great expense, to apply the knowledge to those who need it. However, there are enormous ‘voltage drops' along this transmission line for medical knowledge.

Lawrence Weed

(Weed LL. New connections between medical knowledge and patient care. BMJ 1997; 315(7102):231-5 (Jul 26).

Parachutes reduce the risk of injury after gravitational challenge, but their effectiveness has not been proved with randomised controlled trials

Smith G & Pell J. Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials. BMJ 2003; 327:20-7 (Dec 20).

Data systems: BMJ, 2003

What is already known about this topicParachutes are widely used to prevent death and major injury after gravitational challenge

Parachute use is associated with adverse effects due to failure of the intervention and iatrogenic injury

Studies of free fall do not show 100% mortality

What this study addsNo randomised controlled trials of parachute use have been undertaken

The basis for parachute use is purely observational, and its apparent efficacy could potentially be explained by a “healthy cohort” effect

Individuals who insist that all interventions needto be validated by a randomised controlled trial need to come down to earth with a bump

Smith G & Pell J. Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials. BMJ 2003; 327:20-7 (Dec 20).

Effect size

The aim is to establish cause and effect --to accurately perceive real results.

Sometimes, results are visible to the naked eye.

Most times, they are not ...

measurement acts as our eyes; it providess our ability to see.

You manage what you measure.

Expert consensus?

0 20 40 60 80 100

Probability of Outcome (%)

"The practitioners, all experts in the field, were then asked to write down their beliefs about the probability of the outcome" ... "that would largely determine his or her belief about the proper use of the health practice, and the consequent recommendation to a patient."

Eddy. A Manual for Assessing Health Practices & Designing Practice Policies: The Explicit Approach. Philadelphia, PA: The American College of Physicians, 1992; pg. 14.

Eddy: "You can find a physician who honestly believes (and will testify in court to) anything you want."

(chance of a spontaneous rupture of a silicone breast implant - 58 expert physicians)

Expert consensus?

Experts' estimates of outcomes following implantation of artificial heart valves:

Valve failures 3% - 95% 0% - 25%Death 3% - 40% 2% - 50%Hemorrhage 0% - 30% 1% - 65%Embolization 0% - 30% 2% - 50%

Xenograft Mechanical

c/o David Eddy

Expert consensus?

0 20 40 60 80 100Reduction in Incidence (%)

Eddy. Variations in physician practice: the role of uncertainty. Health Affairs. 1984; 3:74.

"At a recent meeting of experts in colorectal cancer detection ... the attendees were asked ... `What is the overall reduction in cancer incidence and mortality that could be expected ...' The answer to this question is obviously central to any estimate of the value of fecal occult blood testing ..." and flexible sigmoidoscopy.

0 20 40 60 80 100Reduction in Mortality (%)

Deploying guidelines?

"agreed with content":"changed practice":

Carefully-developed repeat C-section guidelineDistributed by professional society

87- 94%33%

Physician survey:

test on guideline contents:actual repeat C-section rates:"slight change in actual practice"

67%15- 49%

Measurement:

above reported

Lomas et. al. Do practice guidelines guide practice? NEJM 1989; 321(19):1306-11 (Nov 9)

understood

What would we find if ...

we applied the rigorous measurement tools of clinical research

to

routine care delivery performance?

Quality, Utilization, & Efficiency (QUE)Six clinical areas studied over 2 years:- transurethral prostatectomy (TURP)- open cholecystectomy- total hip arthroplasty- coronary artery bypass graft surgery (CABG)- permanent pacemaker implantation- community-acquired pneumoniapulled all patients treated over a defined time period

across all Intermountain inpatient facilities - typically 1 yearidentified and staged (relative to changes in expected utilization)- severity of presenting primary condition- all comorbidities on admission- every complication- measures of long term outcomescompared physicians with meaningful # of cases

(low volume physicians included in parallel analysis, as a group)

IHC TURP QUE StudyMedian Surgery Minutes vs Median Grams Tissue

M L K J P B C O N A I D H E G F0

20

40

60

80

100

0

20

40

60

80

100

Attending Physician

Median surgical time Median grams tissue removed

Gra

ms

tissu

e / S

urge

ry m

inut

es

Intermountain TURP QUE Study

1500 1549 1568 16181543

1697

1913

22332140 2156

1598

12691164

1552 15561662

A B C D E F G H I J K L M N O PAttending Physician

0

500

1000

1500

2000

2500

Dol

lars

0

500

1000

1500

2000

2500

Average Hospital Cost

Poor evidence for most practicesThe inherent complexity of modern medicine, versus

well-known limitations of the unaided expert mindlead to

Huge variations in beliefsWell-documented, massive, variations in practicesHigh rates of inappropriate careUnacceptable rates of preventable patient injuryA striking inability to "do what we know works"Wasted resources on a large scale (= decreased access to care)

The core assumption is untenable

Dr. David Eddy

Other factors affect our decisions

Evidence Ourminds Conclusions Actions

Dr. David Eddy

Professional interestsFinancial interestsClinician preferences and personal tastesDesire to have something to offer (Rule of Rescue)Love for the workWishful thinkingSelective memoryPressure from patients and family (direct to consumer advertising)Legal considerations (defensive medicine)

If our minds can't do the work very well, there are all sorts of other things to fill the void:

Huge ranges of uncertaintyLimited, complex Massive variation,inappropriate care

"Managed care" at the bedside

Integrated clinical / operations management structure

1998:

(an outcomes tracking system)Integrated management information systems1997:

(mediated by payment mechanisms)cost structure vs. net incomeintegrated facility / medical expense budgets

Integrated incentives1999: (aligned)

Full roll-out andadministrative integration

2000:

(strategic) Key process analysis1996:

(Education programs: A learning organization)(A shared vision for a future state)

The idea of "patient-centered care"

Institute of Medicine, 2001 (Crossing the Quality Chasm)

It's not "health care," but "disease treatment"patients approach the care system around a health need

Health care system typically organize around:- physicians- technologies (e.g., MRI scanner)- buildings (e.g., a hospital or clinic)

What if we instead organized around patients?- condition-specific disease treatment processes

Often called "continuum of care"

Identifies patients before the disease starts(primary prevention)

Tracks them as the disease develops(screening, detection)

Follows them through treatment(the complete care delivery process)

To eventual resolution(health problem resolution or patient death)

Key clinical process analysis

Within Intermountain, we initially identified > 1400 inpatient and outpatient "work processes" that corresponded to clinical conditions (e.g., "pregnancy, labor, and delivery;" "management of ischemic heart disease;" "management of Type II diabetes mellitus")

Pareto PrincipleThe ; 80/20 rule; or "the Vital Few":

Italian economist Vilfredo Pareto, 1848-1923

*

*

The IOM Chasm report:Design for the usual, but -

recognize and plan for the unusual.

104 of those work processes (~7%) accounted for 95% of all of our inpatient and outpatient care delivery.

Three ways to design a data system

1. Availability - use whatever data we currently collect- financial data (time-derived activty-based costing data (TD-ABC), insurance claims)- regulatory compliance data- electronic medical record (EMR) data (computable form = encoded concepts)

(Danger: "availability bias")

2. Expert opinion - gather a group of experts in a conference room, ask them what they need / what would be useful

(Danger: O'Connor's "recreational data collection")

3. Clinical aim - patient entry and exclusion criteria, stratification criteria, intermediate and final clinical and cost outcome data arranged along clinical workflows (strong evidence originally derived

from data system design for randomized trials)

Designing data for clinical performanceBuild a conceptual model - usually a conceptual flow chart1.Generate a list of desired reports2.

use conceptual model plus outcomes heuristicformat: annotated run charts / SPC chartstest with target end users

Generate a list of necessary data elements3.use list of desired reports; think numerators/denominators, entry/exclusion, etcformat: coding manual --> self-coding data sheetstest (crosswalk) final self-coding data sheets against report listtest manually, at front lines

Negotiate what you want with what you have4.identify data sources for each element: existing/new, automated/manualconsider value of final report vs. cost of getting necessary data

Design EDW structure (data mart, data flows, manual data, etc.)5.Program analytic routines, display subsystems6.Test final reporting system7.

Measuring for clinical management

We already had "sophisticated" automated data- financial systems (claims data)- time-based Activity Based Costing (since 1983)- clinical data for government reporting (JCAHO, CMS Core Measures, etc.)- other automated data (lab, pharmacy, blood bank, etc.)- Danger! Availability bias!

Still missing 30 - 50% of data elements essentialfor clinical management(the reason that the 2 initial Intermountain initiatives for clinical management failed)

We deployed a methodology to identify criticaldata elements for clinical management, then builtthem into clinical workflows (Danger! Recreational data collection!)

Enterprise Data Warehouse (EDW)

currently track 58 clinical processes representing about 80% of all care delivered within Intermountain

follows every patient logitudinally over timecondition-specific clinical, cost, and service process and outcomes

about 2 petabytes (million gigabytes) of storage

primary use: routine clinical management

The clinician as a "trusted advisor"

a situation in which those involved in health care choices (patients, health professionals, payers) have sufficiently accurate, complete, and understandable information about expected clinical results to make wise decisions.

Such choices involve not just the selection of a health plan, a hospital, or a physician, but also the series of testing and treatment decisions that patients

routinely face as they work their way through diagnosis and treatment.

Most clinicians don't know (don't measure, or have easy access to)their own short- and long-term clinical outcome results.

As a result, they cannot accurately advise patients regarding treatment choices.

True transparency:

Managing clinical knowledge

1. Generate initial evidence-based best practice guideline (flowchart) 2. Blend the guideline into clinical workflow

(clinical flow sheets, standing order sets, etc.) 3. Design outcomes tracking reports (using electronic data warehouse) 4. Design and coordinate decision support (electronic medical record) 5. Design patient and professional education materials

Initial development phase

6. Keep the Care Process Model current (protocol variations; patient outcomes; external research findings; improvement suggestions)

7. Academic detail front-line teams (Clinical Learning Days) 8. Run the referral clinic (last step in treatment cascade) 9. Manage specialist care managers10. Conduct Level I research

Maintenance phase

Core work group (knowledge expert) responsibility -build and maintain the Care Process Model:

The Learning Health Care System

1. Build a system to manage care

2. Justify the required major financial investment on the basis of care delivery performance -- "the best clinical result at the lowest necessary cost"

3. Use the resulting clinical management data system to:(a) Generate true transparency at the clinician-patient level,

rolling up to the national level(b) "Learn from every patient" - integrate clinical effectiveness

research into front-line care

1. Rapid impact on care delivery performance(best medical result at lowest necessary cost)

- internally funded - patient care dollars- publication, external grant funding = "icing on the cake"

2. Investigator-initiated research- traditional academic model- external grant funding

3. Collaborations with external investigators- multi-center trials- local universities- requires an internal "champion"

4. Industry-based groups (pharma, device manufacturers)

5. "Research" done by affiliated medical staff independent of Intermountain

5 levels of clinical research

2013 "Level 1" learning productionNICU Development Team: 23 peer-reviewed articles

Cardiovascular Clinical Program (3 Development Teams):64 peer-reviewed articles67 abstracts15 "other" - book chapters, editorials, etc.

Other Clinical Development Teams also published(just not as prolific as Women & Newborn and CV)

Presently calculating direct impact on cost of operations

Goal: 1,000 peer-reviewed Level 1 publicationsin a single year (sometime before I retire)

Better has no limit ...

an old Yiddish proverb