June 8, 2008

Post on 03-Feb-2016

35 views 0 download

Tags:

description

Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration Penn State Evaluation Team: Peter Kemper, Diane Brannon, Brigitt Heier, Amy Stott, Monika Setia, Joseph Vasey, Jungyoon Kim, and Candy Warner. June 8, 2008 - PowerPoint PPT Presentation

Transcript of June 8, 2008

Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the

Better Jobs Better Care Demonstration

Penn State Evaluation Team: Peter Kemper, Diane Brannon, Brigitt Heier, Amy Stott, Monika

Setia, Joseph Vasey, Jungyoon Kim, and Candy Warner

June 8, 2008

Presented at the annual meeting of AcademyHealth. We thank The Atlantic Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and Evaluation for funding.

(Contract No. HHSP23320044303EC)

BJBC Demonstration

• Goal: Improve direct care workers’ job quality and reduce turnover

• Direct care workers: Provide help with personal care • Interventions: BJBC training and technical assistance

intended to improve management practices • Where:

– Five state projects– 124 providers– Skilled nursing, assisted living, home care, adult day service

BJBC’s Intended Effects:Basic Framework

BJBC Interventions

Providers

• Implementation

• Management Practices

Direct Care Worker

• Job outcomes

• Turnover

Intervention IA NC OR PA VTTop Management Training √ √Supervisor Training √ √ √Team Building √ √ √ √DCW Career Development √ √ √ √Caregiving Skill Development √ √ √

BJBC Interventions:Technical Assistance and Training

Approach to Evaluation

• Evaluation not designed with a control group– Before-after evaluation design and data– Sought methods of strengthening design

• “Let a thousand flowers bloom” demonstration interventions not standardized or known– Measured range of management practices– Developed measures of extent of implementation

Methods for Estimating Effects

• Basic approach– Before-after comparison of means– Post-intervention trends compared with national

trends

• Difference-in-difference: Compare changes in:– States with and without specific interventions– Providers that did and did not implement

Analyses Presented

BJBC Interventions

Providers

1. Implementation

2. Management Practices

Direct Care Workers

3. Job outcomes

4. Turnover

Methods Used in Impact Analyses

Before-After or Trend

States with and without Intervention

Providers that did and did not

implement

Management Practices √ √

Job Outcomes √ √

Turnover √ √

Data

• Telephone interviews with project managers• Survey of clinical managers• Survey of frontline supervisors• Survey of direct care workers• Hiring and termination information system

Data Used in Analyses

Practice Manager

Clinical Manager

Supervisor Direct Care

Worker

Hiring and

Termination

Implementation √ √ √

Management Practices √

Job Outcomes √

Turnover √

Measuring Implementation in Formative Evaluations: Using Data from Multiple Perspectives

Peter KemperBrigitt Heier

Joe VaseyDiane Brannon

June 8, 2008

Presented at the annual meeting of AcademyHealth. The authors are grateful for support from The Atlantic

Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and

Evaluation (Contract No. HHSP23320044303EC)

Motivation

• Variation in implementation observed in early site visits

• Mid-course correction: Add implementation measures

• Goal: Develop a summary index of extent of provider implementation for use in impact analysis

Measures from Three Perspectives

• Practice Manager (state project level)

• Clinical Manager (provider level)

• Frontline Supervisors (provider level)

Practice Manager Perspective

• “Make a mark on the scale that best describes this provider’s current degree of implementation”– 0 - Implementation of interventions has not yet started – 100 - Interventions are fully implemented and sustainable

Clinical Manager Perspective

• “Indicate the level of progress your organization has made in implementing the most important intervention”– 0 - Implementation of intervention has not yet started– 10 - The intervention is fully implemented and sustainable

• “The programs that are part of BJBC have been well executed in your organization”– Five point scale from strongly disagree to strongly agree

Frontline Supervisor Perspective

• “The programs that are part of BJBC have been well executed in your organization”– Five point scale from strongly disagree to strongly agree

• Averaged across supervisors in each provider

Methods

• Exploratory factor analysis – Principal components extraction method– Extracted component with an eigenvalue greater than 1– Included items if the factor loading was .6 or greater

• Imputed values when one or two items missing using maximum likelihood procedure

• Sample size: 92 providers

Factor Loadings

Measure Loading

Implementation score (Practice Manager) .70

Implementation score (Clinical Manager) .76

Execution of BJBC programs (Clinical Manager) .80

Execution of BJBC programs (Frontline Supervisor) .76

Factor has an eigenvalue of 2.2 and explains 55% of variance

Distribution of Factor Scores

Mean: .00Median: -.05Minimum: -2.84Maximum: 2.18Skewness: -.19

Factor ScoreRe-scaled to 0-1 Range

Mean: 0.56Median: 0.56Minimum: 0.00Maximum: 1.00Skewness: -.23

Implementation Index Is Related to Underlying Measures

Y= -1.85 + .032X

How We’ll Use Index in Analyzing Effects

• Difference-in-difference approach– Divide providers into two groups: above and below

median implementation– Compare difference between the two groups in differences between Time 2 and Time 1

• Extend to continuous measure in regression• Note: Method does not identify effects but

may identify absence of effects

Summary

• Assessments of implementation from three perspectives are similar

• Summary index was developed successfully• Uses of implementation index

– Will be used to strengthen analysis of BJBC effects– Most useful in confirming absence of effects– Could be used to analyze factors affecting implementation