Scaling up and Sustaining Evidence-based Practices Glen Dunlap, George Sugai, Tim Lewis, Steve...
-
Upload
dina-brown -
Category
Documents
-
view
214 -
download
0
Transcript of Scaling up and Sustaining Evidence-based Practices Glen Dunlap, George Sugai, Tim Lewis, Steve...
Scaling up and Sustaining Evidence-based Practices
Glen Dunlap, George Sugai, Tim Lewis,
Steve Goodman, Rob Horner
www.pbis.org
Goals
Define the features and procedures for moving evidence-based educational practices from demonstrations to large-scale adoptions.
Use School-wide Positive Behavior Support as one example of large-scale implementation
Main Themes To take educational innovations to scale begin
with Valued Outcomes The outcomes need to be valued The outcomes need to be comprehensive
Phases in Scaling of Evidence-based Practices
Emergence
Establishing Demonstrations/Capacity
Elaboration
Systems Adoptions
Phases of Implementation Emergence
Define Innovation with precision Define Supporting Systems Define Implementation Process Awareness dissemination
What is the innovation? Is it evidence-based? Is it conceptually coherent? Why is it effective? How is it more efficient than what we currently do?
Phases of Implementation Demonstration
Documentation that innovative can be implemented locally with (a) fidelity, and (b) effect on valued outcomes.
Provide demonstrations (1-50) Repeated demonstrations in multiple contexts (parts of the
state, urban centers, different grade levels) may be needed. Demonstrations typically are done at greater expense than is
sustainable or scalable, but are justified as examples that the innovation “can be done here”
Build infra-structure for scaling State policy State training and support capacity Information systems
Phases of Implementation Elaboration
Shift from demonstration to broad implementation Use local trainers Presentation by local demonstration sites Many distributed (more cost effective) trainings Training at multiple organizational levels
Administrators School boards Instructional staff Specialists (e.g. behavior specialists, school psychologists,
social workers, counselors) Families
Disseminate outcome data Conduct and disseminate comparative cost data
Phases of Implementation System Adoption/Sustainability
Innovation is integrated into policy Job descriptions Hiring announcements Annual personnel orientation
Regular reporting of data Are we implementing evidence-based practices Are we producing the effects we want for children
Investment in continuous regeneration Implement evaluate adapt
Sustaining SWPBS Implementation
Jennifer DoolittleUniversity of Oregon
2006
Method 285 schools who have been involved in
implementing SWPBS for at least 3 years. 71 not to criterion yet 74 met 80%/80% criterion on SET but did NOT
sustain for two years. 140 met 80%/80% criterion on SET and did
sustain. School-wide Evaluation Tool Logistic Regression
Doolittle (2006)SET Subscale Sustainability Status Pair-wise Comparison Effect Sizes1
Non-Imp(N = 71)
Non-Main(N = 74)
Main(N =140)
M SD M SD M SD F-value Non-Imp vs.
Non-Main
Non-Imp vs.
Main
Non-Main vs.
Main
Expectations defined 0.58a 0.28 0.81b 0.21 0.89c 0.16 51.13*** 0.94 1.41 0.43
Behavioral expectations taught
0.47 0.28 0.81 0.19 0.90 0.13 119.15*** 1.45 2.10 0.56
On-going behavioral reward system
0.53 0.37 0.81 0.24 0.95 0.12 76.29*** 0.92 1.71 0.78System responding to behavioral violations
0.63 0.21 0.71 0.20 0.81 0.14 26.13*** 0.39 1.03 0.59
Monitoring and decision making
0.68 0.25 0.89 0.17 0.95 0.11 59.13*** 1.00 1.50 0.43
Management 0.66 0.26 0.84 0.15 0.93 0.10 60.39*** 0.88 1.50 0.72District-level support 0.75 0.27 0.83 0.25 0.87 0.23 6.07** 0.31 0.48 0.17
*p < .05; **p < .01; ***p < .0011Effect sized used is d-statistic and interpreted as .2 = small effect, .5 = medium effect, .8 = large effect (Cohen, 1988).Notes. SET = School-wide Evaluation Tool; M = mean; SD = standard deviation. Means in the same row with different subscripts significantly differ at p <.05.
Results: Predictors of Implementation
Predictors of Initial SWPBS Implementation
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
Ex Def Taught Reward Data Admin District
Effe
ct S
ize
*
Effect sized used is d-statistic and interpreted as .2 = small effect, .5 = medium effect, .8 = large effect (Cohen, 1988).
Results: Predictors of Sustained Implementation
Predictors of Sustained SWPBS Implementation
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Ex Def Taught Rew ard Data Admin District
Eff
ect
Siz
e
* *Effect sized used is d-statistic and interpreted as .2 = small effect, .5 = medium effect, .8 = large effect (Cohen, 1988).
Summary The variables that were most relevant for
initial implementation were DIFFERENT from the variables that affected sustainability.
Summary for Sustaining and Scaling Begin with the Valued Outcomes Innovations need to be more than effective:
Comprehensive Efficient Research-based Dramatic improvement over what already exits.
The process of implementation changes as the scale increases Increased efficiency Increased emphasis on local capacity
Large scale implementation requires sustained effect Continuous regeneration.