Tools for Developing a Comprehensive Evaluation Template
Transcript of Tools for Developing a Comprehensive Evaluation Template
Tools for Developing a Comprehensive Evaluation Template
Heather Peshak George, Ph.D.Karen Elfner Childs, M.A.
University of South FloridaCynthia Anderson, Ph.D.
University of OregonSkill-Building Workshop: March 27, 2010
7th International Conference on Positive Behavior Support: St. Louis, MO
2
Purpose
• Familiarize participants with the new Benchmarks for Advanced Tiers (BAT) and other tools to develop a comprehensive evaluation template addressing behavior support across tiers with application at state, district, and/or school levels.
3
Objectives
Purpose of a comprehensive evaluation systemAdministration and completion – what is it?Using the results to boost implementation and validate outcomes – how do you use it?
SchoolDistrictState
Future considerations
4
Agenda
8:30-9:00 Introduction and Rationale9:00-10:00 Implementation Monitoring: TIC, PIC10:00-10:30 Implementation Integrity: BoQ10:30-10:45 BREAK10:45-11:45 Implementation Integrity: BAT11:45-12:15 Implementation Research: SET, ISSET12:15-12:30 Wrap-up
5
Purpose of Evaluation
• To examine the extent to which teams are accurately selecting and implementing PBS systems and practices
• Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved
• To determine if teams are accurately and consistently implementing activities and practicesas specified in their individualized action plan
(PBIS Blueprint, 2005)
6
PBIS Evaluation Blueprint:A Work in Progress…
• Context– What was provided, who provided, who received
• Input– Professional development, value, perspective
• Fidelity– Implemented as designed, w/fidelity, process evaluation
• Impact– Behavior change, other schooling changes
• Replication, Sustainability and Improvement– Capacity, practice, policy– Expanding implementation, allocating resources
(PBIS Blueprint, 2010)
7
Factors to Consider in Developing Comprehensive Evaluation Systems
1) Systems Preparation– Readiness activities
2) Service Provision– Training and technical assistance
3) Identification and Assessment of Behavior Problems– Possible data sources
4) Evaluation Process– Timelines, data systems
5) Evaluation Data (Across all three Tiers)– Implementation Fidelity, Impact on Students, Attrition, Client
Satisfaction6) Products and Dissemination
– Reports, materials, presentations, etc.(modified from Childs, Kincaid & George, in press)
Florida’s Evaluation ModelEvaluation
Data
TrainingOn-going technical
assistanceFLPBS
↓Districts
↓Coaches
↓Schools
End-Year
ImpactOutcome data (ODR, ISS, OSS)FL Comprehensive Assessment TestBenchmarks of QualitySchool Demographic DataPBS WalkthroughDaily Progress ReportsBehavior Rating ScalesClimate Surveys
Implementation FidelityPBS Implementation Checklist (PIC)Benchmarks of Quality (BoQ)Benchmarks for Advanced Tiers (BAT)School Demographic DataSchool-wide Implementation FactorsTier 3 plan fidelity checklistBEP Fidelity checklist
Project ImpactAttrition Survey/Attrition RatesDistrict Action Plans
Client SatisfactionSchool-Wide Implementation FactorsDistrict Coordinator’s SurveyTraining Evaluations
Annual ReportsRevisions to
training and technical assistance processNational, State,
district, school dissemination activitiesWebsiteOn-line training
modules
Identification/Assessment
Service Provision
Products and Dissemination
Systems Preparation
Evaluation Process
MidYear
I
MidYear
II
Discipline RecordsESE ReferralsSurveysWalkthroughsPICClassroom
Assessment ToolStudent rank/ratingTeacher requestsLack of responseBATBehavior Rating
ScaleDaily Progress
Report Charts
•District Action Plan•District Readiness Checklist•SchoolReadinessChecklist•New School Profile (includes ODR, ISS, OSS)
9
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
10
Implementation Monitoring
Team Implementation Checklist (TIC)
PBS Implementation Checklist (PIC)
11
Progress Monitoring Measures
• designed to assess the same core features as the research and annual self-assessment measures
• used by school teams (typically with the support of their coach) on a frequent basis (e.g. monthly, every two months, or quarterly) to guide action planning during the implementation process
• require 15-20 minutes to complete online and are used by the team, coach and trainer to tailor actions, supports, and training content associated with assisting the school to implement with high fidelity
(PBIS Blueprint, 2010)
12
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
13
Team Implementation Checklist (TIC) , Version 3.0
14
Team Implementation Checklist
• Utility• Initial planning for implementation• Progress monitoring early implementation
• Completed quarterly by Tier I team• Checklist 1:
15
Components of the TIC
Checklist I Checklist 2•Commitment•Team•Self assessment•Expectations•Information system•Capacity for Tier III
Monitor ongoing activity
16
Use of the Team Checklist
• Who completes the Team Checklist?• The school-team (individually or together)
• When is Team Checklist completed?• At least quarterly, best if done monthly• Web-based data entry www.pbssurveys.org
• Who looks at the data?• Team• Coach• Trainers/State Evaluation
• Action Planning
17
Action Planning with the Team Checklist
• Define items Not in place or Partially in place• Identify the items that will make the biggest
impact• Define a task analysis of activities to achieve
items• Allocate tasks to people, time, reporting
event
Iowa Checklist 01-05, PK-6 % Fully & Partially Implemented
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
05-A
ug-0
3
05-N
ov-0
3
23-F
eb-0
4
22-J
an-0
4
01-F
eb-0
5
02-J
un-0
5
12-A
ug-0
4
24-N
ov-0
4
01-M
ar-0
5
12-S
ep-0
2
31-O
ct-0
2
28-F
eb-0
3
21-A
pr-0
3
01-S
ep-0
3
05-N
ov-0
3
05-A
ug-0
3
11-S
ep-0
3
07-N
ov-0
3
06-F
eb-0
4
01-S
ep-0
3
01-N
ov-0
3
01-M
ar-0
4
03-A
ug-0
4
08-N
ov-0
4
08-M
ar-0
5
03-J
un-0
5
1 1 1 2 2 2 3 3 3 4 4 4 4 4 4 5 5 5 5 6 6 6 7 7 7 7
Start Up Full Implementation Start Up Part Implementation
Iowa Elementary SchoolsTeam Checklists 02-04, % Items Fully & Partially Implemented
0
20
40
60
80
100
Aug
.N
ov.
Feb.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct.
Sep
.O
ct.
Feb.
Apr
.S
ep.
Nov
.Fe
b.O
ct.
Aug
.S
ep.
Nov
.Fe
b.
Sep
.N
ov.
Mar
.
Nov
.Fe
b.
Nov
.Fe
b.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct
Aug
.N
ov.
Feb.
AdamsES-D
Douds ES * Iowa Valley ES* JacksonES-D
MLKES-D
MonroeES-D
ParkAve.ES-D
Prescott ES* Stockport ES-P* StoweES-D
Per
cent
(%) I
mpl
emen
ted
% Imp. % Partially Imp.
R. V. Traylor Elementary SchoolTeam Checklist 03-04
0
20
40
60
80
100
Commit Team Self-Assess Expect.Define
Expect.Teach
RewardsSystem
ViolationsSystem
Info. Function % ItemsImplemented
% TotalPoints
Oct. '03 Dec. '03 Mar. 04
21
Putting your School in Perspective
Use % of Total Items/ or % of pointsCompare multiple schools
– Messages:• It is possible• You don’t need to be perfect immediately
0
20
40
60
80
100
A B C D E F G H I J K L M N
Series1 Series2
Team Checklist Total Scores
0
20
40
60
80
100
A B C D E F G H I J K L M N
Series1 Series2
Team Checklist Total Scores
0
20
40
60
80
100
A B C D E F G H I J K L M N
% Imp. % Partially Imp.
Team Checklist Total Scores
25
PBS Implementation Checklist (PIC)
26
PIC Purpose
• Provide school teams a “snapshot” of where they are in implementation of PBS– Implementation of critical elements at Tier 1– Implementation of Tiers 2 and 3
• Completed 3 and 6 months into the school yr
• Guides action planning and team activities
27
PBS Implementation Checklist
28
Tier 1 Critical Element Implementation Level chart
29
PBS Implementation Level chart
30
Using PIC Results
• Use the results of PIC to guide your PBS team towards implementation with fidelity at all three tiers.
31
Implementation Monitoring Tools
• Will you progress monitor your school(s)?– If so, how often?– Who is responsible to administer, collect and
synthesize the data?– How will it be reported back to the team?
• Which tool will you utilize?• How will you use the results?
• At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
32
Implementation Integrity
Benchmarks of Quality (BoQ)
Benchmarks for Advanced Tiers (BAT)
33
Annual Self-Assessment Measures
• Designed to document the same content as the research measures but to do so more efficiently
• Most available online and provide a school team/coach with the ability to determine once a year if a school is implementing SWPBS practices at a level that would be expected to affect student outcomes
• Always guide development of action planning to assist in efficient and continuous improvement of systems used in the school
(PBIS Blueprint, 2010)
34
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
35
Benchmarks of Quality (BoQ)
36
Creation: Based on Needs
Reliably assess team’s implementationDistinguish Model SchoolsEasy to complete by Coaches with little trainingQuick to completeProvide feedback to teamClarify outcomes as related to implementation
37
Benchmarks of Quality
• Identified items aligned with SWPBS Training process53 items addressing areas of:
• Faculty commitment• Effective procedures for dealing with discipline• Data entry and analysis plan established• Expectations and rules developed• Reward/recognition program established• Lesson plans for teaching• Implementation plan• Crisis plan• Evaluation
38
BoQ Validation Process
• Expert Review• Pilot Study• Florida & Maryland Schools
– Elementary, Middle, High, Center/Alt• Reliability – Test-retest, Inter-rater both >.01• Concurrent Validity – SET/ODRs• For more details see JPBI – Fall 2007
39
Use of the School-Wide Evaluation Tool (SET)
•SET is a validated research tool that combines multiple assessment approaches (interviews, observations, product reviews) to arrive at an implementation score•Concerns:
– Time– High scores– Potential for “practice effect”– May not reflect current activities– Not as useful for action planning
•Results of correlation with BoQ– Overall r=.51 (p<.01)
40
Scatterplot of SET and BoQ scores
0
10
20
30
40
50
60
70
80
90
100
0 10 20 30 40 50 60 70 80 90 100
BoQ Scores
SET
Scor
es
41
BoQ Factor Analysis
• Exploratory and confirmatory analysis– Most items “hang together” within a critical element but
fit better within a 5 factor structure– All but 4 of the 53 items were found to have internal
consistency (strong items)– Item/total correlations indicated that 46 of the 53 items
were highly correlated with total score• The 4 items without strong internal consistency were also
found to lack item/total correlation• All 3 crisis items
42
Utility of the BoQBoQ is reliable, valid, efficient and usefulModerate correlation with SETData regarding association with ODRsEase of use
Little trainingLittle time from team and CoachAreas not unique to one training approachAssist states that are rapidly expanding PBS efforts
Specific team feedback: celebration/planning
43
Benchmarks Review
• Describe the Benchmarks of Quality (what is it?)
• Describe the psychometric properties of the Benchmarks of Quality (can we trust it?)
• Share your answers to these questions with your neighbor
44
Administration and Completion
45
3 Elements of theBenchmarks of Quality
• Team Member Rating Form• Completed by team members independently• Returned to coach/facilitator
• Scoring Form• Completed by coach/facilitator using Scoring Guide• Used for reporting back to team
• Scoring Guide• Describes administration process• Rubric for scoring each item
46
Method of Completion
• Coach/facilitator uses Scoring Guide to ascertain the appropriate score for each item, collects Team Member Rating forms, resolves any discrepancies, and reports back to team
• Alt. Option – Scoring Form is completed at a team meeting with all members reaching consensus on the appropriate score for each item using the Scoring Guide rubric. The team identifies areas of strength and need.
47
Completion of BoQStep 1 – Coach’s Scoring
• The Coach/facilitator will use his or her best judgment based on personal experience with the school and the descriptions and exemplars in the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form (p.1 & 2). Do not leave any items blank.
48
Benchmarks Practice:Scoring Form, Scoring Guide
Critical Elements STEP 1 STEP 2
++, +, or _ STEP 3
1.Team has broad representation 1 0
2 Team has administrative support 3 2 1 0
3 Team has regular meetings (at least monthly) 2 1 0
PBS Team
4 Team has established a clear mission/purpose 1 0
49
Benchmarks Practice:Scoring Form, Scoring Guide
Critical Elements STEP 1 STEP 2
++, +, or _ STEP 3
1. Team has broad representation 1 0
2 Team has administrative support 3 2 1 0
3 Team has regular meetings (at least monthly) 2 1 0
PBS Team
4 Team has established a clear mission/purpose 1 0
50
Completion of BoQStep 2 – Team Member Rating
• The coach/facilitator will give the Benchmarks of Quality Team Member Rating Form to each SWPBS Team member to be completed independently and returned to the coach upon completion. Members should be instructed to rate each of the 53 items according to whether the component is “In Place,” “Needs Improvement,” or “Not in Place.” Some of the items relate to product and process development, others to action items; in order to be rated as “In Place;” the item must be developed andimplemented (where applicable). Coaches will collect and tally responses and record on the Benchmarks of Quality Scoring Form the team’s most frequent response using ++ for “In Place,” + for “Needs Improvement,” and –for “Not In Place.”
Benchmarks Practice:Scoring Form, Team Members Rating Form
In Place (++)Needs Improvement (+)
Not In Place (-)
Team Member B1.Team has broad representation X
2. Team has administrative support X
Team Member A1. Team has broad representation X
2. Team has administrative support X
STEP 1STEP 2++, +,
or _
STEP 3
1. Team has broad representation 1 0
2. Team has administrative support 3 2 1 0+++
Team Member C1. Team has broad representation X
2. Team has administrative supportX
52
Benchmarks Team Member Tally Form
53
Completion of BoQStep 3 - Team Report
• The coach will then complete the Team Summary on p. 3 of the Benchmarks of Quality Scoring Form recording areas of discrepancy, strength and weakness.
• Discrepancies - If there were any items for which the team’s most frequent rating varied from the coaches’rating based upon the Scoring Guide, the descriptions and exemplars from the guide should be shared with the team. This can happen at a team meeting or informally. If upon sharing areas of discrepancy, the coach realizes that there is new information that according to the Scoring Guide would result in a different score, the item and the adjusted final score should be recorded on the Scoring Form
Benchmarks Practice:Scoring Form, Team Members Rating Form
In Place (++)Needs Improvement (+)
Not In Place (-)
Team Member B1.Team has broad representation X
2. Team has administrative support X
Team Member A1. Team has broad representation X
2. Team has administrative support X
STEP 1STEP 2++, +,
or _
STEP 3
1. Team has broad representation 1 0
2. Team has administrative support 3 2 1 0+++
Team Member C1. Team has broad representation X
2. Team has administrative supportX
55
Completion of BoQStep 4 – Reporting Back to Team• After completing the remainder of the Benchmarks
of Quality: Scoring Form, the coach will report back to the team using the Team Report page of the Benchmarks of Quality: Scoring Form. If needed, address items of discrepancy and adjust the score. The coach will then lead the team through a discussion of the identified areas of strength (high ratings) and weakness (low ratings). This information should be conveyed as “constructive feedback” to assist with action planning.
Benchmarks Team Summary:Scoring Form
Areas of Discrepancy
Item #
Team Response
Coach’sScore Scoring Guide Description
Areas of Strength
++, ++, + Administrator does not actively support the process2 0
Critical Element
Description of Areas of Strength
Critical Element
Description of Areas in Need of DevelopmentAreas in Need of Development
Benchmarks Critical Element Maximum
58
Alternative Option* for Completion of BoQ
*statistically validated as an alternative option
59
Alternative OptionStep 1 – Team Member Scoring
• The team member uses personal experience with PBS and the descriptions and exemplars in the Benchmarks of Quality Scoring Guide ) for each of the 53 items on the Benchmarks of Quality Scoring Form (p.1 & 2). The team will meet and reach consensus on the appropriate score for each item.
60
Alternative OptionStep 2 – Team Summary
• After completing the Benchmarks of Quality: Scoring Form, the team should use the Team Report page of the Benchmarks of Quality: Scoring Form to guide a discussion of the identified areas of strength (high ratings) and weakness (low ratings). This information should be used as “constructive feedback” to assist with action planning.
61
Submitting Your Evaluation
• Step 5 – Reporting/Entering Data• The coach/facilitator will enter the data
from the Benchmarks of Quality: Scoring Form on www.pbssurveys.org
• See PBS Surveys Users Manual for specific instructions.
• District/state coordinators may establish due dates for completion of the BoQ annually, or more frequently as needed.
62
PBS Surveys
www.pbssurveys.org
63
Using the BoQ Results to Boost Implementation and
Validate Outcomes
64
Using the BoQ Results
Action plan to increase fidelity of implementationSchoolDistrictState/project
Outcome reportingModel school identification
65
BoQ Max Scores per Critical Element
School
66
PBS Surveys - BoQ ReportCritical Elements
School
Jones Middle School
Are our Benchmarks scores above 70 and rising?Scores have never been over 70 and dropped 15 points last year.
School
68
PBS Surveys - BoQ ReportOverall Scores
District
District PBS Implementation Levels
0%10%20%
30%40%50%60%70%
80%90%
100%Te
am
Facu
ltyC
omm
it.
Dis
cipl
ine
Proc
Dat
a En
try
Expe
ctat
ions
Rew
ards
Teac
hing
Impl
emen
. Pla
n
Cris
is
Eval
uatio
n
Benchmark Category
Ave
rage
% o
f Po
ssib
le P
oint
s Ea
rned
2004-2005 (11 schools)2005-2006 (15 schools)
Are our schools implementing PBS with fidelity?
District
Average BoQ scores over 70% and increasing in all 10 domains.
District Average Referrals by Implementation Level
0
50
100
150
200
250
300
350
400
2004-2005 2005-2006School Year
Ave
rage
# O
DR
/100
St
uden
ts
Low Implementers*
High Implementers
*(Implementation Level based upon score on School-Wide PBS Benchmarks of Quality; >70 or <70 of a possible 100 points)
Is there a difference in ODR outcomes for schools?Low implementers have many more ODRs, but number is decreasing.
District
District Average ISS Days by Level of Implementation
0
10
20
30
40
50
60
70
2004-2005 2005-2006School Year
Ave
rage
# D
ays
ISS
per 1
00 S
tude
nts
Low Implementers*High Implementers
Is PBS impacting ISS in our schools?High implementing schools have 70% fewer ISS and decreased by 50%.
District
72
State
73
State
74
State
75
State
Academic AchievementStudents at Level 3+ in Reading on Florida’s Comprehensive Assessment Test
53
6067
57 59
68
57 58
67
0
10
20
30
40
50
60
70
All FL Schools Low (BoQ<70) High (BoQ>=70)
Ave
rage
Per
cent
age
Scor
ing
Leve
l 3+
2004-2005 2005-2006 2006-2007
76
Using Benchmarks Results
• How will you use the results of the Benchmarks?• At the school, district, state/project level?• As it relates to fidelity of implementation?• As it relates to outcomes?• As it relates to identifying model schools?• Other?
77
Benchmarks forAdvanced Tiers (BAT)
78
Benchmarks for Advanced Tiers (BAT)
• The Benchmarks for Advanced Tiers (BAT) allows school teams to self-assess the implementation status of Tiers 2 (secondary, targeted) and 3(tertiary, intensive) behavior support systems within their school and is designed to answer three questions:1. Are the foundational (organizational) elements in
place for implementing secondary and tertiary behavior support practices?
2. Is a Tier 2 support system in place?3. Is a Tier 3 system in place?
79
BAT OrganizationTier 1: Implementation of School-wide PBSTier 2-3 Foundations
• Commitment• Student Identification• Monitoring and Evaluation
Tier 2: Support SystemsMain Tier 2
• Strategy Implementation• Strategy Monitoring and Evaluation
Tier 3: Intensive Support SystemsTier 3: Assessment and Plan Development
80
Instructions for CompletingWho: The team(s) or individuals involved with Tiers 2 and 3
behavior supportHow: As a group or each member independently. If
completed independently, the team reconvenes to review scores on each item. Team (or individuals involved with Tiers 2 and 3 behavior support) must reach
consensus on the score for each item.
Scoring: After reviewing the rubric for each item, select the score that most closely matches the state of affairs at the school. Rate each item as “2” fully in place, “1” partially in place, or “0” not yet started.
81
Additional Tips
• Before starting the first administration, read through the items to determine who on campus will be likely to have knowledge of the topic(s).
• Since the BAT covers several topic areas and usually requires input from multiple people it is best to work from a paper copy until all items have been scored.
82
Tier 1: (A) SWPBS
83
Tiers 2-3: (B) Foundations
84
Tiers 2-3: (D) Monitoring/Eval
85
Tier 2: (E) Tier 2 Support System
86
Tiers 2: (F) Main Tier 2 Strategy Intervention
87
Tiers 2: (G) Main Tier 2 Strategy Monitoring/Evaluation
88
Additional Tier 2 Interventions
• Items 18-31 may be repeated for other Tier 2 strategies in use at your school for evaluation purposes. However, only the scores associated with the most commonly used Tier 2 strategy will be accounted in your Benchmarks for Advanced Tiers (BAT) score.
89
Tier 3: (H) Intensive Support Systems
90
Tier 3: (I) Assessment & Planning
91
Tier 3: (J) Monitoring/Eval
92
Using the BAT Results
• School teams should use the BAT to build an action plan to define next steps in the implementation process.
• The BAT can also assess progress over time, as scores on each area can be tracked on a year-to-year basis.
Benchmarks for Advanced Tiers
Benchmarks for Advanced Tiers
95
Using the Data for Action Planning
96
Using BAT Results• How will you use the results of the Benchmarks
for Advanced Tiers (BAT)?• At the school, district, state/project level?• As it relates to fidelity of implementation?• As it relates to outcomes?• As it relates to identifying model schools?• Other?
97
Implementation Integrity Tools
• Will you self-assess implementation fidelity for your school(s)?– If so, who is responsible to administer, collect
and synthesize the data?– How will it be reported back to the team?
• How will you use the results? • At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
98
Implementation Research
School-wide Evaluation Tool (SET)
Individual Student Systems Evaluation Tool (ISSET)
99
Research Measures
• designed to have high validity and reliability, and typically involve external observersassessing procedures during a multi-hour evaluation process
• used in formal evaluation and research analyses to allow unequivocal documentation of the extent to which SWPBS Universal, Secondary and Tertiary practices are being used as intended
(PBIS Blueprint, 2010)
100
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
SCHOOL-WIDE EVALUATION TOOL (SET)
Todd, Lewis-Palmer, Horner, Sugai, Sampson, & Phillips (2005)
SET• Developed as a research tool• What the SET does
• Discriminates schools that are and are not implementing Tier I
• What the SET does NOT do• Discern level/degree of implementation• Give information about the extent of
implementation• Lead to action planning
SET Subscales
1. Expectations defined (2 items)2. Expectations taught (5 items)3. Acknowledgement procedures (3 items)4. Correction procedures (4 items)5. Monitoring and evaluation (4 items)6. Management (8 items)7. District support (2 items)
SET Activities
• Interviews• Administrator
Administrator QuestionsDiscipline System
1. Do you collect and summarize office discipline referral information? Yes No If no, skip to #4.
2. What system do you use for collecting and summarizing office discipline referrals? (E2)
a. What data do you collect? __________________b. Who collects and enters the data? ____________________
3. What do you do with the office discipline referral information? (E3)
a. Who looks at the data? ____________________
b. How often do you share it with other staff? __________4. What type of problems do you expect teachers to refer to the
office rather than handling in the classroom/ specific setting? (D2)
5. What is the procedure for handling extreme emergencies in the building (i.e. stranger with a gun)? (D4)
SET Activities• Interviews• Administrator• 15 randomly selected students• 15 randomly selected staff• PBIS team members
• Observations• School rules• Crisis procedures
• Permanent product review• SIP• Action plan and implementation plan• ODR form
SET Activities• Interviews• Administrator• 15 randomly selected students• 15 randomly selected staff• PBIS team members
• Observations• School rules• Crisis procedures
• Permanent product review• SIP• Action plan and implementation plan• ODR form
Feature Evaluation Question
1. Is there documentation that staff has agreed to 5 or fewer positively stated school rules/ behavioral expectations?(0=no; 1= too many/negatively focused; 2 = yes)A.
Expectations Defined 2. Are the agreed upon rules & expectations publicly posted in 8
of 10 locations? (See interview & observation form for selection of locations). (0= 0-4; 1= 5-7; 2= 8-10)1. Is there a documented system for teaching behavioral expectations to students on an annual basis?(0= no; 1 = states that teaching will occur; 2= yes)2. Do 90% of the staff asked state that teaching of behavioral expectations to students has occurred this year?(0= 0-50%; 1= 51-89%; 2=90%-100%)
3. Do 90% of team members asked state that the school-wide program has been taught/reviewed with staff on an annual basis?(0= 0-50%; 1= 51-89%; 2=90%-100%)
4. Can at least 70% of 15 or more students state 67% of the school rules? (0= 0-50%; 1= 51-69%; 2= 70-100%)
B.Behavioral
Expectations Taught
5. Can 90% or more of the staff asked list 67% of the school rules? (0= 0-50%; 1= 51-89%; 2=90%-100%)
Feature Evaluation Question
1. Does the school budget contain an allocated amount of money for building and maintaining school-wide behavioral support? (0= no; 2= yes)
G.District-
Level Support 2. Can the administrator identify an out-of-
school liaison in the district or state? (0= no; 2=yes)
Scoring the SET
1. Calculate percentage of points earned for each subscale
2. Graph scores on each subscale
SETElementary School K
pre/post
Expec
t. defi
ned
Expec
t taug
htAck
nowled
gmen
tCorr
ectio
ns
Evalua
tion
Lead
ership
Distric
t Sup
port
mean
0
20
40
60
80
100
% o
f fea
ture
s im
plem
ente
d
fall 98fall 99
features
SETMiddle School T
year 3 to 4
Expec
t defi
ned
Expec
t taug
htAck
nowleg
emen
tCorr
ectio
nsMon
itorin
gLe
aders
hipDist
rict S
uppo
rt
mean
0
20
40
60
80
100
% o
f fea
ture
s im
plem
ente
d
fall 98fall 99
features
Can Schools AdoptSchool-Wide PBS Systems?
SET Scores Oregon and Hawaii
0
20
40
60
80
100
a b c d e f g h I j k l m n o p q r
Schools
SET
Tot
al S
core
PrePost 1Post 2
Individual Student Systems Evaluation Tool (ISSET)
Anderson, Lewis-Palmer, Todd, Horner, Sugai, and Sampson, (2008)
ISSET• Developed as a research tool• What the ISSET does
• Discriminates schools that are and are not implementing Tiers II and III
• Provides in-depth analysis of extent to which tiers are in place
• What the ISSET does NOT do• Lead to action planning
ISSET• 36 questions across 3 sub-scales• Measurement
• Interview questions: staff, students• Permanent product review (FBAs, BSPs,
intervention manuals for Tier II)• Use
• Administered by trained ISSET evaluator (external to school)
• Takes about 2 hours to administer• Scoring requires about 30 min
What the ISSET MeasuresFoundations
– Commitment– Team-based Planning– Student Identification– Monitoring and Evaluation
Targeted Interventions– Implementation– Evaluation and Monitoring
Intensive Individualized Interventions– Assessment– Implementation– Evaluation and Monitoring
SYSTEMS
Practices
Data
Components of the ISSETData collection protocolInterview questions
– Administrator– Behavior support team leader– 5 randomly selected teachers
Permanent Product ReviewISSET scoring guide
– Organization of the scoring guideScore summary page
Sample Item from Commitment
120
Summary Score Page
Overall Scores
In-Depth Scores
126
Implementation Research Tools
• Will you research implementation fidelity for your school(s)?– If so, who is responsible to administer, collect
and synthesize the data?– How will it be reported back to the team?
• How will you use the results? • At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
127
Back to the Big PictureEvaluation
Data
TrainingOn-going technical
assistanceFLPBS
↓Districts
↓Coaches
↓Schools
End-Year
ImpactOutcome data (ODR, ISS, OSS)FL Comprehensive Assessment TestBenchmarks of QualitySchool Demographic DataPBS WalkthroughDaily Progress ReportsBehavior Rating ScalesClimate Surveys
Implementation FidelityPBS Implementation Checklist (PIC)Benchmarks of Quality (BoQ)Benchmarks for Advanced Tiers (BAT)School Demographic DataSchool-wide Implementation FactorsTier 3 plan fidelity checklistBEP Fidelity checklist
Project ImpactAttrition Survey/Attrition RatesDistrict Action Plans
Client SatisfactionSchool-Wide Implementation FactorsDistrict Coordinator’s SurveyTraining Evaluations
Annual ReportsRevisions to
training and technical assistance processNational, State,
district, school dissemination activitiesWebsiteOn-line training
modules
Identification/Assessment
Service Provision
Products and Dissemination
Systems Preparation
Evaluation Process
MidYear
I
MidYear
II
Discipline RecordsESE ReferralsSurveysWalkthroughsPICClassroom
Assessment ToolStudent rank/ratingTeacher requestsLack of responseBATBehavior Rating
ScaleDaily Progress
Report Charts
•District Action Plan•District Readiness Checklist•SchoolReadinessChecklist•New School Profile (includes ODR, ISS, OSS)
Evaluation in Training
Evaluation in Identification
Evaluation in
Readiness
128
How Do These Evaluation Tools Fit into Your Big Picture?
• How will you integrate the necessary tools into your overall evaluation system?
Implementation Monitoring: TIC, PIC
Implementation Integrity: BoQ, BAT
Implementation Research: SET, ISSET
129
Data-Based Improvements Made
1. Increased emphasis on BoQ results for school and district-level action planning
2. Increased training to District Coordinators and Coaches and T.A.targeted areas of deficiency based upon data
3. Academic data used to increase visibility and political support4. Specialized training for high schools5. Identifying critical team variables impacted via training and T.A.
activities6. Revised Tier 1 PBS Training to include classroom strategies, problem-
solving process within RtI framework7. Enhanced monthly T.A. activities
130
In Summary…
1. Know what you want to know2. Compare fidelity of implementation with
outcomes – presents a strong case for implementing Tier 1 PBS with fidelity
3. Additional sources of data can assist a state in determining if Tier 1 PBS process is working, but also why or why not it is working
4. Address state, district, school systems issues that may impact implementation success
131
Some Resources
• Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2010). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from www.pbis.org
• Childs, K., Kincaid, D., & George, H.P. (in press). A Model for Statewide Evaluation of a Universal Positive Behavior Support Initiative. Journal of Positive Behavior Interventions.
• George, H.P. & Kincaid, D. (2008). Building District-wide Capacity for Positive Behavior Support. Journal of Positive Behavioral Interventions, 10(1), 20-32.
• Cohen, R., Kincaid, D., & Childs, K. (2007). Measuring School-Wide Positive Behavior Support Implementation: Development and Validation of the Benchmarks of Quality (BoQ). Journal of Positive Behavior Interventions.
132
Evaluation Instruments
• PBIS website:– http://www.pbis.org/evaluation/default.aspx
• FLPBS:RtIB Project Coach’s Corner:– http://flpbs.fmhi.usf.edu/coachescorner.asp
• PBS Surveys– http://www.pbssurveys.org/pages/Home.aspx
133
Contact
Heather George, Ph.D. & Karen Childs, M.A.University of South FloridaEmail: [email protected]: http://flpbs.fmhi.usf.edu
Cynthia Anderson, Ph.D.University of OregonEmail: Email: [email protected]