FASP Annual Conference

Post on 23-Feb-2016

49 views 0 download

description

Evaluating School Psychologists Within a Multi-tiered System of Supports Delivery Model: A New Era of Accountability SSPEM. FASP Annual Conference. David Wheeler George Batsche Student Support Services Project. Overview. - PowerPoint PPT Presentation

Transcript of FASP Annual Conference

Evaluating School Psychologists Within a Multi-tiered System of

Supports Delivery Model: A New Era of Accountability

SSPEM

2

FASP Annual Conference

David WheelerGeorge Batsche

Student Support Services Project

3

OverviewStudent Success Act (S.B. 736): Setting the state for professional personnel evaluations in FloridaMulti-Tiered System of Supports: Common Language/Common UnderstandingAlignment Between the MTSS Model and Student Services Delivery in FloridaWhy MTSS? Why Now?School Psychologists Skills/Role in a Multi-Tiered Support SystemFlorida’s New Evaluation SystemOverview of Student Services Personnel Evaluation Model (SSPEM)

4

Florida’s New Evaluation System

The Student Success Act (SB 736)1012.34, F.S.

5

Purpose of Student Success Act

For the purpose of increasing student learning growth by

improving the quality of instructional, administrative, and

supervisory services in public schools of the state.

1012.34(1), F.S.

6

Evaluation System Requirements

Designed to support effective instruction and student learning growth & must be used in developing School Improvement Plans.Provide appropriate instruments, procedures, and criteria for continuous quality improvement of professional skills & results must be used when identifying professional development.

Include a mechanism to examine performance data from multiple sources including parents when appropriate.

Identify teaching fields for which special evaluation procedures and criteria are necessary.

Non-classroom Instructional Personnel (Student Services)

Student learning growth data (50%) assigned over three years ORCombination of student learning growth data (30%) & measureable student outcomes specific to the assigned positionInstructional practice based on FEAPs and specific job expectationsProfessional & job responsibilities

1012.01(3)(a)1.b, F.S.

8

Non-classroom Instructional Personnel (Student Services)

- 1012.01(3)(a)1.b, F.SStudent performance (50%)

Student learning growth as assessed by statewide or district assessments ORCombination of student learning growth data (30%) & other measureable student outcomes specific to the assigned position

Instructional practice (non-classroom instructional personnel)

Florida Educator Accomplished Practices (FEAPs)May include specific job expectations related to student supportProfessional and job responsibilities

9

FEAPs do not adequately reflect the job responsibilities & practices of student services personnel

Impact on student performance is indirect

Student Services personnel typically assigned to multiple schools

Measuring student outcomes related to job

Challenges for Student Services Personnel

Evaluations

10

What Informs Instructional

Practice?NASP (and other student services) Professional Practice Standards

Florida Educator Accomplished Practices

School Psychologist Competency Areas

Multi-Tiered System of Supports Delivery System in Florida

Domains of Practice that incorporate professional standards and Florida Multi-tiered System of Supports

11

Domains of PracticeData-based Decision Making and EvaluationInstruction/Intervention Planning & DesignInstruction/Intervention Delivery & FacilitationLearning EnvironmentProfessional Learning, Responsibility, & Ethics

12

13

14

Florida Educator Accomplished Practices

(FEAPs)Quality of Instruction

Instructional Design and Lesson PlanningThe Learning EnvironmentInstructional Delivery and FacilitationAssessment

Continuous Improvement, Responsibility and Ethics

Continuous Professional ImprovementProfessional Responsibility and Ethical Conduct

15

School Psychologist Competencies: Florida

Selected

Data-Based Decision-Making and Accountability

Knowledge of Curricula and Instruction

Knowledge of Evidence-Based Interventions

Consultation, Collaboration and Problem-Solving

Professional School Psychology and Ethical Decision-Making

16

MTSS: Common Language/Common Understanding

17

MTSSA Multi-Tiered System of Supports (MTSS) is a term used to describe an evidence-based model of schooling that uses data-based problem-solving to integrate academic and behavioral instruction and intervention.

The integrated instruction and intervention is delivered to students in varying intensities (multiple tiers) based on student need.

“Need-driven” decision-making seeks to ensure that district resources reach the appropriate students (schools) at the appropriate levels to accelerate the performance of ALL students to achieve and/or exceed proficiency .

18

Why Organize an Evaluation System Around an MTSS

Model?Research supports that an integrated (academic/behavior/social emotional) service delivery system has greater impact on student performance than separate systems

Services and personnel in schools already are organized by levels of intensity of service delivery

Tier 1—What everybody gets—typically general education teacher ledTier 2—What “some” get—typically more intensive, smaller groupsTier 3—What “few” get—typically most intensive, specialized

19

Why Organize an Evaluation System Around an MTSS

Model?Existing and proposed statutes, regulations and practices support a multi-tiered system

IDEIANCLBLearn ActAchievement Through Prevention (PBIS) Act (SB 541)Florida Educator Accomplished Practices (FEAPs) and School Psychologist CompetenciesNASP Model

Evaluation systems require clear responsibility for levels of service delivery and “stakeholders” who are one focus of the evaluation process

20

Why Organize an Evaluation System Around an MTSS

Model?Instructional support staff of all types typically provide instruction/intervention at all levels (Tiers 1,2 and 3) in a school and/or district

School-based research that identifies evidence-based practices is conducted at levels aligned with the Tiers

School-wide (e.g., PBIS, Crisis Prevention)Classroom level (e.g., Group Procedures, Instructional StrategiesGroup level (e.g., academic instruction, social skills training, group work)Very Small Group/Individual (e.g., therapeutic, intense psychological skills training, academic skills)

21

MTSS: Critical Elements

The Four Corners of the “Frame”

Parts of the “Frame”• 3 Tiers of service delivery into which all

academic and behavioral instruction/intervention “fit.”– Content is not been defined by the model

• A structured Problem-Solving Process used to develop, implement, and monitor instruction/interventions

Parts of the “Frame”

Instruction/interventions are modified, intensified and or dropped based on student performance data

Instruction is integrated and systematically planned across the tiers

24

MTSS & the Problem-Solving Process

ACADEMIC and BEHAVIOR SYSTEMS

Tier 3: Intensive, Individualized Interventions & Supports.

The most intense (increased time, narrowed focus, reduced group size) instruction and intervention based upon individual student need provided in addition to and aligned with Tier 1 & 2 academic

and behavior instruction and supports.

Tier 2: Targeted, Supplemental Interventions & Supports.

More targeted instruction/intervention and supplemental support in addition to and aligned with the core academic and behavior curriculum.

Tier 1: Core, Universal Instruction & Supports.

General academic and behavior instruction and support provided to all students in all settings.

Revised 12/7/09

Problem Solving Process

EvaluateResponse to

Intervention (RtI)

Problem AnalysisWHY are they not

doing it?Identify Variables that Contribute to the Lack of Desired Outcomes

Identify the GoalWhat Do We Want Students to Know and

Be Able to Do?

Implement PlanImplement As Intended

Progress MonitorModify as Necessary

Steps in the Problem-Solving Process1. Problem Identification

Identify replacement behaviorData- current level of performanceData- benchmark level(s)Data- peer performanceData- GAP analysis

2. Problem AnalysisDevelop hypotheses (brainstorming)Develop predictions/assessment

3. Intervention DevelopmentDevelop interventions in those areas for which data are available and hypotheses verifiedProximal/DistalImplementation support

4. Response to Intervention (RtI)Frequently collected dataType of Response- good, questionable, poor

27

Why MTSS? Why Now?

MTSS: Integrating Two Evidence-Based Models to Improve the Academic and Behavior Outcomes for ALL Students

• Challenging Times In Which to Educate America’s Children and Youth– Performance Evaluations Tied to Student

Growth– Economic Crises resulting in reduction of

resources– Alternatives to Public K-12 Education– AYP Projections and Expectations– Recruitment and Retention of Qualified

Professionals– Common Language/Common Understanding

with Educators, Parents and the Community

Strategies for Successfully Addressing these Challenges

Align and allocate effective resources with student needs-Return on Investment Model (ROI)

Anticipate the Future-prevention is cost-effective

Use of Highly Effective Practices-identify them, reward them

Efficient Delivery of those Practices

Data to Evidence Effectiveness of Practices

Strong Professional Development and Support to Sustain Effective Practices aligned with district priorities

Communicating Clearly and Frequently with Stakeholders

Use of professional personnel evaluation models that demonstrate impact of evidence-based practices aligned with district mission

Highly Effective Practices: Research

High quality academic instruction (e.g., content matched to student success level, frequent opportunity to respond, frequent feedback) by itself can reduce problem behavior (Filter & Horner, 2009; Preciado, Horner, Scott, & Baker, 2009, Sanford, 2006)Implementation of school-wide positive behavior support leads to increased academic engaged time and enhanced academic outcomes (Algozzine & Algozzine, 2007; Horner et al., 2009; Lassen, Steele, & Sailor, 2006)“Viewed as outcomes, achievement and behavior are related; viewed as causes of the other, achievement and behavior are unrelated. (Algozzine, et al., 2011)Children who fall behind academically will be more likely to find academic work aversive and also find escape-maintained problem behaviors reinforcing (McIntosh, 2008; McIntosh, Sadler, & Brown, 2010)

30

31

School-wide Behavior & Reading Support

The integration/combination of the two:Are critical for school success

Utilize the three-tiered prevention model

Incorporate a team approach at school level, grade level, and individual level

Share the critical feature of data-based decision making

Produce larger gains in literacy skills than the reading-only model

(Stewart, Benner, Martella, & Marchand-Martella, 2007)

32

School Psychologists Role in a Multi-tiered

Support System

33

Emerging Leadership Themes

Multi-tiered Systems of SupportEvidence-based practicesImplementation science

Rob Horner, Futures of School Psychology Conference 2012

Professional Development:Core Skill Areas for ALL Staff

• Data-Based Decision Making Process

• Coaching/Consultation

• Problem-Solving Process

• Collection, Management and Use of Integrated Data Systems

• Instruction/Intervention Development, Support and Evaluation

• Instruction/Intervention Fidelity

• Staff Training

• Effective Interpersonal Skills

35

Student Services Role in an MTSS System

Academic Performance of students (educator appraisal factor) is influenced significantly by social, emotional and behavior factors—the professional practices of student services personnel

Combining evidence-based instructional strategies with evidence-based strategies to enhance student engagement results in the most dramatic student gains (LESSON STUDY)

Enhancing student engagement (at all levels) is a primary role of students services personnel

36

Student Services Role in an MTSS System

The continued viability and importance of student services personnel is influenced strongly by the impact of their practices on student performance-particularly academic performance

Services provided by student services personnel have a strong, evidence-based relationship with student academic performance

A blueprint for a clear, explicit relationship between the provision of evidence-based student services practices and positive student outcomes is critical in the context of school accountability

Student services personnel must PLAN in such a way as to demonstrate ACCOUTABILITY and COMMUNICATE those outcomes.

37

Multi-tier System of Student Supports (MTSSS):Response to Instruction/Intervention (RtI)

An Overview of Data-based Problem-solving within a Multi-tier System of Student Supports in Florida’s Public Schools

Intensive, Individualized Supports•Intensive interventions based on individual student needs•Students receiving prolonged interventions at this level may be several grade levels behind or above the one in which they are enrolled•Progress monitoring occurs most often to ensure maximum acceleration of student progress•If more than approximately 5% of students are receiving support at this level, engage in Tier 1 and Tier 2 level, systemic problem-solving

Targeted, Supplemental Supports•Interventions are based on data revealing that students need more than core, universal instruction•Interventions and progress monitoring are targeted to specific skills to remediate or enrich, as appropriate•Progress monitoring occurs more frequently than at the core, universal level to ensure that the intervention is working•If more than approximately 15% of students are receiving support at this level, engage in Tier 1 level, systemic problem-solving

Core, Universal Supports•Research-based, high-quality, general education instruction and support•Screening and benchmark assessments for all students•Assessments occur for all students •Data collection continues to inform instruction•If less than approximately 80% of students are successful given core, universal instruction, engage in Tier 1 level problem-solving

38

Critical Role in Addressing Barriers to Learning

Engage in collaborative problem-solving at district, school, and individual levels.Provide culturally competent services (academic, social-emotional, behavioral) to students, schools, and families within a multi-tier model of service delivery.Develop and implement evidence-based interventions at each tier.Conduct assessments that inform instruction (screening, progress monitoring, diagnostic). Assess fidelity and effectiveness of instruction and intervention.

39

Critical Role in Addressing Barriers to Learning

Assist in the design and use of data systems (data collection, display, and interpretation).Provide leadership implementing policies and practices that result in effective and equitable outcomes.Provide services and supports to reengage disconnected students.Engage familiesAdvocate for for evidence-based and culturally competent practices.

40

Florida’s New Evaluation System

The Student Success Act

41

Purpose for Personnel Evaluations

As set forth in the Student Success Act and Race to the Top, teacher evaluations are:

Designed to support effective instruction and student learning growth.

Used when developing district and school level improvement plans.

Used to identify professional development.

42

Measure sound educational principles and research in effective practice in three major areas:

Performance of studentsInstructional Practice (FEAPs)Professional & job responsibilities

Evaluations must differentiate among 4 levels of performance:

Highly EffectiveEffectiveNeeds Improvement or Developing (1st 3 years)Unsatisfactory

Purpose for Personnel Evaluations

(cont.)

43

Major Components of the Evaluation System

Instructional Practice measured by the District’s Instructional Practice Framework

Student performance measured by student learning growth

Instructional Practice

(50%)

Performance of

Students(50%)

Instructional PracticeSection 1012.34, F. S., requires that

instructional practice evaluate the following:For Classroom teachers, excluding substitutes:

Florida Educator Accomplished Practices (FEAPs)For Instructional personnel, not classroom teachers:

FEAPsMay include specific job expectations related to student support

Instructional Framework goal: An expectation that all teachers can increase their expertise from year to year which produces gains in student achievement from year to year with a powerful cumulative effect

44

Performance of StudentsAt least 50% of a performance evaluation must be based upon data and indicators of student learning growth assessed annually and measured by statewide assessments or, for subjects and grade levels not measured by statewide assessments, by district assessments as provided in s. 1008.22(8), F.S.

Section 1012.34(3)(a)1., Florida Statutes45

47

Performance of Students

The performance of students represents 50% of a teacher’s evaluation, with performance based on student learning growth– Growth data for 3 years of students assigned

to the teacher.– If less than 3 years of data are available,

years for which data are available must be used, and percentage of evaluation based on growth may be reduced to not less than 40%.

To meet the above requirement, the development of a fair and transparent measure of student growth is essential.

Florida’s Value-Added Model

A value-added model measures the impact of a teacher on student learning, by accounting for other factors that may impact the learning process.

These models do not:Evaluate teachers based on a single year of student performance or proficiency (status model).Evaluate teachers based on simple comparison of growth from one year to the next (simple growth).

48

49

Advantages of Value-Added Models

Teacher teach classes of students who enter with different levels of proficiency and possibly different student characteristics.

Value-added models “level the playing field” by accounting for differences in the proficiency and characteristics of students assigned to teachers.

Value-added models are designed to mitigate the influence of differences among the entering classes so that schools and teachers do not have advantages or disadvantages simply as a result of the students who attend a school or are assigned to a class.

Value-Added Example

50

0

100

200

300

400

500

Student E

Teacher X

Prior Performance Current Performance Predicted Performance

The difference between the predicted performance and the actual performance represents the value-added by the teacher’s instruction.

The predicted performance represents the level of performance the student is expected to demonstrate after statistically accounting for factors through a value-added model.

51

Florida’s Value-Added Model

Begins by establishing expected growth for each student based on historical data each year.Represents the typical growth seen among students who have earned similar test scores the past two years, and share the other characteristics identified by the committee.Accounts for student, classroom, and school characteristics (factors outside the control of the teacher)

52

Factors Identified to “Level the Playing Field”

Student CharacteristicsUp to two prior years of achievement scores (the strongest predictor of student growth)The number of subject-relevant courses in which the student is enrolledStudents with Disabilities (SWD) statusEnglish Language Learner (ELL) statusGifted statusAttendanceMobility (number of transitions)Difference from modal age in grade (as an indicator of retention)

Classroom CharacteristicsClass sizeHomogeneity of students’ entering test scores in the class

Factors Identified by the SGIC to

“Level the Playing Field” The model recognizes that there is an independent factor related to the school that impacts student learning – a school component. 

Statistically is simply the factors already controlled for in the model measured at the school level by grade and subject.May represent the impact of the school’s leadership, the culture of the school, or the environment of the school on student learning.Acts as another covariate, just like all other factors.

53

54

Overview of SSPEM Developing a State Model for

Student Support Services Personnel Evaluations

55

56

57

Student Support Services

58

Fundamental Principles

Fundamental Purpose: Improve academic & behavioral outcomes for students

Reflect a Multi-tiered System of Support (MTSS) framework.

Align with evidence and research-based practices and professional standards linked to positive student outcomes.

Integrate common practice standards across student services professions.

Support professional growth and continuous improvement.

59

Fundamental Principles (cont.)

Offer a state-approved evaluation framework that is dynamic (flexible

& fluid) and complies with the Student Success Act for districts to

adopt, adapt, or use as a guide.

60

Focus on “practices” component

Crosswalk Professional Practice Standards with FEAPs, Professional Competencies, & Teacher/Principal models

Identify Domains of practice; Practices; Indicators for each practice (levels of performance/proficiency)Research/evidence supporting practice

Develop an evaluation rubric

Vet model rubric with key stakeholders

Developing the Model

61

Relevance of Professional Standards

PurposeEstablishes foundationAuthenticatesLends credibility and agreement

IntegratesResearch linked to positive student outcomesEvidence-based strategies Best practices

62

Bureau of Educator Recruitment & Student Support Services (BEESS)

Task: Develop Process and Model

Core Development Workgroup (District Coordinators from each of the Student Service disciplines)

Task: Develop Draft Model

Focus Group (Student Services Directors; Coordinators from Student Service disciplines; Administrators; Other stakeholders)

Task: Feedback on Draft Model

Contributors/Partners

63

Conceptual Model

Domains (5 Domains) – broad categories used to organize professional practices and help structure the evaluation.

Practices (25 Practices) – standards of practice within a a domain related to a specific area of professional skill.

Indicators – continuum of descriptive statements that assist in differentiating levels of performance for each practice (Highly Effective, Effective, Emerging, Ineffective).

64

Domains of PracticeData-based Decision Making and EvaluationInstruction/Intervention Planning & DesignInstruction/Intervention Delivery & FacilitationLearning EnvironmentProfessional Learning, Responsibility, & Ethics

65

Scoring Rubric for Indicators

Highly Effective – practice has broader, systemic impact (school-wide/district-wide) OR facilitates effective practice of others through mentoring and/or trainingEffective – demonstrates the essential elements of the practice competently and independently with individual students and groupsEmerging – developing practice competence but requires additional supervision, support or training Ineffective – does not demonstrate practice or demonstrates practice poorly

66

Evaluation Rubric

67

Evaluation Rubric

68

Evaluation Rubric

69

Evaluation Rubric

70

Evaluation Rubric

71

Resources/Tools in Guide

Research Support for Model (Appendix B)Crosswalks

Professional Practice Standards (Table 2)FEAPs, Marzano, & Danielson (Table 3)

Methods and Sources of Evidence for Evaluating Professional Practice (Table 1)Scoring Protocols

Evaluation Rubric Scoring Protocol (Form 1)Student Growth ProtocolSummative Evaluation (Form 3)

72

73

74

75

76

Recommendations for District Use

The Evaluation Cycle ProcessSSPEM and the District FrameworkStudent Growth Component

77

SSPEM is designed toEstablish practices and expectations that are linked to student outcomes (academic & behavioral) and based on research.

Develop evaluation procedures that align with professional standards and accomplished educator practices (FEAPs).

Provide feedback to the professional that recognizes effective performance, identifies areas for improvement, and directs professional growth activities.

Provide support to supervisees/practitioners not meeting performance expectations.

78

Evaluation Cycle Process

79

Student Growth Component

Student learning growth component must account for 50% of the evaluation (modified if less than three years of data).

Must be based on students assigned to the professional.

Measurable student outcomes – up to 20% of the student learning growth component may be based on measurable student outcomes specific to the position.

80

81

Problem-Solving Process:Step 1

Desired Outcome: What Do We Want the Student(s) to Know and Be Able to Do? (Tier 1 Goal-Impact on Academic Growth?)

Current Level of Performance (Prior/History)Metrics

Desired Level of Performance (Expected)MetricsPeer Comparisons, Other Data

Rate of Growth

Actual (Value Added?)

82

Table Top Activity

Setting Service ExpectedOutcome

Personnel Involved

Data Collection

Multiple? BehaviorAcademic

Tier 1

Tier 2

Tier 3