OSEP Project Director’s Meeting Washington DC July 15-17, 2013

44
Learning Models, Personalized Instruction, and Within Year Assessment for Low Performing SWD: Implications for Next Generation Comprehensive Assessment System OSEP Project Director’s Meeting Washington DC July 15-17, 2013

description

Learning Models, Personalized Instruction, and Within Year Assessment for Low Performing SWD: Implications for Next Generation Comprehensive Assessment System. OSEP Project Director’s Meeting Washington DC July 15-17, 2013. This Breakout Session. Contributing Sponsors. - PowerPoint PPT Presentation

Transcript of OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Page 1: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Learning Models, Personalized Instruction, and Within Year Assessment for Low Performing SWD: Implications for Next Generation Comprehensive Assessment System

OSEP Project Director’s Meeting

Washington DCJuly 15-17, 2013

Page 2: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

2

This Breakout SessionTOPIC PRESENTER

Welcome Susan Weigert (Moderator), OSEP

GSEG Dissemination Edynn Sato (Lead), WestEdWiYA Symposium Renee Cameto, SRI

InternationalLearning Model Exemplar Sue Bechard, Inclusive Educ.

Asmt.Formative Instruction Karen Erickson (Lead), UNCDiscussion Patricia Almond, Univ. of OR

Page 3: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

3

Contributing Sponsors

OSEP Project Directors’ Meeting 2013

Page 4: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

4

SUSAN WEIGERTOSEP

Welcome & Orientation

Page 5: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

5

OSEP US-DOE Projects General Supervision Enhancement

Grants (GSEG) Learning Models and Learning

Progressions Role of LMs and Formative

Assessments To Promote Learning and Inform

Teaching

Page 6: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

EDYNN SATOWESTEDWASHINGTON, D.C.OCTOBER 26, 2012

Learning Models Colloquium

Page 7: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

7

Colloquium on Learning Models, Instruction, and Next Generation Assessments that Include Special

Populations

October 26, 2012Washington Marriott at Metro Center

Washington, DC

Page 8: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

8

Background: General Supervision Enhancement Grant

WestEd, the Kansas State Department of Education, and the Louisiana Department of Education (H373X070002)

Project Officer: Susan Weigert Focus of grant (general):

Technical Assistance on Data Collection Priority A––Modified Academic Achievement

Standards Dissemination

1Learning Models Colloquium

Page 9: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

9

Colloquium PurposeLearning Models: in development in the U.S. proposed as the foundation for designing

comprehensive next generation assessment systems—both formative and summative

Colloquium will involve discussion of: what is known, what is in the works, and what needs to be known or investigated

2Learning Models Colloquium

Page 10: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

10

Learning Models: Foundation for AssessmentLearning progressions have been proposed for use in both large-scale and classroom assessments. In both cases, they may provide more detailed information about student thinking than more traditional models of assessment. This detailed information is particularly important in the classroom, where it can be used as the first step in a formative assessment process, to impact instructional decisions and provide feedback to students, ultimately improving student learning (Alonzo & Steedle, 2008, p. 419).

3Learning Models Colloquium

Page 11: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

11

Organizing Themes Instruction/Formative Assessment Who Are the Students in Special

Populations? Technical Considerations & Learning

Analytics Race-to-the-Top General Supervision Enhancement

Grants National Perspective

Page 12: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

12

Formative versus Summative?

ONE VIEWA common learning progression for both

formative and summative

assessments.

“In CBAL they are integrated and based on a common set of models. There are strong reasons to use a common foundation for formative and summative assessment. How do you take the evidence from the formative assessment and use it in the classroom?”

Page 13: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

13

Formative versus Summative?

ANOTHER VIEWThe foundation is the standards – the CCSS

assumed some progressions – an

embedded scope and sequence.

“But, the formative and summative could be different. The standards are common, but the learning progression may not be the same for both assessments. I see it halfway between both of you. A summative [assessment] requires a definition of the scope and sequence.”

Page 14: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

14

Selected Discussant Comments

Next generation assessments need high correspondence between interpretation of how students’ progress given the model and the outcomes measurement model.

If there are variations in student pathways, there should be correspondence in the measurement model.

Few learning models have specifically addressed learning and progress for students with disabilities.

Learning models need to support both instruction and assessment to inform instruction.

Page 15: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

15

Additional Considerations In order to account for student thinking and

learning across our heterogeneous student population, what range of models should be established to inform/support the effective instruction and valid assessment of our diverse learners?

How do we best reconcile the “tension” between the number of models and pathways our heterogeneous student population likely necessitates and a viable number of models and pathways that can be appropriately applied (generalized) across the population?

Page 16: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

16

RENEE CAMETOFEBRUARY 21-22, 2013SRI INTERNATIONAL

Within Year Assessments—W iYA for Students with Disabilities Symposium

Page 17: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

17

Organizers Renee Cameto, SRI International Sue Bechard, Inclusive Educational Assessment Patricia Almond, CATE, University of Oregon Jose Blackorby, SRI International Mary Brownell, University of Florida Steve Elliot, National Center on Assessment and Accountability for

Special Education Neal Kingston, University of Kansas Sheryl Lazarus, National Center on Educational Outcomes Edynn Sato, WestEd Jerry Tindal, National Center on Assessment and Accountability for

Special Education Martha Thurlow, National Center on Educational Outcomes Susan Weigert, OSEP

Page 18: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Sponsors SRI International WestEd National Center on Educational Outcomes

(NCEO)

Center for Educational Testing and Evaluation (CETE), University of Kansas

National Center on Assessment and Accountability for Special Education (NCAASE)

182013 Invitational Research Symposium

Page 19: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

19

Purpose of the Symposium Better understand how learning

progressions/maps apply to students in special populations

Discuss how learning progressions/maps will be used to develop assessments based on CCSS

Address three interrelated topics: Why use learning progressions/maps for

assessment? How will all students be validly and reliably

included? What are the technical issues that must be

addressed? Provide support for researchers in these

areas

Page 20: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

20

Within Year Assessments (WiYA) . . . formative assessment . . .

Purposes: Monitor progress, Diagnose strengths and

weaknesses of students, Inform instruction, Support personalized

learning and instruction, and

Lead to improved achievement

NOT: primarily

designed as a summative assessment for accountability

Page 21: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

21

Low Performing Students With Disabilities Not AA-AAS eligible Could be AA-MAS eligible in the

states where there is an AA-MAS Low performing students with

disabilities in states that do not have an AA-MAS

Low performing students with 504 plans

Low performing students but not eligible for special education

Page 22: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

22

Persistently Low Performing

“ . . . students were defined as those who scored at the 10th percentile or

below for all three years. . . ”

Perie, Fincher, Payne, & Swaffield (2013)

an example of a data based definition of low performing students

Page 23: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

23

Symposium Participants will…

Share relevant experiences, research, and practices. Presentations on Characteristics of low performing SWD Opportunity for students to learn Opportunity for teacher professional

development WiYA features

Identify common experiences, challenges, and lessons learned

Identify key issues and recommendations

Page 24: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

24

Outcomes of the Symposium Inform current consortia and the

broader educational community about issues and research recommendations related to key issues

Move the field forward with greater understanding of the challenges and important considerations

Present and publish in various venues

Page 25: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

SUE BECHARDINCLUSIVE EDUCATIONAL ASSESSMENT

Learning Model Exemplar:Dynamic Learning Maps

OSEP Project Directors’ Meeting 2013

Page 26: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

What are Learning Maps? Network of

connected learning targets (nodes)

Maps students’ “knowledge terrain”

Page 27: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Making Nodes1. Review of

Literature2. Node Development

and Placement3. Connection

Placement

Page 28: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Dynamic Learning Maps help us visualize

The skills a student has acquired The path they took to get there And where they’re going next.

12 X 3 = 36

22 X 5 = 110

123 X 3 = 369

224 X 3 = 672

Page 29: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Multiple Paths

29

Page 30: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Alternate Paths

30

Page 31: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Identifying Conceptual Areas

31

Page 32: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Inference

mastered

mastered

Not mastered

Page 33: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Validation

ReviewsInternalTeacherExpert

Cognitive labsPilot studyField tests

Page 34: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

KAREN ERICKSONUNIVERSITY OF N. CAROLINA AT CHAPEL HILLCTR FOR LITERACY & DISABILITY STUDIES

Formative Instruction based on Learning Models

Page 35: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

35

Whole-to-Part Model of the Constructs Underlying

Silent Reading Comprehension

Cunningham (1993)

Page 36: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Language Comprehension

Word Identification

Print Processing Beyond Word Identification

Silent Reading Comprehension

Page 37: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

The Process Identify all 3rd, 4th , and 5th grade

students who are struggling in reading based on: End of Grade & Benchmark

Performance Teacher Referral

Complete a Whole-to-Part Reading Assessment

Assign the student to an appropriate intervention group based on WTP results

Monitor progress and reassign as needed

Page 38: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Evidence that it works Began district-wide in fall of 2009 In 2011-12 a total of 593 students were

served Raising the floor:

Average Intervention Level: 2009: 1.6 2011: 2.0

Average Silent Reading Comprehension Level:

2009: 2.4 2011: 3.2

Page 39: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

Elementary Pro-ficient (437)

Middle Proficient (393)

Elementary Level 1 (437)

Middle Level 1 (393)

0%

10%

20%

30%

40%

50%

60%

21%

29%

51%

26%

31%33%

36%

28%

45% 46%

27%

14%

2010 2011 2012

Elementary and Middle School Students Served in WTP

Page 40: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

40

PATRICIA ALMONDCATE—UNIV OREGON

Questions for Discussion

Page 41: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

41

Who is with us today? State Dept. of Educ. Researchers Organizations Family Members Professional

Development

Page 42: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

42

Summary

ASMT for

INSTRUC-TION

Learning Models

Low Perf. SWD

Next Gen Asmt

Page 43: OSEP Project Director’s Meeting Washington DC July 15-17, 2013

43

DISCUSSION QUESTIONS How might these considerations

influence your work? What are next steps OSEP and the

field of special education should consider taking?

What do you see as our greatest strengths moving forward?

Biggest challenges?