Formative Experimentationdistrictreform.com/wp-content/uploads/AERAResearchConfSupovitz.… · Can...

Post on 01-Oct-2020

1 views 0 download

Transcript of Formative Experimentationdistrictreform.com/wp-content/uploads/AERAResearchConfSupovitz.… · Can...

Formative Experimentation

What Role Can Experimental Research

Play in Program Development?

Jonathan Supovitz

University of Pennsylvania

Consortium for Policy Research in Education

AERA Research Conference

Rochester, NY

November 3, 2011

Presentation Organization

Quick overview of the intervention,

Fostering communities of inquiry between researchers and program developers

Linking Teaching and Learning

Hypothesis: Feedback on teaching examined in conjunction with learning is a more powerful learning condition for teachers than feedback on learning alone.

Testing this hypothesis in an experimental, mixed method framework

Close partnership between district developers and researchers

Distinctions in the Literature

Experiments not used for program improvement

Require stability of intervention

Tradition of ‘design experiments’ which test design theory to application in practice

Feature partnerships between researchers and practitioners

Volunteers

Randomly

Assigned

How the Intervention Works

Lesson

observed

within the

unit for all

participating

teachers

Control group

teachers (as

usual) examine

end-of-unit test

data in PLC

meeting

Repeat

process

twice

during

2010-11

school

year

Math Unit Next Math Unit

Treatment teachers

have facilitated

conversation about

teaching in

conjunction with

end-of-unit test

data in PLC

meeting

Treatment

teachers

receive

customized

feedback (AR

and AT) on their

lesson via

email

Lesson

Video

Emailed

Feedback PLC

Conversation

Pre-

Survey Post

Survey EoU

Test

Data

Exit

Slips

Repeat

Participant

Interviews

Lesson

Video

Facilitator

Interviews

Linking Study Data Collection Scheme

The Task: Where does the fraction

6/9 go on the number line?

Oliver Lee

Ms.

Watson

4th Grade Math Class

Oliver: “If 3/4 is here, then 2/3 goes

here” (to the right)

Watson: “Are you sure 2/3 is greater

than 3/4? How do you know that?”

Lee: “Hmm. . . I Can’t really tell”

Strategy #1

Oliver: “Lets try it this way”

Strategy #2

Watson: “Are there any other ways you

could know? Maybe with equivalent

fractions and like denominators?”

Watson: “So which

is greater?”

(Oliver draws in

the > sign)

Lee: (Works it out) “Oh . . . so this

(9/12) is greater than this (8/12)”

Strategy #3

Ensuing PLC Discussion

1. How did the Accountable Talk interaction

begin?

2. What do the student responses and the

strategies they use reveal about their

understanding of the math?

3. What was the teacher’s follow up (if any)?

4. Were there any missed opportunities here?

How might you have changed the interaction to

learn even more about the students’

understanding?

Lee

Oliver

Teacher perceptions of their

experience

0.00

1.00

2.00

3.00

4.00

5.00

Learning aboutInstruction

Learning aboutStudents

PLC GroupInteraction

3.50 3.61* 3.74* 3.53 3.16 3.47

Learning Alone Teaching & Learning

* p < .05

Spring (Cycle 2) PLC Exit Slip Data

Impacts on Teaching Practice

0.00

1.00

2.00

3.00

4.00

5.00

Fall 2010 Spring 2011 Fall 2010 Spring 2011

3.04 2.75

3.19 2.89 3.07

3.40* 3.28 3.45*

Learning Alone Teaching & Learning

Academic Rigor Accountable Talk

* p < .05

Impacts on Student Test Performance

Round 1 Round 2

Summary Comments on the Intervention

Getting teachers to open up their practice to feedback is a tough sell:

With peers

In today’s accountability environment

Looking at data on instructional practice in conjunction with student test data is a rich experience for teachers.

Preliminary results suggest that looking at data on teaching and learning is a more powerful learning experience than looking at data on learning alone.

Examining the Relationship Between

Research and Program Improvement

Can experimentation be used as a mechanism for

intervention improvement?

More generally, in what ways do different forms of data

inform program improvement?

In what ways are the contrasts provided by an

experiment helpful in providing feedback and adjusting

program design?

Do program changes really compromise the

experimental situation? In essence, can experiments be

used for program learning?

Issue Data Change

1. Treatment

untargeted Teacher Interviews

Facilitator Focus

Groups

Increased focus on video clips as

example of instructional data

examined by treatment group

2. PLCs seen as inefficient

Teacher Interviews

Facilitator Focus

Groups

Videotaping and discussing same lesson across teachers in PLC

3. Teachers want

earlier feedback Teacher interviews Refining of sequence of treatment

feedback

4. (Treatment

expansion) Classroom Ratings

Achievement Results Increasing cycles of feedback from

2 to 3

5. (Concern treatment

sessions are too packed)

Exit Slip Data Further efforts to streamline

treatment PLC

Program Adjustments & Their Sources

Problem-Driven Changes Data-Driven Changes

Conclusions

Experiments can provide a different kind of formative

feedback to program designers.

Not a substitute for other kinds of data.

The ‘leverage’ of formative experimentation is data-

driven rather than problem-driven.

Comes from ‘looking backwards’ from effects in the

data (with greater confidence) to reasons this might

be so.

Is the experiment compromised by mid-course treatment

adjustments?

Depends on how you think about implementation.

Formative Experimentation

What Role Can Experimental Research

Play in Program Development?

Jonathan Supovitz

University of Pennsylvania

Consortium for Policy Research in Education

AERA Research Conference

Rochester, NY

November 3, 2011