Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO...

Post on 26-Mar-2015

214 views 0 download

Tags:

Transcript of Three-Year Effects of a Benchmark Assessment System on Student Achievement Presented the CCSSO...

Three-Year Effects of a Three-Year Effects of a Benchmark Assessment Benchmark Assessment

System on Student System on Student AchievementAchievement

Presented theCCSSO National Conference on Student Assessment

June 29, 2012

Shana Shaw, Ph.D.shana.shaw@sedl.org

Jeffrey C. Wayman, Ph.D.jwayman@austin.utexas.edu

Cindy Bochna, Ph.D.crbochna@mpsaz.org

Joe O’Reilly, Ph.D.joreilly@mpsaz.org

Today’s Talk

Part I: District Context (Cindy)

Part II: Research Study (Shana)

District Context

Mesa Public Schools (MPS)Mesa, Arizona

• Suburban with an inner city core and serving two Native American communities

• 65,000 students• 65% free & reduced lunch rate• 50% Anglo, 40% Latino, 5% American

Indian• Generally slightly above state averages

on academic indicators

Acuity Implementation in MPS

Goal 1: Implement an online formative assessment system.• Initial focus on giving predictive tests and using results to predict state assessment performance

Goal 2: Help teachers incorporate Acuity in ongoing instruction & use it as a tool for PLCs•Creating custom formative tests, using instructional resources•Acuity Unwired

Acuity Timeline in MPS

MPS Acuity Training Model

Acuity Training Characteristics

District hired an experienced Acuity trainer to help with implementation.1.Training by school request• Principals set up 1, 2, and 4 hour sessions for their staff.• If possible, training is scheduled during the school day with sub coverage.

2.Training by teacher request• 4 hour workshops are held through professional development.

Acuity Training Characteristics

Initial Training•Navigating the system, accessing reports

Subsequent Trainings•Using reports to understand student achievement•Assigning instructional resources based upon results from predictive tests

Advanced Uses•Teachers learn to design custom tests for classroom use as well as triangulate data to other district data sources.

Acuity Training Characteristics

CommunicationContinually updated website with relevant information and a news feed.

http://www.mpsaz.org/acuity

Teacher Attitudes Toward Acuity

Survey Method

Online surveys (2009 – 2012)•All teachers surveyed and asked to complete Survey of Educator Data Use online (Wayman et al. 2009).•Incentives provided for school level response rates (e.g., 1 case of paper for 50%).

•Teachers were also asked to complete an electronic survey via Acuity newsletter (2011).•Survey asked about Acuity likes, dislikes and additional training desires as well as Acuity items from the Survey of Educator Data Use.

Teachers Are Comfortable Using Acuity

Survey Item Agreement

Acuity is easy to use. 67%

I know how to use Acuity. 89%

I don’t have to wait for data.

66%

The data is accurate. 77%

Acuity is dependable. 71%

The data is always up to date.

83%

• Source: 2012 Data Use Survey

2009

2010 2011 2012

Agree 59% 58% 70% 69%Disagree 41% 42% 30% 31%

Teachers Felt Acuity Helped Their Teaching

Data Use Survey Item: “Acuity makes my work easier.”

2009 2010 2011 2012

Agree 68% 75% 84% 79%

Disagree 32% 25% 16% 22%

Teachers Felt Acuity Helped Improve Instruction

Data Use Survey Item: “Acuity helps me improve my instruction.”

Experience0-5 Yrs

6-10 Yrs

11-15 Yrs

16+Yrs

Easy to Use 74% 69% 66% 66%

Improves Instruction

80% 76% 78% 79%

Makes Work Easier 78% 72% 64% 67%

Attitudes Toward Acuity by Teaching Experience

Category

Source: 2012 Data Use Survey

District Context Wrap-Up

We have a system that teachers feel:•they know how to use,•is accurate and timely,•helps teachers in their work,•improves instruction.

How well did that transfer into academic achievement?

Research Study

National Context: Benchmark Assessment

ResearchStudies generally show small, positive effects on achievement, but results are inconsistent:

•Vary by content or student sub-groups

•Higher achievement gains when:• Initiatives are sustained.• Data used to select effective interventions.

Sources: Carlson et al., 2011; Henderson et al., 2007; May & Robinson, 2007; Quint et al., 2008; Slavin et al., 2011

Current Study

Study Context: •Year 3 of Acuity implementation (2011)•Small group training at schools, and increased district support.

Variables:•Student state test scores (AIMS) linked to teachers’ instructional use of Acuity (measured by “use logs”).•Characteristics of students, teachers, and schools.

Current Study

Purposes: •Explore what Acuity use looks like after three years.

•Assess cumulative impact of Acuity use on achievement.

Analytic Method

Cross-classified HLMs: Explore associations between Acuity use (2010 and 2011) and 2011 student achievement (grades 4-8).

• Disaggregated by content (reading/math) and school-level (elementary/JHS)

Three HLM outcomes of interest:1. Standardized regression coefficients2. Statistical significance 3. Relative impact of Acuity use on

achievement

Cross-Classified HLM Example

Student (Level-1): 2011 AIMS Reading Score(j1j2)k= π0(j1j2)k +

π1(j1j2)k(gender)(j1j2)k

+ π2(j1j2)k(free lunch status)(j1j2)k + π3(j1j2)k(ethnicity)(j1j2)k

+ π4(j1j2)k(2009 AIMS Reading score)(j1j2)k + e(j1j2)k

Teachers (Level-2): π0(j1j2)k = 00k + 01k(2011 teacher experience)(j1j2)k +

02k(2010 teacher experience)(j1j2)k +

03k(2011 teacher Acuity use)(j1j2)k +

04k(2010 teacher Acuity use)(j1j2)k + uj1k + uj2k

Source: Raudenbush & Bryk, 2002

Research Study Results

What Does Acuity Use Look Like After 3 Years?

After large increases from 2009 to 2010, Acuity use plateaued in 2011.•Averaged about 10 weeks and 150 instructional uses of Acuity. •Same Acuity functions used both years

HLM Results (2011): Teaching experience and school-level associated with teachers’ Acuity use.

What Did Teachers’ Instructional Use of Acuity

Look Like in 2011?Acuity use spiked after predictive assessments were given in August (Form A), October (Form B), and January (Form C).

Teachers’ Acuity Use & Elementary Achievement

Current teachers’ Acuity use (2011):•Significantly related to reading (β=.05, p<0.01) •Marginally related to math (β= .03, p= 0.06)

Previous teachers’ Acuity use (2010): Not significantly related to achievement.

Notes: •Student background factors were significant. •Teaching experience was not significant.

Teachers’ Acuity use & Junior High Achievement

Neither current (2011) nor previous (2010) teachers’ Acuity use was significantly associated with junior high achievement in 2011.

Notes: 1.Student background factors were significant. 2.Teaching experience was not significant.

Impact of Teachers’ Acuity Use Relative to Other

FactorsMagnitude of Acuity use impact: Extra 6 weeks of Acuity use (2011) associated with: •1- to 2-point increase in elementary reading•1-point increase in elementary math

Acuity use vs. student factors: Acuity use (2011) had weaker relationship with elementary achievement than student factors (e.g., FERPL status).

Acuity use vs. teacher experience: Acuity use had stronger associations than experience.

What Do These Results Mean?

Possibility #1: Acuity use doesn’t have much of an impact on achievement. •Small, positive associations with elementary achievement in both 2010 and 2011.•Teachers report that the system is useful and that it helps their instruction.

We don’t think the data support this conclusion.

What Do These Results Mean?

Possibility #2: Need more time to significantly impact achievement in MPS. •Three years not enough to get critical mass of teacher skill in using Acuity, and subsequent achievement benefits.

Possibility #3: Need to move to next level with teachers’ Acuity use. •Acuity use hasn’t changed much.•Virtually same type of Acuity use as 2010, virtually same associations with achievement.

Recommendations

To take Acuity’s impact to the next level, we recommended MPS focus on these areas:• Explore how and when teachers’ notions

about data use are compatible with Acuity. • Highlight new areas of compatibility

between Acuity resources and teachers’ acknowledged data needs.

Source: Cho & Wayman (2012)

Contact & Follow-Up Information

If you have questions about the research conducted for this presentation, please contact Jeff or Shana.

•jwayman@austin.utexas.edu

•shana.shaw@sedl.org

For more information on this and similar studies conducted by these researchers, please go to:

•http://edadmin.edb.utexas.edu/datause/

•http://www.facebook.com/datause

ReferencesCarlson, D., Borman, G. D., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the

impact of data-driven reform on reading and mathematics achievement (2011). Educational Evaluation and Policy Analysis, 33(3), 378–398.

Cho, V. & Wayman, J. C. (2012, April). Districts' efforts for data use and computer systems. Paper presented at the 2012 Annual Meeting of the American Educational Research Association, Vancouver, Canada.

Henderson, S., Petrosino, A., Guckenburg, S., & Hamilton, S. (2007). Measuring how benchmark assessments affect student achievement (Issues & Answers Report, REL 2007 No. 039). Washington, DC: U.S. Department of Education, Institute of Education Sciences.

May, H., & Robinson, M.A. (2007). A randomized evaluation of Ohio’s Personalized Assessment Reporting System (PARS). Madison, WI: Consortium for Policy Research in Education.

Quint, J., Sepanik, S., & Smith, J. (2008). Using student data to improve teaching and learning: Findings from an evaluation of the Formative Assessments of Students Thinking in Reading (FAST-R) program in Boston elementary schools. New York: MDRC.

Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage.

Slavin, R. E., Holmes, G., Madden, N. A., Chamberlain, A., Cheung, A., & Borman, G. D. (2010). Effects of a data-driven district-level reform model, Working Paper. Baltimore, MD: Center for Data-Driven Reform, Johns Hopkins University.

Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of Educator Data Use. Unpublished document.