Student Assessment and Data Analysis Oakland Schools MAEDS 2005 Tammy L. Evans.

Post on 22-Jan-2016

224 views 0 download

Transcript of Student Assessment and Data Analysis Oakland Schools MAEDS 2005 Tammy L. Evans.

Student Assessment Student Assessment and Data Analysisand Data Analysis

Oakland SchoolsOakland Schools

MAEDS

2005Tammy L. Evans

MAEDS 6 October 2005 2

Why are educators so fired up about data?

How do we know if teachers are teaching our curriculum?

How do we maximize the value of dollars spent for assessment and data management?

Are all of our students achieving at acceptable levels?

Superintendents ask…

MAEDS 6 October 2005 3

Professional learning communities ask

What is it we want our students to know and be able to do?

How will we know when they have learned it?

What will we do when students are not learning?

MAEDS 6 October 2005 4

Why are educators so fired up about “data”?

ImprovingStudent

Achievement!

MAEDS 6 October 2005 5

Creating some common languageabout data in schools

What are the major systems?

How are they related?

What have districts done?

Where do we want to go?

MAEDS 6 October 2005 6

Assessment SystemsStudent Information Systems

Data analysis systems Data warehouse

4 Major Data & Technology Systems in Schools Oakland Schools focus is on Assessment and Analysis

(see Data warehouse PP on CD)

MAEDS 6 October 2005 7

SAS DAT PURPOSEStudent Assessment System & Data Analysis Tool

Improve teaching and increase learning for all

Useful reports for teachers, principals and district administration

Common assessments tied to GLCEs Item banks tied to GLCEs Multiple district on-ramps

MAEDS 6 October 2005 8

What is an Assessment System?

Tool for gathering achievement information

It is assessing what is going on in classrooms.

MAEDS 6 October 2005 9

Who needs what data?

Administrators, public, legislators– Evaluation– Accountability– Long range planning

Teachers, parents, students– Diagnosis– Prescription– Placement– Short range planning– Very specific ach info

e.g., What percent met standards on 4th grade MEAP math?Are students doing better this yearthan they were doing last year?

e.g., Who understood this concept? Why is Becky having trouble reading?

A single assessment cannot meet all needs.

Large Grain Size Fine Grain Size

MAEDS 6 October 2005 10

Oakland Schools’ Path to Student Achievement

Fall 2004 – Meetings with focus groups, create RFP

Oct 2004 – Meeting with Assessment, Curriculum and Technology directors from Oakland districts to discuss requirements, including multiple “on ramps”

June 2005 deadline

MAEDS 6 October 2005 11

The RFP Input gathered from LEA focus groups in Curriculum,

Assessment, Instruction and Technology RFP authored at Oakland Schools through a

collaboration between Career Focused Education, Learning Services, Research Evaluation and Assessment, Purchasing, School Quality and Technology Services.

Draft copy provided to LEA Technology and Assessment Directors for input.

Click here for details of the RFP Click here for details of the vendor pricing submitted

MAEDS 6 October 2005 12

The Committee OCSA charged Oakland Schools and LEAs to move

forward on acquisition of assessment and analysis system. The RFP evaluation committee was formed, consisting of

ISD and LEA staff representing Assessment, Curriculum and Technology.

Representatives from OCREAC, Teaching and Learning Council, Oakland County Technology Directors, OCSA Instruction &Technology subcommittee.

Committee members were from Berkley, Huron Valley, Lamphere, Lake Orion, Troy, Novi, South Lyon, Walled Lake and West Bloomfield.

MAEDS 6 October 2005 13

ISD Collaboration

Jan 2005 – Oakland Schools and Wayne RESA met to review strategic goals around assessment and data analysis.

Joint RFP was created Wayne RESA joined RFP evaluation committee Wayne RESA and Oakland Schools separated

scoring and recommendation for individual needs and approvals.

MAEDS 6 October 2005 14

The evaluation begins 10 vendors responded to the RFP The committee met to review the

responses. The committee chose three vendors for

demonstrations Click here for the

Debriefing Voting Results.

MAEDS 6 October 2005 15

The demonstrations• Vendors were asked to cover specific points. • Half day demonstrations for each vendor were

held at Farmington Training Center on March 10 & 11, 2005.

All Oakland Schools LEAs were invited to send representatives to the demonstrations.

Over 100 participants reviewed the products and were asked to complete a survey. – Click here for the Survey results.

MAEDS 6 October 2005 16

Further evaluation

After the demonstrations, the committee met to discuss the products and created a pros/cons list for each vendor.

Using an audience response system, the group prioritized the functionality of the products and rated each vendor on those functional areas. (see SAS-DAT PP on CD for full presentation.)

Click here for the Functionality Summary

MAEDS 6 October 2005 17

MAEDS 6 October 2005 18

MAEDS 6 October 2005 19

Vendor References A subcommittee was formed to conduct reference

interviews. Included committee members from Huron Valley,

South Lyon, Walled Lake and West Bloomfield and Oakland Schools

Plato – two references, EduSoft – two references, Pearson – three references

Click here for the Reference Questions The reference information was synthesized and

presented to the committee on April 11. Click here for the Reference Call Summary

MAEDS 6 October 2005 20

Further Analysis

Reviewed goals of RFP Reviewed priority & ranking from vendor

demonstrations Reviewed vendor reference calls Reviewed pricing

MAEDS 6 October 2005 21

The Evaluation Filled out evaluation sheets

– Click here for the Evaluation Form

Results tallied:– Plato 4680– EduSoft 4350– Pearson 5720

MAEDS 6 October 2005 22

Site Visit

May 4, 2005 – Putnam City Schools, OK Met with Curriculum Director, principals

to review product in use.

MAEDS 6 October 2005 23

Facilitated Product Demonstration

May 5, 2005 – Oakland Schools SAS-DAT Committee members were

invited to participate in a test drive of Benchmark and Inform.

MAEDS 6 October 2005 24

Principal’s Dashboard

At school & classroom levels, every bar in a graph links to

student names and information

Pearson Inform helps you target assistance – provide

early intervention

Key Feature

MAEDS 6 October 2005 25

Teacher’s Dashboard

Pearson Inform provides “Concept Analysis” at District, School, and Class Views …

A big help in planning instruction, aligning

curriculum, and identifying student needs

Key Feature

MAEDS 6 October 2005 26

Parent’s / Student’s Dashboard

Pearson Inform’sParent Access &

Family Views

MAEDS 6 October 2005 27

Oakland Schools Support

Models defined to support diverse needs of districts and multiple on-ramps

Monetary support Curriculum, Item Banks, and Assessments

delivered to all districts

MAEDS 6 October 2005 28

The Partnership Created Benchmark “Lite”

– Host for Oakland Schools’ • Standard curriculum• Units / Lesson plans• Assessments

– MCF – Michigan Curriculum Framework– Common assessments tied to GLCEs– Item banks tied to GLCEs– Allows districts to create assessments

Benchmark “Full” – administer tests (scan or web based)– report scoring

Inform– Analyzes test responses down to the individual student

MAEDS 6 October 2005 29

Where we are now…

Conversion for 27 of 29 districtsTraining Implementation! August 2005+

Sharing experience with other MI districts. – Contract allows for state purchase– Increased participation reduces cost for all

MAEDS 6 October 2005 30

MACUL 2006Presentation will cover… Success stories Lessons Learned Examples of classroom

assessment Examples of analysis Website and

demonstration

MAEDS 6 October 2005 31

Questions