SMART Data 2011

31
USING SMART2 Focus on NAPLAN ESL Workshop August 2011

description

 

Transcript of SMART Data 2011

Page 1: SMART Data 2011

USING SMART2Focus on NAPLAN

ESL WorkshopAugust 2011

Page 2: SMART Data 2011

SESSION OVERVIEW:

•Clarifying purposes for using data

•Knowing the data – considering

perspectives

•Principles to guide the use of data

•Using SMART2

•Navigating SMART2

•Analysis tools, tips and tricks

•NAPLAN one part of the picture2

Page 3: SMART Data 2011

Providing evidence for:

assessing and confirming the levels of

student achievement

diagnosing areas for development of

students’ knowledge, understanding and

skills

school program evaluation

school planning, self-evaluation and

reporting

assessing school performance

Purposes of data

3

Page 4: SMART Data 2011

Data analysis maxims No judgements without context

No excuses without reflection

Data raises more questions than provides answers

Raw data is a piece of the puzzle

The more pieces, the more reliable the picture

Be cautious of generalisations for small cohorts

Don’t over-interpret the evidence.4

Page 5: SMART Data 2011

Know your data well! NAPLAN

5

Page 6: SMART Data 2011

TRIANGULATION

Utilising multiple indicators

6

Page 7: SMART Data 2011

Know your data well! ESSA

7

Page 8: SMART Data 2011

BAND Year 3 Year 5 Year 7 Year 9

10

9

8

7

6 NMS

5 NMS

4 NMS

3

2 NMS

1

0

PIBs?

What are our NAPLAN . . .

expected growth data?average growth

data?

686

634

582

530

478

426

322

270

374

218

738

8

17%

16%

16%

11%

1%

Scale

means?

Page 9: SMART Data 2011

BAND Year 3 Year 5 Year 7 Year 9

10

9

8

7

6

5

4

3

2

1

0

686

634

582

530

478

426

322

270

374

218

738

9

Scale

What are our NAPLAN data characteristics?

Page 10: SMART Data 2011

BAND Year 3 Year 5 Year 7 Year 9

10 5%

9 12% (9)

11%

8 19% 23% (24)

20%

7 22% 36% (22)

18%

6 17% 22% 24% (36)

13%

5 16% 24% 4% (7) 6%

4 16% 11% 0% (2)

3 11% 7%

2 7%

1 1%

0

What are our NAPLAN data characteristics?

686

634

582

530

478

426

322

270

374

218

738

10

2010 Reading data: SMART Demo school (Year 9 data includes three year average )

Scale

Page 11: SMART Data 2011

The NAPLANframework

11

Page 12: SMART Data 2011

The NAPLAN framework

12

Page 13: SMART Data 2011

Using SMART

WHOLE SCHOOL LEVEL ....... focus on mean trends; percentages in bands, and growth.

CLASSROOM TEACHER LEVEL ..... a focus on individual performance in bands, individual growth, and item analysis.

What can we use in SMART?

13

Page 14: SMART Data 2011

BAND Year 3 Year 5 Year 7 Year 9

10

9

8

7

6 NMS

5 NMS

4 NMS

3

2 NMS

1

0

Tracking achievement

686

634

582

530

478

426

322

270

374

218

738

NAPLAN

School-based assessment

HSCSCESSA

14

Scale

Page 15: SMART Data 2011

SyllabusT&L

program Internal

assessment

Connections

15

Page 16: SMART Data 2011

Valuing school-based data

16

Page 17: SMART Data 2011

SyllabusT&L

program Internal

assessment

External assessment

Strengthening connections

17

Page 18: SMART Data 2011

Student Records

Teacher assessments

Faculty/Stage records

Student reports

A-E tracking

Syllabus outcome records

Assessment results

ESL scales

Attendance norms

Retention norms

Course participation

Participation in competitions

NAPLAN results

ESSA results

SC / HSC results

Best Start

Growth/Value added

[Electronic-SMART & BOS RAP]

Local-State-National competitions

Early school leavers

Post school destinations

Merit award records

Student mobility

Attendance rates

Retention rates

Behaviour records

Suspension rates

Surveys

Focus Groups

DATA availableSchool based External

Academic Non-Academic Non-AcademicAcademic

Concept credit: Eric Jamieson18

Page 19: SMART Data 2011

Some key ideas and goals / principles Connect and align syllabuses, T&L programs,

and internal and external assessment

Identify strengths, and deeply investigate areas where we can make the biggest improvement

View and use data over time

Describe targets in terms of students outcomes

Identify starting points (baselines) for T&L and for measuring improvements

Use multiple indicators

19

Page 20: SMART Data 2011

Qualitative data Quantitative data

SSG comparisons State comparisons

Student level data School level data

School-based data External data

Social data Academic data

Towards a balanced picture

Areas for improvement Strengths

20

Page 21: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Page 22: SMART Data 2011

22

SURFING the MENU

Page 23: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Action/Stage plan for Yr #

• These overall results of students’ achievement are/are not consistent with our in-school assessment results

• Results indicate that school groupings are/are not consistent with student achievement

• To improve the literacy achievement of Yr # students, the action/stage plan needs to be amended to include strategies to support

High achieversLow achievers

BoysGirls

ATSILBOTEESL

• These groups have/have not been identified in the School Targets for 2009.

Page 24: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Programming

• The students’ results indicated for these items are/are not consistent with our in-school assessment results.

• The structure of our stage/faculty programs need to be evaluated in relation to the following:

Grammar and PunctuationReadingWriting

These items of learning have/have not been identified in the School Targets for 2010.

Page 25: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Teaching Strategies https://detwww.det.nsw.edu.au/directorates/schoimpro/EMD/naplan/pubs/Naplan08CL/index.htm

Page 26: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Essential Support Documents 7-12

• Curriculum Support Literacy 7-12• http://www.curriculumsupport.education.nsw.gov.au/

secondary/english/index.htm• ELLA Publications• https://detwww.det.nsw.edu.au/directorates/schoimpro/

EMD/ella_publications.htm• Curriculum Support Mathematics 7-12•

http://www.curriculumsupport.education.nsw.gov.au/secondary/mathematics/index.htm

• SNAP Publications• https://detwww.det.nsw.edu.au/directorates/schoimpro/

EMD/snap_publications.htm

Page 27: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Focus Learning Areas -

Implications

• The students’ results related to key learning areas are/are not consistent with our in-school assessments.

• The content of our stage/faculty program needs to be evaluated in relation to the following key learning areas:

Page 28: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Reading Skills

• The students’ results in relation to literacy skills are/are not consistent with our in-school assessments.

• Teaching strategies and classroom practice need to be evaluated in relation to the following:

Page 29: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Programming implications for Reading

Our results show: Yr # are strong in these areas:

Further skills development is needed in these areas:

Page 30: SMART Data 2011

Educational Measurement and School Accountability (EMSAD)

Possible Considerations

Consider whole school plans/strategies Use NAPLAN stimulus magazine as a teaching

resource, identify areas of student interest Investigate teaching strategies within SMART and

school-based strategies Plan across curriculum units of work with a focus on

skills needing attention e.g. punctuation Consult and investigate professional learning

networks, curriculum support and Regional support NAPLAN questions – examine the type of question eg

in Year 9 identifying grammar, punctuation and spelling errors in a piece of text. Use this as a classroom teaching/assessment strategy.

Page 31: SMART Data 2011

31

Acknowledgments• Eric Jamison Rel/Direrctor• Gerry McC loughan Ass/Director EMSAD• Dr Geoff Barnes EMSAD