Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary...
-
Upload
rodger-warner -
Category
Documents
-
view
235 -
download
0
Transcript of Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary...
Introduction toCEM Secondary Pre-16
Information Systems
Nicola Forster & Neil Defty
Secondary Systems Programme Managers
London, June 2011
Ensuring Fairness
Principles of Fair Analysis :
1. Use an Appropriate Baseline
2. Compare ‘Like’ with ‘Like’
3. Reflect Statistical Uncertainty
CEM Systems
Principle of Fair Analysis No1 : Use an Appropriate Baseline
The Projects
Year 7Year 7
Year 8Year 8(+ additional)(+ additional)
Year 9Year 9
MidYISMidYIS Computer Adaptive Baseline Test or paper-based test
Year 10Year 10
Year 11Year 11 YellisYellis Computer Adaptive Baseline Test or paper-based test
Year 12Year 12
Year 13Year 13 AlisAlis GCSE Computer Adaptive Baseline Test(paper test also available)
GCSE
A / AS / Btec / IB etc
INSIGHTINSIGHT Combines curriculum tests with developed ability
KS3
KS4
The Assessments
Year Groups
When Delivery Includes
MidYISMidYIS 7
8
9
Term 1 + catch ups
Paper or Computer Adaptive
1 session
Developed Ability
Vocabulary
Maths
Non Verbal
Skills
YellisYellis10
11
Term 1 + catch ups
Paper or Computer Adaptive
1 session
Developed Ability
Vocabulary
Maths
Non Verbal
INSIGHTINSIGHT9
4 week testing window
mid April – mid May
+ catch ups
Computer Adaptive
3 or 4 sessions
Curriculum-based
Reading
Maths
Science
+ Attitudes
& Developed Ability
The Assessments
INSIGHT – curriculum-based Assessment
Maths - Number & Algebra- Handling Data & Space- Shapes & Measures
Science - Biology - Chemistry - Physics
- + Attitudes to Science
Reading - Speed Reading - Text Comprehension - Passage Comprehension
Additionally for INSIGHT:
Developed - Vocabulary, Ability Non verbal, Skills
- Attitudinal measures
What is Computer Adaptive Assessment?• Questions adapt to pupil
• Efficient
• No time Wasting
• Wider Ability Range
• More Enjoyable
• Green
The Assessments
Computer Adaptive Paper-based
Number of Students per Test Session
Limited by number of computers available – multiple testing sessions
Can test all students in a single session (in hall or in form groups) or in more than the one session
Cost Roughly 30% cheaper than
paper-based testStandard cost
Processing of Baseline
Feedback
Baseline feedback available within a couple of hours of testing
Takes around 2-4 weeks for papers to be marked
Preparation
Must be able to install the software or access the internet version of the test
No pre-test set up
Student Experience
“Tailored” assessment All students see all questions, irrespective of suitability
Computer-adaptive vs Paper-based testing
The Analysis
.
Subject X
012345678
70 80 90 100 110 120 130
Baseline
Out
com
e
50 100 150
-ve VA
+ve VA
Regression Line (…Trend Line, Line of Best Fit)
Outcome = gradient x baseline + intercept
Correlation Coefficient (~ 0.7)
Residuals
Subject X
Linear Least Squares Regression
.
Subject X
012345678
70 80 90 100 110 120 130
Baseline
Out
com
e
50 100 150
Subject X
Making Predictions
e.g. MidYIS, INSIGHT, Yellis standardised scores
e.g. GCS
E
A*
A*B
CD
E
A
F
GU
Some Subjects are More Equal than Others….
Principle of Fair Analysis No2 : Compare ‘Like’ with ‘Like’
Trend segments for 2009 GCSEs: drawn over 68% of Yellis intake(mean +/-1 standard deviation)
1
2
3
4
5
6
7
8
20 30 40 50 60 70 80
Yellis Test Score (%)
GC
SE
po
int
sco
re (
A*=
8, A
=7,
B=
6, …
)
English
Maths
Art & Design
Geography
Biology
French
Business Studies
Core Science
Additional Science
Add Appl Science
1.5 grades’ difference!
The Assessments
Developed Ability - Maths
Developed Ability - Vocabulary
Developed Ability - Non-verbal
Developed Ability - Skills
INSIGHT - Maths
INSIGHT - Science
INSIGHT - Reading
Baseline Assessment and Predictive Feedback
Baseline Feedback
Nationally-Standardised Feedback• How did your pupils perform on the assessment? • What strengths and weaknesses do they have?• As a group how able are they?
Predictions• Given their performances on the test, how well might
they do at KS3 or GCSE?
Baseline Feedback
Feedback can be used at the pupil, class and cohort level.
- to guide individuals
- to monitor pupil progress
- to monitor subject-wide and department level progress
For classroom teachers, Head teachers or SMT as a quality assurance tool.
Data can be aggregated at other levels. We support & provide software tools to help schools to do this e.g. Paris software.
Baseline Feedback – Test Scores
Baseline Feedback – Test Scores
· National Mean =100, Standard Deviation =15
· 4 Performance Bands A, B, C, D
· Averages & Band Profiles for the cohort
· 95% of scores lie between 70 & 130
· No ceiling at 130+ or floor at 70
Baseline Feedback – Band Profile Graphs
Baseline Feedback-Gifted Pupils
Standardised Test Score
Baseline FeedbackIndividual Pupil Recordsheets (IPRs)
Predictive Feedback
Predictions…...
Average performance by similar pupils in past examinations
Predictive Feedback
Predictive Feedback
English Language - band D
2
10
23
3124
9
1 0 00
10
20
30
40
50
U G F E D C B A A*
Grade
Perc
ent
English Language - band C
1 26
20
3529
71 0
0
10
20
30
40
50
U G F E D C B A A*
Grade
Perc
ent
English Language - band B
0 0 27
24
40
21
50
0
10
20
30
40
50
U G F E D C B A A*
Grade
Perc
ent
English Language - band A
0 0 0 1
8
26
35
23
7
0
10
20
30
40
50
U G F E D C B A A*
Grade
Perc
ent
Predictive Feedback- Chances Graphs
Predictive Feedback- Individual Chances Graphs
30% chance of a grade D – the most likely single grade. 70% chance of a different grade
Point Prediction = 3.8
Chances Graphs based on Pupil’s actual Test Score NOT Band
Chances Graphs
• The Chances graphs show that, from almost any baseline score, students come up with almost any grade - - - there are just different probabilities for each grade depending on the baseline score.
• In working with students these graphs are more useful than a single predicted or target grade
• Chances graphs serve as – a warning for top scoring students and – a motivator for low scoring students
Value Added Feedback
Value Added Feedback
For each subject, answer questions:
• Given their abilities, have pupils done better or worse than expected?
• Can we draw any conclusions at the department level?
Value Added Feedback
For each pupil in each subject:• Raw residual = Achieved – predicted• Standardised residual – allows fair comparison between
different subjects and years
At subject level:• Confidence bounds are narrower with more pupils• If average standardised residual lies within bounds you cannot
draw any conclusions• If average standardised residual lies outside bounds you can be
confident that something significant is happening in that subject.
Burning Question :
What is my Value-Added Score ?
Better Question :
Is it Important ?
Principle of Fair Analysis No3 : Reflect Statistical Uncertainty
Value Added Feedback
Value Added Feedback – Scatter Plot
GCSE English
GC
SE P
oin
ts
Equiv
ale
nt
Baseline Score Look for Patterns…
General under- or over-achievement ?
Do any groups of students stand out ?
– high ability vs low ability ?
– male vs female ?
Value Added Feedback
Year 7 Pupil Level Residuals to GCSE
Value Added Feedback –
Standardised Residuals Graph
Standardised Residuals shown with confidence limits at 2 (95%) and 3 (99.7%) standard deviations
Standardised Residuals can be compared fairly between subjects and over years
Value Added Feedback -
Statistical Process Control (SPC) Chart
Subject: X
Attitudinal Surveys
Attitudinal Feedback
Your data is above the average
Your data is below the average
Your data is about the same as the average
Your data is about the same as the average
Attitudinal Feedback