CAN Conference TESTA Programme Assessment

Post on 07-Jul-2015

165 views 1 download

Tags:

description

Workshop using TESTA data from many UK programmes to show how modular programme design may have unintended consequences for student learning.

Transcript of CAN Conference TESTA Programme Assessment

Zoom to wide-angle lens: a programme approach to assessment and feedback

Dr Tansy Jessop

TESTA Project Leader

University of Winchester

CAN Conference: 18 and 19 February 2014

HEA funded research project (2009-12)

Seven programmes in four partner universities

Maps programme-wide assessment

Evidence-informed approach

About TESTATransforming the Experience of Students through Assessment

TESTA ‘Cathedrals Group’ Universities

EdinburghEdinburgh Napier

Greenwich

Canterbury Christchurch

Glasgow

Lady Irwin College University of Delhi

University of West Scotland

Sheffield Hallam

TESTA

“…is a way of thinking about assessment and feedback”

Graham Gibbs

Time-on-task

Challenging and high expectations

Internalising goals and standards – ‘self-regulation’

Prompt feedback

Detailed, high quality, developmental feedback

Dialogic cycles of feedback

Deep learning

Based on assessment principles

TESTA Research Methods(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)

ASSESSMENTEXPERIENCE

QUESTIONNAIRE

FOCUS GROUPS

PROGRAMME AUDIT

Programme Team

Meeting

What’s going on?

What’s going wrong?

What’s working?

Definitions:

Formative assessment

Summative assessment

TESTA Case Studies

Case Study X: what’s going on?

Mainly full-time lecturersPlenty of varieties of assessment, no examsReasonable amount of formative assessment (14 x)33 summative assessmentsMasses of written feedback on assignments (15,000 words)Learning outcomes and criteria clearly specified

….looks like a ‘model’ assessment environment

But students:Don’t put in a lot of effort and distribute their effort across few topicsDon’t think there is a lot of feedback or that it very useful, and don’t make use of itDon’t think it is at all clear what the goals and standards are…are unhappy

Case Study Y: what’s going on?

35 summative assessments

No formative assessment specified in documents

Learning outcomes and criteria wordy and woolly

Marking by global, tacit, professional judgements

Teaching staff mainly part-time and hourly paid

….looks like a problematic assessment environment

But students:

Put in a lot of effort and distribute their effort across topics

Have a very clear idea of goals and standards

Are self-regulating and have a good idea of how to close the gap

Teach less, learn more

Transmission model

Expert

Full of knowledge

Experts transmits or ‘pours’ knowledge & information into the mug

Novice

Empty head

Workshop 1: looked at disciplinary stats from Programme Audit data, discussed implications in pairs

Workshop 2: Looked at student voice data from four themes on A3, identified problems and solutions in pairs. Themes and slides reflect workshop materials.

Workshop Elements

Eight Humanities Degrees in five universities: the typical student experience of

A&F over three years

Category Theology History History Philosoph

y

Politics English American

Studies

Media

Total assessments 49 50 40 72 52 54 63 53

Summative 45 45 39 47 49 26 52 34

Formative 4 5 1 25 3 28 11 19

Variety 9 17 7 7 13 10 13 14

Exam % 17.7 20 5 0 26.5 23.5 9.6 11.7

Time to return 21 21 22 35 14 26 21 21

Oral feedback in minutes 175 295 290 135 59 50 212 359

Number of words written

feedback

7,378 4,995 5,920 3,060 7,527 11,865 10,972 7,344

Audit data: Workshop 1

1) Any interesting patterns?

2) Anything particularly striking?

3) Any dangling questions, curiosities, scepticisms?

4) Any predictions, hunches, thoughts about what other data might throw up?

TESTA Audit data

Challenges Solutions

Workshop 2: Student voice data

If there weren’t loads of other assessments, I’d do it.

If there are no actual consequences of not doing it, most students are going to sit in the bar.

It’s good to know you’re being graded because you take it more seriously.

I would probably work for tasks, but for a lot of people, if it’s not going to count towards your degree, why bother?

Theme 1: Formative is a great idea but…

We could do with more assessments over the course of the year to make sure that people are actually doing stuff.

We get too much of this end or half way through the term essay type things. Continual assessments would be so much better.

So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam!

Theme 2: Assessment isn’t driving and helping to distribute effort

The feedback is generally focused on the module.

It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.

Because it’s at the end of the module, it doesn’t feed into our future work.

Theme 3: Feedback is disjointed and modular

Assessment criteria can make you take a really narrow approach.

It’s such a guessing game.... You don’t know what they expect from you.

I don’t have any idea of why it got that mark.

They read the essay and then they get a general impression, then they pluck a mark from the air.

It’s a shot in the dark.

Theme 4: Students are not clear about goals and standards

Assessment Design

Feedback Practice

Paper to people

Smart Changes

www.testa.ac.uk

Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education.Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88.

Jessop, T, McNab, N and Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154.Jessop, T., El Hakim and Gibbs (2011) Research Inspiring Change. Educational Developments. 12(4).Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144.

References