A Look Ahead for the Multi-State Collaborative to Advance Learning ...

46
A Look Ahead for the Multi-State Collaborative to Advance Learning Outcomes Assessment/VALUE Initiative Terrel L. Rhodes, Vice President for Quality, Curriculum and Assessment and Executive Director of VALUE, AAC&U Julie Carnahan, Vice President, State Higher Education Executive Officers Kate McConnell, Senior Director of Research and Assessment, AAC&U October 17, 2016

Transcript of A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Page 1: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

A Look Ahead for the Multi-State Collaborative to Advance Learning

Outcomes Assessment/VALUE Initiative

Terrel L. Rhodes, Vice President for Quality, Curriculum and Assessment and Executive Director of VALUE, AAC&U

Julie Carnahan, Vice President, State Higher Education Executive Officers

Kate McConnell, Senior Director of Research and Assessment, AAC&U

October 17, 2016

Page 2: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Why Are We Doing This Work?

2

Page 3: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Purpose and Vision for the Multi-State Collaborative

3

Change the dialogue currently focused on:

To…

Access and Completion

Quality and Success

Page 4: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

A Collaborative Vision for Student Learning

Valid Assessment of Learning in Undergraduate Education

Launched in 2007

Campus-based

Authentic assessment of student work

Privileges role/importance of faculty as authors of assignments and arbiters of quality

An advocate for state policy leadership

A liaison between states and the federal government,

A vehicle for learning from and collaborating with peers,

A source of information and analysis on educational and public policy issues.

Emphasis on success

Page 5: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Taking the Vision to Scale in Twelve States

5

MSC

Participants:

CT, HI, IN,

KY, MA, ME,

MO, MN, OR,

RI, TX and

UT

Plus – GLCA

& MNCp

Steering Committee: State

pt. persons from each state +

reps. from SHEEO & AACU.

Institution Point Persons: Person

from each campus in each state.

-Root assessment of learning in authentic work & the expertise of faculty

-Establish benchmarks for essential learning outcomes

-Develop transparency of shared standards of learning to assist w/ transfer

Goals:

VA

Page 6: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

The Current VALUE/Multi-State Collaborative Project

• Purpose

– Sea Change in Assessment

– Reliability

– Validity

– Local value

– Policy debate = learning

MinnesotaCollaborative

(MN)

Great Lakes Colleges Association (GLCA)

Multi-State Collaborative

(MSC)

Page 7: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Multi-State Collaborative (MSC) to Advance

Learning Outcomes Assessment

Demonstration Year

7

Page 8: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

MSC Demonstration Year by the Numbers (2015-16)

• MSC states: Connecticut, Hawaii, Indiana, Kentucky, Maine, Massachusetts, Minnesota, Missouri, Oregon, Rhode Island, Texas, Utah

• 79 public institutions uploaded artifacts

• By sector:– 37 four-year, including 8 research institutions– 42 two-year

These results are not generalizable across participating states or the nation in any way. Please use appropriately.

8

Page 9: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

MSC Demonstration Year by the Numbers

• 10,948 pieces of student work were submitted [number of pieces of work approximates number of student participants]

– Students had to be 75% of the way to completion of institutional degree requirements

– 3,031 artifacts scored twice (28%) in order to measure inter-rater reliability

• 1,573 assignments were submitted[number of assignments approximates number of faculty participants]

• More than 190 scorers

• Three rubrics – critical thinking, quantitative literacy, written communication and option of civic engagement

These results are not generalizable across participating states or the nation in any way. Please use appropriately.

9

Page 10: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

VALUE Rubric Approach Assumptions

• Learning is a process that occurs over time

• Student work is best representation of student motivated learning

• Focus on what student does in terms of key dimensions of learning outcomes

• Faculty and educator expert judgment

• Results are useful and actionable for improvement of learning (and accountability)

10

Page 11: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Inherent Challenge for VALUE: Navigating Methodological Complexity

Page 12: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

12

LEARNING OUTCOMES

• State and Sector Coverage of Outcomes• Mean Scores by Learning Outcome• Score Distribution by Criterion• State Comparisons for Overall Mean Scores

Page 13: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

VALUE & Validity

Page 14: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

90%Participating faculty believed that the VALUE rubric = Useful tool for evaluating student work quality

VALUE & Validity in the Pilot Year

Page 15: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

>80%

Sufficient range

Descriptors were understandable

Descriptors were relevant for making

judgments about levels of learning

VALUE Rubrics

Page 16: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

16

~75% Dimensions encompassed the core meaning of the learning outcome

VALUE Rubrics

Page 17: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

17

DEMONSTRATION YEAR PRELIMINARY DATA

Page 18: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

18

CRITICAL THINKING – 2 YEAR

0%

10%

20%

30%

40%

50%

Page 19: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

19

CRITICAL THINKING – 4 YEAR

0%

10%

20%

30%

40%

50%

Page 20: A Look Ahead for the Multi-State Collaborative to Advance Learning ...
Page 21: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

21

QUANTITATIVE LITERACY – 2 YEAR

0%

10%

20%

30%

40%

50%

Page 22: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

22

QUANTITATIVE LITERACY – 4 YEAR

0%

10%

20%

30%

40%

50%

Page 23: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

23

WRITTEN COMMUNICATION – 2 YEAR

0%

10%

20%

30%

40%

50%

Page 24: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

24

WRITTEN COMMUNICATION – 4 YEAR

0%

10%

20%

30%

40%

50%

Page 25: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• Methodological

• Philosophical

• Pedagogical

Nature, Implications of Complexity

Page 26: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Quantitative Terms Qualitative Terms

From “Qualitative Analysis on Stage: Making the Research Process More Public” by Vincent A. Anfara, Jr., Kathleen M. Brown, & Terry L. Mangione, Educational Researcher, October 2002 vol. 31 no. 7 28-38.

Validity Credibility, Transferability

Reliability Dependability, Confirmability

VALUE

Page 27: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• Comparing the validity & reliability of the VALUE process to standardized tests will always be an “apples” to “oranges” proposition.

• That said, establishing the validity (credibility & transferability) and reliability (trustworthiness & confirmability) of the VALUE process in its own right is a key priority for AAC&U.

Nature, Implications of Complexity

Page 28: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Purpose = Discuss Reliability in Relation to Inherent Complexity of VALUE

Scores (rubrics)

AssignmentsScorers

Page 29: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Order of Operations for Discussion:1. Reliability vis-à-vis Demonstration Year Data

• Discussion of preliminary analyses• Discussion of “in the weeds” plans for

investigating reliability, generalizability more deeply

2. Reliability & MSC’s Multiple Moving Parts• Scores• Scorers• Assignments

Page 30: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• % agreement and Intraclass Correlation Coefficient (ICC) test to assess agreement beyond chance

• ICC often used to look at ordered categorical data (data that “behaves” as ordinal data) –like the VALUE rubrics

• ICC (1) – used when do not have consistent/fully crossed raters for all artifacts

• Most conservative/smallest of ICC tests

Preliminary Pass at Interrater Reliability

Page 31: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Commonly Used Cutoffs for Qualitative Ratings of Agreement Based on ICC Values:

<.40 = Poor

.40-.59 = Fair

.60-.74 = Good

.75-1.0 = Excellent

What does the ICC value mean in very “lay” terms?

An ICC(1) Average Measures Score of .449 means that 44.9% of the variance of the mean of the raters is “real” – not simply attributable to chance.

Page 32: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Start w/ the Big Picture

Page 33: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

ICC(1) for Average Total Scores

.59 .63 .62

QuantitativeLiteracy

Written Communication

Critical Thinking

Civic Engagement

.62

Page 34: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Scores (rubrics)

AssignmentsScorers

Page 35: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• More robust interrogation of reliability data• Fall 2016 study with Gary Pike (IUPUI) more

thorough consideration of IRR, appropriate tests (e.g., Gwet’s AC1)

• Broader generalizability study

• Look at literal source of scores – rubrics themselves

• Consider the possibility… not all criterion, performance levels are created equal from a reliability perspective

Scores (Rubrics)

Page 36: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• Mine the existing data to identify outliers, more “expert” scorers, disciplinary differences

• Enhanced training (in what ways?)

• Enhanced criteria for participation (how stringent?)

• Different approach to scoring (e.g., scoring to a “4”, etc.)

Scorers

Page 37: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

• Mine the existing data (part of our planned work for 2016-2017) – what should we be looking at?

• Improve assignment design – how?

• Establish – and enforce? – assignment design parameters. What would that look like?

Assignments

Page 38: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

A Careful Balancing

Act

Page 39: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Methodological Philosophical/

Pedagogical

Page 40: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Lessons Learned- Overall

40

Page 41: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Lessons Learned

• Increased focus on campus capacity

• Assessment as a high impact practice

• Importance of assignment for demonstration of learning

• Faculty development and collaboration and engagement for learning

• Increased attention to equity – based on data

41

Page 42: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Lessons Learned - MSC Structure, Organization, Leadership, Process

• SHEEO agency – coordinating role, financial support, outreach to legislators, governor, accreditors, K-12, business leaders, public

• Institutional leadership – Provost, CAO, Assessment & IR Directors, Faculty, champions for the work

• Statewide Assessment Council – voluntary association of experts, inter-institutional communication, technical support, collegial support

• Statewide & campus convenings, assignment design work shops; norming calibration training

42

Page 43: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Lessons Learned – Campus Capacity and Engagement

43

Page 44: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Level Learning Outcome9-State

Avg Score Connecticut Avg Score

4-Y

ear

In

stit

uti

on

s Critical Thinking 2.01 1.99

Quantitative Reasoning 2.33 2.12

Written Communication 2.53 2.53

*Rubrics – 4 point scale

Reflections on the Pilot Year

Page 45: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Learning Outcome & Faculty

Faculty (N)

Number of Artifacts

First Year

Soph Junior SeniorCCSU Total

MSC Total

Critical Thinking (33 Majors)

12 16 21 58 130 225 119

Quantitative Reasoning (19 majors)

5 0 6 29 82 117 78

Written Communication (28 Majors)

13 13 19 62 97 191 87

Grand Total(45 majors – 75% of majors)(27 faculty – 45% of dept)

27 29 46 140 318 533 283

Reflections on the Pilot Year

Page 46: A Look Ahead for the Multi-State Collaborative to Advance Learning ...

Questions and Follow-Up

Julie Carnahan, [email protected]

Kate McConnell, AAC&[email protected]

Terrel L. Rhodes, AAC&[email protected]

46