Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M....

46
Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU)

Transcript of Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M....

Page 1: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Assessing for Program Improvement

Presented at the University of ArizonaFebruary 11, 2009

Victor M. H. BordenAssociate Professor of Psychology (IUPUI)

Associate Vice President for University Planning, Institutional Research, and Accountability (IU)

Page 2: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Overview

What I think you think I might talk about

What I think you need to think about

Page 3: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

How to Assess Programs for Improvement

A range of methods from simple to complex

The Core IdeaSimple modelsQuality improvement modelsProgram reviewMore complex models

Page 4: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

The Core Idea: The Planning-Evaluation-Improvement

CyclePlan

Implement

Assess

ImprovePlan

Check

Do

Act

Page 5: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Adapted from Norman Jackson

Toward a Spiral of Improvement2. ENGAGE WITH THE

PROBLEM CALLED HOW DO WE IMPROVE PROGRAM?

5. EVALUATE IMPACT ON OUTCOMES

* did it work as I intended?

* how did people respond?

* what were the results?

6. PLAN TO IMPROVE

1. THINK ABOUT PROGRAM ISSUES

3. DEVELOP RESOURCES/ STRATEGIES TO IMPROVE

4. IMPLEMENT CHANGES

* experiment

Back

to t

he d

raw

ing b

oard

On

to s

om

eth

ing e

lse

Page 6: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Core Evaluation Cycle Questions

What are you trying to achieve?

Needs assessment

What are you doing to achieve it?

Process assessment

How will you know when you get there

Outcomes assessment

What can you do with the results?

Improvement

Page 7: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Why the Fixation on Outcomes?

We haven’t paid sufficient systematic attention to this in the past

We look at inputs (resources) and processes (curricula and programs) fairly systematically

We tend to look at outcomes one student at a time

The link to accountability

Page 8: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Simple Models of Assessment

AdvantagesEasy to communicate, use, and learn from

Can be built into everyday work

Helps build and maintain culture of evidence

ModelsThe evaluation cycle (or spiral)

The assessment matrix template

Page 9: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

The Assessment Matrix

Page 10: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

The Limits of Simple Models

Often overly simplistic relative to problems

Actual measures can be misguided

Implementation can be inconsistent across units

Not always easy to link outcome measures to “responsible” processes

doing the right thing vs. doing it right

Page 11: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Quality Improvement Models

AdvantagesFocus on process provides best chances for identifying points of improvementCollaborative teams empower staff and help improve communication across unitsFormulaic method and external staff support help guide and keep on track

Sample methodsPenn State’s FAST TRACKU of Wisconsin Accelerated Improvement

Page 12: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

PSU Fast Track

Page 13: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Team # 526 -- Food Sciences Measures Task Force College of Agricultural Sciences December 2002 ObjectiveDevelop and implement a centralized system for collection and reporting of key performance indicators and departmental reports.

Action Plan: 1. Evaluate current processes and data sources for gathering data for performance indicators and department reports.2. With information from #1, work with Dept. Head to define "key performance indicators" fromDepartment's Strategic Plan.3. Working with Dept. Head, define "departmental reports and other measures".4. Develop feasible solutions for a collection process for data/information defined in #2 and #3.5. Present solutions to the Sponsor with cost/benefit analyses.6. Sponor to disseminate solution and plan to faculty.7. Assist in implementation of solutions including initial collection of information (test cycle for newprocess), revising process, flowcharting and writing procedures and training stakeholders.

Results Achieved to DateTeam was disbanded at the end of 2003 after achieving action plans #1 - 7. Most of the expected outcomes were achieved for a fundamental centralized data collection system. Given the current resource and budget constraints, the sponsor decided not to pursue further automation of the centralized data system at the time. The Food Sciences Task Force was disbanded at the end of 2003.

Page 14: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

http://www.wisc.edu/improve/improvement/accel.html

Page 15: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

UWisc Accelerated Improvement

Define Goals and measures of success

Document process

Understand customer needs

Check/refine goals

Design Develop potential solutions

Analyze solutions/options

Finalize solution develop implementation plan

Implement Inform affected people

Conduct training, if needed

Execute action plans w/timeline

Follow-up Collect data to track improvement

Review and refine process changes

Issue final report with results

Page 16: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Limits of QI Models

Academicians wary of “business” models

Focus on process emphasizes doing it right over doing right thing

Can be episodic rather than continuous

Page 17: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Program Review

Program self-study, site visit by “peers”Common method for academic programs

Increasing use for administrative programs

Fits well with accreditation frameworkGuidelines shape tone and tenor

Content standardsReview team composition

Flexibility accommodates range of inquiry orientations

Page 18: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Limits of Program Review

Expensive and time-consuming

Can be done with little participationOr with a lot

Results not always directly useful for change

Memorandum of understanding helpful

Episodic nature not responsive to changing environment

Page 19: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

More Complex Models

AdvantagesHandle true complexity

Provide in-depth insight into context

Academicians respect the scholarship (although not necessarily the particular approach)

Examples (from WMU’s evaluation center)

CIPP

Constructivist Evaluation

Page 20: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

More Complex Models

Page 21: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

The CIPP Model

1. Contractual Agreements2. Context Evaluation3. Input Evaluation4. Process Evaluation5. Impact Evaluation6. Effectiveness Evaluation7. Transportability Evaluation8. Sustainability Evaluation9. Metaevaluation10.The Final Synthesis Report

Page 22: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)
Page 23: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Constructivist Evaluation

Guba & Lincoln (2001)Two-stage process

Discovery - effort to describe “what’s going on here,”  the “here”  being the evaluand and its context Assimilation - effort to incorporate new discoveries into the existing construction or constructions …so that the new construction will fit, work, demonstrate relevance, and exhibit modifiability.

Page 24: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Limits of Complex Models

Too complex to be practical

Expensive

They require an…“evaluation unit as a staff operation at a high level of the organization in order to help insulate the unit from inappropriate internal influences and enhance its influence on decision making .”

Page 25: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Take Home Points

There are many approaches to assessing for improvement

Virtually any method of inquiry can be accommodated

The point of all of them is to determine how well you are doing things and how they might be done better; and to then try doing better and to see if that improves the outcomesEach can be done well or poorly

Page 26: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Doing Assessment Well

Being “data-” or “evidence-driven” is not, in and of itself, a good thing

e.g., “selective use” of evidence to support a foregone conclusion

Torture numbers long enough and they’ll confess to anything

Effective use of data requires sharing diverse and often divergent perspectives

It’s not what the data say, it’s what you say about the dataSome disagreement and dissent is important to learning and innovation

Page 27: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Further Heresy

Building effective programs requires some level of irrationality and disorder

To learn from what we do requires that we unlearn some things that we often don’t want to unlearn

As if doing this by ourselves were not difficult enough, we must do this together

Page 28: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

From Data- to Learning-Driven

Data-driven implies…Rational, systematic testing of ideas through inspection of facts

sequential, often individual decision-making process

Learning-driven implies…Going beyond what we already know and can do to gain new competencies

Deconstruction and reconstruction of ideas and beliefs

Becoming irrational to become re-rational

Page 29: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Single- and Double-Loop Learning

Learning is the detection and correction of error (unintended consequences)“Governing Variables” are those things what we feel are important to keep within acceptable limits“Action Strategy” is what we do or plan to do to keep the governing variables within limits“Consequences” are the intended and unintended outputs and outcomes

Intended: confirm our theory in useUnintended: suggests error in our theory in use

Page 30: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Single-Loop Learning

Governing variables not called into question

Adjustments made to action strategies at best

Defense mechanisms can readily arise to maintain single-loop learningGoverningVariables

ActionStrategies

Conse-quences

Page 31: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Double-Loop Learning

Questioning the role of the framing and learning systems which underlie actual goals and strategies

Reflection is fundamentalBasic assumptions are confronted

Hypotheses publicly tested

Falsification is sought

Ego is laid aside

GoverningVariables

ActionStrategies

Conse-quences

Page 32: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Model I and II Org Learning

Single- and double-loop learning at the organizational level

Model I: Organizational members prescribe to a common theory in use

Organizational policies and practices inhibit change

Model II: Governing values, policies, and practices promote double-loop learning

Page 33: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

A Model I Learning Organization

Governing VariablesTow the lineWin at all costsSuppress negative feelingsEmphasize rationality

Action StrategiesControl environment and task unilaterallyProtect self and others unilaterallyDiscourage inquiry

ConsequencesDefensive relationshipsLow freedom of choiceReduced production of valid informationLittle public testing of ideas

Page 34: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

A Model II Learning Organization

Governing VariablesValid information is most importantFree and informed choiceShared internal commitment

Action StrategiesShared controlParticipation in design and implementation of action

ConsequencesMinimally defensive relationshipsHigh freedom of choicePublic testing of ideas

Page 35: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Participatory Action Research/Inquiry

Systematic inquiry process Can use any of aforementioned methods

Stakeholder empowerment through active and on-going participation Dialog throughout process promotes collaborationActive learning and discovery fostered by critical reflection process Action plans create shared responsibility for doing something with the resultsFollow-up to action (checking results) maintains relationships and commitments

Page 36: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Participatory Action Research/Inquiry

Quotes from Handbook of Action Research by Peter Reason http://www.bath.ac.uk/~mnspwr/Papers/HandbookIntroduction.htm

The aim of participatory action research is to change practices, social structures, and social media which maintain irrationality, injustice, and unsatisfying forms of existence.

(Robin McTaggart)

Participatory research is a process through which members of an oppressed group or community identify a problem, collect and analyse information, and act upon the problem in order to find solutions and to promote social and political transformation.

Daniel Selener

We must keep on trying to understand better, change and reenchant our plural world.

Orlando Fals Borda

Page 37: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Participatory Action Research

Who does what?Decides what actions are taken?Is responsible for effective implementation?Can devise appropriate evaluation protocols?Has access to or can collect appropriate evidence?Reviews the results and decides what to do?

What can be done to get these people to work together and in concert?

Page 38: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Example: Evaluation of New Student Orientation

Research Question and Evaluation Focus reassessment of goals; incoming students’ needs; impacts on knowledge, attitudes, and behaviors

Data Collectionfocus groups and questionnaires, sought perspectives of all major stakeholders

Data Reporting and Feedbackmeetings with orientation leaders and faculty stakeholders

Development of Action Plansfacilitation of dialogue and data-driven proposals

Actionimplementation of proposed changes

Assessment – on-going formative evaluation; re-administration of process and outcome instruments

Page 39: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Example: Indiana Project on Academic Success (IPAS)

Research-based inquiry for enhancing academic successFour-stage method

AssessmentOrganizingAction InquiryEvaluation

Supported by use of state and institutional student tracking records

Page 40: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Stage 1: Assessment

Compare campus assessment information to statewide assessment results; identify possible challengesCollect additional information from campus sources, such as prior reports and studies and focus group interviewsOrganize teams of administrators, faculty, professional staff, and students to identify critical challenges on the campusPrioritize the challenges, identifying two or three that merit special attention at a campus level

Page 41: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Stage 2: Organizing

Coordinate the assessment and inquiry process with campus-level planning and budgeting; integrate the challenges with strategic plans; coordinate budgeting to provide necessary support. Appoint workgroups to address critical, campus-wide challenges; consider providing release time to team leaders to work on tasks for the campus. Coordinate the inquiry process (activities of the workgroups) with campus planning and budgeting.

Page 42: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Stage 3: Action Inquiry

Build an Understanding of the ChallengeWhat solutions have been tried in the past, and how well did they work? What aspects of the challenge have not been adequately addressed? What aspects of the challenge require more study? Develop hypotheses about the causes for the challenges using data to test the hypotheses. Do the explanations hold up to the evidence?

Look Internally and Externally for SolutionsTalk with people on campus about how they have addressed related challenges. Consider best practices for retention and how they might be adapted to meet local needs. Visit other campuses that have tried out different approaches to the problem. How well would these alternatives address the challenge at your campus?

Assess Possible SolutionsConsider alternatives in relation to the understanding of the problem developed in Stage 3, step 1. Will the solutions address the challenge at your campus? How can the solution be pilot tested? If you tried out the solution, how would you know if it worked? What information would you need to know how well it worked?

Page 43: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Stage 3: Action Inquiry (cont.)

Develop Action PlansAction plans should address the implementation of solutions that should be pilot tested. Consider solutions that can be implemented by current staff. If there are additional costs, develop budgets for consideration internally and externally. (Remember, seeking additional funds can slow down the change process.) Develop action plans with time frames for implementation and evaluation

Page 44: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Stage 4: Evaluate

Implement Pilot Test and EvaluateProvide feedback to workgroups and campus coordinating team. Use evaluation results to refine the solution. Also, evaluation can be used as a basis for seeking additional funding from internal and external sources, if needed

Page 45: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Building Trust – Lowering Resistance to Change

Do…• Evaluate program

effectiveness• Provide incentive for

using information (regardless of results)

• Raise expectations regarding quality and use of evidence

• Be patient with the learning curve

• Raise expectations for learning (for students and colleagues)

Don’t…• Evaluate individual

effectiveness• Tie resource

allocation directly to results

• Beat people over the head with findings

• Confuse anecdotes with evidence

• Keep changing direction based on initial findings

• Lower expectations for learning

Page 46: Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

What’s the Point?

Assessment and evaluation are means not endsOther important ingredients include:

Bringing the right people togetherA climate of trust and experimentationIncentives and support

It’s not rocket scienceAn imprecise answer to the right question is much more useful than a precise answer to the wrong question