Post on 16-Jan-2016
Introduction to theSystems Evaluation Protocol
William TrochimProfessor, Policy Analysis and Management
Director of Evaluation for Extension and Outreach
Cornell University
eXtension Webinar on Research and Evaluation
October 21, 2009
Overview
• Systems Thinking• The Systems Evaluation Protocol (SEP)• Integrating the SEP and Systems Thinking• The “Netway” Cyberinfrastructure• Research on Evaluation: The NSF Project
Evolutionary & Ecological Systems Thinking
Every program has a lifecycle (ontogeny), a progression through phases or stages…
…and we can see the emergent evolutionary family “tree” (phylogeny) of programs and their theories
…over time there is an evolution of programs and program theories…
4
The Systems Evaluation Protocol
global (resources/policy)information for deciding how and whether to support this whole category of programs
Systems Evaluation – Stakeholder Analysis
The evidence you want depends on your place in the network of stakeholders
Program
local (delivery/implementation) information that makes running the program easier
Systems Evaluation – Stakeholder Analysis
Government, especially county
Board Onondaga Staff
Program Partners: citizen groups,
agencies, (e.g. Head Start)
Researcher (e.g. Cornell,
SU, SUNY system)
Funder (e.g. federal foundation)
Program Participants (e.g. associations
by topic, geography, teens)
Citizens
Mike(Cornell Extension)
Team Leaders,
Executive Director
Local Government
Other Counties
County Legislators
Future Funders The Environment
The Press
OnondagaCounty
CooperativeExtension
Systems Evaluation – Boundary Analysis
• Key Distinctions– part-whole – local-global
• Definitional Challenges – Program– Project– Activity
• Motivations– Narrow boundaries –program deliverers feel like they are
doing more but program funders feel like there’s “no there there”
– Broader boundaries – funders feel like they are funding something important but program deliverers feel the program description doesn’t capture the nuances
Systems Analysis – Boundary Analysis
• The Goldilocks Criterion– Too narrow – need separate descriptions, logic models,
and evaluations for each program– Too broad – descriptions and models don’t describe the
program in sufficient detail; logic model is difficult to complete because there are too many activities and outcomes that only pertain to specific subprograms
– Just right – the program description and logic model capture both the common nature of the programs and the important details
Just Right!
Systems Evaluation – Program Analysis
• Logic models– Components
• Assumptions
• Context
• Inputs/Resources
• Activities
• Outputs
• Outcomes (short, mid, long)
Developing a Program Model: NYC 4-H After-School Clubs and Special Interest Projects
Context / Assumptions
Inputs Activities Outputs Short Medium Long
Outcomes
Youth Leadership Academy
Public Presentation
ProgramStudents make
public presentations to
a group
Youth interest in
science
Increased interest in
science education
Develop leadership
skills
Curricula
FTE Extension Educators
Partnerships (community, sponsors)
State & National 4-H
Initiatives
Volunteers
Agilent Science & Tech
Volunteer Orientation &
Trainings
Students complete
engineering kits
Increased rates of higher
education & career
goal attainment
Funder mandates time & budgetary constraints
Staff is proficient at train-the trainer skills
A Program Logic Model: NYC 4-H After-School Clubs and Special Interest Projects
Systems Evaluation – Program Analysis
• Pathway models– Come from
• Causal Modeling (Path Analysis)
• Program Theory
• Theory of Change
– Show network of causal linkages– Primary pathways and nodes– “Throughlines”
Short-Term OutcomesOutputsActivity
Systems Evaluation – Program Analysis
Legend: Medium-Term Outcomes Long-Term Outcomes
Increased rates of higher
education & career
goal attainment
Agilent Science & Tech
Public Presentation
Program
Volunteer Orientation &
Trainings
Youth Leadership Academy
Students complete
engineering kits
Students make public
presentations to a group
Youth interest in
science
Develop leadership
skills Increased interest in
science education
A Program Pathway Model: NYC 4-H After-School Clubs and Special Interest Projects
Systems Evaluation – Lifecycle Analysis
• Program Lifecycle– Phases
• Initiation• Growth• Maturity• Translation/dissemination/regeneration
– Similar to the system of phased clinical trials in medicine• Evaluation Lifecycle
– Phases• Description and Rapid Feedback• Relationships and Change• Causation and Effectiveness• Generalizability and Meta-analysis
– Similar to many taxonomies of research methods• Determining the Lifecycles
– program phase– evaluation phase– synchronization of program and evaluation phases
Initiation Development Maturity Translation &Dissemination
Farmer’s Market
EFNEP
Project Bridge
Systems Evaluation – Lifecycle Analysis
Initiation Development Maturity Translation &Dissemination
Farmer’s Market
Program Phase
Evaluation Phase
Project Bridge
EFNEP
Systems Evaluation – Lifecycle Analysis
• The Netway is an internet-based database tool for monitoring programs and their evaluations
• Features of the Netway
– Stores program information
– Enables logic models and pathway models
– Provides platform for developing evaluation plans
The Netway Cyberinfrastructure
Netway = Networked Pathways
The Netway: Tracking Programs
The Netway: Logic Models
The Netway: Pathway Models
The Netway: Evaluation Plans
Research on Evaluation
Research on Evaluation
23
2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022
Cornell Cooperative Extension funding National Science Foundation Grant No. EREC-0535492, Systems Evaluation and Evaluation Systems plus Supplements
NSF Award #0814364 - A Phase II trial of the Systems Evaluation Protocol for Assessing and Improving STEM Education Evaluation
Examples and Experiences with the
Systems Evaluation Protocol