Program Theory Lecture University of San Diego October 2006

51
1 Program Theory & The Theory-Driven Approach to Evaluation Jeffrey Sheldon, M. A., Ed. M. School of Behavioral & Organizational Sciences Claremont Graduate University The Claremont Colleges 25 October 2006

Transcript of Program Theory Lecture University of San Diego October 2006

Page 1: Program Theory Lecture  University of San Diego October 2006

1

Program Theory & The Theory-Driven Approach to Evaluation

Jeffrey Sheldon, M. A., Ed. M.School of Behavioral & Organizational Sciences

Claremont Graduate UniversityThe Claremont Colleges

25 October 2006

Page 2: Program Theory Lecture  University of San Diego October 2006

2

What does this model tell us?

Page 3: Program Theory Lecture  University of San Diego October 2006

3

Challenges in Program Evaluation

• Inadequate program conceptualization.

• Poor program implementation.

• Insensitive program evaluation.

• Poor stakeholder – evaluator relations.

• Scarcity of cumulative knowledge and wisdom.

Page 4: Program Theory Lecture  University of San Diego October 2006

4

Black Box Evaluation

• Evaluation of program outcomes without the benefit of an articulated program theory that provides insight into what is presumed to be causing those outcomes, and why.

Rossi, Lipsey & Freeman (2004)

Page 5: Program Theory Lecture  University of San Diego October 2006

5

Contingency View

• No single best way to conduct program evaluation.

• The choice of approaches and methods for program evaluation should be situational.

• The individual natures of programs and the uniqueness of evaluation purposes and contextual circumstances require use of a range of evaluation approaches and methods.

Chen (2005)

Page 6: Program Theory Lecture  University of San Diego October 2006

6

Theories Used in Evaluation

• Evaluation theory.– Guides evaluation practice, e.g., empowerment

evaluation, theory-driven evaluation, goals-free evaluation.

• Social science theory.– Theory from the extant literature, e.g., Social

Learning Theory, Theory of Reasoned Action.

• Program theory– Stakeholder theory – varies by program.

Page 7: Program Theory Lecture  University of San Diego October 2006

7

Program Theory

• The set of assumptions about the manner in which the program relates to the social benefits it is expected to produce and the strategy and tactics the program has adopted to achieve its goals and objectives. Within program theory we can distinguish impact theory – the nature of the change in social conditions brought about by program action and process theory – the program’s organizational plan and service utilization plan.

Rossi, Lipsey & Freeman (2004)

Page 8: Program Theory Lecture  University of San Diego October 2006

8

Program Theory

• Assumptions made by stakeholders about what action is required to solve a social problem and why the problem will respond to this action.

• Perception of nature of problem is from experience, conventional wisdom, discussions with peers, etc…

• Solution to the problem is practicable.

Page 9: Program Theory Lecture  University of San Diego October 2006

9

Assumptions

• Prescriptive – the action that is required to solve a social problem.

– Explained by the Process Model

• Descriptive – why the problem will respond to the action.

– Explained by the Impact Model

Page 10: Program Theory Lecture  University of San Diego October 2006

10

Process Model

• Components and activities program designers and key stakeholders see as necessary for program success.

• Example: Wired With Wisdom Parent RecruitmentLetters from PrincipalsChampion Parents: phone, email, personal contactChildren of target parents

Page 11: Program Theory Lecture  University of San Diego October 2006

11

WWK

Principal

Project Manager

PTO/PTAOfficers

Computer Instructor

Parents

Effects intrinsic motivation

leading to use

TeachersStudents

Champs

Positive Letter

Negative Letter

Incentive Letter

Phone

E - Mail

Personal

Follow-up

Follow-up Letter

Internet training

No effect on intrinsic

motivation leading to non-

use

Direct influence/communication:

Direct action:

Two-way influence/communication:

Process Model

Or

Recruitment Program

Page 12: Program Theory Lecture  University of San Diego October 2006

12

Components of Process Model

• Intervention and service delivery protocols.

• Implementing organization: assess, enhance, and ensure its capacity.

• Program implementers: recruit, train and maintain both competency and commitment.

Page 13: Program Theory Lecture  University of San Diego October 2006

13

Components of Process Model

• Associate organizations/community partners: establishing collaborations.

• Ecological context: seek its support at micro and macro levels.

• Target population: identify, recruit, screen, serve.

Page 14: Program Theory Lecture  University of San Diego October 2006

14

Impact Model

• Assumptions about causal processes through which intervention is supposed to work.

• Example:

Parent recruitment Increases or Successful recruitment,

Methods Reduces Intrinsic use of program

Motivators

Page 15: Program Theory Lecture  University of San Diego October 2006

15

Safer children

Wired With WisdomParent Recruitment Program

(LettersPhone calls

E - mailsPersonal contact)

Successfulrecruitment =

use of Wired With

Wisdom

Well managed family internet environment &

safety plan

ReducePerceivedBarriers

IncreasePerceivedBenefits

IncreasePerceived

Susceptibility

IncreasePerceivedSeverity

IncreaseActionCues

Impact Model

Page 16: Program Theory Lecture  University of San Diego October 2006

16

Components of the Impact Model

• Intervention/treatment

• Determinants– Mediators– Moderators

• Goals/outcomes: – Distal– Intermediate – Proximal

Page 17: Program Theory Lecture  University of San Diego October 2006

17

Logic Models & Program Theory

Page 18: Program Theory Lecture  University of San Diego October 2006

18

Program Theory Considerations

• Parsimony (the core of the program).

• Precision of relationships.

• Bi-directional approach.

• Program-effect decay functions.

• Dose-response functions.

• Mediators.

• Moderators.

Page 19: Program Theory Lecture  University of San Diego October 2006

19

Program-effect Decay Functions

Page 20: Program Theory Lecture  University of San Diego October 2006

20

Dose-Response Functions

Page 21: Program Theory Lecture  University of San Diego October 2006

21

Direct Effects Model

Page 22: Program Theory Lecture  University of San Diego October 2006

22

One Mediator Model

Page 23: Program Theory Lecture  University of San Diego October 2006

23

Indirect Effects Model

Page 24: Program Theory Lecture  University of San Diego October 2006

24

Multiple Mediator Effects Model

Page 25: Program Theory Lecture  University of San Diego October 2006

25

Moderator of Mediator Effect Model

Page 26: Program Theory Lecture  University of San Diego October 2006

26

Moderator of Mediator-Outcome Relationship

Page 27: Program Theory Lecture  University of San Diego October 2006

27

Small Group Exercise

Based on the following model, describe the program, its actions and the changes that are expected to result. Use the questions below as a guide.

What is the intervention/treatment?What are the mediators?What are the moderators?What are the proximal outcomes?What is the distal outcome?

Page 28: Program Theory Lecture  University of San Diego October 2006

28

Computers In Our Future

Page 29: Program Theory Lecture  University of San Diego October 2006

29

Theory Calls Evaluation Practitioner’s Attention To:

• Which stage or stages of the program cycle will be the focus of the evaluation?

• What do stakeholders want from the evaluation?

• What evaluation options potentially fit the given program’s context?

• What trade-offs among these options will be most profitable?

Page 30: Program Theory Lecture  University of San Diego October 2006

30

Using Program Theory to Design Evaluations

• Compels evaluators to be thoughtful before acting.

• Enhances understanding of program.

• Program assumptions used as scaffolding for the study.

• Informs method choices – qualitative, quantitative or mixed, that is, the contingency view!

Page 31: Program Theory Lecture  University of San Diego October 2006

31

Using Program Theory to Design Evaluations

• Highlights elements of program activity that deserve attention in the evaluation.

• Helps tailor evaluations to answer the most important questions (remember, parsimony).

• Heightens evaluation responsiveness and sensitivity.

• Increases validity – construct & internal.

Page 32: Program Theory Lecture  University of San Diego October 2006

32

Using Program Theory to Design Evaluations

• Fosters cumulative wisdom.

• Helps evaluators meet American Evaluation Association professional evaluation standards – Utility, Feasibility, Practicality, Accuracy.

• Can choose to collect data on linkage mechanisms assumed to be operative in one theory or in several theories.

• Can direct the evaluation toward investigating one link in the theory chain.

Page 33: Program Theory Lecture  University of San Diego October 2006

33

Is Theory-Driven Evaluation Methodologically Rigorous?

“The tie-in, or relationship, of the theory-driven approach with our best methodological work is impressive. Think about the way we establish the validity of constructs in experimental research. In essence, construct validation requires a theory, an understanding of the hypothetical network of causal associations and non-causal relationships among the variables that we might try to understand”

Crano (2003)

Page 34: Program Theory Lecture  University of San Diego October 2006

34

Theory-driven Evaluation The CDC Framework

• Engage stakeholders – evaluability assessment.

• Describe the program through the action and change models.

• Formulate & prioritize evaluation questions.

• Focus the evaluation design.

Page 35: Program Theory Lecture  University of San Diego October 2006

35

Theory-driven Evaluation The CDC Framework

• Gather credible evidence through rigorous scientific methods.

• Justify conclusions.

• Ensure utilization and lessons learned.

Page 36: Program Theory Lecture  University of San Diego October 2006

36

CDC Evaluation Framework

Page 37: Program Theory Lecture  University of San Diego October 2006

37

Effective Theory-Driven Evaluations

1. Future action directedness.

– Useful to stakeholders.

– Assessing merit is a means rather than an end.

– Provides useful information for stakeholders to improve current or future programs.

Page 38: Program Theory Lecture  University of San Diego October 2006

38

Effective Theory-Driven Evaluations

2. Scientific and stakeholder credibility.

– Follows scientific methods and principles to optimize validity and reliability.

– Responding to stakeholders’ values, views, concerns, and needs.

Page 39: Program Theory Lecture  University of San Diego October 2006

39

Effective Theory-Driven Evaluations

3. Holistic Approach

– Intrinsic value.

– Context.

Page 40: Program Theory Lecture  University of San Diego October 2006

40

Explicating Program Theory: Basics

Facilitated by Evaluator:

• Stakeholders reflectively examine what they are doing.

• Stakeholders identify elements that are essential for achieving program goals.

• Stakeholders articulate causal relationships.

Page 41: Program Theory Lecture  University of San Diego October 2006

41

Explicating Program Theory: Process

• Face-to-face meetings with stakeholders (working group or intensive interview)

• Facilitating conceptualization of program: – “Tell me how your program works.”– “What do you want your program to do?”– “What circumstance does it mitigate or need it meets?”– “Who does it impact?”

• Theorizing methods – backward reasoning (start with intended outcomes) to inputs, forward reasoning, or both.

Page 42: Program Theory Lecture  University of San Diego October 2006

42

Explicating Program Theory: Results

• Stake-holder buy-in & support of evaluation.

• Systematic understanding of stakeholder views, needs, and values.

• Utilization of knowledge produced by the evaluation

– Conceptual (understanding/education)– Instrumental (decision-support)– Process (making use of the logic of the evaluation)– Symbolic (justify a priori decisions)– Influence

Page 43: Program Theory Lecture  University of San Diego October 2006

43

Evaluation Questions Hierarchy

Page 44: Program Theory Lecture  University of San Diego October 2006

44

Types of Theory-Driven Evaluations

• Action Model Theory-driven process evaluation.

• Change Model Theory-driven outcome evaluation.

• Action Model + Change Model Integrated Theory-driven process/outcome evaluation.

Page 45: Program Theory Lecture  University of San Diego October 2006

45

Theory-Driven Process Evaluation

• Systematically assess how the following major components of an action model are being implemented in the field:

– Intervention and service delivery protocols– Target populations– Implementing organization– Implementers– Associate organizations/partners– Ecological support

Page 46: Program Theory Lecture  University of San Diego October 2006

46

Theory-Driven Outcome/Impact Evaluation

• Serves accountability and program improvement needs by investigating underlying causal mechanisms.

• Comments on construct ability

• Increases internal validity.

• Generates two kinds of information: – Assesses if program achieving its predetermined goals– Investigates why and how program succeeds or does not succeed

Page 47: Program Theory Lecture  University of San Diego October 2006

47

Integrated Process – Outcome/Impact Evaluation

Implementation of

parent recruitment

action model

Parent recruitment Increases or Successful recruitment,

Methods Reduces Intrinsic use of program

Motivators

(Action theory of success) (Conceptual theory of success)

Page 48: Program Theory Lecture  University of San Diego October 2006

48

References• Bickman, L. (Ed.) (1987). Using program theory in evaluation.

New Directions for Program Evaluation, No., 47. San Francisco, CA: Jossey-Bass.

• Birkmayer, J. D., & Weiss, C. H. (2000). Theory-based evaluation in practice. Evaluation Review, 21(4), 407 – 431.

• Chen, H. T. (2005). Practical Program Evaluation: Assessing, and Improving Planning, Implementation and Effectiveness. Thousand Oaks, CA: Sage.

• Chen, H. T. (1990). Theory-Driven Evaluation. Newbury Park, CA: Sage

Page 49: Program Theory Lecture  University of San Diego October 2006

49

References• Chen, H. T., & Rossi, P. H. (1983). Evaluating with sense: The

theory-driven approach. Evaluation Review, 7, 283 – 302.

• Chen, H. T., & Rossi, P. H. (1987). The theory-driven approach to evaluation. Evaluation and Program Planning, 10, 95 – 103.

• Crano, W. D. (2003). Theory-driven evaluation and construct validity. In S. I. Donaldson and M. Scriven (Eds.), Evaluating Social Programs and Problems: Visions for the New Millennium. Mahwah, NJ: Erlbaum.

• Donaldson, S. I. (Forthcoming). Program theory-driven evaluation science: Strategies and applications. Mahwah, NJ: Erlbaum.

Page 50: Program Theory Lecture  University of San Diego October 2006

50

References• Donaldson, S. I. (2002). Theory-Driven Evaluation in the New

Millennium. In S. I. Donaldson and M. Scriven (Eds.). Evaluating Social Programs and Problems: Visions for the New Millennium. Mahwah, NJ: Erlbaum.

• Donaldson, S. I., & Lipsey, M. (2006). Roles for theory in contemporary evaluation practice: Developing practical knowledge. In. I. Shaw, J. C. Greene, and M. H. Mark (Eds.). The SAGE Handbook of Evaluation, Thousand Oaks, CA: Sage.

• Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program Evaluation: Alternative Approaches and Practical Guidelines (3rd Ed.). Boston, MA: Pearson.

• Lipsey, M. (1988). Practice and malpractice in evaluation research. Evaluation Practice, 8(4), 5 – 24.

Page 51: Program Theory Lecture  University of San Diego October 2006

51

References• Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004).

Evaluation: A Systematic Approach (7th Ed.). Thousand Oaks, CA: Sage.

• Shadish, W. R., Cook, T., & Leviton, L. (1991). Foundations of Program Evaluation: Theories of Practice. Newbury Park, CA: Sage.

• Weiss, C. H. (1997). How can theory-based evaluation make greater headway? Evaluation Review, 21(4), 501 – 524.

• Weiss, C. H. (1998). Evaluation: Methods for Studying Programs and Policies (2nd Ed.). Upper Saddle River, NJ: Prentice Hall.