Presented by CSAP’s CAPT Southwest Regional Team Southwest Prevention Center The University of...

62
CSAP’s CAPT Southwest Region Service to Science Academy March 28-29, 2011 Oklahoma City, OK Presented by CSAP’s CAPT Southwest Regional Team Southwest Prevention Center The University of Oklahoma 1639 Cross Center Drive, Suite 254 Norman, OK 73019 (800) 853-2572

Transcript of Presented by CSAP’s CAPT Southwest Regional Team Southwest Prevention Center The University of...

CSAP’s CAPT Southwest Region Service to Science Academy

March 28-29, 2011Oklahoma City, OK

Presented byCSAP’s CAPT

Southwest Regional TeamSouthwest Prevention Center

The University of Oklahoma1639 Cross Center Drive, Suite 254

Norman, OK 73019(800) 853-2572

Approaches to Prevention Evaluation

• How Do You Feel About Evaluation?– What comes up for you when you think about it?

Learning Objectives

• Understand the purposes of evaluation• Understand the key components of evaluation• Use a logic model as a guide to create an

evaluation plan• Identify measures and sources of data for

evaluation• Identify the benefits of internal and external

evaluation• Use evaluation findings to assist with decision-

making

Prevention Evaluation

Why is it important?

Evidence-Based Prevention

• A prevention activity is judged to be evidence-based if “good” evaluation – research that has been shown to be rigorous according to a set of carefully defined criteria – demonstrates that the activity is effective.

• The evaluation should demonstrate that:• the activity produces the expected positive results

intended, and• these results can be attributed to the activity or

program rather than to other factors.

Evaluation

The systematic collection and analysis of information about program activities, characteristics, and outcomes to reduce uncertainty, improve effectiveness, and make decisions.

It’s About Utility

• Planning programs• Monitoring implementation of programs• Improving or redesigning programs• Advancing knowledge of innovative programs• Providing evidence of program effectiveness

Definitions

Process Evaluation – Documenting program implementation

Outcome Evaluation – Documenting effects that you expect to achieve after the program is implemented

Traditional vs. Collaborative Evaluation

9

Traditional Collaborative

Done to the program Done with the program

Evaluator operates apart from the program

Evaluator operates in concert with the program

Evaluator decides Evaluator advises

Evaluator retrieves information from program staff as needed to plan and carry out the study

Program staff are participants in planning and carrying out the study

Evaluator interacts relatively infrequently through the program director

Evaluation interacts regularly through the program staff and other stakeholders

The Collaborative Model

The primary mechanism is an evaluation team made up of:

• Evaluator• Program Staff• Other Stakeholders (e.g., in a school-based program,

stakeholders may include curriculum designers, school board members, teachers, parents, students)

10

Framework for Evaluation

11

Describe theProgram

Focus theEvaluation

Design

Select Appropriate

Methods

JustifyConclusions

Ensure Use andShare Lessons Learned

EngageStakeholders

Step 1: Engage Stakeholders

12

Stakeholders

Those organizations and individuals who care about either the program or the evaluation findings

In general, anyone who has something to gain or lose from the program

13

Activity 1 : Who Are Your Stakeholders?

• Identify the stakeholders in your program• Identify their interests• Rank order from most to least important

– Program stakeholders– Evaluation stakeholders

14

Activity 1 :Things to Keep in Mind

• Did you put yourself on the list?• Did you identify competing needs?• Were the agendas of all stakeholders

explicit?• Were you clear about what could and

couldn’t be accomplished?

15

Step 2: Describe the Program

16

Definition of a Logic Model

Description of what a program is expected to achieve and how it is expected to work

A map linking together a project’s goals, activities, services, and inclusive of assumptions

17

Benefits of a Logic Model

1. Develops understanding

2. Helps monitor progress

3. Serves as an evaluation framework

4. Helps expose assumptions

5. Helps restrain over-promising

6. Promotes communications

18

Activity 2: Let Purpose Be Your Guide

• Discuss among yourselves what’s the purpose of your evaluation; the utility.

• Acquaint your S2S evaluator consultant and/or your invited evaluators with your program and what purpose you hope that an evaluation would help with.

19

20

Blank Logic ModelGOALS FOCUS

POPULATION STRATEGIES “IF-THEN” STATEMENTS SHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

2. Evaluation Questions:

3. Data Collection Sources:

Designing a Logic Model

A. Goals: What risk and protective factors will be addressed?

B. Focus Population: Who will participate in, or be influenced by, the program?

C. Strategies: What services and activities will be provided?

D. “If-Then” Statements: How will these activities lead to expected outcomes?

E. Short-Term Outcomes: What immediate changes are expected for individuals, organizations, or communities?

F. Long-Term Outcomes: What changes would the program ultimately like to create?

21

22

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

Reduction in Academic Failure

23

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

Reduction in Academic Failure

Children in grades 1-3 at the local elementary school who are struggling academically as identified by teachers

24

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

Reduction in Academic Failure

Children in grades 1-3 at the local elementary school who are struggling academically as identified by teachers

Academic Tutoring: 3 hours per week for one school year

50 students

Assumptions

• Identify the assumptions underlying your program.

• Do your program activities lead logically to your goals?

• How and why do you expect your program to achieve your goals?

• What are the steps that will lead logically from your program activities to your goals?

25

“If-Then” Statements

• IF the program invests time and money to develop an inventory of the drug-free activities…THEN youth will be more informed about opportunities within the community.

• IF youth know what’s available…THEN they’ll be more likely to participate.

• IF youth participate in alternative drug-free activities…THEN they’ll be more likely to develop friendships with non-using peers and THEN be less likely to use ATOD themselves.

26

27

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

Reduction in Academic Failure

Children in grades 1-3 at the local elementary school who are struggling academically as identified by teachers

Academic Tutoring: 3 hours per week for one school year

50 students

*If tutoring is offered to students having academic problems, then students will have the opportunity to improve their academic skills.

*If identified students take the opportunity, they will improve their academic skills.

*If they improve their academic skills, they will not fail in school.

*If they don’t fail in school, they will be less likely to abuse alcohol, tobacco, and other drugs.

The Short & Long of It

Short-Term Outcomes – the immediate program effects that you expect to achieve (e.g., improving problem solving skills)

Long-Term Outcomes – the long-term or ultimate effects of the program (e.g., reducing drug use)

28

A Word about Outcomes

• There are no right number of outcomes

• The more immediate the outcome, the more influence the program has over its achievement

• The longer term the outcome, the less direct influence a program has over its achievement

• Because other forces affect an outcome doesn’t mean it shouldn’t be included

• Long-term shouldn’t go beyond the program’s purpose or focus audience

29

30

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

1. Logic Model:

Reduction in Academic Failure

Children in grades 1-3 at the local elementary school who are struggling academically as identified by teachers

Academic Tutoring: 3 hours per week for one school year

50 students

*If tutoring is offered to students having academic problems, then students will have the opportunity to improve their academic skills.

*If identified students take the opportunity, they will improve their academic skills.

*If they improve their academic skills, they will not fail in school.

*If they don’t fail in school, they will be less likely to abuse alcohol, tobacco, and other drugs.

Participants grades improve.

Participants move to next grade level on time.

Participants do not begin using ATOD within five years of participating in the program.

(Participants will be in 6-8 grades)

Activity 3: Generating a Logic Model

• Review the example, “Sample Logic Model A.”

• Complete Row 1 of the Blank Logic Model work sheet using your program.

• Record your Logic Model on chart paper and post.

31

Step 3: Focus the Evaluation Design

32

Activity 4: Logic Model Focus

• Focus your Logic Model on the area in which you would like to direct your evaluation efforts. You may have brought a Program Logic Model (or you may need to develop one), review and assess your possibilities recalling the purpose you may have identified earlier.

• Discuss this at your table.

33

Designing an Evaluation

Clarify PURPOSE of evaluation that leads to

QUESTIONS that require

INFORMATION and data obtained from

METHODS

34

Points to Consider

• Keep in mind the purpose of the evaluation• What’s going to be evaluated• Who wants to know what• When you need the information• What you intend to do with the evaluation results• Resources you have available for the evaluation

(e.g., time, money, people)

35

Be realistic . . .

But think creatively.

36

Tips for Generating Evaluation Questions

• 3 to 5 questions are often adequate.

• Use open-ended questions, not “yes-no.”

• Avoid compound questions, i.e., questions that include multiple statements.

• A good rule of thumb is that the questions start with “To what extent . . .”

37

Sample Evaluation Questions

Process Evaluation• How are resources allocated to various activities?• To what extent was the program implemented as

planned?• What obstacles were encountered during program

implementation?

Outcome Evaluation Over the duration of the program, to what extent has:• School attendance improved?• Community-wide prevention awareness activities

changed adult norms about substance use?• Youth substance use decreased?

38

Beware of the “Black Box”!

Outcome evaluations that focus solely on program effects are dangerous!

39

40

Fill in the Blanks

©1995

Activity 5: Developing Your Questions

• Complete Row 2 of your “Blank Logic Model” work sheet, developing 2-3 evaluation questions for each column.

• Select the top 3 evaluation questions you would like answered.

41

42

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

2. Evaluation Questions:

Have we achieved our goal?

To what extent were academic failure rates reduced?

~Were teachers referring as intended? ~Is the method by which they refer helpful or a hindrance?~Where their other barriers to reaching our students?

~To what extent have I implemented my tutoring program as planned?(3 hrs./wk. for 9 mos.)~What obstacles were encountered during the process?~Did I reach the students I intended?~Did delivery style, setting , mentor type matter?

~Does my logic hold up? ~What assumptions have I made that may need to be challenged?

~Have participants grades improved?~If not, why?~Have participants moved to the next grade level? ~If not, why?~How many improved?~How many moved to next grade?

~Have all our students that received tutoring and moved up in grade remained drug-free? ~If not, why? ~Is there a difference in the dosage between those that didn’t start and those that did?

Step 4: Select Appropriate Methods

43

Evaluation Methods

44

Quantitative Qualitative

Counting Anecdotes

Checklists Case studies

Surveys Focus groups

Pre-post tests Key informant interviews

Analysis of existing statistics Observations

Analysis of existing files

45

Sample Logic Model AGOALS

FOCUS POPULATION

STRATEGIES “IF-THEN” STATEMENTSSHORT-TERM OUTCOMES

LONG-TERM OUTCOMES

A. In order to address the level of this risk or protective factor:

B. For thesepeople:

C. We will do the following program activities/strategies (what, where, and how much):

D. We expect that this activity will lead to changes in these risk/protective factors, which in turn will lead to our program goal:

E. We will know these changes have occurred if:

F. We will know we are reaching our goals if:

3. Data Collection Sources:

~School Advancement Records~Program Reports

~Teachers~Referral protocols~Interviews

~Tutoring Aides~Questionnaires~Survey~Fidelity Instrument~Interviews~Focus group

~Theory base ~List operating assumptions

~GPAs~Student Achievement Tests~Archived Advancement Records

~Student Surveys~Dosage Logs

Activity 6: Identifying Data Collection Sources

• Complete Row 3 of your “Blank Logic Model” work sheet, identifying 2-3 data sources for each column.

• What gaps are you noticing? Does your logic model still link together?

• Are you beginning to favor a particular method? Quantitative? Qualitative? Or both?

46

Benefits of Quantitative Methods

• Standardized• Succinct• Easily aggregated for analysis• Systematic• Easily presented in short space• The ability to generalize is widely accepted

47

Benefits of Qualitative Methods

• Detailed and variable• Unanticipated benefits and/or concerns are possible to

detect• Offer explanations for short-term outcomes• Help generate new ideas and/or theories

48

Benefits of Multi-Method Evaluation

• Understand program processes and outcomes from multiple perspectives

• Strengths of some methods compensate for weaknesses in others

• Results will be useful to a variety of audiences• Results will be credible to a variety of audiences

49

Limitations of Multi-Method Evaluation

• Requires multiple evaluation skills and evaluation team members

• Cost is usually higher than single method evaluations

• Methodological rigor possible with single method evaluation may be sacrificed

• Contradictory or inconsistent findings may require additional analysis and increase complexity of reporting

50

Activity 7: Selecting Appropriate Methods

• Begin to think creatively about potential methods suitable to address some of your crafted evaluation questions.

• Report out on some of your ideas. What has lead you to this decision?

51

Step 5: Justify Conclusions

52

Conclusions are based on:

• Analysis: Isolates important findings• Synthesis: Combines sources of

information to reach larger understandings

• Interpretation: Determines meaning of the findings

53

Step 6: Ensure Use and Share Lessons Learned

54

“Use is not simply determined by some configuration of abstract factors;

it is determined in large part by real, live, caring human beings.”

Michael Quinn Patton, 1997

55

Evaluation Characteristics

• Report - nature and timing• Relevance of findings• Quality of evaluation• Follow-up support and technical assistance

56

Consider Context

• Political climate• Size of organization/program• Competing information

57

Creating a Data Dissemination Plan

1. WHAT is the nature of the data you have collected?

2. WHY do you want to present the data?3. TO WHOM are you presenting the data?4. HOW are you going to present the data?5. HOW MUCH data are you going to present?6. WHO will present the data?7. WHERE are you going to present the data?8. WHEN are you going to present the data?

58

Choosing Appropriate MethodsMETHODS

AUDIENCE

Abstracts&

Briefings

Annual/Evaluation

ReportsFact

Sheets

Brochures&

Posters ExhibitsPress

ConferencesPress

ReleasesTown

Meetings

Current/Potential Funder

X X

New PotentialFunder

X X

Administrator X X X

BoardMembers

X X X X

CommunityGroups

X X X

GeneralPublic

X X X X X

Organizations X X

Media X X X X

59

Adapted from (1) Borden, DeBord, and Snipes and (2) Morris, Gibson, and Freeman.

Making Evaluation Results Useful

• Brief stakeholders throughout the project• Help stakeholders understand data• Create a dissemination plan• Select the most useful media for reporting

results

60

Framework Revisited

61

Describe theProgram

Focus theEvaluation

Design

Select Appropriate

Methods

JustifyConclusions

Ensure Use andShare Lessons Learned

EngageStakeholders

Thank You!

Questions, comments, observations?

Contact us:

CSAP’s CAPT Southwest Regional Team

[email protected]

62