Evaluating and Reporting Program Impacts: Contentious Public Issues

38
Evaluating and Reporting Program Impacts: Contentious Public Issues Loretta Singletary Professor & Extension Educator University of Nevada Cooperative Extension

description

Evaluating and Reporting Program Impacts: Contentious Public Issues. Loretta Singletary Professor & Extension Educator University of Nevada Cooperative Extension. Staci Emm University of Nevada Cooperative Extension - PowerPoint PPT Presentation

Transcript of Evaluating and Reporting Program Impacts: Contentious Public Issues

Page 1: Evaluating and Reporting Program Impacts: Contentious Public Issues

Evaluating and Reporting Program Impacts: Contentious Public Issues

Loretta SingletaryProfessor & Extension Educator

University of Nevada Cooperative Extension

Page 2: Evaluating and Reporting Program Impacts: Contentious Public Issues

Common Assumptions Can’t measure impacts of “process”

programs because…Handing out evaluations is awkwardParticipants don’t like to complete forms I don’t need to measure the impacts because

I know when it is successful I don’t want to report results of little to no

impacts or negative impacts

Page 3: Evaluating and Reporting Program Impacts: Contentious Public Issues

Common Assumptions Can’t measure impacts of “process”

programs because…These programs are just too complexThere is too much controversy Some participants arrive angry, stay angry

and leave angryMy role is limited to that of facilitator onlyMy role is that of educator onlyMy role is fact-finder and researcher only

Page 4: Evaluating and Reporting Program Impacts: Contentious Public Issues

Evaluation DilemmaAll affected stakeholders do not

see collaboration as a better alternative

Stakeholders do not participate equally

Participants have different skills, commitment, motives

Page 5: Evaluating and Reporting Program Impacts: Contentious Public Issues

Evaluation DilemmaPlaying field is unevenStakeholders lack sufficient time and

receive varied “dosage” External forces are rushing the

processFunding and time constraints are

problems

Page 6: Evaluating and Reporting Program Impacts: Contentious Public Issues

Evaluation Required? LOGIC model is a systematic method for

program planning that requires (handout):1. Situational Analyses2. Priority Setting3. Action Plan4. Evaluation

Page 7: Evaluating and Reporting Program Impacts: Contentious Public Issues

Benefits of EvaluationServe to strengthen programs

over the short and long termEvaluations determine strategies

and methods that work best, under which conditions and for which purposes

Page 8: Evaluating and Reporting Program Impacts: Contentious Public Issues

Benefits of EvaluationEnable comparisons with

traditional decision-making processes to build public understanding of consensus-based processes

Page 9: Evaluating and Reporting Program Impacts: Contentious Public Issues

Evaluating Impacts of Programs that Address Contentious Public Issues WHAT? WHO? WHEN? HOW?

Page 10: Evaluating and Reporting Program Impacts: Contentious Public Issues

What to Evaluate? Learning objectives = desired changes Programs that address contentious

public issues try to accomplish some basics:

Educate Communicate Explore options Seek mutually satisfying agreements

Page 11: Evaluating and Reporting Program Impacts: Contentious Public Issues

What are Desired Changes? Contentious issue efficiently addressed Multiple options developed Multiple options considered Climate becomes less contentious Climate supports public discourse Conflict is resolved for mutual satisfaction

Page 12: Evaluating and Reporting Program Impacts: Contentious Public Issues

Primary Impacts (Gray, 1989)

Two objective criteria to measure impacts of a collaborative effort:1. Negotiated agreement is reached2. Agreement has been implemented

Page 13: Evaluating and Reporting Program Impacts: Contentious Public Issues

Secondary Impacts Subjective criteria show impacts of

collaborationAlter individuals’ skills to participate Alter perceptions of othersAlter perceptions of issueAlter attitudes Alter behavior Develop criteria to measure these changes

Page 14: Evaluating and Reporting Program Impacts: Contentious Public Issues

Secondary Impact Measures Improved…

Communication skillsNetworking skillsRelationship building skills

Increased hope for resolving the issue

Page 15: Evaluating and Reporting Program Impacts: Contentious Public Issues

Secondary Impact Measures Quality of Interaction

Volunteers learn how to share powerTreat one another fairly and with respect

Operational DetailsMeetings follow timelineMeetings accomplish goalsParticipants stay informed (Gray 1989)

Page 16: Evaluating and Reporting Program Impacts: Contentious Public Issues

Secondary Impacts… First order vs. second order effects Second order effects…

Increased knowledge about issues Increased awareness of issues Increased awareness of diverse views New personal and working relationships Stakeholders accept and understand scientific

analyses Respond to future conflict civilly (Innes, 1999)

Best when retrospectively measured

Page 17: Evaluating and Reporting Program Impacts: Contentious Public Issues

When to Evaluate? Midcourse – partway through to assess

progress and improvements End-of-process – at conclusion to

assess learning, satisfaction and first order effects

Retrospective – 12 to 18 months after process to identify value and stability of process and second order effects

Page 18: Evaluating and Reporting Program Impacts: Contentious Public Issues

Midcourse Evaluation After a few months of process

program to assess “how it is going”… Are process criteria being met? Are participants satisfied? Are representatives communicating with

their stakeholder groups? Do all participants feel well informed? Do all participants feel equally empowered? Are participants engaged and interested?

Page 19: Evaluating and Reporting Program Impacts: Contentious Public Issues

End-of-Process Evaluation Within month or two of final meeting Experience still “fresh” in everyone’s

mind Objectively assess accomplishments

Participants feel time well spent? Participants plan to stay involved with

implementing agreement? Participants would try this type of process

again?

Page 20: Evaluating and Reporting Program Impacts: Contentious Public Issues

Retrospective Evaluation Of single process takes place at least

one year after process completed Examines immediate and longer term

effects of program Traces over time program effects on

participants and their relationships to one another

Page 21: Evaluating and Reporting Program Impacts: Contentious Public Issues

Who to Evaluate? Program objectives determines WHO

Joint fact finding Policy dialogue Public-private partnerships Regulatory negotiation Advisory Committees Public Forums Seeking negotiated agreement

Page 22: Evaluating and Reporting Program Impacts: Contentious Public Issues

Who to Evaluate? Program objectives determines WHO Process participants (stakeholders—

those directly involved) Agencies not directly involved Citizens not directly involved Funding sources Program speakers, researchers,

facilitators

Page 23: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?

Focus Groups (least expensive) Face-to-face interviews Postal surveys Internet surveys

Page 24: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?Principles for Writing QuestionsPrinciples for Writing Questions

Simple wording, yet complete sentencesSimple wording, yet complete sentences Avoid vague quantifiersAvoid vague quantifiers Avoid specificity (how many)Avoid specificity (how many) Equal scalar (positive and negative)Equal scalar (positive and negative) Provide “Don’t Know” category for each question *Provide “Don’t Know” category for each question * State both ends of scale equally (Poor / Excellent)State both ends of scale equally (Poor / Excellent) Eliminate “check all that apply”Eliminate “check all that apply” Avoid open ended questionsAvoid open ended questions Offer ordered responses (scale)Offer ordered responses (scale)

Page 25: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?

Close-ended questionsClose-ended questions with with ordered responsesordered responses provide provide participants with a response scale;participants with a response scale;Select one answer from a scale or Select one answer from a scale or fixed range of choices. fixed range of choices.

Page 26: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?All citizens have a responsibility to All citizens have a responsibility to conserve water on a daily basis.conserve water on a daily basis. Strongly disagreeStrongly disagree Somewhat disagreeSomewhat disagree Neither agree nor disagreeNeither agree nor disagree Somewhat agreeSomewhat agree Strongly agreeStrongly agree

Page 27: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?

My ability to listen to others’ views has:My ability to listen to others’ views has: Improved very littleImproved very little Improved littleImproved little Stayed about the sameStayed about the same Improved muchImproved much Improved very muchImproved very much

Page 28: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?My understanding of the effect of My understanding of the effect of salinity on the Lahonton Cutthroat Trout salinity on the Lahonton Cutthroat Trout fishery has:fishery has: Improved very littleImproved very little Improved littleImproved little Stayed about the sameStayed about the same Improved muchImproved much Improved very muchImproved very much

Page 29: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?

Close-ended questionsClose-ended questions with with unordered responsesunordered responses present present answers in no particular order and answers in no particular order and participants must pick the one best participants must pick the one best response. response.

Page 30: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?Whose responsibility is it to see to it that Whose responsibility is it to see to it that

watershed water quality improves? (check watershed water quality improves? (check only one):only one):

Area farmers Area farmers Environmental Protection Agency Environmental Protection Agency My Regional Watershed Planning Group My Regional Watershed Planning Group Local elected officials Local elected officials Environmental interest groupsEnvironmental interest groups Every citizen living in the watershed area Every citizen living in the watershed area

Page 31: Evaluating and Reporting Program Impacts: Contentious Public Issues

How to Evaluate?An example of a An example of a partially closed-ended partially closed-ended questionquestion with with unordered responseunordered response is: is: How do you prefer to receive ABC Program How do you prefer to receive ABC Program information? (check all that apply)information? (check all that apply)

Printed material mailed to mePrinted material mailed to me Small group meetingsSmall group meetings Large public forumsLarge public forums Newspaper articlesNewspaper articles Other (please describe) Other (please describe) ________________________________________________________________

Page 32: Evaluating and Reporting Program Impacts: Contentious Public Issues

Reporting Quantitative Impacts Ranked means using Likert scale Percentages collapsing highs/low

Likert scale Regression analyses using participant

characteristics to find impact patterns Several other tests for relationships

Page 33: Evaluating and Reporting Program Impacts: Contentious Public Issues

Reporting Qualitative Impacts Frequency count by key words

VerbsNounsDirect objects

Categorize comments by classLiked mostLiked least

Page 34: Evaluating and Reporting Program Impacts: Contentious Public Issues

Reporting Impacts Citizen participants Participating agencies Your administrators Land-grant universities (academic body) Funding sources

Page 35: Evaluating and Reporting Program Impacts: Contentious Public Issues

Reporting Impacts 4-page fact sheets In-depth Extension publications Power point presentations to citizens Professional conferences (ANREP,

NACAA, NACDEP, ESP, Galaxy) Refereed journal articles (JOE) News articles Radio and television interviews

Page 36: Evaluating and Reporting Program Impacts: Contentious Public Issues

Bibliography Bickman, L., Rog, D.J. & Hedrick, T.E. 1998. Bickman, L., Rog, D.J. & Hedrick, T.E. 1998.

Applied Research Design: A Practical Approach Applied Research Design: A Practical Approach in L. Bickman and D.J. Rog (eds.) Handbook of in L. Bickman and D.J. Rog (eds.) Handbook of Applied Social Research Methods. Thousand Applied Social Research Methods. Thousand Oaks, CA: Sage Publications.Oaks, CA: Sage Publications.

Dillman, D. A. 2000. Mail and Internet Dillman, D. A. 2000. Mail and Internet Surveys: The Tailored Design Method, 2nd Surveys: The Tailored Design Method, 2nd edition. New York: John Wiley and Sons.edition. New York: John Wiley and Sons.

Dillman, D. A. 1978. Mail and Telephone Dillman, D. A. 1978. Mail and Telephone Surveys: The Total Design Method. New York, Surveys: The Total Design Method. New York, NY: John Wiley and Sons.NY: John Wiley and Sons.

Gray, B. 1989. Collaborating: Finding Gray, B. 1989. Collaborating: Finding Common Ground for Multiparty Problems Common Ground for Multiparty Problems

Page 37: Evaluating and Reporting Program Impacts: Contentious Public Issues

Bibliography Heron, J. 1996. Co-operative Inquiry: Research into Heron, J. 1996. Co-operative Inquiry: Research into

the Human Condition. Thousand Oaks, CA: Sage the Human Condition. Thousand Oaks, CA: Sage Publications.Publications.

Innes, J. E. 1999. Evaluating Consensus Building. Innes, J. E. 1999. Evaluating Consensus Building. Thousand Oaks, CA: Sage Publications.Thousand Oaks, CA: Sage Publications.

Patton, M.Q. 1978. Utilization-Focused Evaluation. Patton, M.Q. 1978. Utilization-Focused Evaluation. Beverly Hills, CA: Sage Publications.Beverly Hills, CA: Sage Publications.

Salant, P. & Dillman, D.A. 1994. How to Conduct Your Salant, P. & Dillman, D.A. 1994. How to Conduct Your Own Survey. New York: John Wiley and Sons.Own Survey. New York: John Wiley and Sons.

Singletary, L. 2004. Mail Surveys, in L. Singletary (Ed.) Singletary, L. 2004. Mail Surveys, in L. Singletary (Ed.) Conducting Community Situational Analysis: A Field Conducting Community Situational Analysis: A Field Guide to Dynamic Extension Programming. Reno, NV: Guide to Dynamic Extension Programming. Reno, NV: UNCE EB-04-02,available online at UNCE EB-04-02,available online at www.unce.unr.edu.

Page 38: Evaluating and Reporting Program Impacts: Contentious Public Issues

Loretta Singletary, Ph.D.Professor and Extension EducatorUniversity of Nevada Cooperative ExtensionPhone: (775) 463-6541 FAX: (775) 463-6545 Email: [email protected]