Demonstrating Steel Fibres from Waste Demonstrating Steel Fibres
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your...
-
Upload
erika-hutchinson -
Category
Documents
-
view
220 -
download
1
Transcript of Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your...
Planning EvaluationPlanning EvaluationSetting the Course
Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online atwww.cdc.gov/ncipc/pub-res/dypw/dypw.pdf?file=%2F2%2Fmodules%2Fmodule01.swf
Planning EvaluationPlanning EvaluationWhat’s going on?
If these youth were part of a 4-H program, how would you show evidence for program quality and outcomes?What would they (or their parents, teachers, or peers) tell you about their experience?
Why Evaluate?
• Brainstorm reasons for evaluating programs
Reasons to Evaluate
• Prove (scientists “show evidence”)– Program impact (school/college/career success)
– Program outcomes
(knowledge-attitude-skills-aspirations)
– Program quality (best practices)
• Improve: guidance to reach audience
• Approve: feedback for staff
Rationale for Evaluation
• Demonstrate solid evidence for success
• Allow other programs to learn
• Monitor ongoing quality and outcomes
Summing up evaluation
“…the process of determining whether a program or certain aspects of a program are appropriate, adequate, effective, or efficient, and if not, how to make them so.”
Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online atwww.cdc.gov/ncipc/pub-res/dypw/dypw.pdf?file=%2F2%2Fmodules%2Fmodule01.swf
Evaluationmay bring more than you expected
• People talk…and feel good that you listen
• You talk…stakeholders and media listen
• Problems become opportunities
• Programs are sometimes
‘better than expected’
Begin with the end in mind
• Clear and definite objectives
• Distinctive target population
• Straightforward indicators of success
• Evaluation integrated with programming
• Appropriate, well-tested methods and tools
• Comparison data (population, control)
• Information about process and quality
Shakespeare evaluates
• Stage 1: Formative (Implementation)—Is it in place?
• Stage 2: Formative (Process/Progress)—is it serving target audience?
• Stage 3: Summative (Outcome)—Is it getting results?
• Stage 4: Summative (Impact)—Is it building results?
Planning EvaluationPlanning EvaluationFormative: Implementation
Is the project being implemented according to plan? (e.g., participant selection and involvement, activities and strategies, adjustments matching program plan, capable staff members hired, trained, and well-managed, materials and equipment ready, timelines maintained, appropriateness of personnel, and the development and fulfillment of the management plan.
Planning EvaluationPlanning EvaluationFormative: Progress
Is the project progressing toward planned results? (e.g., participant progress on key indicators, activities and strategies fostering progress?
Program Fidelity
• How can you say that changes in youth knowledge, attitudes, skills, or aspirations result from your program rather than some external factor?
Program Fidelity Keys
• Document pre- and post-project scores• Monitor best practices and youth progress via
– External observers– Youth participant feedback
Planning EvaluationPlanning EvaluationSummative: (Short-term) Outcomes
At the completion of each/all “units,” how have participants changed? (e.g., knowledge, attitudes, skills, aspirations)
Planning EvaluationPlanning EvaluationSummative: (Long-term) Impacts
As a result of program participation, what profound changes occurred in a youth (family, community)?(e.g., behavior, application of program lessons)
Outcome Expectations
• What kinds of changes are significant?
• How much change is enough?
• What if some participants don’t change?
• How long will changes “stick”?
Answers on Outcome Expectations
• It depends
Clarifying Expectations
• What kinds of changes are significant?– Depends on the factor (e.g., attitude toward
reading vs. reading comprehension)
– Depends on audience (e.g., competent readers vs. struggling readers)
– Depends on program (e.g., one-time/short-term vs. all year/all summer)
– Depends on context (e.g., stage/pace-appropriate vs. constrained or chaotic)
Clarifying Expectations
• How much change is enough?– Depends on the above (reality, research)
– Depends on funder expectations
…often critical first steps or progress toward a goal is a key indicator of continued success
(think about staying up on your first bike)
Clarifying Expectations
• What if some participants don’t change?– See the above (clarify expectations first)
– Critically examine threshold criteria (e.g., minimal health, safety, and education goals vs. substantial or optimal improvement)
– Critically examine program potential (e.g., relative benefit for specific participants)
Clarifying Expectations
• How long will changes “stick”?– See the above (check research and reason)
– Depends on the nature of the change• Interest in science or practice of healthy eating
sustained through life (turning point)• Increasing involvement and growth in ongoing
programming (cumulative benefits)
So where do we begin?
• Create a “logic model” that describes what results you want and how to get to them
• Check the research to see what others have learned
• Get to know your audience so that you know what results are relevant for them