Meta Evaluation and Evaluation Dissemination(Manual 3)

83
META EVALUATION AND EVALUATION DISSEMINATION Dawit Wolde ( MSc, Lecturer).ICME-JU College of Health Sciences of Jimma University E-mail:[email protected] or [email protected] Cell phone:(+251)-922489558/967657712 P.O.Box:378,Jimma University Jimma Ethiopia

Transcript of Meta Evaluation and Evaluation Dissemination(Manual 3)

META EVALUATION

AND EVALUATION

DISSEMINATION Dawit Wolde ( MSc, Lecturer).ICME-JU

College of Health Sciences of Jimma University

E-mail:[email protected] or [email protected]

Cell phone:(+251)-922489558/967657712

P.O.Box:378,Jimma University Jimma Ethiopia

Presentation objectives At the end of the presentations participants will able to:

oDefine Meta Evaluation and recognize historical basis for Meta

Evaluation

oDifferentiate b/n Ethical rules, Codes, Standards, Principles and

Theories

oBe familiar with standards for Evaluation and guiding principles

for Evaluators

o Identify types and role of Meta Evaluators

oDiscuss on Purpose of Evaluation report and important factors in

planning Evaluation report

oRecognize techniques of Evaluation information dissemination

oUnderstand key components of Evaluation written report

oExplain Evaluation information use

4/17/2016 Meta Evaluation and Evaluation dissemination 2

Presentation outline

• Historical origin and basic concepts of Meta Evaluation

• Standards of Evaluation

• Guiding principles of Evaluators

• Types and roles of Meta Evaluators

• Meaning and Rationale for Evaluation dissemination

• Consideration in report preparation

• Channels and formats of dissemination

• Structure of Evaluation report

• Evaluation use and factors affecting it

4/17/2016 Meta Evaluation and Evaluation dissemination 3

Training methods

• Interactive lectures,

•Group discussion(exercises),

•Plenary presentations

Allocated time:20 hours

4/17/2016 Meta Evaluation and Evaluation dissemination 4

Basic steps in Program M&E

1.Engage stakeholders

2.Describe the program

3.Focus the evaluation

design

4.Gather credible evidence

5.Justify conclusions

6.Ensure use and share

lesson learned

Source:CDC’s framework for program evaluation in public health,1999.

Evaluation

standards: o Utility

o Feasibility

o Propriety

o Accuracy

4/17/2016 Meta Evaluation and Evaluation dissemination 5

Evaluation

standards require

gathering of

credible/correct

information(Accur

acy) and use of

information(Utility)

****MM Evaluation

And to ensure perception

of credibility by

evaluation users ,use of

multiple source of data is

one strategy

This will

enhances

….

Basic concepts of Meta Evaluation

• Meta Evaluation is Evaluation of an evaluation to determine the quality and/or value of an evaluation(Scriven,1991).

• It is a systematic and formal evaluation of evaluations, evaluation systems or use of Specific evaluation tools in order to guide planning / management of evaluations within Organizations (Scriven, 2009).

• It is the process of delineating, obtaining, and applying descriptive information and judgmental information about the utility, feasibility, propriety, and accuracy of an evaluation and its systematic nature, competent conduct, integrity/honesty, respectfulness, and social responsibility to guide the evaluation and/or report its strengths and weaknesses‛(Stufflebeam, 2000)

4/17/2016 Meta Evaluation and Evaluation dissemination 6

Basic concepts of Meta Evaluation…

Purpose:

• To keep quality of the evaluation and help

evaluation live up to its potential so that:

• Formative Meta Evaluation(Proactive) can

improve an evaluation study before it is

irretrievably too late.

• And Summative Meta Evaluation(Retroactive)

can add credibility to final results.

4/17/2016 Meta Evaluation and Evaluation dissemination 7

Basic concepts of Meta Evaluation… Evolution of Meta Evaluation:

• In an informal sense, Meta evaluation has been around as long as evaluation.

• However formally Meta evaluation introduced in 1960’s where:

• Evaluators began to discuss formal Meta evaluation procedures and criteria

• Writers began to suggest what constituted good and bad evaluations and unpublished checklist of evaluation standards began to be exchanged informally among evaluators.

• Several evaluators published their proposed guidelines, or Meta evaluation‛ criteria, for use in judging evaluation plans or reports.

• Attempt made among authors to make evaluation criteria’s useful to evaluation consumers with the objective of avoiding number of unhelpful and wasteful evaluations.

4/17/2016 Meta Evaluation and Evaluation dissemination 8

Basic concepts of Meta Evaluation…

• However until 1970, there has been a debate among evaluators and consumers on set of evaluation criteria leading to disagreement among them in choosing one best criteria among the list.

• Late 1970- effort made to develop a comprehensive set of standards explicitly tailored for use in Educational evaluations and containing general agreed on standards for quality evaluation.

• Developments of these standards began in 1975, under the direction of Daniel stufflebeam and under authorization of authorization a Joint Committee on Standards for Educational Evaluation.

• The result of the Joint Committee work was named as: Standards for Evaluations of Educational programs, Projects and Materials.

• The standards include 30 standards under four major standards: Utility, Feasibility, Propriety and Accuracy.

4/17/2016 Meta Evaluation and Evaluation dissemination 9

Basic concepts of Meta Evaluation… Benefits of the standards:

1. A common language to facilitate communication and collaboration in evaluation

2. A set of general rules for dealing with a variety of specific evaluation problems

3. A conceptual framework by which to study the often-confusing world of evaluation

4. A set of working definition to guide research and development on the evaluation process

5. A public statement of the state of the art in evaluation

6. A basis for self-regulation and accountability by professional evaluators and

7. An aid to developing public credibility for the evaluation field.

4/17/2016 Meta Evaluation and Evaluation dissemination 10

EXERCISE 1

Be in your group and :

• Differentiate between ethical rules, codes, standards,

principles and theory?

• Discuss on :Why Evaluation standards are required in

addition to Fundamental Ethical principles??(30 minutes)

4/17/2016 Meta Evaluation and Evaluation dissemination 11

Basic concepts of Meta Evaluation… 1. Ethical rules: are specific statements about ethical

behavior(in our situation related to professional setting or

professional practices).These rules prescribe behavior in a

relatively detailed fashion. For example: Rules about

obtaining informed consent from evaluation or research

participants are frequently specific, although they may

vary in the amount of detail.

2. Ethical codes: are compilations of ethical rules. For

example: The ethical code for educational researchers has

six sections: responsibilities to the field, research

populations, intellectual ownership, editing and revewing

research,and students and student researchers(American

Educational Research Association,1992).

4/17/2016 Meta Evaluation and Evaluation dissemination 12

Basic concepts of Meta Evaluation…

3.Standard-synonyms with a rule but also can suggest model behavior. The evaluation standards focus on overall quality of evaluations, not on specific rules. Use of the word ethics is assiduously avoided in these standards, but embedded within them are guideline that might be considered ethical in nature. A typical example for this is Propriety evaluation standard.

4.Ethical principles: are broader than rules or codes and often serve as the foundation on which rules and codes are built. Principles such as “don’t harm” and Golden rule provide guidance for many ethical decisions and are helpful when the rules or codes conflict or don’t provide specific guidelines for our ethical concerns. The Fundamental ethical principles includes:Beneficence,Nonmaleficence,Authonomy,Justice and Fidelity.

4/17/2016 Meta Evaluation and Evaluation dissemination 13

Basic concepts of Meta Evaluation…

5.Ethical theory: refer to efforts to explain how people go about making ethical decisions. Ethical theory is the science of ethical decision making. Ethical theories are general ways of determing what behavior is considered right or wrong. Classified in to five based on the criteria used to decide whether the behavior is right or wrong as:

• Consequences

• Duty

• Rights

• Social justice and

• Ethics of care

4/17/2016 Meta Evaluation and Evaluation dissemination 14

Basic concepts of Meta Evaluation…

Rules Statements of specific do’s and most often don’ts

Example: Provide mean and median scores on all standardized tests used in

an evaluation, Take both oral and written consent from research participants.

Codes Compilations of rules, usually adopted and endorsed by a professional

organizations.

Example: Ethical codes of American Educational Research Association and

American Psychological Association

Standards Similar to rules but often suggest ideal behavior.

Example: Make sure that all stakeholders understand the technical terms used

in an evaluation report.

Principles Broader than rules or codes; provide guidance when rules conflict or when

rules are not specific to the context.

Example: Evaluate programs as you would want your program to be

evaluated.

Theories Justification or criteria for ethical decisions; the science and rationale for

making ethical descions.

Example: The consequences of an action are the determinants of what

constitutes ethical or unethical behavior.

4/17/2016 Meta Evaluation and Evaluation dissemination 15

Relationship b/n ethical theories and principles

Theory/Criteria Related principles

Consequences(utilitarianism):What

are the consequences of my choice?

What would happen, for example ,if

every evaluator made the same

decision?

Autonomy,Beneficence,Nonmaleficence

Duty(deontological):What duties and

obligations do I have as an evaluator?

Autonomy,Beneficence,Fidelity,Nonmalefi

cence

Rights: What right do my clients

have? What rights do I have?

Autonomy,Justice,Nonmaleficence

Social Justice: What would be just or

fair in this situation?

Justice,Beneficence,Nonmaleficence

Caring(Ethics of care):What would

be the caring response or course of

action?

Beneficence,Nonmaleficence

4/17/2016 Meta Evaluation and Evaluation dissemination 16

STANDARDS OF EVALUATION

Utility

Feasibility

Propriety

Accuracy

Standards for program Evaluation

• The joint committee standards comprised a set of 30

evaluation standards under four core standards.

• These core standards were :

( 1) Utility,

(2) Feasibility,

(3) Propriety, and

(4) Accuracy.

4/17/2016 Meta Evaluation and Evaluation dissemination 18

Standards for program Evaluation…

Utility standards:

• The utility standard is intended to ensure that an

evaluation will serve the intended information

needs of its intended users.

• Intended users can be specific individuals,

organizations or any entity that receives and uses

evaluation findings.

• Under these standards there are seven (7) sub-

standards.

4/17/2016 Meta Evaluation and Evaluation dissemination 19

Standards for program Evaluation…

U1:Stakeholders Identification

Persons involved in or affected by the evaluation should be identified, so that their needs can be addressed.

U2:Evaluator credibility

• The person conducting the evaluation should be both trustworthy and competent to perform the evaluation, so that the evaluation findings achieve maximum credibility and acceptance.

U3:Information scope and selection

• Information collected should be broadly selected to address pertinent questions about the program and be responsive to the needs and interests of clients and other specified stakeholders.

4/17/2016 Meta Evaluation and Evaluation dissemination 20

Standards for program Evaluation…

U4:Values identification:

• The perspectives, procedures, and rationale used to

interpret the findings should be carefully described,

so that the bases for value judgment are clear.

U5:Report clarity

• Evaluation reports should clearly describe the

program being evaluated, including its context, and

the purpose, procedures, and findings of the

evaluation, so that essential information is provided

and easily understood.

4/17/2016 Meta Evaluation and Evaluation dissemination 21

Standards for program Evaluation…

U6:Report timeliness and Dissemination

• Significant interim findings and evaluation reports

should be distributed to intended users, so that they

can be used in a timely fashion.

U7:Evaluation Impact

• Evaluation should be planned, conducted and

reported in ways that encourage follow-through by

stakeholders, so that the likelihood that the

evaluation will be used is increased.

4/17/2016 Meta Evaluation and Evaluation dissemination 22

Standards for program Evaluation…

Feasibility Standards:

• The feasibility standards are intended to ensure that

an evaluation will be realistic, prudent, diplomatic,

and frugal.

• It comprises of three (3) standards.

F1:Practical Procedures:

• The evaluation procedures should be practical; to

keep disruption to a minimum while needed

information is obtained.

4/17/2016 Meta Evaluation and Evaluation dissemination 23

Standards for program Evaluation…

F2:Political Viability

• The evaluation should be planned and conducted with

anticipation of the different positions of various interest

groups, so that their cooperation may be obtained and so

that possible attempts by any of these groups to curtail

evaluation operations or to bias or misapply the results can

be averted or counteracted.

F3:Cost Effectiveness:

• The evaluation should be efficient and produce information

of sufficient value, so that the resources expended can be

justified.

4/17/2016 Meta Evaluation and Evaluation dissemination 24

Standards for program Evaluation…

Propriety Standards:

• The propriety standards are intended to ensure that an

evaluation will be conducted legally, ethically, and with due

regard for the welfare of those involved in the evaluation as

well as those affected by its results.

• This standard comprises of 8 sub -standards.

P1:Service Orientation:

• Evaluation should be designed to assist organizations to

address and effectively serve the needs of the full range of

targeted participants.

4/17/2016 Meta Evaluation and Evaluation dissemination 25

Standards for program Evaluation…

P2:Formal Agreement:

• Obligations of the formal parties to an evaluation (what is to be done, how, by whom, when) should be agreed to in writing, so that these parties are obligated to adhere to all conditions of the agreement or formally to renegotiate it.

P3:Rights of Human Subjects:

• Evaluations should be designed and conducted to respect and protect the rights and welfare of human subjects.

P4:Human Interactions:

• Evaluators should respect human dignity and worth in their interaction with other persons associated with an evaluation, so that participants are not threatened or harmed.

4/17/2016 Meta Evaluation and Evaluation dissemination 26

Standards for program Evaluation…

P5:Complete and Fair Assessment

• The evaluation should be complete and fair in its

examination and recording of strengths and weaknesses of

the program being evaluated, so that the strengths can be

built up on and problem areas addressed.

P6:Disclosure of Findings:

• The formal parties to an evaluation should ensure that the

full set of evaluation findings along with pertinent

limitations are made accessible to the persons affected by the

evaluation and to any others with expressed legal rights to

receive the results.

4/17/2016 Meta Evaluation and Evaluation dissemination 27

Standards for program Evaluation…

P7:Conflict of Interest:

• Conflict of interest should be dealt openly and

honestly, so that it does not compromise the

evaluation process and results.

P8:Fiscal Responsibility:

• The evaluator’s allocation and expenditures of

resources should reflect sound accountability

procedures and otherwise be prudent and ethically

responsible, so that expenditures are accounted for

and appropriate.

4/17/2016 Meta Evaluation and Evaluation dissemination 28

Standards for program Evaluation…

Accuracy Standards: • The accuracy standards are intended to ensure that an

evaluation will reveal and convey technically adequate

information about the features that determine worth or merit

of the program being evaluated.

• It comprise of a total of 12 sub -standards.

A1:Program Documentation:

• The program being evaluated should be described and

documented clearly and accurately, so that the program is

clearly identified.

4/17/2016 Meta Evaluation and Evaluation dissemination 29

Standards for program Evaluation…

A2:Context Analysis:

• The context in which the program exists should be examined

in enough detail so that its likely influences on the program

can be identified.

A3:Described Purposes and Procedures:

• The purpose and procedures of the evaluation should be

monitored and described in enough detail so that they can be

identified and assessed.

A4:Defensible Information Sources:

• The sources of information used in a program evaluation

should be described in enough detail so that the adequacy of

the information can be assessed.

4/17/2016 Meta Evaluation and Evaluation dissemination 30

Standards for program Evaluation…

A5:Valid Information

• The information-gathering procedures should be chosen or

developed and then implemented so that they will ensure

that the interpretation arrived at is valid for the intended use.

A6:Reliable Information:

• The information-gathering procedures should be chosen or

developed and then implemented so that they will ensure

that the information obtained is sufficiently reliable for the

intended use.

4/17/2016 Meta Evaluation and Evaluation dissemination 31

Standards for program Evaluation…

A7:Systematic Information:

• The Information collected, Processed and reported in an

evaluation should be systematically reviewed and any errors

found should be corrected.

A8:Analysis of Quantitative Information

• Quantitative information in an evaluation should be

appropriately and systematically analyzed, so that evaluation

questions are effectively answered.

A9:Analysis of Qualitative Information

• Qualitative information in an evaluation should be

appropriately and systematically analyzed, so that evaluation

questions are effectively answered.

4/17/2016 Meta Evaluation and Evaluation dissemination 32

Standards for program Evaluation…

A10:Justified Conclusions:

• The conclusions reached in an evaluation should be

explicitly justified, so that the stakeholders can

assess them.

A11:Impartial Reporting:

• Reporting procedures should guard against

distortion caused by personal feelings and biases of

any party to the evaluation, so that the evaluation

reports fairly reflect the evaluation findings.

4/17/2016 Meta Evaluation and Evaluation dissemination 33

Standards for program Evaluation…

A12:Metaevaluation:

• The evaluation itself should be formatively and

summatively evaluated against this and other

pertinent standards, so that its conduct is

appropriately guided and, on completion,

stakeholders can closely examine its strengths and

weaknesses.

4/17/2016 Meta Evaluation and Evaluation dissemination 34

Principles

Evaluator Autonomy Nonmalificence Beneficence Justice Fidelity

Political

viability(F2)

Practical procedures(F1) Stakeholder

identification(U1)

Report

dissemination(U6)

Evaluators

Credibility(U2)

Disclosure of

findings(P6)

Reliable measurement(A6) Information scope

and selection(U3)

Formal

obligations(P2)

Valid Measurement(A5) Report Clarity(U5) Report

timeliness(U6)

Conflict of interest(P7) Systematic

information(A7)

Practical

Procedures(F1)

Disclosure of findings(P6) Analysis of QUAN

information(A8)

Rights of Humans(P3) Complete and fair

assessment(P5)

Analysis of QUAL

information(A9)

Defensible

information

sources(A4)

Impartial

reporting(A11)

Human interactions(P4)

Justifiable conclusions(A10)

Political Viability(F2)

Stakeholders Information scope

and selection(U3)

Values

identification(U4)

Evaluators

Credibility(U2)

Report timeliness and

dissemination(U6)

Rights of Human

subjects(P3)

Formal

Obligations(P2)

Balanced

reporting(P6)

4/17/2016 Meta Evaluation and Evaluation dissemination 35

Evaluators role and Ethical Dilemma's

1. Evaluators as consultant or administrator(Vignette 1.1 and

1.2)

2. Evaluator as data collector/researcher(Vignette 1.3 and

1.4)

3. Evaluator as reporter(Vignette 1.5 and 1.6)

4. Evaluator as member of a profession(Vignette 1.7 and 1.8)

5. Evaluator as member of a society(Vignette 1.9 and 1.10)

4/17/2016 Meta Evaluation and Evaluation dissemination 36

EXERCISE 2:APPLICATION OF

EVALUATION STANDARDS Be in your previous group and conduct Meta Evaluation on the Evaluation reports provided to you(30 minutes).

Group I:PMTCT program

Group II: Nutrition Program/ICCM Program

Group III: Malaria program

4/17/2016 Meta Evaluation and Evaluation dissemination 37

Guiding Principles for Evaluators

• The AEA principles are lifestyle expectations for

professional evaluators rather than a set of

standards to be applied to any one specific study.

• It promotes a lifestyle of systematic inquiry,

professional development, honesty, respect, and

concern for society.

• These guiding principles permeate the day-to-day

activities of the evaluator over an entire career.

4/17/2016 Meta Evaluation and Evaluation dissemination 38

Guiding Principles for Evaluators…

• The AEA principles include:

1. Systematic inquiry-Evaluators conduct systematic, data-based

inquiries about whatever is being evaluated.

2. Competence-Evaluators provide competent performance to

stakeholders.

3. Integrity/Honesty-Evaluators ensure the honesty and integrity

of the entire evaluation process.

4. Respect for people-Evaluators respect the security, dignity, and

self -worth of the respondents, program participants, clients,

and other stakeholders with whom they interact.

5. Responsibilities for General and Public Welfare-Evaluators

articulate and take into account the diversity of interest and

values that may be related to the general and public welfare.

4/17/2016 Meta Evaluation and Evaluation dissemination 39

Relationship b/n AEA guiding principles for Evaluators

and Kitchener’s principles AEA guiding principles Kitchener’s principles

Systematic inquiry: Evaluators conduct systematic,

data-based inquiries about whatever is being

evaluated.

Beneficence,Nonmaleficence

Competence: Evaluators provide competent

performance to stakeholders.

Non-maleficence

Integrity/honesty: Evaluators ensures the honesty

and integrity of the entire evaluation process.

Fidelity

Respect for people: Evaluators respect the

security,dignity,and self-worth of the respondents,

program participants, clients and other stakeholders

with whom they interact.

Autonomy, Beneficence

Responsibilities for general and public

welfare:Evaluators articulate and take into account

the diversity of interests and values that may be

related to the general and public welfare.

Justice

4/17/2016 Meta Evaluation and Evaluation dissemination 40

EXERCISE 3:WHO SHOULD

CONDUCT META

EVALUATION? Discuss on and present outcome of your discussion to

general class(30 minutes).

4/17/2016 Meta Evaluation and Evaluation dissemination 41

Types of Meta Evaluators

1. Meta Evaluation conducted by the Original evaluator: sometimes evaluations were evaluated by the evaluator himself /herself (original evaluators). The problem behind this is the biases that can accrue from evaluating one’s own works and recommended to have another evaluator review their own works.

2. Meta Evaluation conducted by Evaluation consumer-Often the evaluation sponsor, client, or other stakeholders are left to judge the adequacy of an evaluation plan or report without assistance from a professional evaluator. But the success of this approach depends heavily on the technical competence of the consumer. Recommended to consult expertise.

3. Meta Evaluation conducted by Competent Evaluators: is the best arrangement in that it has both the advantage of technical competence and minimal biases.

4/17/2016 Meta Evaluation and Evaluation dissemination 42

DISSEMINATION AND USE

OF EVALUATION FINDINGS

Section II

Basic concepts and terminologies

• Designing, preparing ,and presenting an evaluation reports is

the very essence of evaluation,

• But if that report does not prove useful or influential, then

that essence quickly evaporates into an empty exercise.

• Many program collect valuable and informative data on their

program and services.

• But they may not know how to share the data with the public

and with influential people at the local and state levels to

ensure that the findings will be used and that programmatic

approaches and interventions can be replicated.

4/17/2016 Meta Evaluation and Evaluation dissemination 44

Basic concepts and terminologies

What is dissemination?

• Dissemination is making information available and usable to

various audiences through a wide variety of channels or

formats.

• Here:

• A channel refers to a route of communication such as a

news conference or posters.

• And a format refers to the actual layout for

communicating the information.

• Information can be communicated through both oral

and written formats.

4/17/2016 Meta Evaluation and Evaluation dissemination 45

EXERCISE 4:REASONS TO

DISSEMINATE Be in your previous group and discuss on the rationale for disseminating monitoring and evaluation information. Then present outcome of your discussion to larger group(20 minutes).

4/17/2016 Meta Evaluation and Evaluation dissemination 46

Purpose of Evaluation reports • The purpose of evaluation report is directly linked to the use

intended for the evaluation.

• For example:

• The overall purpose of formative evaluation is to improve

the program ,and the report should inform program staff

early about how the program is functioning and what

changes must be made to improve it.

• And in summative evaluation the report should provide

information and judgment about the mature programs

value to those who:(1)may wish to adopt it, (2)will

determine resource allocation for its continuation, or

(3)have a right to know about the program for other

reasons.

4/17/2016 Meta Evaluation and Evaluation dissemination 47

Purpose of Evaluation reports… • In general evaluation reports can serve many different purposes

including:

• Demonstrating accountability

• Assisting in making a decision

• Bringing an issue to attention of others

• Helping stakeholders elaborate or refine their opinion of an issue

• Convincing others to take action

• Exploring and investigating issues

• Involving stakeholders in program planning or policy development

• Gaining support for a program

• Promoting understanding of issue

• Changing attitudes

• Changing individual behaviors

• Changing the nature of dialogue or interaction among groups

• Influencing policy

• Introducing those involved to new ways of thinking through evaluation.

4/17/2016 Meta Evaluation and Evaluation dissemination 48

Important factors in planning evaluation

reports • Identifying the intended audiences for the evaluation report.

Think that:

• Different stakeholders have different information needs?

• Different stakeholders prefer different channel of communication?

• Understanding what information is demanded by stakeholders (audience) and for what purposes.

• This help to tailor the report content to the evaluation audience preferences.

• Tailoring the report format, style, and language to the preferences of evaluation audiences.

• Timing of the evaluation information dissemination

4/17/2016 Meta Evaluation and Evaluation dissemination 49

Important factors in planning evaluation

reports… • In corresponding Information to Particular Audiences

answer the following questions:

1. Who is your audience? (e.g., directors and staff at the

ministry or regional level, donors, customers etc…)

2. What does your audience need to know or what are their

specific interests? (e.g., improvement in outcomes, such as

increased knowledge, more positive attitudes, or healthy

behavior change)

3. What do you hope to gain by disseminating this

information or these results? (e.g., to justify the existence

of the program or to leverage additional funding)

4/17/2016 Meta Evaluation and Evaluation dissemination 50

Important factors in planning evaluation

reports… 4.How will you communicate about the ongoing

program? (e.g., briefings at board meetings, progress

reports with a summary, and verbal presentations)

5.How will you communicate about the program

upon its completion? (e.g., final written report with a

summary, verbal debriefing, videos, and oral

presentation)

4/17/2016 Meta Evaluation and Evaluation dissemination 51

EXERCISE 5:CHANNELS

AND TIMING OF

REPORTING Be in your previous group discuss on:

• The different channels of reporting(Strength and weaknesses) and

• Timing of reporting of evaluation results(30 minutes).

4/17/2016 Meta Evaluation and Evaluation dissemination 52

Channels of reporting

• Written reports

• Photo essays

• Audiotape reports

• Slide-tape presentations

• Film or video tape reports

• Multimedia presentations

• Dialogues/testimonies

• Hearings or mock trials

• Product displays

• Simulations

• Scenarios

• Portrayals

• Case studies

• Graphs and charts

• Test score summaries

• Question/answers

• E-mail reports

N.B:In choosing among channels the recommendation is to discuss with

stakeholders and identify appropriate channel for them.

4/17/2016 Meta Evaluation and Evaluation dissemination 53

Timing of Evaluation reporting

• Timing of reporting is crucial for effective use of evaluation findings by intended users.

• Based on timing of reporting, Evaluation reports can be:

1. Scheduled Interim Reports: Reports can be scheduled at milestones in either the evaluation or the program or at regular intervals corresponding to routine meetings of clients or stakeholders.

2. Unscheduled Interim Reports: used when unexpected event or results pop up.

3. Final reports: reports prepared after the interim reports of evaluation. Sometimes preliminary final report released for review and reaction by stakeholders, and then final report.

4/17/2016 Meta Evaluation and Evaluation dissemination 54

Timing of Evaluation reporting…

For example of interim unscheduled reports:

• In a formative evaluation the evaluator may discover a major

problem or impediment, such as the fact that video monitors

used in an experimental program designed to train federal meat

inspectors are too small for trainees beyond the third row to see

the critical indicators of possible contamination. It would be a

gross disservice to withhold that information until the next

scheduled interim report, which might be weeks away, and then

deliver the not-to surprising message that a majority of the new

generation of meat inspectors didn‘t seem to be learning much

from the experimental program that would serve the cause for

public health.

4/17/2016 Meta Evaluation and Evaluation dissemination 55

Key component of a written report

1. Title page

2. Executive Summary(2-6 pages)/ Executive Abstract(1-2 pages)

3. Introduction to the report

Purpose of the evaluation

Audiences for the evaluation report

Limitation of the evaluation and explanation of disclaimers(if any)

Overview of report contents

4.Focus of the evaluation

Description of the evaluation object

Evaluative questions or objectives used to focus the study

Information needed to complete the evaluation

5.Brief overview of evaluation plan and procedures

6.Presentation of evaluation results

Summary of evaluation findings

Interpretation of evaluation findings

4/17/2016 Meta Evaluation and Evaluation dissemination 56

Key component of a written report…

7.Conclusion and Recommendations

• Criteria and standards used to judge evaluation object

• Judgment about evaluation object(strength and weakness)

• Recommendations

8.Minority reports or rejoinders(if any)

9.Appendices

• Description of evaluation plan/ design, instruments, and data analysis and interpretation

• Detailed tabulations or analysis of quantitative data, and transcripts or summaries of qualitative data

• Other information, as necessary

4/17/2016 Meta Evaluation and Evaluation dissemination 57

Evaluation report outline examples

Example 1: 1. Title Page:

• Title and nature of evaluation

• Title of Program, phase, duration

• Identification of author, date of submission, commissioning service

2. Table of contents:

• Main headings and sub-headings

• Index of tables of figures and graphs

3.Executive Summary:

• An overview of the entire report in no more than five pages

• A discussion of the strengths and weakness of the chosen evaluation design

4/17/2016 Meta Evaluation and Evaluation dissemination 58

Evaluation report outline examples…

4.Introduction:

• Description of the Program in terms of needs, objectives, delivery systems etc.

• The context in which the Program operates

• Purpose of the evaluation in terms of scope and main evaluation questions.

• Description of other similar studies which have been done

5.Research methodology:

• Design of research

• Implementation of research and collection of data

• Analysis of data

4/17/2016 Meta Evaluation and Evaluation dissemination 59

Evaluation report outline examples 6.Evaluation results:

• Findings

• Conclusions

• Recommendations

7.Annexes:

• Terms of reference of the evaluation

• References and sources

• Names of evaluators and their companies (CV should also be shown, but summarized and limited to one page per person).

• Methodology applied for the study (phases, methods of data collection, sampling, etc).

• Logical framework matrices (original and improved/updated).

• List of persons and organizations consulted, literature and documentation other than technical annexes (e.g. statistical analyses)

4/17/2016 Meta Evaluation and Evaluation dissemination 60

Evaluation report outline examples…

Example 2: • The basic elements of a final evaluation report might include the

following:

ƒ Title page

ƒ Executive summary

ƒ Intended use and users

ƒ Program description

ƒ Evaluation focus

ƒ Data sources and methods

ƒ Results, conclusions, and interpretation

ƒ Use, dissemination, and sharing plan

ƒ Tools for clarity

4/17/2016 Meta Evaluation and Evaluation dissemination 61

Evaluation report outline examples…

Example 3 I. Executive Summary

II. Program Description

a. Implementation Process

b. Program Goals and Objectives

III. Evaluation Design and Methodology

IV. Results

a. Data

b. Process Report

b. Outcomes Report

V. Interpretation of Process and Outcomes Report

VI. Conclusions

VII. Recommendations

VIII.Appendices

4/17/2016 Meta Evaluation and Evaluation dissemination 62

Evaluation report outline examples…

Example 4 I. Executive Summary

II. Problem Statement

III. Evaluation Design

IV. Evaluation Methodology

V. Results

a. Quantitative

b. Qualitative

VI. Conclusions

VII. Recommendations

VIII.Appendices

4/17/2016 Meta Evaluation and Evaluation dissemination 63

Evaluation report outline examples…

Example 5 I. Executive Summary

II. Program Description

III. Project Objectives and Activity Plans

IV. Evaluation Plan and Methodology

V. Results

a. Process and Outcome

b. Successes and Barriers

VI. Conclusions

VII. Recommendations

VIII.Appendices

4/17/2016 Meta Evaluation and Evaluation dissemination 64

Key component of a written report…

• The evaluation report should follow a logical

structure.

• What is important is that the structure of the report

meets the needs of the donors of the evaluation as

well as the principal stakeholders.

4/17/2016 Meta Evaluation and Evaluation dissemination 65

Basics of Good Evaluation Reporting(Tips)

1. Consider the needs of your target audience(s) even before you

begin your evaluation.

2. Give your audience important details of the evaluation.

• What– Your central evaluation question

• Why– Your purpose for the evaluation

• Who– Your source(s) of information, including sample

or census size and response rate

• How– Your data collection methods

• Where– The locations from which you collected data

• When– The time frame you collected the data

4/17/2016 Meta Evaluation and Evaluation dissemination 66

Basics of Good Evaluation Reporting(Tips)…

3.Use caution in reporting findings and drawing conclusions.

• Remember the type of data you collected and how you

collected it helps determine what you can eventually

conclude about it.

• Do not claim that your program caused a specific result

unless you have experimental evidence in support of that

claim.

4.Have others read or listen to your report and give you feedback

before you create the final version.

4/17/2016 Meta Evaluation and Evaluation dissemination 67

EXERCISE 6:UTILIZATION

OF EVALUATION AND

FACTORS AFFECTING IT Be in your group and discuss on culture of information utilization in your organization and factors affecting it.

Then present outcome of your discussion to larger group(30 minutes).

4/17/2016 Meta Evaluation and Evaluation dissemination 68

The evaluation theory three

Adopted from:Marvin C. Alkin and Christina A. Christie 2004

11/17/2015 Introduction to M&E for Environmental Health 69

Process

Use

Findings

Use

Utilization of Evaluation

• The utility of any evaluation is a prime criterion for judging

its worth.

• Use is one of the factor that distinguishes evaluation from

research.

• Evaluation is intended to have an immediate or at least a

near -term impact, while research is intended to add to

knowledge and theory in a field, but the result it yields may

not be used for some time.

• Evaluation use can be Instrumental utilization, Conceptual

utilization, Enlightenment, persuasive utilization and

Process utilization .

• The first four are related to the use of the findings.

4/17/2016 Meta Evaluation and Evaluation dissemination 70

Utilization of Evaluation…

1. Direct(Instrumental) utilization: refers to the use of evaluation findings to make some immediate decisions or judgments. Evaluators often tailor their evaluations to produce results that can have a direct influence in the improvement of the structure, or on the process of a program. E.g. Decisions for funding/no funding based on results.

2. Conceptual utilization: refers to use of evaluation findings to enlighten and inform stakeholders about issues. Even if evaluation results do not have a direct influence in the re-shaping of a program, they may still be used to make people aware with regard to the issues of concern. This can lead to instrumental use at later time.

3. Enlightenment-when evaluation finding add knowledge to the field and thus may be used by anyone, not just those involved with the program or evaluation of the program.

4/17/2016 Meta Evaluation and Evaluation dissemination 71

Utilization of Evaluation…

4.Process use:

• Refers to ways in which being engaged in the process of evaluation can be useful quite apart from the findings that may emerge from the study

• Participating in an evaluation can cause changes to occur: in one’s thought about organizational management and programs, in drawing attention to new issues, in creating dialogues between different stakeholders.

5.Persuasive Utilization:

• It is the enlistment of evaluation results in an effort to persuade an audience to either support an agenda or to oppose it.

• In this case unless the 'persuader' is the same person that ran the evaluation, this form of utilization is not of much interest to evaluators as they often cannot foresee possible future efforts of persuasion.

4/17/2016 Meta Evaluation and Evaluation dissemination 72

Utilization of Evaluation…

Level of influence of evaluation: Process use

4/17/2016 Meta Evaluation and Evaluation dissemination 73

Utilization of Evaluation… Change Processes for Evaluation Influence at the Individual Level

4/17/2016 Meta Evaluation and Evaluation dissemination 74

Factors affecting Evaluation utilization Results From the 2006 Sample

4/17/2016 Meta Evaluation and Evaluation dissemination 75

Factors affecting Evaluation utilization

Factors affecting evaluation utilization(finding utilization):

Relevance of the evaluation to decision makers and/or other

stakeholders

Involvement of users in the planning and reporting stages of

the evaluation

Reputation or credibility of the evaluator

Quality of the communication of findings: timeliness,

frequency, method.

Development of procedures to assist in use or

recommendations for action.

4/17/2016 Meta Evaluation and Evaluation dissemination 76

Factors affecting Evaluation utilization…

• Factors that affect process use:

(a) Facilitation of evaluation processes;

(b) Management support;

(c) Advisory group characteristics;

(d) Frequency, methods, and quality of communications;

and

(e) Organization characteristics.

4/17/2016 Meta Evaluation and Evaluation dissemination 77

Results From the 2006 Sample…

4/17/2016 Meta Evaluation and Evaluation dissemination 78

Results From the 2006 Sample…

4/17/2016 Meta Evaluation and Evaluation dissemination 79

Results From the 2006 Sample…

4/17/2016 Meta Evaluation and Evaluation dissemination 80

Recommended readings

1. Jody Fitzpatrick, James R. Sanders, Blaine R.Worthen.Program

Evaluation: Alternative Approaches and Practical guidelines.

Third edition. Pearson Education,Inc.2004.

2. Joint Committee on Standards for program Evaluation. The

program Evaluation standards. Second edition. Thousand Oaks,

CA: Sage.1994.

3. Kristin Olsen and Sheelagh O‟Reilly. Evaluation

Methodologies፡A brief review of Meta-evaluation, Systematic

Review and Synthesis Evaluation methodologies and their

applicability to complex evaluations within the context of

international development. June 2011.

4. Disseminating Program Achievements and Evaluation Findings

to Garner Support. Evaluation Briefs. No. 9 | February 2009.

4/17/2016 Meta Evaluation and Evaluation dissemination 81

Recommended readings…

5.Daniel A. McDonald, Pamela B. C. Kutara, Lucinda S.

Richmond, Sherry C. Betts. Culturally respectful evaluation.

December 2004, Vol. 9, No. 3

ISSN 1540 5273:https://ncsu.edu/ffci/ publications/2004/v9-n3-

2004-december/ ar-1-cuturally.php

4/17/2016 Meta Evaluation and Evaluation dissemination 82

THANK YOU….

………A Good Journey

Evaluation=Use+ Methods + Valuing