Program evaluation 20121016
Transcript of Program evaluation 20121016
Program Evaluation: Methods and Case StudiesEmil J. Posavac and Raymond G. Carey7th Edition. 2007. New Jersey: Pearson, Prentice Hall. Aung Thu NyeinDA- 8020 Policy Studies
Content
About the authors Chapter 1: Program Evaluation: An
Overview Chapter 3: Selecting criteria and setting
standards
About the authors
Emil J. PosavacPh. D., University of Illinois, a professor Emeritus of Psychology at Loyola University of Chicago, Director of applied social psychology graduate programAwarded for Myrdal Award by American Evaluation Association
Raymond G. CareyPh. D., Loyola University of Chicago, principal of R. G. Carey Associates. Widely published in the field of health services and quality assurance.
An Overview Evaluation is natural routine.
“Program evaluation is a collection of methods, skills, and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people in need at a reasonable cost without unacceptable side effects.”
An Overview… Contd.But program evaluation is different with natural, automatic evaluation.
First, organization efforts are carried out by team. This specialization means that responsibility for program evaluation is diffused among many people.
Secondly, most programs attempt to achieve objectives that can only be observed sometime in the future rather than in a matter of minutes. Then choice of criteria?
Third, when evaluating our own ongoing work, a single individual fills many roles– workers, evaluator, beneficiary, recipient of the feedback, etc.
Last, programs are usually paid for by parties other than clients of the program.
Evaluation tasks that need to be done
PE is designed to assist some audience to access the a program’s merit or worth.
Verify that resources would be devoted to meeting unmet needs
Verify that implemented programs do provide services Examine the outcomes Determine which program produce the most favorable
outcome Select the programs that offer the most needed types of
services Provide information to maintain and improve quality Watch for unplanned side effects.
Common Types of Program Evaluation Assess needs of the program participants
Identify and measure the level of unmet needs, Some alternatives
Examine the process of meeting the needs Extent of the implementation, the nature of people being served The degree to which the program operates as
planned Measure the outcomes of the program
Who had received what? Program service makes changes for better? Different opinions of people on outcome?
Integrate the needs, costs, and outcomes Cost-effectiveness
Activities often confused with program evaluation Basic research Individual assessment Program audit
Although these activities are valuable, program evaluation is different and more difficult to carry out.
Different Types of Evaluations for Different Kinds of Programs No “one size fits all” approach. Organizations needing program evaluations
Health care Criminal justice Business and Industry Government
Time Frame of needs Short-term needs Long-term needs Potential needs
Extensiveness of the programs Some programs are offered to small
group of people with similar needs, but other are developed for use at many sites through out the country.
Complexities involved.
Purpose of program evaluation The over all purpose of program evaluation is
contributing to the provision of quality services to the people in needs.
Feedback mechanism: formative evaluations or summative evaluations or evaluation for knowledge.
A Feedback Loop
The roles of evaluators A variety of work setting
Internal evaluators External: of governmental or regulatory
agencies Private research firms
Comparison of internal and external evaluators
Factors related to competence Access and advantages Technical expertise
Personal qualities Evaluator’s personal qualities: objective, fair and
trustable. Factors related to the purpose of an
evaluation Formative, summative or quality assurance
evaluation?
Evaluation and service The role of social scientist concerned with
theory, the design of research, and analysis of data.
And the role of practitioners dealing with people in need.
Evaluation and related activities of organizations
Research Education and staff development Auditing Planning Human resources
Chapter 3:
Selecting Criteria and Setting Standards
Useful criteria and standards Research design is important, but criteria and standards as well.
Criteria that reflect a program’s purposes Immediate short-term effects, but a marginal long-term ones.
Criteria that the staff can influence Could meet with resistance to an evaluation if the program staff feel
that their program will be judged on criteria that they cannot effect. Criteria that can be measured reliably and validly.
Repeated observation could give same values. Criteria that stakeholders participate in selecting
In consultation with evaluator and stakeholders
Developing Goals and Objectives How much agreement on goals is needed?
A number of issues to be addressed. Different types of goals
Implementation goals Intermediate goals Outcome goals
Goals that apply to all programs Treating the subjects with respect Personal exposure to the program Depending on surveys and records to provide
evaluations, etc.
Evaluation criteria and evaluation questions Does the program or plan match the
values of the stakeholders? Does the program or plan match the
needs of the people to be served? Does the program as implemented fulfill
the plans? Does the outcomes achieved match the
goals?
Using Program Theory Why a program theory is helpful? How to develop a program theory? Implausible program theories
Every program embodies a conception of the structure, functions, and procedures appropriate to attain its goals.
The conception constitutes the “logic” or plan of the program, which is called “Program Theory”.
Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
Assessing program theoryFramework for assessing program theory In relation to social needs Assessment of logic and plausibility Are the program goals and objectives well defined? Are the program goals and objectives feasible? Is the change process presumed in the program theory plausible? Are the program procedures for identifying members of the target
population, delivering service to them, and sustaining that service through completion well defined and sufficient?
Are the constituent components, activities, and functions of the program well defined and sufficient?
Are the resources allocated to the program and its various components and activities adequate?
Assessment through comparison with research and practice Assessment via preliminary observation
Assessing program theory-2 Program theory can be assessed in relation to the support for
critical assumptions found in research or documented program practice elsewhere. Sometimes findings are available for similar programs.
Assessment of program theory yields findings that can help improve conceptualization of a program or, to affirm its basic design.
Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
More questions.. Is the program accepted? Are the resources devoted to the program
being expended appropriately? Using program costs in the planning phase Is offering the program fair to all stakeholders? Is this the way the funds are supposed to be
spent? Do the outcomes justify the resources spent? Has the evaluation plan allowed for the
development of criteria that are sensitive to undesirable side effects?
Example: Program Theory
Example: Program Theory
Example: Program Theory and theory failure
E.g. Theory failure
Some practical limitations in selecting evaluation criteria Evaluation budget: Evaluation is not free. Time available for the project Criteria that are credible to the
stakeholders.
Overlap in terminology in program evaluation by Jane T. Bertrand
Bertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, The communicating initiative network.
Thanks for your attention.