RCS 6740 7/11/05

63
1 Needs Analysis, Program Needs Analysis, Program Evaluation, Program Evaluation, Program Implementation, Survey Implementation, Survey Construction, and the Construction, and the CIPP Model CIPP Model RCS 6740 RCS 6740 7/11/05 7/11/05

description

Needs Analysis, Program Evaluation, Program Implementation, Survey Construction, and the CIPP Model. RCS 6740 7/11/05. What is a Needs Analysis?. According to McKillip (1987), "Needs are value judgments that a target group has problems that can be solved" (p. 7). - PowerPoint PPT Presentation

Transcript of RCS 6740 7/11/05

Page 1: RCS 6740 7/11/05

11

Needs Analysis, Program Needs Analysis, Program Evaluation, Program Evaluation, Program

Implementation, Survey Implementation, Survey Construction, and the CIPP Construction, and the CIPP

ModelModel

RCS 6740RCS 6740

7/11/057/11/05

Page 2: RCS 6740 7/11/05

22

What is a Needs Analysis?What is a Needs Analysis?

• According to McKillip (1987), "Needs are value According to McKillip (1987), "Needs are value judgments that a target group has problems judgments that a target group has problems that can be solved" (p. 7). that can be solved" (p. 7).

• Needs analysis, involving the identification and Needs analysis, involving the identification and evaluation of needs, is a tool for decision evaluation of needs, is a tool for decision making in the human services and education. making in the human services and education.

• Decisions can be varied, including such as Decisions can be varied, including such as resource allocation, grant funding, and resource allocation, grant funding, and planning. In other words, Needs Analysis is a planning. In other words, Needs Analysis is a process of evaluating the problems and process of evaluating the problems and solutions identified for a target population. In solutions identified for a target population. In this process, it emphasizes the importance and this process, it emphasizes the importance and relevance of the problems and solutions. relevance of the problems and solutions.

Page 3: RCS 6740 7/11/05

33

Models of Needs AnalysisModels of Needs Analysis

• Discrepancy ModelDiscrepancy Model

• Marketing ModelMarketing Model

• Decision-Making ModelDecision-Making Model

Page 4: RCS 6740 7/11/05

44

Discrepancy ModelDiscrepancy Model

• This model is the most straightforward and This model is the most straightforward and widely used, especially in education.widely used, especially in education.

• This model emphasizes normative This model emphasizes normative expectations and involves three phases:expectations and involves three phases:– Goal settingGoal setting, identifying what ought to be, identifying what ought to be– Performance measurementPerformance measurement, determining what , determining what

isis– Discrepancy identificationDiscrepancy identification, ordering differences , ordering differences

between what ought to be and what isbetween what ought to be and what is

Page 5: RCS 6740 7/11/05

55

Marketing ModelMarketing Model

• This model defines needs Analysis as a feedback This model defines needs Analysis as a feedback process used by organizations to learn about and to process used by organizations to learn about and to adapt to the needs of their client populations.adapt to the needs of their client populations.

• A marketing strategy of needs analysis has three A marketing strategy of needs analysis has three components:components:– Selection of the target populationSelection of the target population, those actually or , those actually or

potentially eligible for the service and able to make potentially eligible for the service and able to make the necessary exchangesthe necessary exchanges

– Choice of competitive positionChoice of competitive position, distinguishing the , distinguishing the agency's services from those offered by other agency's services from those offered by other agencies and providersagencies and providers

– Development of an effective marketing mixDevelopment of an effective marketing mix, , selecting a range and quality of services that will selecting a range and quality of services that will maximize utilization by the target populationmaximize utilization by the target population

Page 6: RCS 6740 7/11/05

66

Decision-Making ModelDecision-Making Model

• This model is an adaptation of multi-attribute utility This model is an adaptation of multi-attribute utility analysis (MAUA) to problems of modeling and analysis (MAUA) to problems of modeling and synthesis in applied research. synthesis in applied research.

• The Decision-Making model has three stages:The Decision-Making model has three stages:– Problem modelingProblem modeling: In this stage, need identification : In this stage, need identification

takes place. The decision problem is conceptualized takes place. The decision problem is conceptualized by options and decision attributes.by options and decision attributes.

– QuantificationQuantification: In this stage, measurements : In this stage, measurements contained in the need identification are transformed contained in the need identification are transformed to reflect the decision makers’ values and interests.to reflect the decision makers’ values and interests.

– SynthesisSynthesis: In this stage, an index that orders options : In this stage, an index that orders options on need will be provided. This index also gives on need will be provided. This index also gives information on the relative standing of these needs.information on the relative standing of these needs.

Page 7: RCS 6740 7/11/05

77

Steps In Needs AnalysisSteps In Needs Analysis

Step 1: Identify users and uses of Step 1: Identify users and uses of the need analysisthe need analysis

– The users of the analysis are those who The users of the analysis are those who will act on the basis of the report.will act on the basis of the report.

– Knowing the uses of the need analysis Knowing the uses of the need analysis can help focus on the problems and can help focus on the problems and solutions that can be entertained. solutions that can be entertained.

Page 8: RCS 6740 7/11/05

88

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.Step 2: Describe the target population Step 2: Describe the target population

and the service environmentand the service environment– For example, For example, geographic dispersiongeographic dispersion may may

include transportation, demographic include transportation, demographic characteristics of the target population, characteristics of the target population, eligibility restrictions, and service capacity. eligibility restrictions, and service capacity. Client analysisClient analysis refers to the comparison of refers to the comparison of those who use services with those who are those who use services with those who are eligible to use services. eligible to use services. Resource inventoriesResource inventories detail services available.detail services available.

Page 9: RCS 6740 7/11/05

99

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.

Step 3: Identify needsStep 3: Identify needs• Describe problems:Describe problems:• According to McKillip (1987), three types of problems According to McKillip (1987), three types of problems

are identified by need analysis:are identified by need analysis:– DiscrepanciesDiscrepancies: “Problems are revealed by : “Problems are revealed by

comparison of expectations with outcomes. comparison of expectations with outcomes. Discrepancies are problems” (p. 11).Discrepancies are problems” (p. 11).

– Poor outcomePoor outcome: Problems involve those at-risk of : Problems involve those at-risk of developing poor outcomesdeveloping poor outcomes

– Maintenance needMaintenance need: “A group with maintenance : “A group with maintenance needs can develop poor outcomes if services needs can develop poor outcomes if services presently offered are withdrawn or altered” (p. 11).presently offered are withdrawn or altered” (p. 11).

Page 10: RCS 6740 7/11/05

1010

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.• Bradshaw identified four types of outcome expectations that Bradshaw identified four types of outcome expectations that

support judgments of needs (McKillip, 1987):support judgments of needs (McKillip, 1987):– Normative needNormative need: Expectations based on expert identification of : Expectations based on expert identification of

adequate levels of performance or service. (This type of adequate levels of performance or service. (This type of expectations may miss real needs of target population.)expectations may miss real needs of target population.)

– Felt needFelt need: Expectations that members of a group have for their : Expectations that members of a group have for their own outcomes (e.g., parents’ expectations about the own outcomes (e.g., parents’ expectations about the appropriate amount of elementary level mathematics appropriate amount of elementary level mathematics instruction).instruction).

– Expressed needExpressed need: Expectations based on behavior of a target : Expectations based on behavior of a target population. Expectations are indicated by use of services (e.g., population. Expectations are indicated by use of services (e.g., waiting lists, enrollment pressure, or high bed occupancy rates).waiting lists, enrollment pressure, or high bed occupancy rates).

– Comparative needComparative need: Expectations based on the performance of a : Expectations based on the performance of a group other than the target population. (Comparative group other than the target population. (Comparative expectations mainly depend on the similarity of the comparison expectations mainly depend on the similarity of the comparison group and the target population. In addition, such expectations group and the target population. In addition, such expectations can neglect unique characteristics that invalidate can neglect unique characteristics that invalidate generalizations.)generalizations.)

Page 11: RCS 6740 7/11/05

1111

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.

• Describe solutions:Describe solutions:

• According to McKillip (1987), there According to McKillip (1987), there are three criteria (dimensions) for are three criteria (dimensions) for evaluating solutions:evaluating solutions:– Cost analysisCost analysis– ImpactImpact– FeasibilityFeasibility

Page 12: RCS 6740 7/11/05

1212

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.

Step 4: Assess the importance of Step 4: Assess the importance of the needsthe needs

• Once problems and their solutions Once problems and their solutions have been identified, needs are have been identified, needs are evaluated.evaluated.

Page 13: RCS 6740 7/11/05

1313

Steps In Needs Analysis Steps In Needs Analysis Cont.Cont.

Step 5: Communicate resultsStep 5: Communicate results

• Finally, the results of the need Finally, the results of the need identification must be communicated identification must be communicated to decisions makers, users, and other to decisions makers, users, and other relevant audiences.relevant audiences.

Page 14: RCS 6740 7/11/05

1414

Techniques for Needs Techniques for Needs AnalysisAnalysis

• Resource InventoryResource Inventory– Information is gathered from service Information is gathered from service

providers, either by survey or interviewproviders, either by survey or interview– Provides a systematic mapping of Provides a systematic mapping of

services available, points to gaps and to services available, points to gaps and to widely available services, and may widely available services, and may identify services that planners were not identify services that planners were not aware ofaware of

Page 15: RCS 6740 7/11/05

1515

Techniques for Needs Analysis Techniques for Needs Analysis Cont.Cont.

• Secondary Data AnalysisSecondary Data Analysis– Target population descriptionTarget population description– Synthetic estimationSynthetic estimation– Client analysisClient analysis– Direct and at-risk indicatorsDirect and at-risk indicators

Page 16: RCS 6740 7/11/05

1616

Techniques for Needs Analysis Techniques for Needs Analysis Cont.Cont.

• SurveysSurveys– Key informantsKey informants– Client satisfactionClient satisfaction– Training surveysTraining surveys

Page 17: RCS 6740 7/11/05

1717

Techniques for Needs Analysis Techniques for Needs Analysis Cont.Cont.

• Group ProceduresGroup Procedures– Focus groupsFocus groups– Nominal groupsNominal groups– Public hearing and community forumsPublic hearing and community forums

Page 18: RCS 6740 7/11/05

1818

Additional Information on Additional Information on Needs AnalysisNeeds Analysis

• In practice, needs analysis is an In practice, needs analysis is an iterativeiterative and and satisfyingsatisfying activity, which includes " activity, which includes "the cycle the cycle of decision, data gathering, and data analysis of decision, data gathering, and data analysis repeats until further cycles are judged repeats until further cycles are judged unnecessaryunnecessary" (McKillip, 1987, pp. 9-10). " (McKillip, 1987, pp. 9-10).

• Because different stakeholders have different Because different stakeholders have different perspectives on needs and solutions, needs perspectives on needs and solutions, needs analysis usually involves the gathering of analysis usually involves the gathering of information from information from more than one referent more than one referent groupgroup (e.g., clients, families, service (e.g., clients, families, service providers). providers).

Page 19: RCS 6740 7/11/05

1919

Additional Information on Additional Information on Needs Analysis Cont.Needs Analysis Cont.• Further, in planning needs analysis/assessment Further, in planning needs analysis/assessment

(or any other research), it is important to consider (or any other research), it is important to consider multiple measuresmultiple measures (e.g., different types of (e.g., different types of measures for the same construct) and measures for the same construct) and different different methods of assessmentmethods of assessment (e.g., client surveys – (e.g., client surveys – questionnaires, key informant interviews; McKillip, questionnaires, key informant interviews; McKillip, 1987).1987).

• Based on the characteristics of needs analysis, Based on the characteristics of needs analysis, the major concepts of the major concepts of participatory action participatory action researchresearch ( (PARPAR) that involve consumers in the ) that involve consumers in the planning and conduct of research can be planning and conduct of research can be considered in needs analysis.considered in needs analysis.

Page 20: RCS 6740 7/11/05

2020

Needs AnalysisNeeds Analysis

Needs Analysis Needs Analysis ActivityActivity

Page 21: RCS 6740 7/11/05

2121

Program EvaluationProgram Evaluation

History of Program EvaluationHistory of Program Evaluation• Primary purpose traditionally has been to provide Primary purpose traditionally has been to provide

decision makers with information about the decision makers with information about the effectiveness of some program, product, or effectiveness of some program, product, or procedureprocedure

• Has been viewed as a process in which data are Has been viewed as a process in which data are obtained, analyzed, and synthesized into relevant obtained, analyzed, and synthesized into relevant information for decision-makinginformation for decision-making

• Has developed in response to the pressing Has developed in response to the pressing demands of the rapidly changing social system demands of the rapidly changing social system that started in the 1930s. The attempts by that started in the 1930s. The attempts by evaluators to meet these demands have resulted evaluators to meet these demands have resulted in the development of decision-oriented models.in the development of decision-oriented models.

Page 22: RCS 6740 7/11/05

2222

Program Evaluation Cont.Program Evaluation Cont.

History of Program Evaluation Cont.History of Program Evaluation Cont.

• In education, the major impetus to the In education, the major impetus to the development of decision-oriented evaluation development of decision-oriented evaluation was the curriculum reform movement in the was the curriculum reform movement in the 1960s.1960s.

• Human service evaluation began primarily as Human service evaluation began primarily as an extension of educational evaluation. This an extension of educational evaluation. This field has also applied methods from economics field has also applied methods from economics and operations research to develop cost-and operations research to develop cost-effectiveness and cost-benefit analyses.effectiveness and cost-benefit analyses.

Page 23: RCS 6740 7/11/05

2323

Program Evaluation Cont.Program Evaluation Cont.

Originating OrientationsOriginating Orientations• Education:Education:

– Educational Psychology Models: purpose is to determine Educational Psychology Models: purpose is to determine discrepancies between objectives and outcomes and discrepancies between objectives and outcomes and between the intended and actual program implementedbetween the intended and actual program implemented

– Educational Decision Models: purpose is to make better, Educational Decision Models: purpose is to make better, more defensible decisionsmore defensible decisions

– Educational Science Models: purpose is to determine Educational Science Models: purpose is to determine causal relationships between inputs, program activities, causal relationships between inputs, program activities, and outcomesand outcomes

– Limitations of all three models: a) focus on interim Limitations of all three models: a) focus on interim outcomes, b) emphasis on measurement, c) emphasis on outcomes, b) emphasis on measurement, c) emphasis on student achievement, d) terminal availability of data, and student achievement, d) terminal availability of data, and e) limited judgment criteriae) limited judgment criteria

Page 24: RCS 6740 7/11/05

2424

Program Evaluation Cont.Program Evaluation Cont.

Originating Orientation Cont.Originating Orientation Cont.• Human Services:Human Services:

– Guided by the same decision-oriented philosophy Guided by the same decision-oriented philosophy as found in education, but added the cost-as found in education, but added the cost-effectiveness and cost-benefit analyses modelseffectiveness and cost-benefit analyses models

– 3 roles of an evaluator: (a) evaluator as 3 roles of an evaluator: (a) evaluator as statistician, (b) evaluator as researcher, and (c) statistician, (b) evaluator as researcher, and (c) evaluator as technicianevaluator as technician

– Similar limitations as educational models, Similar limitations as educational models, especially when the evaluators are not allowed to especially when the evaluators are not allowed to participate in or even have access to decision participate in or even have access to decision making and planningmaking and planning

Page 25: RCS 6740 7/11/05

2525

Basics of Program Basics of Program EvaluationEvaluation• A particularly important goal of research in natural A particularly important goal of research in natural

settings is program evaluation. settings is program evaluation. • ““Program evaluation Program evaluation is done to provide feedback is done to provide feedback

to administrators of human service organizations to to administrators of human service organizations to help them decide what services to provide to whom help them decide what services to provide to whom and how to provide them most effectively and and how to provide them most effectively and efficiently” (Shaughnessy & Zechmeister, 1990, p. efficiently” (Shaughnessy & Zechmeister, 1990, p. 340). 340).

• Program evaluation represents a hybrid discipline Program evaluation represents a hybrid discipline that draws on political science, sociology, that draws on political science, sociology, economics, education, and psychology. Thus, economics, education, and psychology. Thus, persons (e.g., psychologists, educators, political persons (e.g., psychologists, educators, political scientists, and sociologists) are often involved in this scientists, and sociologists) are often involved in this process (Shaughnessy & Zechmeister, 1990).process (Shaughnessy & Zechmeister, 1990).

Page 26: RCS 6740 7/11/05

2626

Basics of Program Evaluation Basics of Program Evaluation Cont.Cont.

• ““Evaluation researchEvaluation research is meant for is meant for immediate and direct use in improving the immediate and direct use in improving the quality of social programming” (Weiss, as quality of social programming” (Weiss, as cited in Patton, 1978, p. 19).cited in Patton, 1978, p. 19).

• ““Evaluation researchEvaluation research is the systematic is the systematic collection of information about the activities collection of information about the activities and outcomes of actual programs in order and outcomes of actual programs in order for interested persons to make judgments for interested persons to make judgments about specific aspects of what the program about specific aspects of what the program is doing and affecting” (Patton, 1978, p. 26).is doing and affecting” (Patton, 1978, p. 26).

Page 27: RCS 6740 7/11/05

2727

Basics of Program Evaluation Basics of Program Evaluation Cont.Cont.• ““Evaluation research refers to the use of scientific Evaluation research refers to the use of scientific

research methods to plan intervention programs, research methods to plan intervention programs, to monitor the implementation of new programs to monitor the implementation of new programs and the operation of existing ones, and to and the operation of existing ones, and to determine how effectively programs or clinical determine how effectively programs or clinical practices achieve their goals”. practices achieve their goals”.

• ““Evaluation research is means of supplying valid Evaluation research is means of supplying valid and reliable evidence regarding the operation of and reliable evidence regarding the operation of social programs or clinical practices--how they social programs or clinical practices--how they are planned, how well they operate, and how are planned, how well they operate, and how effectively they achieve their goals” (Monette, effectively they achieve their goals” (Monette, Sullivan, & DeJong, 1990, p. 337).Sullivan, & DeJong, 1990, p. 337).

Page 28: RCS 6740 7/11/05

2828

Qualitative Program Qualitative Program EvaluationEvaluation• Attributes of qualitative Attributes of qualitative

evaluation include:evaluation include:– Qualitative data (observations, Qualitative data (observations,

interviews)interviews)– Qualitative design (flexible)Qualitative design (flexible)– One group under observation or studyOne group under observation or study– Inductive hypothesis testingInductive hypothesis testing– Researcher as participantResearcher as participant– Qualitative data analysis (e.g. coding)Qualitative data analysis (e.g. coding)

Page 29: RCS 6740 7/11/05

2929

Qualitative Program Evaluation Qualitative Program Evaluation Cont.Cont.• Naturalistic InquiryNaturalistic Inquiry

– Defined as slice-of-life episodes documented through Defined as slice-of-life episodes documented through natural language representing as closely as possible natural language representing as closely as possible how people feel, what they know, how they know it, how people feel, what they know, how they know it, and what their concerns, beliefs, perceptions, and and what their concerns, beliefs, perceptions, and understandings are.understandings are.

– Consists of a series of observations that are directed Consists of a series of observations that are directed alternately at discovery and verification.alternately at discovery and verification.

– Came about as an outgrowth of ecological psychology.Came about as an outgrowth of ecological psychology.– Has been used for many purposes and applied in Has been used for many purposes and applied in

different orientations, including education and different orientations, including education and psychology.psychology.

– The perspective and philosophy make this method The perspective and philosophy make this method ideally suited to systematic observation and recording ideally suited to systematic observation and recording of normative valuesof normative values..

Page 30: RCS 6740 7/11/05

3030

Qualitative Program Evaluation Qualitative Program Evaluation Cont.Cont.

• Systems ApproachesSystems Approaches– The writings of systems theorists provide The writings of systems theorists provide

evidence that systems and the study of systems evidence that systems and the study of systems are necessary in order to understand people's are necessary in order to understand people's increasingly complex interaction with others and increasingly complex interaction with others and with the environmentwith the environment

– General systems paradigm suggests that it is General systems paradigm suggests that it is impossible to understand complex events by impossible to understand complex events by reducing them to their individual elementsreducing them to their individual elements

– An example in education is the use of An example in education is the use of instructional systems development instructional systems development

Page 31: RCS 6740 7/11/05

3131

Qualitative Program Evaluation Qualitative Program Evaluation Cont. Cont. • Participatory Action Research (PAR)Participatory Action Research (PAR)

– Some of the people in the organization under Some of the people in the organization under study participate actively with the professional study participate actively with the professional researcher throughout the research process from researcher throughout the research process from the initial design to the final presentation of the initial design to the final presentation of results and discussion of their action implications.results and discussion of their action implications.

– In rehabilitation and education, this paradigm In rehabilitation and education, this paradigm would potentially involve all of the stakeholders - would potentially involve all of the stakeholders - consumers, parents, teachers, counselors, consumers, parents, teachers, counselors, community organizations, and employers.community organizations, and employers.

– Note: Remember that not all PAR evaluation is Note: Remember that not all PAR evaluation is qualitative. qualitative.

Page 32: RCS 6740 7/11/05

3232

Evaluation Research vs. Basic Evaluation Research vs. Basic Research Research • In Evaluation Research, In Evaluation Research, the researcher takes the researcher takes

immediate action on the basis of the results. immediate action on the basis of the results. He/she must determine clearly whether a He/she must determine clearly whether a program is successful and valuable enough to be program is successful and valuable enough to be continued. continued.

• In Basic Research, In Basic Research, the researcher can afford to the researcher can afford to be tentative and conduct more research before be tentative and conduct more research before they draw strong conclusions about their results they draw strong conclusions about their results (Cozby, 1993).(Cozby, 1993).

• Program evaluation is one type of applied social Program evaluation is one type of applied social research (Dooley, 1990; Shaughnessy & research (Dooley, 1990; Shaughnessy & Zechmeister, 1990). Zechmeister, 1990).

Page 33: RCS 6740 7/11/05

3333

Evaluation Research vs. Basic Evaluation Research vs. Basic Research Cont.Research Cont.• According to Shaughnessy and Zechmeister According to Shaughnessy and Zechmeister

(1990), the purpose of program evaluation is (1990), the purpose of program evaluation is practical, not theoretical. practical, not theoretical.

• The distinction of basic versus applied The distinction of basic versus applied research cannot be determined by research cannot be determined by methodology, location, or motivation of the methodology, location, or motivation of the work (Dooley, 1990). work (Dooley, 1990).

• Basic and applied research can be Basic and applied research can be differentiated in terms of (a) goals and differentiated in terms of (a) goals and products and (b) constraints placed on products and (b) constraints placed on the problem-solving aspects of these the problem-solving aspects of these kinds of research.kinds of research.

Page 34: RCS 6740 7/11/05

3434

Evaluation Research vs. Basic Evaluation Research vs. Basic Research Cont.Research Cont.

• Basic researchBasic research seeks a better seeks a better understanding of human nature through the understanding of human nature through the development of conceptual tools. development of conceptual tools.

• Applied researchApplied research looks for an looks for an improvement of human life through the improvement of human life through the scientific discovery of practical solutions. scientific discovery of practical solutions. However, a case can be made for a However, a case can be made for a reciprocal relationship between basic and reciprocal relationship between basic and applied research (Shaughnessy & applied research (Shaughnessy & Zechmeister, 1990).Zechmeister, 1990).

Page 35: RCS 6740 7/11/05

3535

The Need for Program The Need for Program EvaluationEvaluation

• When new ideas or programs are When new ideas or programs are implemented, evaluation should be implemented, evaluation should be planned to assess each program to planned to assess each program to determine whether it is having its determine whether it is having its intended effect. If it is not, intended effect. If it is not, alternative programs should be tried. alternative programs should be tried.

Page 36: RCS 6740 7/11/05

3636

The Need for Program The Need for Program Evaluation Cont.Evaluation Cont.

• According to Monette, Sullivan, and DeJong According to Monette, Sullivan, and DeJong (1990), evaluation research is conducted for (1990), evaluation research is conducted for three major reasons:three major reasons:– It can be conducted for administrative purposes, It can be conducted for administrative purposes,

such as to fulfill an evaluation requirement such as to fulfill an evaluation requirement demanded by a funding source, to improve demanded by a funding source, to improve service to clients, or to increase efficiency of service to clients, or to increase efficiency of program delivery.program delivery.

– A program is assessed to see what effects, if any, A program is assessed to see what effects, if any, it is producing (i.e., impact assessment).it is producing (i.e., impact assessment).

– It can be conducted to test hypotheses or It can be conducted to test hypotheses or evaluate practice approaches.evaluate practice approaches.

Page 37: RCS 6740 7/11/05

3737

Standards of Program Standards of Program EvaluationEvaluation

• According to Ralph and Dwyer (1988), According to Ralph and Dwyer (1988), a good program evaluation design a good program evaluation design should:should:– Demonstrate that a clear and attributable Demonstrate that a clear and attributable

connection exists between the evidence connection exists between the evidence of an educational effect and the program of an educational effect and the program treatment, and treatment, and

– Account for rival hypotheses that might Account for rival hypotheses that might explain effects.explain effects.

Page 38: RCS 6740 7/11/05

3838

Standards of Program Standards of Program Evaluation Cont.Evaluation Cont.• Whatever designs were chosen and whatever Whatever designs were chosen and whatever

facets of the program are evaluated current facets of the program are evaluated current standards demand that program evaluations standards demand that program evaluations should possess the following characteristics: should possess the following characteristics: – (a) utility, (b) feasibility, (c) propriety, and (d) (a) utility, (b) feasibility, (c) propriety, and (d)

accuracy.accuracy.• That is, they must be useful for program, That is, they must be useful for program,

they should be feasible (politically, they should be feasible (politically, practically, and economically), they must be practically, and economically), they must be conducted fairly and ethically, and they must conducted fairly and ethically, and they must be conducted with technical accuracy be conducted with technical accuracy (Patton, 1978).(Patton, 1978).

Page 39: RCS 6740 7/11/05

3939

Elements of Design in Program Elements of Design in Program EvaluationEvaluation• A A designdesign is a is a planplan which dictates which dictates whenwhen and and from from

whomwhom measurements will be gathered during the measurements will be gathered during the course of the program evaluationcourse of the program evaluation

• Two types of evaluators:Two types of evaluators:– Summative evaluator: responsible for a summary Summative evaluator: responsible for a summary

statement about the effectiveness of the programstatement about the effectiveness of the program– Formative evaluator: helper and advisor to the Formative evaluator: helper and advisor to the

program planners and developersprogram planners and developers• The critical characteristic of any one evaluation The critical characteristic of any one evaluation

study is that it provide the best possible information study is that it provide the best possible information that could have been collected under the that could have been collected under the circumstances, and that this information meet the circumstances, and that this information meet the credibility requirements of its evaluation audiencecredibility requirements of its evaluation audience

Page 40: RCS 6740 7/11/05

4040

Elements of Design in Program Elements of Design in Program Evaluation Cont.Evaluation Cont.

• Related terms:Related terms:– True control groupTrue control group– Non-equivalent control groupNon-equivalent control group– Pre-testsPre-tests– Post-testsPost-tests– Mid-testsMid-tests– Retention testsRetention tests– Time Series testsTime Series tests

Page 41: RCS 6740 7/11/05

4141

Steps in Designing a Program Steps in Designing a Program EvaluationEvaluation

• Consider your goalsConsider your goals

• Identify needed data related to each Identify needed data related to each goalgoal

• Identify the comparison groupIdentify the comparison group

• Identify a schedule for data collectionIdentify a schedule for data collection

Page 42: RCS 6740 7/11/05

4242

Program EvaluationProgram Evaluation

Program Evaluation Program Evaluation ActivityActivity

Page 43: RCS 6740 7/11/05

4343

Program ImplementationProgram Implementation

• Program implementation is how a Program implementation is how a program looks in operationprogram looks in operation

• Need to look at not only that a Need to look at not only that a program worked, but how it workedprogram worked, but how it worked

Page 44: RCS 6740 7/11/05

4444

Questions as you plan an Questions as you plan an implementation evaluationimplementation evaluation

• What purpose will your What purpose will your implementation study serve?implementation study serve?

• What overall evaluation orientation is What overall evaluation orientation is most appropriate?most appropriate?

Page 45: RCS 6740 7/11/05

4545

Program Implementation Program Implementation StepsSteps• Initial planning: Deciding what to measureInitial planning: Deciding what to measure

– What are the program’s critical characteristics?What are the program’s critical characteristics?– How much supporting data do you need?How much supporting data do you need?

• Steps for planning data collectionSteps for planning data collection– Choosing data collection methodsChoosing data collection methods– Determining whether appropriate measures Determining whether appropriate measures

already existalready exist– Creating a sampling strategyCreating a sampling strategy– Thinking about validity and reliabilityThinking about validity and reliability– Planning for data analysisPlanning for data analysis

Page 46: RCS 6740 7/11/05

4646

Methods for Assessing Program Methods for Assessing Program ImplementationImplementation

• Program recordsProgram records

• Questionnaires and interviewsQuestionnaires and interviews

• ObservationsObservations

Page 47: RCS 6740 7/11/05

4747

Program ImplementationProgram Implementation

Program Program Implementation Implementation

ActivityActivity

Page 48: RCS 6740 7/11/05

4848

Information on Survey Information on Survey ConstructionConstruction

• DESIGNING SURVEYSDESIGNING SURVEYS• A good question is one that A good question is one that

produces answers that are reliable produces answers that are reliable and valid measures of something and valid measures of something that we want to describethat we want to describe

• 2 types of question evaluations:2 types of question evaluations:– Those aimed at evaluating reliabilityThose aimed at evaluating reliability– Those aimed at assessing the validity Those aimed at assessing the validity

of answers of answers

Page 49: RCS 6740 7/11/05

4949

Survey Construction Cont.Survey Construction Cont.

• 5 process standards of questions and 5 process standards of questions and answersanswers– Questions need to be consistently understood.Questions need to be consistently understood.– Questions need to be consistently administered Questions need to be consistently administered

or communicated to respondents.or communicated to respondents.– What constitutes an adequate answer should be What constitutes an adequate answer should be

consistently communicated.consistently communicated.– Unless measuring knowledge is the goal of the Unless measuring knowledge is the goal of the

question, all respondents should have access to question, all respondents should have access to the information needed to answer the question the information needed to answer the question accurately.accurately.

– Respondents must be willing to provide the Respondents must be willing to provide the answers called for in the question.answers called for in the question.

Page 50: RCS 6740 7/11/05

5050

Survey Construction Cont.Survey Construction Cont.

• Possible steps to assess the Possible steps to assess the extent to which questions meet extent to which questions meet process standardsprocess standards– Focus group discussionsFocus group discussions– Intensive or cognitive interviewsIntensive or cognitive interviews– Field pre-tests under realistic Field pre-tests under realistic

conditionsconditions

Page 51: RCS 6740 7/11/05

5151

Survey Construction Cont.Survey Construction Cont.

• Some general rules for designing good survey instrumentsSome general rules for designing good survey instruments– The strength of survey research is asking people about their The strength of survey research is asking people about their

firsthand experiences: what they have done, their current firsthand experiences: what they have done, their current situations, their feelings and perceptions.situations, their feelings and perceptions.

– Questions should be asked one at a time.Questions should be asked one at a time.– A survey question should be worded so that all respondents A survey question should be worded so that all respondents

are answering the same question.are answering the same question.– If a survey is to be interviewer administered, wording of the If a survey is to be interviewer administered, wording of the

questions must constitute a complete and adequate script questions must constitute a complete and adequate script such that when the interviewer reads the question as worded, such that when the interviewer reads the question as worded, the respondent will be fairly prepared to answer the question.the respondent will be fairly prepared to answer the question.

– All respondents should understand the kind of answer that All respondents should understand the kind of answer that constitutes an adequate answer to a question.constitutes an adequate answer to a question.

– Survey instruments should be designed so that the tasks of Survey instruments should be designed so that the tasks of reading questions, following instructions, and recording reading questions, following instructions, and recording answers are as easy as possible for interviewers and answers are as easy as possible for interviewers and respondents.respondents.

Page 52: RCS 6740 7/11/05

5252

Survey Construction Cont.Survey Construction Cont.

• Ways of addressing validity in Ways of addressing validity in surveyssurveys– Deriving questions from relevant Deriving questions from relevant

literatureliterature– Expert panelExpert panel

Page 53: RCS 6740 7/11/05

5353

Sampling and SurveysSampling and Surveys

• Sampling relates to the degree to Sampling relates to the degree to which those surveyed are which those surveyed are representative of a specific representative of a specific populationpopulation

• The sampling frame is the set of The sampling frame is the set of people who have the chance to people who have the chance to respond to the surveyrespond to the survey

Page 54: RCS 6740 7/11/05

5454

Sampling and Surveys Cont.Sampling and Surveys Cont.

Probability SamplingProbability Sampling

• Random samplingRandom sampling

• Stratified random samplingStratified random sampling

• Systematic samplingSystematic sampling

• Cluster samplingCluster sampling

Page 55: RCS 6740 7/11/05

5555

Sampling and Surveys Sampling and Surveys

Non-probability SamplingNon-probability Sampling

• Quota samplingQuota sampling

• Snowball samplingSnowball sampling

• Convenience samplingConvenience sampling

Page 56: RCS 6740 7/11/05

5656

Survey ConstructionSurvey Construction

Survey Construction Survey Construction ActivityActivity

Page 57: RCS 6740 7/11/05

5757

CIPP ModelCIPP Model

• From a broad perspective of From a broad perspective of conceptualization, the CIPP (i.e., conceptualization, the CIPP (i.e., Context, Input, Process, and Product) Context, Input, Process, and Product) evaluation model, developed by the evaluation model, developed by the Phi Delta Kappa Commission on Phi Delta Kappa Commission on Evaluation, divides evaluation into Evaluation, divides evaluation into four strategies (Borich & Jemelka, four strategies (Borich & Jemelka, 1982).1982).

Page 58: RCS 6740 7/11/05

5858

CIPP: Context EvaluationCIPP: Context Evaluation

• The subsystems of the context in The subsystems of the context in which the program operateswhich the program operates

• Examples: Examples: – public rehabilitation agency, public public rehabilitation agency, public

school, private rehabilitation agency, school, private rehabilitation agency, private/parochial school, high schoolprivate/parochial school, high school

Page 59: RCS 6740 7/11/05

5959

CIPP: Input EvaluationCIPP: Input Evaluation

• The available human and material The available human and material resources and strategiesresources and strategies

• Examples: Examples: – school personnel, special educators, school personnel, special educators,

rehabilitation counselors, supervisors, rehabilitation counselors, supervisors, forms, paperforms, paper

Page 60: RCS 6740 7/11/05

6060

CIPP: Process EvaluationCIPP: Process Evaluation

• The program that is being The program that is being implemented and the barriers that implemented and the barriers that are encounteredare encountered

• Examples: Examples: – direct instruction, vocational evaluation direct instruction, vocational evaluation

program, independent living programprogram, independent living program

Page 61: RCS 6740 7/11/05

6161

CIPP: Product EvaluationCIPP: Product Evaluation

• The outcomes measured and the The outcomes measured and the standards by which they are being standards by which they are being evaluatedevaluated

• Examples: Examples: – reading scores as measured by a reading scores as measured by a

standardized test, employment as standardized test, employment as measured by a job in which the individual measured by a job in which the individual is working 30 hours per week in an is working 30 hours per week in an integrated settingintegrated setting

Page 62: RCS 6740 7/11/05

6262

CIPPCIPP

CIPP ACTIVITYCIPP ACTIVITY

Page 63: RCS 6740 7/11/05

6363

Questions and CommentsQuestions and Comments

??