February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program...

25
February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation National Institute of General Medical Sciences MORE Program Directors Meeting Colorado Springs, Colorado June 12, 2009

Transcript of February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program...

Page 1: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

February 2009

Program Evaluation: A Pseudo-Case Study

Juliana M. Blome, Ph.D. , MPHOffice of Program Analysis and Evaluation

National Institute of General Medical Sciences

MORE Program Directors MeetingColorado Springs, Colorado

June 12, 2009

Page 2: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

We’re looking for……

Page 3: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.
Page 4: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

But we often get…..

Page 5: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.
Page 6: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

An evaluation plan should include…..

• Program description• Purpose & rationale for evaluation • Evaluation Design • Data Collection & Analyses • Products of evaluation & their use• Project Management • Budget estimate

Page 7: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Program Description

Include program goals and baseline data

Program goals are the intended effects of a program

Activities should be organized to achieve specific goals Types of goals: Process, intermediate, long-term

Long-term Goal Example

• Weak: To train a diverse biomedical workforce

• Strong: To significantly increase the # of URMs graduating with baccalaureate STEM degrees and persisting through to graduate study

Page 8: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Purpose & Rationale of Evaluation

Type of Evaluation

Needs assessment, Feasibility study, Process evaluation

Outcome evaluation?

Timing

Why is right now the time to conduct an evaluation?

Program Maturity

Is it reasonable to expect certain levels of output or measurable changes at this stage?

Page 9: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Evaluation Design

An evaluation design should include…

Study questions Target population Key variables Conceptual framework if applicable

Page 10: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Study Questions

What are the key questions the evaluation is designed to answer?

Key questions link to stated purpose of evaluation and program activities

Include any hypotheses that will be tested

Examples

How is the training program being implemented? (process)

What factors have inhibited the achievement of goals? (process)

Page 11: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Examples

What has been the impact of the training program on the participants? (outcome)

What is the quality and character of the mentorship that is being provided in the program? (outcome)

How and to what extent does the program increase student skills and knowledge about laboratory research? (outcome)

Page 12: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Institutional Impact Questions How has the training program affected your institution?

Institutions have structures which are defined by formal rules (laws, regulations, policies) and informal rules (culture, tradition, trust, implied codes of conduct) that shape people’s behavior

Where might we see institutional change?

• Curriculum development• Policies and practices• Services and support offered to students and faculty• Increased faculty awareness and responsibility for diversity• Impact on students not supported by the training program

Page 13: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Target Population

What groups or groups do you need information about in order to answer the study questions?

People or institutions? Size, general characteristics, any subgroups, etc. ?

Examples• Trainees and students

• Project managers

• Academic coordinators

• Faculty

• High-ranking administrators

Page 14: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Key Variables What specific information is needed to answer the study questions?

Examples

• Program resources – funding, staffing, infrastructure

• Population characteristics – demographics

• Program activities – operations, processes, other activities

• Program goals, performance measures, comparison measures

Program goal: Provide training opportunity for participants

Performance measure: Minimum # workshops held per year

Comparison measure: At least 4 workshops held per year (recognized standard of performance)

• External factors – factors beyond control of the program that may influence program success

Page 15: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Conceptual Framework Consider developing a conceptual framework (logic model) to

illustrate how the program is supposed to achieve its goals

What is a logic model?

A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources your have to operate your program, the activities you plan, and the changes or results you hope to achieve

-W.K. Kellogg Foundation Logic Model Development Guide (www.wkkf.org/Pubs/Tools/Evalaution/Pub3669.pdf)

Page 16: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Model of a Training Program

Resources Activities Impact

Research base

Workshops & Seminars

Mentoring by faculty member

Training in scientific methods

Short term KnowledgeSkills Attitudes

IntermediateBehaviorsPractices

Long term

Enter PhD Program

Faculty & Staff

Money

Equipment & Technology

What is invested?

(Inputs) (Outputs) (Outcomes)

What is invested? What is done? What are the changes or benefits?

Page 17: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Conceptual Framework: Why should we use one?

Increases understanding of program Provides a common language & framework Links activities to results Helps identify variables to measure Reflects group process and shared understanding Strengthens case for program investment

The bane of evaluation is a poorly designed program.

Ricardo Miller, Director, Evaluation Unit

WK Kellogg Foundation

Page 18: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Data Collection and Analysis

Will you use new data or secondary data? Will it be quantitative, qualitative or mixed? Are there appropriate comparison groups? How will you collect the data? Are there ethical or IRB considerations? What are the limitations of the data?

Page 19: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Bibliometric analysis

Quantitative; useful in aggregate as tool to assess quality of medical research

Measures only quantity; can be artificially influenced

Case studies Provides understanding of interaction of various influences on research process

Cases not necessarily representative within or across programs

Database extractions, Document reviews

Useful for analyzing archival data: databases, program records, literature review, etc

Records incomplete

Typical Data Collection Strategies

Method Pro Con

Page 20: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Expert panel Useful in research fields, especially when few quantifiable indicators exist

Difficult to obtain systematic, objective assessment

Focus groups

Provides understanding of attitudes and thoughts on subject; group dynamic can help elicit honest responses

Results cannot be statistically generalized to larger populations; not quantifiable

Interviews Offer insight from perspective of specific program roles and expertise

Limited perspective; time-intensive

Surveys

Generate statistically reliable data – rating services, behavior, demographics, etc

Requires statistically representative sample & adequate response rate

Typical Data Collection Strategies (Cont.)

Method Pro Con

Page 21: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Project Management: Who Participates?

Role Contributions Challenges

Program manager and staff

•Program knowledge Vested interest

Evaluator •Evaluation expertise•Independence

Limited program knowledge

Evaluation Advisory Committee

•Program familiarity•Evaluation expertise•Organizational context

Limited program knowledge

Senior Leader/ Decision-maker

•Organizational context•Resources

Vested interest

Page 22: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Budget Estimate

“Rule of thumb” – 10% of project’s total budget

Common Pitfalls Failure to consider in up-front planning Lack of resources – for analysis & interpretation Lack of time – be realistic & consider time for each step Qualitative evaluation – more costly to implement

Page 23: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Products of Evaluation

What reports or products are planned? • Executive summary & final report

• Briefings – for students, faculty, & administrators

• How will results be used?

Page 24: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

The Evaluation Design Matrix: A Tool for Discussion

Key Question(s)

Information Required

Information Source(s)

Data Collection Methods

Data Analysis Methods

Limitations Conclusions

WHAT DO YOU WANT TO KNOW?

WHAT DO YOU NEED TO ANSWER THE QUESTION?

WHERE ARE YOU GOING TO GET IT?

HOW ARE YOU GOING TO GET IT?

WHAT WILL YOU DO WITH IT ONCE YOU GET IT?

WHAT CAN'T YOU DO (CAVEATS)?

WHAT CAN YOU SAY?

•Clear and specific•Measurable•Doable•Key terms defined•Scope•Timeframe•Population

•Program goals •Evidence•Program criteria•Participant rates•Cost information•Funding levels

•Program officials or participants •External stakeholders•Program documents•Databases•Journals

•Structured interviews•Focus groups•Structured surveys•Case studies•Data extractions•Document retrieval

•Descriptive

statistics•Inferential statistics (T-test, regression)•Cost/ benefit analysis•Qualitative analysis

•Data quality or reliability•Access to records•Staffing/ funding constraints•Generalize

•Unexpected finding•Anecdotal information•Precise statements about sample•Generalize•Impact of program changes

Page 25: February 2009 Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D., MPH Office of Program Analysis and Evaluation National Institute of General.

NIGMS

February 2009

Why do we ask for program evaluation? ….................Because we’re accountable

What gets measured get’s done If you don’t measure success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support

Osborne & Gabler (1992) in Reinventing Government As summarized by Ellen Taylor Powell -U of Wisc Extension