Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation...

186
Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non- Profit Organizations And Their Evaluation Partners Bruner Foundation Rochester, New York

Transcript of Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation...

Page 1: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Anita M. Baker, Ed.D.

Building Evaluation Capacity Presentation Slides for

Participatory Evaluation Essentials:A Guide for Non-Profit Organizations

And Their Evaluation Partners

Bruner Foundation Rochester, New York

Page 2: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

How to Use the Bruner Foundation Guide & Powerpoint SlidesEvaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the Guide) and slides are organized to help an evaluation trainee walk through the process of designing an evaluation and collecting and analyzing evaluation data. The Guide also provides information about writing an evaluation report. The slides allow for easy presentation of the content, and in each section of the Guide there are activities that provide practice opportunities. The Guide has a detailed table of contents for each section and it includes an evaluation bibliography. Also included are comprehensive appendices which can be pulled out and used for easy references, as well as to review brief presentations of other special topics that are not covered in the main section and sample logic models, completed interviews which can be used for training activities, and a sample observation protocol.For the Bruner Foundation-sponsored REP project, we worked through all the information up front, in a series of comprehensive training sessions. Each session included a short presentation of information, hands-on activities about the session topic, opportunities for discussion and questions, and homework for trainees to try on their own. By the end of the training sessions, trainees had developed their own evaluation designs which they later implemented as part of REP. We then provided an additional 10 months of evaluation coaching and review while trainees actually conducted the evaluations they had designed and we worked through several of the additional training topics that are presented in the appendix. At the end of their REP experience, trainees from non-profit organizations summarized and presented the findings from the evaluations they had designed and conducted. The REP non-profit partners agreed that the up-front training helped prepare them to do solid evaluation work and it provided opportunities for them to increase participation in evaluation within their organizations. The slides were first used in 2006-07 in a similar training project sponsored by the Hartford Foundation for Public Giving. We recommend the comprehensive approach for those who are interested in building evaluation capacity. Whether you are a trainee or a trainer, using the guide to fully prepare for and conduct evaluation or just look up specific information about evaluation-related topics, we hope that the materials provided here will support your efforts.

Bruner Foundation Rochester, New York

Page 3: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D.

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or

in part provided that credit is given to the Bruner Foundation.

They may NOT be sold or redistributed in whole or part for a profit.

Copyright © by the Bruner Foundation 2007

* Please see the notes attached to the first slide for further information about how to use the available materials.

Page 4: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Anita M. Baker, Ed.D.

Building Evaluation Capacity Session 1

Important DefinitionsThinking About Evaluative Thinking

Bruner Foundation Rochester, New York

Page 5: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 5

Important Definitions

Page 6: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 6

Working Definition of Program Evaluation

The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions.

Page 7: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 7

Working Definition of Program Evaluation

The practice of evaluation involves thoughtful, systematic collection and analysis of information

Page 8: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 8

Working Definition of Program Evaluation

The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs,

Page 9: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 9

Working Definition of Program Evaluation

The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people,

Page 10: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 10

Working Definition of Program Evaluation

The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions.

Page 11: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 11

Working Definition of Participatory Evaluation

Participatory evaluation involves trained evaluation personnel and practice-based decision-makers working in partnership.

It brings together seasoned evaluators with seasoned program staff to:Address training needsDesign, conduct and use results of program evaluation

Page 12: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 12

Evaluation Strategy Clarification

All Evaluations Are: Partly social Partly political Partly technical

Both qualitative and quantitative data can be collected and used and both are valuable

There are multiple ways to address most evaluation needs.

Different evaluation needs call for different designs, types of data and data collection strategies.

Page 13: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 13

Purposes of Evaluation

Evaluations are conducted to:

Render judgment Facilitate improvements Generate knowledge

Evaluation purpose must be specified at the earliest stages of evaluation planning and with input from multiple stakeholders.

Page 14: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 14

What is an Evaluation Design?

An Evaluation Design communicates plans to evaluators, program officials and other stakeholders.

Evaluation Designs help evaluators think about and structure evaluations.

Page 15: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 15

Good Evaluation Designs Include the Following

Summary Information about the program

The questions to be addressed by the evaluation

The data collection strategies that will be used

The individuals who will undertake the activities

When the activities will be conducted

The products of the evaluation (who will receive them and how they should be used)

Projected costs to do the evaluation

Page 16: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 16

Evaluation Questions Get you Started

Focus and drive the evaluation.

Should be carefully specified and agreed upon in advance of other evaluation work.

Generally represent a critical subset of information that is desired.

Page 17: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 17

What about Evaluation Stakeholders?

Evaluation stakeholders include anyone who makes decisions about a program, desires information about a program, and/or is involved directly with a program.

• Most programs have multiple stakeholders.

• Stakeholders have diverse, often competing interests.

Page 18: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 18

Who are Evaluation Stakeholders?

Organization officials

Program staff

Program clients or their caregivers

Program Funders

Page 19: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 19

What do you need to know about a program …. before you design an evaluation?

1. What is/are the purpose(s) of the program?

2. What stage is the program in? (new developing, mature, phasing out)

3. Who are the program clients?

4. Who are the key program staff (and where applicable, in which department is the program?

5. What specific strategies are used to deliver program services?

6. What outcomes are program participants expected to achieve?

7. Are there any other evaluation studies currently being conducted regarding this program?

8. Who are the funders of the program?

9. What is the total program budget?

10. Why has this program been selected for evaluation?

Page 20: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 20

Thinking About

Evaluative

Thinking

Page 21: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 21

What is Evaluative Thinking?

Evaluative Thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions.

Page 22: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 22

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 23: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 23

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 24: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 24

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 25: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 25

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 26: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 26

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 27: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 27

What Are Key Components of Evaluative Thinking?

1. Asking questions of substance

2. Determining data needed to address questions

3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results

5. Developing strategies to act on findings

Page 28: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 28

How is Evaluative Thinking Related to Organizational

Effectiveness?

Organizational capacity areas (i.e., core skills and capabilities, such as leadership, management, finance and fundraising, programs and evaluation) where evaluative thinking is less evident, are also capacity areas of organizations that usually need to be strengthened.

Assessing evaluative thinking provides insight for organizational capacity enhancement.

Page 29: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 29

Why Assess Evaluative Thinking?

Assessment of Evaluative Thinking . . . helps clarify what evaluative thinking is

helps to identify organizational capacity areas where evaluative thinking is more or less prominent (or even non-existent)

informs the setting of priorities regarding how to enhance or sustain evaluative thinking

Page 30: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 30

What Organizational Capacity Areas Does the Bruner Foundation Evaluative

Thinking Tool Address?

Mission Strategic Planning Governance Finance Leadership Fund Development Evaluation Client Relationships

Program Development Communication & Marketing Technology Acquisition &

Training

Staff Development

Human Resources

Alliances/Collaborations

Business Venture Dev.

Page 31: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 31

How Can Evaluative Thinking be Assessed?

Develop or locate a tool.

Decide on an administrative approach and strategy: – Individual vs. Team/Group

– Timing of administration

– Communicating about the assessment

Discuss how results could be used and plan for next steps.

Page 32: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 32

The Bruner Foundation Evaluative Thinking Assessment Tool

ORGANIZATION MISSION  

  Assessment Priority

a.The mission statement is specific enough to provide a basis for

developing goals and objectives 

 

b.The mission is reviewed and revised on a scheduled basis (e.g. annually)

with input from key stakeholders as appropriate  

 

c.The organization regularly assesses compatibility between programs and

mission 

 

d.The organization acts on the findings of compatibility assessments (in

other words, if a program is not compatible with the mission, it is changed or discontinued)

  

Comments:   #DIV/0!  

     

Please proceed to the next Worksheet  

Page 33: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 33

The Bruner Foundation Evaluative Thinking Assessment Tool

GOVERNANCE  

  Assessment Priority

a.Board goals/workplan/structure are based on the mission and strategic planning  

 

b.Board uses evaluation data in defining goals/workplan/structure and organizational strategic planning  

 

c.Board regularly evaluates progress relative to own goals/workplan/structure  

 

d.There is a systematic process and timeline for identifying, recruiting, and electing new board members  

 

e.Specific expertise needs are identified and used to guide board member recruitment  

 

f.The board regularly (e.g., annually) evaluates the executive director’s performance based on established goals/workplan  

 

g.Board members assess and approve the personnel manual covering personnel policy  

 

h.The board assess the organization’s progress relative to long-term financial plans  

 

i.The board assess the organization’s progress relative to program evaluation results  

 

Comments: #DIV/0!  

     

Page 34: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 34

The Bruner Foundation Evaluative Thinking Assessment Tool

TECHNOLOGY ACQUISITION PLANNING AND TRAINING

  Assessment Priority

a.An assessment process is in place to make decisions about technology maintenance, upgrades, and acquisition  

 

b.Technology systems include software that can be used to manage and analyze evaluation data (e.g., Excel, SPSS)  

 

c. Technology systems provide data to evaluate client outcomes   

d. Technology systems provide data to evaluate organizational management   

e.Technology systems are regularly assessed to see if they support evaluation  

 

f. Staff technology needs are regularly assessed   

Comments: #DIV/0!  

     

Page 35: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 35

The Bruner Foundation Evaluative Thinking Assessment Tool

Bruner FoundationEvaluative Thinking Assessment

Organizational Capacity AreaCapacity Score* Action Planning**                                                      

                 (Select from list)

1 Mission 50 Action suggested see priorities

2 Strategic Planning 50 Action suggested see priorities

3 Governance 63 No action required in this area

4 Leadership 92 No action required in this area

5 Finance 71 Action suggested see priorities

6 Fund Development/Fund Raising 50 Action suggested see priorities

7 Evaluation 69 Action required see priorities

8 Program Development 80 No action required in this area

9 Client Relationships 80 No action required in this area

10 Communication and Marketing 80 No action required in this area

11 Technology Acquisition and Planning 67 Action suggested see priorities

12 Staff Development 67 Action suggested see priorities

13 Human Resources 33 Action required see priorities

14 Business Venture Development 50 No action required in this area

15 Alliances and Collaboration 40 No action required in this area

Page 36: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 36

The Bruner Foundation Evaluative Thinking Assessment Tool

50

50

63

92

71

50

69

80

80

80

67

67

33

50

40

0 10 20 30 40 50 60 70 80 90 100

Mission

Strategic Planning

Governance

Leadership

Finance

Fund Development/Fund Raising

Evaluation

Program Development

Client Relationships

Communication and Marketing

Technology Acquisition and Planning

Staff Development

Human Resources

Business Venture Development

Alliances and Collaboration

Evaluative Thinking Scores

Page 37: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 37

How Should Evaluative Thinking Assessment Results be Used?

1. Review assessment results.

2. Distinguish communications vs. strategic issues (where possible).

3. Identify priorities and learn more about strategies to enhance Evaluative Thinking.

4. Develop an action plan based on priorities and what’s been learned about enhancing Evaluative Thinking.

5. Re-assess Evaluative Thinking and determine the effectiveness of the action plan.

Page 38: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Anita M. Baker, Ed.D.

Building Evaluation Capacity Session 2

Logic ModelsOutcomes, Indicators and Targets

Bruner Foundation Rochester, New York

Page 39: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 39

Logic Model Overview

Page 40: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 40

So, what is a logic model anyway? A Logic Model is a simple description of how a program is understood to work to achieve outcomes for participants.

It is a process that helps you to identify your vision, the rationale behind your program, and how your program will work.

Page 41: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 41

And . . . . Logic models are useful tools for program planning, evaluation and fund development.

Developing or summarizing a logic model is a good way to bring together a variety of people involved in program planning to build consensus on the program’s design and operations.

Page 42: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 42

Why use a Logic Model? Developing a Logic Model will help you get clear

about what you’re doing, and how you hope it will make a difference.

You have the best knowledge of the context of your work and what’s important to you and your communities.

Developing a Logic Model draws from what you already know.

A Logic Model will leave you with a clear, thoughtful plan for what you are doing and what you hope to achieve. This plan can be an advocacy resource, bring clarity to your message and help you tell your story.

Page 43: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 43

To Construct a Logic Model You Must Describe:

Inputs: resources, money, staff/time, facilities, etc.

Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.

Outcomes: changes to individuals or populations during or after participation.

Inputs Activities Outcomes

Page 44: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 44

Here is an illustration that will help you create your own Logic Model.

InputsResources dedicated to or consumed by the program.

E.G.money staff and staff time, volunteers and volunteer timefacilitiesequipment and supplies

OutcomesBenefits for participants during and after program activities.

E.G.new knowledgeincreased skillschanged attitudesmodified behaviorimproved conditionaltered status

ActivitiesWhat the program does with the inputs to fulfill its mission.

E.G.provide x number of classes to x participants provide weekly counseling sessions educate the public about signs of child abuse by distributing educational materials to all agencies that serve familiesIdentify 20 mentors to work with youth and opportunities for them to meet monthly for one year

Contextual Analysis

Identify the major conditions and

reasons for why you are doing the

work in your community

Page 45: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 45

Longer-term Outcomes

- Participants maintain their employment and establish records that increase the likelihood for continuous work and better jobs.

Short-term Outcomes-Participants learn specific marketable skills and strategies to help them get and keep jobs.-Participants establish trusting relationships with mentors who can answer questions and support them while they are involved in on-the-job training.

Contextual AnalysisPeople in my community: • Have few job skills and are likely to have bad jobs or no jobs, and limited job histories.• Have few opportunities for job training, placement, or help to deal with issues that come up while on the job.

Assumptions

Assumptions

Let’s analyze an example logic model

• Jobs exist, we just have to help people find them. The absence of a job history perpetuates unemployment.

•Education can help people improve their skills. Being able to ask a mentor for advice is useful.

•Job seekers need help with soft skills and technical training. • Personal, one-on-one attention and classes can inspire and support people in keeping jobs and establishing job histories.

Getting solid hard and soft skills are the first steps to keeping a job.

If people feel supported, they will keep working.

Ask yourself….

…do the outcomes seem

reasonable given the program

activities?

…do the assumptions resonate with me and my

experiences?

…are there gaps in the strategy?

Activities • Provide 6 weekly Soft Skills classes. •Identify on-the-job training opportunities and assist participants with placement. • Conduct 6 months of on-the-job supervised training and lunchtime mentoring sessions

Page 46: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 46

Summarizing a Logic Model Helps to:

Create a snapshot of program operations that shows what is needed, how services are delivered and what is expected for participants.

Describe programs currently or optimally.

Identify key components to track.

Think through the steps of participant progress and develop a realistic picture of what can be accomplished.

Page 47: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 47

Important Things to Remember

Not all programs lend themselves easily to summarization in this format.

Logic models are best used in conjunction with other descriptive information or as part of a conversation.

It is advisable to have one or two key project officials summarize the logic model but then to have multiple stakeholders review it and agree upon what is included and how.

Page 48: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 48

Important Things to RememberWhen used for program planning, it is advisable to

start with outcomes and then determine what activities will be appropriate and what inputs are needed.

There are several different approaches and formats for logic models. This one is one-dimensional and limited to three program features (inputs, activities, outcomes).

The relationships between inputs, activities and outcomes are not one-to-one. The model is supposed to illustrate how the set of inputs could support the set of activities that contribute to the set of outcomes identified. (Levels of service delivery or “outputs” are shown in the activities.)

Page 49: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 49

Longer-term Outcomes:

What do we think happens ultimately?

How does or can this contribute to organizational and community value?

Short-term Outcomes:

What benefits for participants during and after the program can we or do we expect?

New knowledge?Increased skills?Changed attitudes?Modified behavior?Improved condition?Altered status?

Contextual Analysis

Identify the major

conditions and reasons for why you are doing or could

do this work

Assumptions

Assumptions

Use Logic Models for Planning, Evaluation and Fund Development

What is needed to address the context that exists?

What would be interesting to try?

What do we need to respond to this RFP?

Ask yourself….

…do the outcomes seem

reasonable given the program

activities?

…do the assumptions resonate with me and my

experiences?

…are there gaps in the strategy?

Inputs: What resources do we need, can we dedicate, or do we currently use for this project?

Activities: What can or do we do with these inputs to fufill the program mission?

When do we think outcomes will happen – will what happens initially affect or cause other longer-term outcomes?

How does this fit into our outcome desires overall?

Page 50: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 50

Outcomes,

Indicators

and Targets

Page 51: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 51

What is the difference between outcomes, indicators, and

targets?Outcomes are changes in behavior, skills,

knowledge, attitudes, condition or status.

Outcomes are related to the core business of the program, are realistic and attainable, within the program’s sphere of influence, and appropriate.

Outcomes are what a program is held accountable for.

Page 52: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 52

What is the difference between outcomes, indicators, and

targets?Indicators are specific characteristics or

changes that represent achievement of an outcome.

Indicators are directly related to the outcome and help define it.

Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal.

Page 53: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 53

What is the difference between outcomes, indicators, and

targets?

Targets specify the amount or level of outcome attainment that is expected, hoped for or required.

Page 54: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 54

Why measure outcomes?

To see if your program is really making a difference in the lives of your clients

To confirm that your program is on the right track

To be able to communicate to others what you’re doing and how it’s making a difference

To get information that will help you improve your program

Page 55: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 55

Use Caution When Identifying Outcomes

There is No right number of outcomes. Be sure to think about when to expect

outcomes.

1) Initial Outcomes First benefits/changes participants experience

2) Intermediate Outcomes Link initial outcomes to longer-term outcomes

3)Longer-term Outcomes Ultimate outcomes desired for program participants

Page 56: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 56

Use Caution When Identifying Outcomes

Outcomes should not go beyond the program’s purpose.

Outcomes should not go beyond the scope of the target population.

Avoid holding a program accountable for outcomes that are tracked and influenced largely by another system.

Do not assume that all subpopulations will have similar outcomes.

Consider carefully unintended and possibly negative outcomes.

Page 57: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 57

Identifying Outcomes: Consider This . . .

• Is it reasonable to believe the program can influence the outcome in a non-trivial way?

• Would measurement of the outcome help identify program successes and help pinpoint and address problems or shortcomings?

• Will the program’s various “publics”–accept this as a valid outcome of the program?

• Do program activities and outcomes relate to each other logically?

GET FEEDBACK

Page 58: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 58

How do you identify indicators? Indicators are specific characteristics or changes

that represent achievement of an outcome.

Indicators are directly related to the outcome and help define it.

Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal.

Ask the questions shown on the following slide.

Page 59: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 59

Questions to Ask When Identifying Indicators

1. What does this outcome look like when it occurs?

2. What would tell us it has happened?

3. What could we count, measure or weigh?

4. Can you observe it?

5. Does it tell you whether the outcome has been achieved?

Page 60: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 60

a reduction in the employee turnover rate among aides

involved in the program

Use the “I’ll know it when I see it” rule

The BIG question is what evidence do we need to see to be convinced that things are changing or improving?

The “I’ll know it (outcome) when I see it (indicator)” rule in action -- some examples:

Let’s “break it down”

I’ll knowI’ll know

when I seewhen I see

andand when I see when I see survey results that indicate that aides are experiencing increased job satisfaction

that retention has increased among home health aides involved in a career ladder program

Page 61: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 61

“I’ll know it when I see it”

I’ll know I’ll know that economic stability has increased among the clients I place in permanent employment

when I seewhen I see an increase in the length of time that clients keep their jobs

I’ll I’ll knowknow my clients are managing their nutrition and care more effectively

when I seewhen I see my clients consistently show up for scheduled medical appointments

andand when I see when I see decreases in my clients’ body mass indexes (BMI)

andand when I see when I see an increase in the number of clients who qualify for jobs with benefits

Page 62: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 62

Remember!When Identifying Indicators . . .

Indicators must be observable and measurable

Indicators may not capture all aspects of an outcome.

Many outcomes have more than one indicator. Identify the set that you believe (or have agreed) adequately and accurately signals achievement of an outcome.

Page 63: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 63

Examples of Indicators

Outcomes Indicators

Initial: Teens are knowledgeable of prenatal nutrition and health guidelines

Program participants are able to identify food items that are good sources of major dietary requirements

Intermediate: Teens follow proper nutrition and health guidelines

Participants are within proper ranges for prenatal weight gain

Participants abstain from smoking

Participants take prenatal vitamins

Intermediate: Teens deliver healthy babies

Newborns weigh at least 5.5 pounds and score 7 or above on the APGAR scale.

Page 64: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 64

What are Targets ?

Targets specify the amount or level of outcome attainment that is expected, hoped for or required.

Page 65: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 65

How do you Identify Targets?

Targets or levels of outcome attainment can be determined relative to:

External standards (when they are available)

Internal agreement • best professional hunches• past performance• performance of similar programs

Page 66: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 66

Example of a Target

Outcome: Parents will read to their preschoolers more often.

Indicator: Parent reports of increased reading time after coming to the program.

Target: 75% of participating parents will report a 50 percent increase in how often they read to their preschoolers.

Page 67: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 67

Example of a Target

Outcome: Parents will read to their preschoolers more often.

Indicator: Parent reports of increased reading time after coming to the program.

Target: 75% of participating parents will report reading to their preschoolers for at least 15 minutes, 4 or more nights per week.

Page 68: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 68

Targets Some Words of Caution

Performance targets should be specified in advance (i.e., when deciding to measure outcomes).

Be sure there is buy-in regarding what constitutes a positive outcome – when the program has achieved the target and when it has missed the mark.

Lacking data on past performance it may be advisable to wait.

Be especially cautious about wording numerical targets so they are not over or under ambitious.

Be sure target statements are in sync with meaningful program time frames.

Page 69: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Anita M. Baker, Ed.D.

Building Evaluation Capacity Session 3

Evaluation Questions and DesignsDocumenting Service Delivery

Enhancing Service Delivery

Bruner Foundation Rochester, New York

Page 70: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 70

To Construct a Logic Model You Must Describe:

Inputs: resources, money, staff/time, facilities, etc.

Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.

Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here

Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome

Targets: specify the amount or level of outcome attainment that is expected, hoped for or required.

Inputs Activities Outcomes Indicators w/ Targets

Data Sources

Page 71: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 71

To Construct a Logic Model You Must Describe:

Inputs: resources, money, staff/time, facilities, etc.

Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.

Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here

Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome

Targets: specify the amount or level of outcome attainment that is expected, hoped for or required.

Inputs Activities Outcomes Indicators Data Sources

Reports – staff, clients, sig. others

Existing records – staff, clients

Observation – staff, clients

Test Results – staff, clients

Page 72: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 72

Evaluation Strategy Clarification

All Evaluations Are: Partly social Partly political Partly technical

Both qualitative and quantitative data can be collected and used and both are valuable

There are multiple ways to address most evaluation needs.

Page 73: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 73

What is an Evaluation Design?

An Evaluation Design communicates plans to evaluators, program officials and other stakeholders.

Evaluation Designs help evaluators think about and structure evaluations.

Page 74: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 74

Good Evaluation Designs Include the Following (see Appendix 6)

Summary Information about the program

The questions to be addressed by the evaluation

The data collection strategies that will be used

The individuals who will undertake the activities

When the activities will be conducted

The products of the evaluation (who will receive them and how they should be used)

Projected costs to do the evaluation

Page 75: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 75

Evaluation Questions . . .

Focus and drive the evaluation.

Should be carefully specified and agreed upon in advance of other evaluation work.

Generally represent a critical subset of information that is desired.

Page 76: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 76

Evaluation Questions: Criteria• It is possible to obtain data to address the

questions. • There is more than one possible “answer” to

the question. • The information to address the questions is

wanted and needed. • It is known how resulting information will be

used internally (and externally).• The questions are aimed at changeable

aspects of programmatic activity.

Page 77: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 77

Evaluation Questions: Advice

Limit the number of evaluation questions

Between two and five is optimal

Keep it Manageable

Page 78: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 78

Evaluation Questions: ExamplesEval. Questions Data Collection/Protocol Questions

How was staff training delivered, how did participants respond and how have they used what they learned?

1. How would you rate the staff training you received?

2. Did the staff training you received this year meet your needs?

3. Has the training you received changed your practice?

4. Has the training you received lead to changes in . . .

Page 79: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 79

Evaluation Questions: ExamplesEval. Questions Data Collection/Protocol Questions

How was staff training delivered, how did participants respond and how have they used what they learned?

1. How would you rate the staff training you received?

2. Did the staff training you received this year meet your needs?

3. Has the training you received changed your practice?

4. Has the training you received lead to changes in . . .

How and to what extent has the program met its implementation goals?

What impact has the program had on participants?

1. What does the X program do best? . What is your greatest concern?

2. Do staff communicate with caretakers as often as required?

3. Did you receive all the services promised in the program brochure?

4. How knowledgeable are staff about the issues you face?

1. Have you changed the way you proceed with planning requirements?

2. Do you know more about guardianship now than before the program.

3. How would you rate this program overall?

Page 80: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 80

Switching Gears

How are Evaluative Thinking and Service Delivery (Activities) related?

Page 81: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 81

When Organizations use Evaluative Thinking . . .

• Client interaction includes collection and use of information.

• Service delivery and program development include collection and use of information.

Page 82: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 82

Examples of Evaluative Thinking:Client Interaction

• Client needs assessments are conducted regularly.

• Program services reflect client needs.

• Client satisfaction and program outcomes are regularly assessed.

• Results of client outcome assessments and client satisfaction are used.

Page 83: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 83

1) Designing programs based on what funders want or only on what is “thought” to be best.

2) Assuming clients happiness = program effectiveness.

3) Collecting but not analyzing client data.

4) Limiting data collection from clients to satisfaction only.

Evaluative ThinkingClient Data

Page 84: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 84

Examples of Evaluative Thinking:Program Development

• Identifying gaps in community services before planning new programs.

• Assessing the needs of the target population as part of program planning process.

• Using data from needs assessments and/or gaps analyses to inform planning.

Page 85: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 85

Organizations that Regularly useEvaluative Thinking Will Also . . .

• Think carefully about developing and assessing programs.

• Incorporate program evaluation findings into the program planning.

• Involve significant others in planning/revising.

• Develop written program plans and logic models.

• Follow program plans.

• Have strategies in place to modify plans

Page 86: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 86

What Can Organizational Leaders do to Enhance Evaluative Thinking?

• Educate staff about Evaluative Thinking. • Be clear about what it means to take an evaluative approach.

• Set the stage for others by using Evaluative Thinking in your own practice.

**Ask important questions before

decisions are made,

**Systematically collect and analyze

data to inform decisions,

**Share results of findings and

base responses on the results of

analyses (as appropriate).

Page 87: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 87

Remember Logic Models???

Inputs: resources, money, staff/time, facilities, etc.

Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.

Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here

Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome

Inputs Activities Outcomes Indicators

Page 88: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 88

What strategies are used to collect data about indicators?

Surveys

Interviews

Observations

Record/Document Reviews

Page 89: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 89

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 90: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 90

Surveys: Have a series of questions (items)

with pre-determined response choices.

Can include all independent items or groups of items (scales) that can be summarized.

Can also include some open-ended items for write-in or clarification,

Can be completed by respondents or survey administrators,

Can be conducted via mail, with a captive audience, on the phone

or using the internet, and through a variety of alternative strategies.Instruments are called surveys, questionnaires, assessment forms

Page 91: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 91

Use Surveys: To study attitudes and

perceptions.

To collect self-reported assessment of changes in response to program.

To collect program assessments.

To collect some behavioral reports.

To test knowledge.

To determine changes over time.

Best with big or distant groups, for sensitive information.

Page 92: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 92

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 93: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 93

Interviews

An interview is a one-sided conversation between an interviewer and a respondent.

Questions are (mostly) pre-determined, but open-ended. Can be structured or semi-structured.

Respondents are expected to answer using their own terms.

Interviews can be conducted in person, via phone, one-on-one or in groups. Focus groups are specialized group interviews.

Instruments are called protocols, interview schedules or guides

Page 94: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 94

Use Interviews:

To study attitudes and perceptions using respondent’s own language.

To collect self-reported assessment of changes in response to program.

To collect program assessments.

To document program implementation.

To determine changes over time.

Page 95: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 95

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 96: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 96

Observations

Observations are conducted to view and hear actual program activities so that they can be described thoroughly and carefully.

Observations can be focused on programs overall or participants in programs.

Users of observation reports will know what has occurred and how it has occurred.

Observation data are collected in the field, where the action is, as it happens.

Instruments are called protocols, guides, sometimes checklists

Page 97: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 97

Use Observations:

To document program implementation.

To witness levels of skill/ability, program practices, behaviors.

To determine changes over time.

Page 98: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 98

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 99: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 99

Record Review

Review of program records involves accessing existing internal information or information that was collected for other purposes. Data are obtained from:

a program’s own records (e.g., intake forms, program attendance)

records used by other agencies (e.g., report cards; drug screening results; hospital birth data).

adding questions to standard record-keeping strategies (e.g., a question for parents about program value can be added to an enrollment form).

Instruments are called protocols. Use requires identification of and access to available information.

Page 100: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Anita M. Baker, Ed.D.

Building Evaluation Capacity Session 4

Evaluation Data Collection & AnalysisSurveys and Interviews

Bruner Foundation Rochester, New York

Page 101: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 101

Evaluative Thinking

• Ask important questions before decisions are made,

• Systematically collect and analyze data to inform decisions,

• Share results of findings and

• Base responses and actions on the results of analyses (as appropriate).

Page 102: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 102

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 103: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 103

Surveys: Have a series of questions (items)

with pre-determined response choices.

Can include all independent items or groups of items (scales) that can be summarized.

Can also include some open-ended items for write-in or clarification,

Can be completed by respondents or survey administrators,

Can be conducted via mail, with a captive audience, on the phone

or using the internet, and through a variety of alternative strategies.Instruments are called surveys, questionnaires, assessment forms

Page 104: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 104

Surveys Are Most Productive When They Are:

Well targeted, with a narrow set of questions

Used to obtain data that are otherwise hard to get.

Used in conjunction with other strategies.

Surveys are best used:with large numbers, for sensitive information, for groups that are hard to collect data from

Most survey data are qualitative but simple quantitative analyses are often used to summarize responses.

Page 105: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 105

Surveys can be administered and analyzed

quickly when . . . pre-validated instruments are

used

sampling is simple or not required

the topic is narrowly focused

the numbers of questions (and respondents*) is relatively small

the need for disaggregation is limited

Page 106: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 106

Use Surveys To . . . study attitudes and

perceptions.

collect self-reported assessment of changes in response to program.

collect program assessments.

collect some behavioral reports.

test knowledge.

determine changes over time.

Page 107: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 107

Benefits of Surveys Surveys can be used for a

variety of reasons such as exploring ideas or getting sensitive information.

Surveys can provide information about a large number and wide variety of participants.

Survey analysis can be simple. Computers are not required.

Results are compelling, have broad appeal and are easy to present.

Page 108: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 108

Drawbacks of Surveys:

Designing surveys is complicated and time consuming.

The intervention effect can lead to false responses, or it can be overlooked.

Broad questions and open-ended responses are difficult to use.

Analyses and presentations can require a great deal of work. You MUST be selective.

!

Page 109: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 109

Developing/Assessing Survey Instruments

1) Identify key issues.

2) Review available literature.

3) Convert key issues into questions.

4) Determine what other data are needed.

5) Determine how questions will be ordered and formatted.

6) Have survey instrument reviewed.

Page 110: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 110

For Survey Items, Remember:

1) State questions in specific terms, use appropriate language.

2) Use multiple questions to sufficiently cover a topic.

3) Avoid “double-negatives.”

4) Avoid asking multiple questions in one item.

5) Be sure response categories match the question, are exhaustive and don’t overlap.

6) Be sure to include directions and check numbering, format etc.

Page 111: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 111

Types of Surveys:

Mail Surveys (must have correct addresses and return instructions, must conduct tracking and follow-up). Response is typically low.

Electronic Surveys (must be sure respondents have access to internet, must have a host site that is recognizable or used by respondents; must have current email addresses). Response is often better.

Web + (combining mail and e-surveys). Data input required, analysis is harder.

Page 112: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 112

Types of Surveys:

Phone Surveys (labor intensive and require trained survey administrators, access to phone numbers, usually CATI software). Response is generally better than mail, but must establish refusal rules.

Staged Surveys (trained survey administrators required, caution must be used when collecting sensitive info). Can be administered orally, multiple response options possible, response rates very high.

Intercept Surveys (require trained administrators). Refusal is high.

Page 113: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 113

Sampling Surveys are not always administered to every member of a group (population). Often, some members, a sample, are selected to respond. (Additional strategies in manual.)

Convenience Samples. Provide useful information to estimate outcomes (e.g.

85% of respondents indicated the program had definitely helped them)

Must be used cautiously, generalization limited.

Random Samples. Everyone must have equal opportunity.

Careful administration and aggressive follow-up needed.

Generalization/prediction possible.

Page 114: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 114

How Many Surveys Do you Need to Administer?

Identify the population size, desired confidence and sampling error thresholds.

95% confidence with 5% error is

common.

With the right sample size you can be 95% confident that the answer given by respondents is within 5 percentage points of the answer if all members of the population had responded.

Use this formula: n=385/(1+(385/all possible respondents)). OR

Consult a probability table (see manual).

Page 115: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 115

How Many Surveys Do you Need to Administer?

The sample should be as large as probabilistically required. (Probability – not Percentage)

If a population is smaller than 100, include them all.

When a sample is comparatively large, adding cases does not increase precision.

When the population size is small, relatively large proportions are required and vice versa.

You must always draw a larger sample than needed to accommodate refusal. Desired sample size ÷ (1-refusal proportion)

Page 116: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 116

How Can I Increase Response? Write a good survey and tailor administration

to respondents.

Advertise survey purpose and administration details in advance.

Carefully document who receives and completes surveys. Aggressively follow-up. Send reminders.

Consider using incentives. Make response easy.

Remember: Non-response bias can severely limit your ability to interpret and use survey data.

Page 117: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 117

Calculating Response Rates

Response rate is calculated by dividing the number of returned surveys by the total number of “viable” surveys administered.

Desirable response rates should be determined in advance of analysis and efforts should be made to maximize response.

Page 118: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 118

Administration Plans (see apppendix12)

Before you administer a survey be sure you can answer the following questions!

Who and where are your target groups? Do they require assistance to answer?

Which type of survey will be best to use with your target group? How often?

Will the survey be anonymous or confidential?

How much time will be required to respond?

How will you analyze the data you expect to collect?

Page 119: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 119

Administration Plans (Con’t.)What specific fielding strategy will be used? Will there be incentives?

How will you track the surveys?

How will you provide ample opportunities for all members of the sample to respond? What response rate is desired?

Whose consent is required/desired? Will you use active or passive consent?

How will you store and maintain the confidentiality of the information?

Page 120: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 120

Preparing for Analysis: Developing Codebooks

Unless they’re embedded,

assign numbers for all response categories.

Write the codes onto a copy of the survey and use for reference.

It is bad practice to re-code data as you go. Prepare for entry as is.

List or describe how data are to be recoded.

Page 121: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 121

What Should Your Survey Analysis Plan

Include? How survey items are related to evaluation overall

What analytical procedures including disaggregation will be conducted with each kind of data you collect

How you will present results

How you will decide whether data show that targets have been exceeded, met or missed (as appropriate)

How you will handle missing data

Page 122: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 122

Example: Student Survey Analysis

Plan1. The percentages of all

students who smoke and those who have recently started will be calculated.

2. The percentage of boys who smoke will be compared to the percentage of girls who smoke.

3. The average age of first alcohol use will be calculated from students’ responses.

Page 123: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 123

4. The percentage of students who provide positive (i.e., Good or Very Good) ratings for the smoking prevention program will be calculated. Answers for self-reported non-smokers will be compared to self-reported smokers.

5. The distribution of scores on the likelihood of addiction scale will be determined.

Only valid percents will be used, items missed by more than 10% of respondents will not be used. Meeting the target means ±5 percentage points. Far exceeding or missing = +15 percentage points.

Example: Continued Student Survey Analysis

Plan

Page 124: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 124

Common Factors That Can Influence Responses Participant Characteristics

• Age group, gender, race/ethnicity• Educational level or type• Household income group, household composition• Status (e.g., disabled/not, smoker/non)• Degree of difficulty of the participant’s situation

Location of program• Political or geographic boundaries• Program sites• Characteristics of location (e.g., distressed/not)

Program Experience (type or amount or history)

Where appropriate, disaggregate by one or more of these!

Page 125: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 125

Survey Result Example

% of 2005 Freshman who . . .

Peer Study Group Total

Yes

n=232

No

n=247 N=479

Reported struggling to maintain grades 36% 58% 47%

Are planning to enroll for the sophomore year at this school 89% 72% 80%

Disaggregated Data

Page 126: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 126

Survey Result Example

Comparison of Site Outcomes: MS/JHS Only

0%20%40%60%80%

100%

CAP Site 1CAP Site 2CAP Site 3

Page 127: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 127

Survey Result ExampleTable 3: Relationships between ATOD useand other factors among 9th graders

Ever Used ATOD

Never Used ATOD

Academically and socially attached to school

35% 49%

Have post-secondary aspirations 68% 73%

Are passing most classes 86% 94%

Were sent to the office during last 2 months

23% 10%

Describe their health as excellent 30% 42%

Felt unhappy, sad or depressed recently 32% 12%

Page 128: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 128

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 129: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 129

Interviews

An interview is a one-sided conversation between an interviewer and a respondent.

Questions are (mostly) pre-determined, but open-ended. Can be structured or semi-structured.

Respondents are expected to answer using their own terms.

Interviews can be conducted in person, via phone, one-on-one or in groups. Focus groups are specialized group interviews.

Instruments are called protocols, interview schedules or guides

Page 130: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 130

Use Interviews:

To study attitudes and perceptions using respondent’s own language.

To collect self-reported assessment of changes in response to program.

To collect program assessments.

To document program implementation.

To determine changes over time.

Page 131: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 131

Interviews: Methodological

Decisions What type of interview should you conduct? (see

pg. 28) Unstructured Semi-structured Structured Intercept

What should you ask? How will you word and sequence the questions?

What time frame will you use (past, present, future, mixed)?

Page 132: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 132

Interviews: More AboutMethodological

Decisions How much detail and how long to conduct?

Who are respondents? (Is translation necessary?

How many interviews, on what schedule?

Will the interviews be conducted in-person, by phone, on-or-off site?

Are group interviews possible/useful?

Page 133: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 133

Conducting and Recording Interviews:

Before

Clarify purpose for the interview.

Specify answers to the methodological decisions.

Select potential respondents – sampling.

Collect background information about respondents.

Develop a specific protocol to guide your interview.

Page 134: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 134

Conducting and Recording Interviews:

During

Use the protocol (device) to record responses.

Use probes and follow-up questions as necessary for depth and detail.

Ask singular questions.

Ask clear and truly open-ended questions.

Page 135: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 135

Conducting and Recording Interviews:

After

Review interview responses, clarify notes, decide about transcription.

Record observations about the interview.

Evaluate how it went and determine follow-up needs.

Identify and summarize some key findings.

Page 136: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 136

Tips for Effective Interviewing

Communicate clearly about what information is desired, why it’s important, what will happen to it.

Remember to ask single questions and use clear and appropriate language. Avoid leading questions.

Check (or summarize) occasionally. Let the respondent know how the interview is going, how much longer, etc.

Understand the difference between a depth interview and an interrogation. Observe while interviewing.

Practice Interviewing – Develop Your Skills!

Page 137: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 137

More Tips

Recognize when the respondent is not clearly answering and press for a full response.

Maintain control of the interview and neutrality toward the content of response.

Treat the respondent with respect. (Don’t share your opinions or knowledge. Don’t interrupt unless the interview is out of hand).

Practice Interviewing – Develop Your Skills!

Page 138: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 138

Analyzing Interview Data

1) Read/review completed sets of interviews.

2) Record general summaries

3) Where appropriate, encode responses.

4) Summarize coded data

5) Pull quotes to illustrate findings. (see pg 30 for examples)

Page 139: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 139

What Happens After Data are Collected?

1. Data are analyzed, results are summarized.

2. Findings must be converted into a format that can be shared with others.

3. Action steps should be developed from findings

Step 3 moves evaluation from perfunctory compliance into the realm of usefulness.

“Now that we know _____ we will do _____.”

Page 140: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 140

Increasing Rigor in Program Evaluation

Mixed methodologies

Multiple sources of data

Multiple points in time

Page 141: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Building Evaluation Capacity Session 5

Evaluation Data Collection & Analysis Observation and Record Review

Anita M. Baker, Ed.D.

Bruner Foundation Rochester, New York

Page 142: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 142

What strategies are used to collect data about indicators?

Surveys

Interviews

Observations

Record/Document Reviews

Page 143: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 143

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 144: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 144

Observations

Observations are conducted to view and hear actual program activities so that they can be described thoroughly and carefully.

Observations can be focused on programs overall or participants in programs.

Users of observation reports will know what has occurred and how it has occurred.

Observation data are collected in the field, where the action is, as it happens.

Instruments are called protocols, guides, sometimes checklists

Page 145: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 145

Use Observations:

To document program implementation.

To witness levels of skill/ability, program practices, behaviors.

To determine changes over time.

Page 146: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 146

Trained Observers Can:

see things that may escape awareness of others

learn about things that others may be unwilling or unable to talk about

move beyond the selective perceptions of others

present multiple perspectives

Page 147: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 147

Other Advantages

the observer’s knowledge and direct experience can be used as resources to aid in assessment

feelings of the observer become part of the observation data

OBSERVER’S REACTIONS are data, but they MUST BE KEPT SEPARATE

Page 148: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 148

Observations:Methodological

Decisions

What should be observed and how will you structure your protocol? (individual, event, setting, practice)

How will you choose what to see?

Will you ask for a “performance” or just attend a regular session, or both? Strive for “typical-ness.”

Page 149: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 149

Observations:Methodological

Decisions Will your presence be known, or unannounced?

Who should know? How much will you disclose about the purpose of

your observation?

How much detail will you seek? (checklist vs. comprehensive)

How long and how often will the observations be?

Page 150: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 150

Conducting and Recording Observations:

Before

Clarify the purpose for conducting the observation

Specify the methodological decisions you have made

Collect background information about the subject (if possible/necessary)

Develop a specific protocol to guide your observation

Page 151: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 151

Conducting and Recording

Observations: During

Use the protocol to guide your observation and record observation data

BE DESCRIPTIVE (keep observer impressions separate from descriptions of actual events)

Inquire about the “typical-ness” of the session/event.

Page 152: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 152

Conducting and Recording

Observations: After Review observation notes and make clarifications where

necessary. clarify abbreviations elaborate on details transcribe if feasible or appropriate

Evaluate results of the observation. Record whether: the session went well, the focus was covered, there were any barriers to observation there is a need for follow-up

Page 153: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 153

Observation Protocols

Comprehensive Setting Beginning, ending and chronology of events Interactions Decisions Nonverbal behaviors Program activities and participant behaviors, response of

participants

Checklist – “best” or expected practices

Page 154: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 154

Analyzing Observation Data

Make summary statements about trends in your observations

Every time we visited the program, the majority of the children were involved in a literacy development activity such as reading, illustrating a story they had read or written, practicing reading aloud.

Include “snippets” or excerpts from field notes to illustrate summary points (see manual pp 38-39)

Page 155: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 155

Analyzed Observation Data

Many different types of arts activities were undertaken, and personal development was either delivered directly or integrated with arts activities. Of the 57 different combinations of programming at the 10 sites, only 3 included activities that were not wholly successful with their target groups, 2 of those because of mismatch between instructor and the participant group. At all sites, ongoing projects were underway and examples of participant work were readily visible. Teaching artists were demonstrating skills, giving youth opportunities to try the skills, and providing one-on-one assistance as needed.

Page 156: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 156

Evaluation Data Collection

Surveys

Interviews

Observations

Record/Document Reviews

Page 157: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 157

Record Review

Review of program records involves accessing existing internal information or information that was collected for other purposes. Data are obtained from:

a program’s own records (e.g., intake forms, program attendance)

records used by other agencies (e.g., report cards; drug screening results; hospital birth data).

adding questions to standard record-keeping strategies (e.g., a question for parents about program value can be added to an enrollment form).

Instruments are called protocols. Use requires identification of and access to available information.

Page 158: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 158

Use Record Reviews:

To collect some behavioral reports. To test

knowledge

To verify self-reported data.

To determine changes over time.

Page 159: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 159

Analyzing/Using Record Review Data

Findings from record review data are usually determined through secondary analysis.

Example: Attendance data are regularly collected for a program to inform routine program operations. Attendance records are summarized quarterly or annually to inform other stakeholders such as funders about program use.

Page 160: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 160

Analyzing/Using Record Review Data

Results of record reviews are typically arrayed in tables or summarized in profiles, or “bullet lists” as frequencies or proportions, or averages (see pg. 16, appendix 10 in the Participatory Evaluation Essentials Guide).

Like observation data, record review data can be both descriptive and/or evaluative. -- See pg 16.

Page 161: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 161

Analyzing/Using Record Review Data

Record review data are commonly combined

for multi-variate analyses

with other evaluation data to determine relationships

Page 162: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 162

Collecting Record Review Data

Review existing data collection forms (suggest modifications or use of new forms if possible).

Develop a code book or at least a data element list keyed to data collection forms.

Develop a “database” for record review data.

Develop an analysis plan with mock tables for record review data.

Page 163: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 163

Record Review Data: Example

ASAP Participant Outcomes

New York Boston

Number % Number %

Enrollment Goal 188 112

Enrollment Actual 152 81% 94 84%

Trn. Completion Goal 97 48

Trn. Completion Actual 87 89% 39 81%

Placement Actual (30+) 41 48% 26 59%

Placement Actual (180+) 83 97% 37 84%

In-field placement 77 93% 36 97%

Page 164: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 164

Record Review Data: Example

Outcome: Delivering Healthy Babies

In Program On Waiting List

Number % Number %

Babies Born

18

22

Born Healthy*

13

72%

14

64%

Not Born Healthy*

5

28%

8

36%

*The indicator of a healthy baby is birthweight above 5.5 pounds AND Apgar score 7 Or Above.

Page 165: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 165

Record Review Data: Example

Average Pre and Post Test Scores for Youth Enrolled in Summer Learning Camps

Average Scores

Pre Test

Post Test

Difference

Reading

22.7 (64%)

25.2 (72%)

+ 2.5

Math

29.9 (85%)

29.7 (85%)

-0.2

Page 166: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 166

What Happens After Data are Collected?

1. Data are analyzed, results are summarized.

2. Findings must be converted into a format that can be shared with others.

3. Action steps should be developed from findings

Step 3 moves evaluation from perfunctory compliance into the realm of usefulness.

“Now that we know _____ we will do _____.”

Page 167: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 167

Increasing Rigor in Program Evaluation

Mixed methodologies

Multiple sources of data

Multiple points in time

Page 168: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Building Evaluation Capacity Session 6

Designing Evaluations Putting it All Together

Anita M. Baker, Ed.D.

Bruner Foundation Rochester, New York

Page 169: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 169

Good Evaluation Designs Include the Following

Summary Information about the program

The questions to be addressed by the evaluation

The data collection strategies that will be used

The individuals who will undertake the activities

When the activities will be conducted

The products of the evaluation (who will receive them and how they should be used)

Projected costs to do the evaluation

Page 170: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 170

Increasing Rigor in Program Evaluation

Mixed methodologies

Multiple sources of data

Multiple points in time

Page 171: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 171

Identify data sources

Select data collection methods

Develop and/or test instruments and procedures

Develop plans for entering and managing the data

Train data collectors

Plan for analysis

Plan to monitor the data collection system

What else must you think about? Data Collection Management

Page 172: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 172

Thinking about . . . . Data Collection Instruments

• Who will you collect data about? Clients, caregivers, other service providers working with clients, staff, some other group? Who are considered participants of your program? Be sure to clearly specify your eval. target population.

• What instruments do you need? Surveys, interview guides, observation checklists and/or protocols, record extraction protocols?

• Are there any pre-tested instruments (e.g., scales for measuring human conditions and attitudes)?

– If not, how will you confirm validity?

Page 173: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 173

Thinking about . . . . Data Collection Instruments

Keeping in mind things like cultural sensitivity, language and expression:

• Are the instruments you plan to use appropriate for the group you are planning to use them with?

• Will responses be anonymous or confidential?

• How will you analyze data from instruments you choose?

Page 174: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 174

Thinking about . . . . Data Collection Procedures

• What are your timelines for data collection? – When will you administer surveys, conduct interviews, etc. ? – Are pre/post strategies needed? Doable?

• When do you need data? – Is this the same time that data collectors and subjects are

available?– What outcomes are expected by the time data collection is

planned? i.e., is this the proper timeframe?

• What is required for data collection approval? – Institutional review? – Active consent? – Passive consent? – Informed consent?

Page 175: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 175

Thinking about . . . . Data Entry and Management

• How will you store and maintain the information you collect? – How much data is expected and in what form? – What procedures are necessary to ensure

confidentiality? – Where will the data reside?

• How will you handle data entry? – Do you have specialty software or can you use readily

available programs like Excel to help support your data entry?

– Who will actually enter the data and where will it be entered? Are there training needs?

Page 176: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 176

Thinking about . . . . Data Collector Training

• Who will collect the data? Staff within a program, staff from another program, other agency staff, clients from another program (e.g., youth), volunteers?

• What training do data collectors need? – Can they administer surveys?– Do they know how to conduct interviews?– Have they been trained as observers for this data

collection? – Do they have access to and knowledge about records?

Page 177: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 177

Thinking about . . . . Data Analysis

• How will you analyze the data you collect? – How will you handle quantitative data? e.g., frequencies, averages,

ranges, distributions? Do you need tables and graphs? Do you know how to make them?

– How will you handle qualitative data, e.g., quotes, “snippets,” numerical summaries?

– What will you do about missing data?

– What influencing factors should you consider? What disaggregation is needed?

• Who (staff, volunteers, consultants) will conduct the analysis and how long will it take? Will they need some additional training?

• Are there any additional costs associated with data analysis?

Page 178: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 178

What are the Components of a Strong Evaluation Report?

* Subject program description.

* Clear statement about the evaluation questions and the purpose of the evaluation.

* Description of actual data collection methods used.

* Summary of key findings (including tables, graphs, vignettes, quotes, etc.

* Discussion or explanation of the meaning and importance of key findings

* Suggested Action Steps

* Next Steps (for the program and the evaluation).

* Issues for Further Consideration (loose ends)

Page 179: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 179

Additional Reporting Tips Findings can be communicated in many forms. * brief memos * powerpoint presentations * oral reports * formal evaluation report is most common

Think about internal and external reporting.

Plan for multiple reports.

Before you start writing, be sure to develop an outline and pass it by some stakeholders.

If you’re commissioning an evaluation report, ask to see a report outline in advance.

If you are reviewing others’ evaluation reports, don’t assume they are valuable just because they are in a final form. Review carefully for the important components and meaningfulness.

Page 180: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 180

Projecting Level of Effort

LOE projections are often summarized in a table or spreadsheet. To estimate labor and time:

• List all evaluation tasks

• Determine who will conduct each task

• Estimate time required to complete each task (including pre-training), in day or half-day increments (see page 42 in Participatory Evaluation Essentials)

Page 181: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 181

Projecting Timelines

Timelines can be constructed separately or embeddedin an LOE chart (see example pp. 44 – 45 Participatory

Evaluation Essentials). To project timelines:

• Assign dates to your level of effort, working backward from overall timeline requirements.

• Be sure the number of days required for a task and when it must be completed are in sync and feasible.

• Check to make sure evaluation calendar is in alignment with program calendar.

Don’t plan to do a lot of data collecting around program holidays

Don’t expect to collect data only between 9 and 5

Page 182: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 182

Budgeting and Paying for Evaluation

• Usually the cost to do good evaluation is equivalent to about 10 – 15% of the costs to operate the program effectively.

• Most of the funds for evaluation pay for the professional time of those who develop designs and tools, collect data, analyze data, summarize and present findings.

• Other expense include overhead and direct costs associated with the evaluation (e.g., supplies, computer maintenance, communication, software)

Page 183: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 183

Projecting Budgets • Determine rates for all “staff” to the project.

• Calculate total labor costs by multiplying LOE totals by “staff” rates.

• Estimate other direct costs (ODC) such as copying, mail/delivery, telephone use and facilities.

• Estimate any travel costs.

• Calculate the subtotal of direct costs including labor (fringe where appropriate), ODC and travel.

• Estimate additional indirect (overhead) costs, where appropriate, as a percentage applied to the direct costs.

• Apply any other fees where appropriate

• Sum all project costs to determine total cost of project.

• Establish a payment schedule, billing system and deliverables.

Page 184: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 184

Things to Avoid when Budgeting and Paying for Evaluation

• It’s bad practice to assume there is a standard, fixed evaluation cost regardless of program size or complexity.

• It is dangerous to fund an evaluation project that does not clarify how evaluation funds will be used.

Page 185: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D. 185

Budgeting and Paying for Evaluation

There are two ways to project evaluation costs:

Identify a reasonable total amount of funds dedicated for evaluation and then develop the best evaluation design given those resource requirements.

Develop the best evaluation design for the subject program, and then estimate the costs associated with implementing the design. NEGOTIATE design changes if costs exceed available funds.

Page 186: Anita M. Baker, Ed.D. Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And.

Bruner Foundation Rochester, New York Anita M. Baker, Ed.D.

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or

in part provided that credit is given to the Bruner Foundation.

They may NOT be sold or redistributed in whole or part for a profit.

Copyright © by the Bruner Foundation 2007

* Please see the notes attached to the first slide for further information about how to use the available materials.