INDEPENDENT IN-DEPTH EVALUATION · Web view1400 Vienna, Austria Telephone: (+43-1) 26060-0 Email:...

38
INDEPENDENT IN-DEPTH EVALUATION PLEASE ADD FULL PROGRAMME TITLE HERE Programme number Month year

Transcript of INDEPENDENT IN-DEPTH EVALUATION · Web view1400 Vienna, Austria Telephone: (+43-1) 26060-0 Email:...

PROJECT TITLE

INDEPENDENT IN-DEPTH EVALUATION

PLEASE ADD FULL PROGRAMME

TITLE HERE

Programme number

Month year

viii

iii

INSTRUCTIONS

YELLOW TEXT BOXES PROVIDE INSTRUCTIONS. PLEASE DELETE THEM BEFORE SUBMITTING THE REPORT. (JUST CLICK ON THE BORDER OF THE TEXT BOX AND DELETE IT.)

TEXT AREAS HIGHLIGHTED IN GREEN ARE PLACEHOLDERS. YOU WILL NEED TO REPLACE THEM WITH ACTUAL DATA AND REMOVE THE HIGHLIGHTING.

THIS TEMPLATE IS FORMATTED USING THE REQUIRED STYLES FOR HEADINGS, BODY TEXT, ETC. PLEASE FOLLOW THESE FORMATS AND STYLES BY WRITING ON TOP OF THE HEADINGS AND BODY TEXT. IF PASTING DATA FROM ANOTHER DOCUMENT, PLEASE MAKE SURE TO MATCH DESTINATION FORMATTING.

FOR THE FRONT COVER: PLEASE SELECT A SUITABLE IMAGE TO REPLACE THE STANDARD ONE IN THIS TEMPLATE.

Please visit IES’s website for additional information:

http://www.unodc.org/unodc/en/evaluation/normative-tools.html

Please consult the IES Handbook:

https://www.unodc.org/unodc/en/evaluation/evaluation-handbook.html

or contact IES directly for more guidance.

This independent evaluation report was prepared by an evaluation team consisting of (names and titles of external evaluators). The Independent Evaluation Section (IES) of the United Nations Office on Drugs and Crime (UNODC) provides normative tools, guidelines and templates to be used in the evaluation process of projects.

Please find the respective tools on the IES web site: http://www.unodc.org/unodc/en/evaluation/evaluation.html

The Independent Evaluation Section of the United Nations Office on Drugs and Crime can be contacted at:

United Nations Office on Drugs and CrimeVienna International CentreP.O. Box 5001400 Vienna, AustriaTelephone: (+43-1) 26060-0Email: [email protected]: https://www.unodc.org/unodc/de/evaluation/index.html

Disclaimer

Independent In-Depth Evaluations are managed by the Independent Evaluation Section (IES) and conducted by external independent evaluators. It is the responsibility of IES to respond to the commitment of the United Nations Evaluation Group (UNEG) in professionalizing the evaluation function and promoting a culture of evaluation within UNODC for the purposes of accountability and continuous learning and improvement.

The views expressed in this independent evaluation report are those of the evaluation team. They do not represent those of UNODC or of any of the institutions or Member States referred to in the report. All errors and omissions remain the responsibility of the evaluation team. 

© United Nations, Month Year. All rights reserved worldwide.

The designations employed and the presentation of material in this publication do not imply the expression of any opinion whatsoever on the part of the Secretariat of the United Nations concerning the legal status of any country, territory, city or area, or of its authorities, or concerning the delimitation of its frontiers or boundaries.

This publication has not been formally edited.

CONTENTS

CONTENTSiii

ABBREVIATIONS AND ACRONYMSiv

EXECUTIVE SUMMARYvi

MANAGEMENT RESPONSE NARRATIVE…………………………………………………………………………………………………………vii

SUMMARY MATRIX OF FINDINGS, EVIDENCE, RECOMMENDATIONS AND MANAGEMENT RESPONSEix

I. INTRODUCTION1

Background and context1

Evaluation methodology2

Limitations to the evaluation2

II. EVALUATION FINDINGS3

Relevance3

Efficiency4

Coherence4

Effectiveness4

Impact5

Sustainability5

Human Rights, Gender Equality and leaving no one behind5

III. CONCLUSIONS7

IV. RECOMMENDATIONS8

V. LESSONS LEARNED AND BEST PRACTICES10

Lessons Learned10

Best Practices10

ANNEX I: TERMS OF REFERENCE11

ANNEX II: EVALUATION TOOLS: QUESTIONNAIRES AND INTERVIEW GUIDES12

ANNEX III: DESK REVIEW LIST13

ANNEX IV: STAKEHOLDERS CONTACTED DURING THE EVALUATION14

ANNEX V: OPTIONAL15

IN-DEPTH EVALUATION OF THE PROJECT TITLE

iii

ix

ABBREVIATIONS AND ACRONYMS

Abbreviationor Acronym

Full name

Abbreviationor Acronym

Full name

Text

Text

Text

Text

iii

ABBREVIATIONS AND ACRONYMSix

EXECUTIVE SUMMARY

INSTRUCTIONS

The Executive Summary should not be a repetition of the text of the main body but be drafted in a crisp, short and clear manner, with the objective to convey the key and most important information about the evaluation to a wide array of readers.

This section should consist of no more than four pages (equalling up to 14,000 characters including spaces) that includes:

· Short introduction with a brief description of the project including its objectives, the purpose and scope of the evaluation, the evaluation methodology and the composition of the external independent evaluation team (maximum 2 short sentences on background, expertise, number and gender);

· The main findings of the evaluation for each of the evaluation criteria;

· The main conclusions;

· The main recommendations (ranked by importance, each with a topic and implementing recipient(s), referring to the total number in the matrix and the main body. There should be a clear illustration of how the recommendations build upon the conclusions and in turn upon the findings. See more guidance in relation to the Summary Matrix and the chapter on recommendations in the main body.

· The main lessons learned and main best practice, referring to the total number in the main body of the report.

INTRODUCTION

Text

PROJECT DESCRIPTION and objectives

Text

PURPOSE, SCOPE and methodology OF EVALUATION

Text

Main findings

Text

Main conclusions

Text

Main recommendationsRecommendation 1 – TOPIC

Text

Recommendation 2 – TOPIC

Text

Recommendation 3 – TOPIC

Text

MAIN Lessons learned and best practice

IN-DEPTH EVALUATION OF THE PROGRAMME TITLE

Tex

iii

EXECUTIVE SUMMARYix

MANAGEMENT RESPONSE NARRATIVE

The Programme/Project Manager provides a Management Response narrative text in a separate document, from which IES copies and pastes below.

Text

MANAGEMENT RESPONSE NARRATIVEix

SUMMARY MATRIX OF FINDINGS, EVIDENCE, RECOMMENDATIONS AND Management response

INSTRUCTIONS

FINDINGS, EVIDENCE AND RECOMMENDATIONS

The part below should include the most significant findings with related evidence and all recommendations (each with a topic).

The recommendations should be brief and clear, as well as appropriately directed to the specific target group of implementing recipient(s) at UNODC to take action (e.g. Project Management at the specific Office/Section/Unit). The recommendations could also be directed to Project Management and refer to cooperation with specific other responsible UNODC Offices/Sections/Units. Utility is promoted when the recommendations are SMART (Specific, Measurable, Actionable, Results-oriented, Time-bound).

The maximum number of recommendations is 10. They should be ranked by importance, beginning with the most important recommendations and finish with the least important ones.

Recommendations need to be clearly derived from the findings and conclusions.

The evidence column should illustrate the general sources of the findings that lead to the recommendations (e.g. desk review of project documents and progress reports; interviews with internal stakeholders; interviews with recipients; etc.).

PLEASE NOTE: The findings and recommendations should be brief and succinct to match the format of a matrix and not be directly copied from the findings or recommendations chapter in the main body.

MANAGEMENT RESPONSE

The Programme/Project Manager completes an Evaluation Follow-up Plan with a Management Response in a separate document, from which IES copies information relating to the Management Response and pastes below in the last column.

Findings

Evidence[footnoteRef:1] [1: General sources that substantiate the findings.]

Recommendations[footnoteRef:2] [2: Should include the specific target group of implementing recipient(s) at UNODC.]

Management Response[footnoteRef:3] [3: Accepted/partially accepted or rejected for each recommendation. For any recommendation that is partially accepted or rejected, a short justification is to be added.]

1. Text

Text

1. Text

2.

2.

3.

3.

4.

4.

5.

5.

6.

6.

7.

7.

8.

8.

9.

9.

10.

10.

Note: A finding uses triangulated evidence from the data collection to allow for a factual statement.

Note: Recommendations are proposals aimed at enhancing the effectiveness, quality, or efficiency; at redesigning the objectives; and/or at the reallocation of resources. For accuracy and credibility, recommendations should be the logical implications of the findings and conclusions and need to clearly identify the responsible implementing recipient(s).

iii

SUMMARY MATRIX OF FINDINGS, EVIDENCE, RECOMMENDATIONS AND MaNAGEMENT rESPONSEix

INSTRUCTIONS

The main body of the report should not exceed 25-30 pages, excluding the four annexes (up to 100,000 characters including spaces). The report should further include relevant maps, graphs, pictures etc. of project implementation. Please note that title and source for each is required.

I. INTRODUCTIONBackground and context

INSTRUCTIONS

This sub-section should include:

· The overall concept and design of the project, including an assessment of its strategy; duration; financial and human resources; as well as clarity, logic and coherence of the project document;

· The purpose (objective) and scope (coverage) of the evaluation;

· The composition of the external independent evaluation team (including short background; expertise; number and gender of evaluators);

· A map of the countries in which the project is active (please refer to the UN Geospatial information website: http://www.un.org/Depts/Cartographic/english/htmain.htm)

OVERALL CONCEPT AND DESIGN

Text

Purpose and scope

Text

The composition of the evaluation team

Text

MAP OF PROJECT COUNTRIES

INSERT MAP HERE

MAP CAPTION

Evaluation methodology

INSTRUCTIONS

This section should describe the approach and methods used to obtain, collect and analyze the data. This part is very important as it provides the basis for the credibility of the evaluation results. Reference should be made to the annex encompassing the evaluation tools.

The evaluation methodology should support the purpose of the evaluation. It should further answer each of the evaluation questions posed in the TOR and further refined in the Inception Report.

The evaluation report should contain a logical sequence; evidence-assessment: assessment-findings; findings/conclusion-recommendations.

Reference must be made to the desk review, identification and inclusion of stakeholders in the evaluation process, sampling strategy and triangulation.

Please include at least the following graphics:

· Number and type of stakeholders interviewed (e.g. pie chart), including sex of interviewees, and number and type of stakeholders surveyed (if applicable). Title and source for each graphic is required.

· Countries included in the data collection.

Text

Limitations to the evaluation

INSTRUCTIONS

The report should highlight major constraints that had an impact on the evaluation process, i.e. limited budget, limited time and unavailability of some major stakeholders for interviews. This section should further include how these limitations were overcome.

Text

INDEPENDENT EVALUATION OF: XXXX

9

INTRODUCTION2

II. EVALUATION FINDINGS

INSTRUCTIONS

This section is the most important one since it covers the analysis of information and articulates the findings of the evaluation.

It is the longest and most detailed section of the report and it should be based on facts, proven by reference to source/methodology. The other sections of the report should draw on and make references to this part.

A finding uses evidence from several sources to allow for a factual statement.

All questions for each criteria from the cleared Inception Report need to be included in the box at the top. These should be followed by the key findings relating to the questions.

When writing this section, please keep in mind that findings are presented for both documentation purposes and for substantiating the conclusions, recommendations and lessons learned of the evaluation.

Please also clearly relate the evaluation to the role of UNODC in supporting the implementation of the SDGs, UNODC’s role in the country, UNODC’s cooperation with other UN agencies, etc. – in line with the Evaluation ToR.

A box at the end should include the main findings/summary for each criteria. The text should be no longer than three sentences.

PLEASE NOTE: Relevant maps, graphics, statistics, pictures, etc. should be included where possible. Please further note that title and source for each is required.

RelevanceEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary - RELEVANCE

Insert maximum three synthesized sentences of the key findings.

EfficiencyEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary – Efficiency

Insert maximum three synthesized sentences of the key findings.

CoherenceEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary – Coherence

Insert maximum three synthesized sentences of the key findings.

EffectivenessEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary – EFFECTIVENESS

Insert maximum three synthesized sentences of the key findings.

ImpactEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary – IMPACT

Insert maximum three synthesized sentences of the key findings.

SustainabilityEvaluation questions:

Insert questions from the cleared Inception Report.

Text

Summary – SUSTAINABILITY

Insert maximum three synthesized sentences of the key findings.

Human Rights, Gender Equality and leaving no one behindEvaluation questions:

Insert questions from the cleared Inception Report.

Human Rights

Text

Gender Equality

Text

Leaving no one behind

Text

Summary – Human Rights, Gender Equality and leaving no one behind

Insert maximum three synthesized sentences of the key findings.

EVALUATION FINDINGS2

III. CONCLUSIONS

INSTRUCTIONS

The report must draw overall conclusions based on the evaluation findings. Conclusions should add value to the findings and draw on data collection and analyses undertaken through a transparent chain of arguments.

Conclusions point out the factors of success and failure of the evaluated project, with special attention paid to the intended and unintended results and impacts, and more generally to any other strength or weakness.

PLEASE NOTE: There must be a clear link between the findings, the conclusions and the recommendations.

Text

9

CONCLUSIONS2

IV. RECOMMENDATIONS

INSTRUCTIONS

This part of the report should provide more detailed information on the recommendations than in the Summary Matrix.

The recommendations should be clear, useful, time-bound and actionable, aimed at enhancing the project performance and improving the sustainability of results.

The recommendations should be aimed, for example, at improving project design, delivery and overall management. In addition, be directed on the topic under evaluation, as well as towards changing policy.

The recommendations should clearly build upon the conclusions, which in turn build upon the findings.

Each recommendation should clearly indicate the action to be undertaken or the decision to be made, as well as the specific group of implementing recipient(s) (same as in the Summary Matrix).

PLEASE NOTE: Recommendations are not directed at Governments as the objective of the evaluation is the respective project.

The maximum number of recommendations is 10. They should be ranked by importance, beginning with the most important recommendations and finish with the least important ones. The topic of each recommendation should be clearly indicated with a sub-heading. Please also provide the related finding/conclusion for each recommendation.

Utility is promoted when the recommendations are SMART (Specific, Measurable, Actionable, Results-oriented, Time-bound), prioritized and appropriately pitched to the target group responsible for taking action. To further support utilization of evaluation results, the recommendations should easily allow the implementing recipient(s) to address them in the Management Response and the Evaluation Follow-up Plan.

RECOMMENDATION 1 - TOPICRecommendation

Text

RECOMMENDATION 2 - TOPICRecommendation

Text

RECOMMENDATION 3 - TOPICRecommendation

Text

RECOMMENDATION 4 - TOPICRecommendation

Text

RECOMMENDATION 5 - TOPICRecommendation

Text

RECOMMENDATION 6 - TOPICRecommendation

Text

RECOMMENDATION 7 - TOPICRecommendation

Text

RECOMMENDATION 8 - TOPICRecommendation

Text

RECOMMENDATION 9 - TOPICRecommendation

Text

RECOMMENDATION 10 - TOPICRecommendation

Text

9

RECOMMENDATIONS2

V. LESSONS LEARNED AND BEST PRACTICES

INSTRUCTIONS

Lessons learned are “Generalizations based on evaluation experiences with projects, programs, or policies that abstract from the specific circumstances to broader situations. Frequently, lessons highlight strengths or weaknesses in preparation, design, and implementation that affect performance, outcome, and impact.”

Lessons learned are a key component of any knowledge management system and they are important for continuously improving the performance of organizations like UNODC. Sometimes these lessons will be derived from success and sometimes they will be derived from areas where there is room for improvement.

The purpose of a lesson learned is to see what works and what does not. Lessons can be success stories that should be repeated, or they can be areas in which change towards improvement is to take place. They can offer advice on how to improve processes (how things were done) or products (outputs).

The evaluation report should focus on the most important lessons, especially those with wider applicability and those that have the following characteristics:

· The lessons learned from a specific project should highlight the strengths and weaknesses in preparation, design and implementation that affect performance, outcomes and impact. They are also applicable to other projects and programs, as well as policies, and have the potential to improve future actions.

· Lessons learned should be based on findings and evidence presented in the report. Lessons learned should neither be written as a recommendation, nor as an observation or description.

Lessons Learned

Text

Best Practices

Text

9

LESSONS LEARNED AND BEST PRACTICES2

INSTRUCTIONS

Apart from the mandatory four annexes: Terms of reference of the evaluation; Evaluation tools: questionnaires and interview guides; Desk review list; and, Stakeholders contacted during the evaluation - additional annexes may be included in case they add value to the evaluation report (e.g. case studies; list of outputs and outcomes achieved/not achieved; etc.)

PLEASE ENSURE THAT THE SAME CORRECT FORMAT AND STYLES OF THE MAIN BODY OF THE REPORT ARE USED WHEN FORMATTING THE ANNEXES.

ANNEX I: TERMS OF REFERENCE

Insert text from the cleared Terms of Reference. Do not include annexes from the ToR.

9

ANNEX I: TERMS OF REFERENCE2

ANNEX II: EVALUATION TOOLS: QUESTIONNAIRES AND INTERVIEW GUIDES

9

ANNEX II: EVALUATION TOOLS: QUESTIONNAIRES AND INTERVIEW GUIDES2

ANNEX III: DESK REVIEW LIST UNODC documents

Number

Title

Text

Total number of UNODC documents reviewed:

External documents

Number

Title

Text

Total number of external documents reviewed:

9

ANNEX III: DESK REVIEW LIST2

ANNEX IV: STAKEHOLDERS CONTACTEDDURING THE EVALUATION

Number of interviewees

Organisation

Type of stakeholder (see note below)

Sex disaggregated data

Country

Text

Text

Male:

Female:

Text

Male:

Female:

Male:

Female:

Male:

Female:

Male:

Female:

Male:

Female:

Male:

Female:

Male:

Female:

Total:

Male:

Female:

Note: A stakeholder could be a Civil Society Organisation; Project/Programme implementer; Government recipient; Donor; Academia/Research institute; etc.

9

ANNEX IV: STAKEHOLDERS CONTACTED DURING THE EVALUATION2

ANNEX V: OPTIONAL

9

ANNEX V:2