ERMS OF REFERENCE

17
TERMS OF REFERENCE External Evaluation of the SMART Initiative Program Name Global SMART Initiative Sector Health and Nutrition Implementing Partners (if applicable) Action Against Hunger - Canada (SMART Host), Action Contre la Faim - France, Action Against Hunger - USA Location (country region/s) Global and Yemen Project Language English and Arabic Program period to be covered October 2018 to November 2021 Funding Agencies USAID, SIDA Responsible ACF HQ Action Against Hunger - Canada Evaluation Type Remote - Independent Project Evaluation Evaluation Dates Data collection is to be completed by October 2021, final report submitted by November 30, 2021. BACKGROUND The Standardized Monitoring and Assessment of Relief and Transitions (SMART) is an inter-agency initiative launched in 2002 by a network of organizations and humanitarian practitioners. SMART advocates a multi-partner, systematized approach to provide critical, reliable information for decision-making, and to establish shared systems and resources for host government partners and humanitarian organizations. The SMART Methodology is an improved survey method that balances simplicity (for rapid assessment of acute emergencies) and technical soundness. It draws from the core elements of several methodologies with continuous upgrading informed by research and current best practices. The SMART Methodology is based on the two most vital and basic public health indicators for the assessment of the magnitude and severity of a humanitarian crisis: 1. Nutritional status of children under-five. 2. Mortality rate of the population.

Transcript of ERMS OF REFERENCE

Page 1: ERMS OF REFERENCE

TERMS OF REFERENCE

External Evaluation of the SMART Initiative

Program Name Global SMART Initiative

Sector Health and Nutrition

Implementing Partners (if

applicable)

Action Against Hunger - Canada (SMART Host), Action Contre la

Faim - France, Action Against Hunger - USA

Location (country region/s) Global and Yemen

Project Language English and Arabic

Program period to be

covered

October 2018 to November 2021

Funding Agencies USAID, SIDA

Responsible ACF HQ Action Against Hunger - Canada

Evaluation Type Remote - Independent Project Evaluation

Evaluation Dates Data collection is to be completed by October 2021, final report submitted by November 30, 2021.

BACKGROUND The Standardized Monitoring and Assessment of Relief and Transitions (SMART) is an inter-agency

initiative launched in 2002 by a network of organizations and humanitarian practitioners. SMART

advocates a multi-partner, systematized approach to provide critical, reliable information for

decision-making, and to establish shared systems and resources for host government partners and

humanitarian organizations.

The SMART Methodology is an improved survey method that balances simplicity (for rapid

assessment of acute emergencies) and technical soundness. It draws from the core elements of

several methodologies with continuous upgrading informed by research and current best

practices.

The SMART Methodology is based on the two most vital and basic public health indicators for the

assessment of the magnitude and severity of a humanitarian crisis:

1. Nutritional status of children under-five.

2. Mortality rate of the population.

Page 2: ERMS OF REFERENCE

Page 2 of 17

These indicators are useful for prioritizing resources as well as for monitoring the extent to which

the relief system is meeting the needs of the population, and therefore the overall impact of relief

response.

SMART ensures that consistent and reliable survey data is collected and analyzed using a single

standardized methodology. It provides technical capacity for decision-making, reporting, and

comprehensive support for strategic and sustained capacity building.

The widely used combination of SMART and ENA software1 has improved data quality review and

assurance in larger surveys (e.g. Multiple Indicator Cluster Surveys- MICS and Demographic and

Health Surveys- DHS) and has also been incorporated into many national nutrition protocols.

Additionally, SMART survey results are now used in early warning systems.

SMART Methodology looks to reform and harmonize assessments of and responses to

emergencies and for surveillance (if used at equal time intervals). It ensures that policy and

programming decisions are based on reliable, standardized data and that humanitarian aid is

provided to those most in need.

As the Global Convener for the SMART Initiative, Action Against Hunger has been involved for

the past decade in providing global technical expertise and guidance for the collection of nutrition

and mortality data using the SMART methodology. Specifically, the Initiative works closely with

national nutrition clusters, nutrition information working groups, Ministry of Health (MoH) and

humanitarian and development organizations (i.e. UN, INGO, Local NGO) to enhance capacity to

plan, collect, analyse, and utilize nutrition data collected at sub-national and national levels using

SMART methodology. In addition, Action Against Hunger has significant experience in developing

and strengthening nutrition information and quality assurance mechanisms at country and global

level. The SMART Initiative has been present in East Africa since 2013, where the Initiative has

been providing technical support and building capacity of partners on nutrition assessments and

information management. In 2017, the SMART initiative established a regional presence in the

Middle East and North Africa region (MENA) through USAID BHA financial, supporting the set-up

of the SMART Middle East project to address the most vulnerable populations in Yemen and Syria,

Sudan, and Libya, respectively. Through its global mandate, the Initiative has experience

supporting key humanitarian emergencies in Asia and is currently providing dedicated technical

support for South-East Asian countries (e.g. Afghanistan, Bangladesh, India, Nepal, Myanmar and

Pakistan) through provision of a SMART advisor based in the region since May 2019. By the end

of 2020, the SMART Project trained a total of 838 people in the East and southern Africa region,

311 people in Asia, and 227 people in the MENA region (112 in Yemen).

1 https://smartmethodology.org/survey-planning-tools/smart-emergency-nutrition-assessment/

Page 3: ERMS OF REFERENCE

Page 3 of 17

Specific objectives of the evaluation 1. Carry out an evaluation of the project performance using the DAC criteria (see table in annex

III), or something similar used in the M&E sector.

2. Assess the appropriateness of the design and strategies used, its effectiveness and efficiency,

and adaptation of the project management structures and monitoring mechanisms in relation

to the results and impact achieved, considering different HQs and country’ missions involved.

3. Assess key factors (internal and external) that have contributed to or hindered the achievement

of results.

4. Provide findings, key lessons learned, and best practices, conclusions, recommendations for

follow-up actions and future interventions, including conclusions that different stakeholder

groups will develop based on their own recommendations and insights (including intended and

unintended results).

Each of the outlined objectives should be analyzed in a participatory, collaborative and results-

based approach using appropriate key informants and criteria. Cross-cutting issues need to be

incorporated in each evaluation criteria and questions.

Audience of the evaluation

The users of the evaluation are:

- Direct users: The SMART Initiative.

- Indirect users: Ministries of Health, Secretary of Food and Nutrition Security, local

authorities, health staff, community leaders and families, USAID, other humanitarian and

nutrition organizations, Action Against Hunger country teams in Yemen, ACF HQ’s and

Management teams in Canada, the United States, and France, project partners.

SCOPE OF WORK:

The evaluation will cover all areas where SMART Initiative has provided support at the global level,

with a specific focus in East Africa (South Sudan, Somalia, Ethiopia), Middle East and North Africa

(Yemen2, Syria, Sudan), and Asia (Bangladesh, Afghanistan, Pakistan) looking at the different levels

of the intervention and the links between global and country level support. The consultant(s) will

carry out all major tasks in consultation with SMART Initiative. The consultant(s) shall lead the

process of tool and evaluation design as per project requirements, analysis of information, data

collection, and analysis of partner relationships, as well as management structure and processes

established. Finally, the evaluation should generate new knowledge for, increase capacity of, and

mobilize all stakeholders, from the project management team to the partners and target groups

towards pursuing similar initiatives. The evaluation report should provide key recommendations

2 Yemen should be prioritized for this evaluation.

Page 4: ERMS OF REFERENCE

Page 4 of 17

towards sustainability, which should focus on both the outcomes of the project as well as on the

overall process.

The key elements of the evaluation are outlined below.

EVALUATION DESIGN AND METHODOLOGY Outlined below is the suggested mixed-method methodological approach for the evaluator(s) to

collect quantitative and qualitative data, and the chronological steps of the evaluation process.

The evaluator(s) will develop data gathering instruments and methods that allow collecting sex

disaggregated information. This evaluation will be largely qualitative as opposed to direct

quantitative outcome data collection, which will be completed by the teams in the end line

reporting of the project.

Evaluation Briefing

Prior to the evaluation taking place, the evaluator(s) is expected to attend a technical briefing with

the Evaluation Committee (EC). Briefings by Zoom/Skype must be agreed in advance.

SMART Initiative Briefing

As part of the evaluation, the evaluator(s) will interview the SMART Team, HQ stakeholders and

to get preliminary information about the project being evaluated. Briefings by Zoom/Skype must

be agreed in advance.

Desk review

The evaluator or team will undertake a desk review of project materials, including the project

documents and proposals, baseline study, progress reports, means of verification and outputs of

the project (such as publications, communication materials, videos, recording etc.), as well as

results of any internal planning process and relevant materials from secondary sources.

Inception Reports

At the end of the desk review period and before the data collection, the evaluator(s) will prepare

an inception report written in English that will include the following sections (see annex 1 for

example outline):

● Key elements of the Terms of Reference (TORs) to demonstrate that the evaluator(s) understands

and will adhere to the TORs;

● The methodological approach to the evaluation. This shall include an evaluation matrix as an annex

and specify how the evaluator(s) will collect data to answer the evaluation questions, specifying

which tools will be used, and examine the limitations to the methodology if any.

● A detailed evaluation work plan, including schedule and sampling (key stakeholders);

Page 5: ERMS OF REFERENCE

Page 5 of 17

● Statement of adherence to Action Against Hunger Evaluation Policy and outline of the evaluation

report format.

The inception report will be discussed and approved by the EC and shared with stakeholders.

Primary data collection techniques As part of the evaluation, the evaluator(s) will collect data from key project stakeholders (local

authorities, humanitarian agencies, UN agencies, or donor representatives) as per a list

recommended by the EC with suggestions from the evaluator based on the desk review and

guaranteeing a representative and participatory approach for the sample size. The evaluator(s) will

use the most suitable format for data collection as detailed in the inception report.

Field visits

No field visits are planned under this evaluation. All data collection should be conducted remotely.

Secondary data collection techniques: Desk review The evaluator(s) will further review complementary documents and collect project monitoring data

or of any other relevant information.

Debriefing and stakeholders’ workshop

The evaluator(s) shall facilitate a debriefing online workshop to present preliminary findings of the

evaluation to the SMART Initiative staff to gather feedback on the findings and build consensus

on recommendations and to develop action-oriented statements on lessons learned and proposed

improvements for the future.

Evaluation Report *Final format to be decided between the consultant and the EC.

● Cover Page

● Summary Table to follow template provided

● Table of Contents

● List of acronyms

● Executive Summary must be a standalone summary, describing the intervention, main

findings of the evaluation, and conclusions and recommendations. This will be no more than

2 pages in length

● Background basic project data, maps etc.

● Methodology describe the methodology used, provide evidence of triangulation of data and

presents limitations to the methodology

Page 6: ERMS OF REFERENCE

Page 6 of 17

● Findings includes overall assessment of the project against the evaluation criteria, responds

to the evaluation questions, all findings are backed up by evidence, cross-cutting issues are

mainstreamed and; unintended and unexpected outcomes are also discussed

● Conclusions are formulated by synthesizing the main findings into statements of merit and

worth, judgements are fair, impartial, and consistent with the findings

● Lessons Learnt and Good Practices present lessons that can be applied elsewhere to

improve project performance, outcome, or impact and; identify good practices: successful

practices from those lessons which are worthy of replication

● Recommendations should be as realistic, operational and pragmatic as possible; that is, they

should take careful account of the circumstances currently prevailing in the context of the

action, and of the resources available to implement it both locally. They should follow

logically from conclusions, lessons learned and good practices. The report must specify who

needs to take what action and when. Recommendations need to be presented by order of

priority

● Annexes should be listed and numbered and must include the following: Evaluation Criteria

Rating Table, list of documents for the desk review, list of persons interviewed, data

collection instrument, evaluation TORs, etc.

The whole report shall not be longer than 30 pages (50 pages including relevant annexes). The

draft report should be submitted no later than 10 calendar days after data collection completion.

The final report will be submitted no later than the end date of the consultancy contract. Annexes

to the report will be accepted in the working language of the country and project subject to the

evaluation.

Learning workshop

The evaluator shall facilitate a learning workshop to present key findings of the evaluation to the

project stakeholders to develop action-oriented statements on lessons learnt and good practices

and proposed improvements for the future. The evaluator shall prepare a PowerPoint for the

learning workshop and present a report after the workshop.

EVALUATION CRITERIA AND QUESTIONS Unless otherwise agreed, the evaluation is expected to use DAC criteria in data analysis and

reporting. The evaluator(s) should refer to the DAC criteria rating table (Refer to Annex III) and

include it, or something similar, in the final report annex. Specifically, Action Against Hunger uses

Page 7: ERMS OF REFERENCE

Page 7 of 17

the following criteria: Design, Relevance, Coherence, Effectiveness, Efficiency, Sustainability and

Likelihood of Impact3.

Evaluation questions have been developed to help the evaluator(s) assess the project against these

criteria (Refer to Annex II). The evaluator(s) may adapt the evaluation criteria and questions, but

any fundamental changes should be agreed between the SMART Initiative and the evaluator(s)

and reflected in the inception report.

Cross-cutting issues need to be included through specific evaluation questions for each criterion

to be assessed.

Deliverables The following outputs are expected:

- Inception report including evaluation work plan on a weekly schedule.

- Draft evaluation report for internal review

- Final evaluation Report

- A summary presentation or debrief on the final report and key findings for the EC

- A PowerPoint presentation for the Learning Workshop

- A Learning Workshop Report

Outputs Deadlines

Inception report September 15, 2021

Draft evaluation report October 30, 2021

Final evaluation report November 15, 2021

Synthesis Report (info graphic for wider

dissemination)

November 15, 2021

Summary/Debrief November 15, 2021

A PowerPoint presentation for the

Learning Workshop

November 30, 2021

A Learning Workshop Report November 30, 2021

All outputs must be submitted in English and under Word Document format.

3

The criterion has been rephrased to “Likelihood of Impact” as a thorough impact assessment is linked to the estimation of attribution,

which can only be measured through experimental or quasi experimental evaluation designs. The evaluation design for carrying out a performance evaluation would not be suitable to determine the effects attributed to the project.

Page 8: ERMS OF REFERENCE

Page 8 of 17

Management Arrangements and Work plan These evaluation TORs have been developed in a participatory manner, by the SMART Initiative

Team, and Action Against Hunger Headquarters in Canada.

This evaluation will be under the supervision of the Evaluation Committee (EC). Specifically, the

consultant(s) will be responsible for the following tasks:

● The evaluator(s) will directly report to the Associate Director for Nutrition at Action

Against Hunger Canada.

● The evaluator will submit all the evaluation outputs directly and only to the EC.

● The EC will do a quality check to ensure all required elements of the evaluation are present

and decide whether the report is ready for sharing. The Associate Director for Nutrition

will forward a copy to key stakeholders for comments on factual issues and for

clarifications.

● All feedback and comments will be consolidated and shared with the consultant for action

with clear timelines.

● Final products will be submitted once all comments have been considered and

incorporated.

Timeline This work will take place between September 1st, 2021, to November 30th, 2021

ESSENTIAL QUALIFICATION AND EXPERIENCE OF EVALUATOR(S) ● At least 7 years of experience conducting and coordinating final project evaluation in

Africa, Middle East and/or Asia;

● Demonstrated experience using innovative and dynamic evaluation methods;

● Written and Oral skills in English are required, Arabic knowledge is considered an asset;

● Knowledge in the Health and Nutrition sector with particular experience on nutrition

assessment (preferably SMART), nutrition information system and nutrition situational

analysis;

● Significant field experience in the evaluation of humanitarian / development projects;

● Relevant degree / equivalent experience related to the evaluation to be undertaken;

● Significant experience in coordination, design, implementation, monitoring and

evaluation of multi-country programs;

● Good communications skills and experience of workshop facilitation;

● Ability to write clear and useful reports;

● Understanding of donor requirements (USAID and SIDA);

● Ability to manage the available time and resources and to work to tight deadlines;

Page 9: ERMS OF REFERENCE

Page 9 of 17

● Independence from the parties involved;

● Familiarity with the context of the humanitarian situation in the countries to be

included in this evaluation is considered an asset.

LOCATION OF WORK The consultancy is home-based.

PAYMENT CONDITIONS Payment by bank transfer in installments. The consultant will receive their payment in three

installments:

1- 20% upon approval of inception report.

2- 20% upon approval of draft evaluation report.

3- 40% after submission and approval of final evaluation report.

4- 20% upon completion of the Learning Workshop and approval of the Learning Report.

LEGAL AND ETHICAL MATTERS The ownership of the draft and final documentation belongs to the SMART Initiative and the

funding donor exclusively. The document, or publication related to it, will not be shared with

anybody except SMART Initiative before the delivery by SMART Initiative of the final document

to the donor.

SMART Initiative is to be the main addressee of the evaluation and its results might impact on both

operational and technical strategies. This being said, SMART Initiative is likely to share the results

of the evaluation with the following groups:

● Donor(s)

● Action Against Hunger network

● Governmental partners

● Various coordination bodies

For independent evaluations, it is important that the consultant does not have any links to project

management, or any other conflict of interest that would interfere with the independence of the

evaluation.

Intellectual property All writings, books, articles, artwork, computer programs, databases, source and object codes, and

other material of any nature whatsoever produced in the course of this assignment produced in

Page 10: ERMS OF REFERENCE

Page 10 of 17

whole or in part by the consultant in the course of his/her service to SMART Initiative shall be

considered a work made for hire, or otherwise, and therefore SMART Initiative’s property, unless

otherwise discussed and decided between the consultant and the evaluator.

APPLICATION PACKAGES AND PROCEDURES

Qualified and interested parties are asked to submit the following:

1. Letter of interest in submission of a proposal (in the form of an email), to:

[email protected] by August 29th, 2021 at 23:59 EST.

2. Detailed proposal clearly demonstrating a thorough understanding of this Terms of

Reference and including the following:

a. Description of methodology, tools and sample size plan based on this Terms of Reference

b. Proposed approaches to data analysis, in response to overall objectives

c. A proposed timeframe detailing activities and schedule/work plan (including a Gantt

chart for all stages of the evaluation process, include information/support required from

the SMART Initiative).

d. Demonstrated previous experience in mixed methods evaluation and other qualifications

outlined in this ToR.

e. Team composition and level of effort of each proposed team member and indication

language skills of team members (if more than one).

3. A financial proposal with a detailed breakdown of costs for the study:

a. Itemized consultancy fees/costs

b. Currency of offer

c. Validity period of the offer

4. Curriculum Vitae(s) of all proposed team members outlining relevant experience

5. Names and contact information of three references who can be contacted regarding

relevant experience

6. A Consulting Firm profile (if applicable).

The proposal will be scored on both technical (methodology) and financial (budget) aspects.

Complete applications should be submitted electronically to: [email protected]

with the subject line: ‘Final Evaluation Consultancy – SMART Initiative ’ by August 29th, 2021 at

23:59 EST.

Page 11: ERMS OF REFERENCE

Page 11 of 17

ANNEXES

Annex 1: Outline Inception Report Table of Contents

List of Acronyms

List of Tables (*)

List of Figures

1 Rationale, Purpose and Specific Objectives

Should include: rationale, purpose and specific objectives of the evaluation.

2 Development Context

Should include: a description of key contextual element, specific to the call for proposal and

specific to each project;

3 Project descriptions

Should include: a brief project description (e.g. the time period; budget; geographical area;

programming; stakeholder mapping; organizational set-up; implementation arrangements)

4 Project Intervention Logic

Should include: an analysis of project logic model to identify the causal pathways (from activities

to results) to inform evaluation questions.

5 Evaluation Approach and Methodology

Should include: (i) a description and an explanation of the evaluation approaches, evaluation

methodology and its application; including details of, and justification for, the methodological

choices; (ii) description of the methods of data collection for the desk and field-based case

studies -- incl. data collection plan; preparation of interview and issues guides for interviews and

focus groups, harmonization of approaches across country case studies, preparation process and

logistics; recruitment of field teams; (iii) approach and design proposal for surveys;

(iv) description of samples, sampling choices/methods and limitations regarding the

representativeness of samples for interpreting evaluation results. (iv); data analysis plan (i.e. how

the information collected will be organized, classified, tabulated, inter-related, compared and

displayed relative to the evaluation questions, including how multiple sources will be integrated

– qualitative and quantitative); (v) limitations.

6 Proposed Evaluation Questions

Should include: a set of evaluation questions with the explanatory comments associated with

each question; overall approach for answering the evaluation questions; detailed proposed

evaluation questions (including: rationale; method/chain of reasoning; assumptions to be

assessed and corresponding qualitative and/or quantitative indicators).

7. Evaluation Management

Should include: team composition and distribution of tasks, roles and responsibilities; the

contractor’s approach to ensure quality assurance of all evaluation deliverables.

8. Implementation plan

Page 12: ERMS OF REFERENCE

Page 12 of 17

Should include: a detailed plan for the next phases/stages of the evaluation; including detailed

plans for countries selected for field visits, including the list of interventions for in-depth analysis

in the field (explanation of the value added for the visits), preparation process and logistics,

recruitment of field teams.

9. Detailed Evaluation Budget

10. Annexes

Should include:

● Terms of reference

● Logic model for project

● Portfolio of project interventions

● Stakeholder mapping

● Evaluation evidence matrix as per Annex

● Sampling details for each sample

● Draft list of proposed interviewees, focus groups, etc.

● Draft data collection tools (e.g. template for questionnaire-based interview and surveys; focus

group and other participatory methods protocols)

● Gantt Chart and calendar of activities, including details on the anticipated level of effort

● List of persons met for the inception report

● List Document consulted for the inception report

(*) Tables, graphs and diagrams should be numbered and have a title.

Page 13: ERMS OF REFERENCE

Page 13 of 17

Annex II: Evaluation Criteria and Example Questions This is a list of potential evaluation questions, the evaluator is requested to provide their own list

with proposal submission.

In the case of this project, the primary target population are government agencies, UN Agencies,

International NGOs and National NGOs.

Design: A measure of whether the design is logical, allows for Results Base Management and include a

sustainability strategy involving partners, stakeholders involved and target groups

● Are target groups needs well identified and in which way? What was the level of

participation of each target group in project design?

● Are cross-cutting issues properly taken into account in project design?

● Are processes and structures established for management and coordination (internally and

externally)?

● Are project objectives and indicators SMART? Are sources of verification realistic?

● Is the design of the exit strategy realistic?

● Is there a good design of the M&E system in place?

● To which extent were assumptions and risks correctly identified at the beginning and

updated over the course of the project implementation?

Relevance/Appropriateness: A measure of whether interventions are in line with local needs and

priorities (as well as donor policies, thus increasing ownership, accountability, and cost-effectiveness)

● Were the actions undertaken relevant and appropriate given the local context and needs

of the target population?

● Was the assistance relevant and appropriate in relation to the practices / culture of the

target population?

● To what extent were the changing and/or emerging needs of beneficiaries and

stakeholders taken in to account in project implementation?

● Were complementarities and/or synergies with other initiatives carried out by Action

Against Hunger, partners or other organizations taken into account in the design?

● To what extent strategies and methodologies prioritized by the project proved to be

relevant and appropriate in the specific contexts and needs identified?

Coherence: A measure of whether interventions are consistent with existing interventions, global and

national policies and strategies to ensure consistency, maximize synergies and minimize duplication

● Are other stakeholders informed or aware about Action Against Hunger

activities/approach/strategy of the project?

Page 14: ERMS OF REFERENCE

Page 14 of 17

● How have activities of this project been integrated into a broader nutrition and health

programming and/or with other sectors/ programs in the operational area?

● Do project team members feel they are working towards a common goal with respect to

other sectors, which are not part of the project?

Coverage: A measure of whether interventions meet the need to reach major population groups facing

life threatening suffering wherever they are

● Were the most affected groups covered with the limitation of the resources available?

● Was the geographical coverage of the project appropriate?

● Were target groups correctly and fairly identified and targeted?

● How was the targeting understood or perceived by local communities?

● Was gender within the target community considered in Action Against Hunger’s

assessment/identification of the beneficiary and in the implementation of the project?

Efficiency: A measure of how economically resources/inputs (funds, expertise, time, etc.) are converted

to results.

● Were the resources properly allocated to reach the objectives?

● How efficiently are the project implementers utilizing the project’s inputs to conduct

activities and achieve the project’s intended results?

● How efficient is the overall management set up of the project; or in other words, how is

the suitability of management arrangements in place?

● Is the project being implemented in the most efficient way compared to other eventual

alternatives?

● Are the project activities being implemented as planned and scheduled?

● How efficient have the project performance and its outputs and objectives’ indicators been

monitored?

Effectiveness: A measure of the extent to which the interventions’ objectives were achieved, or are

expected to be achieved, taking into account their relative importance and illustrating the effectiveness

of SMART Initiative approach

● What is the quality of the project outputs and to what extent project outcomes (intended

and unintended) have been achieved?

● What are the major internal and external factors influencing the achievement or non-

achievement of the intended outputs and objectives?

● How effective were the coordination and communication processes and mechanisms

established internally and externally?

● How is the adequacy of control mechanisms to limit fraud and corruption? How has the

feedback mechanism in place worked and were they accessible to all involved stakeholders

and target groups? What could be improved?

Page 15: ERMS OF REFERENCE

Page 15 of 17

● How was the project team able to adapt to the constraints and changing context faced by

the project? Where the mitigation measures effective?

● What steps were taken by SMART Initiative to ensure that its responses were coordinated

with other organizations and local authorities?

● What were the capacity gaps in the course of the implementation of the project?

● To what extent does SMART Initiative take part in technical coordination mechanism at all

level of project implementation?

Sustainability: A measure of whether the benefits of an activity are likely to continue after donor

funding has been withdrawn and project activities officially cease.

● How and when does the project intend to withdraw its resources?

● What plans are in place to ensure that the achievements of the project are not jeopardized

by the time of project phase out? Assess and evaluate Action Against Hunger’s exit

strategy.

● Was the project assistance provided in a way that took account of the long-term context?

● How suitable are these plans and are they being implemented?

● Did the partnership or local community-based organizations established at local level

contribute to the sustainability of the work?

● To what extent are the project results likely to be sustained in the long term?

Likelihood of impact: Early signs of positive and negative, primary and secondary, short, mid and long-

term effects produced by an intervention, directly or indirectly, intended or unintended

● To what extent is the project contributing to an increased high quality nutrition data,

improved nutrition information system and situational analysis.

● What are some of the significant changes the target groups can point to as a result of the

project and their participation.

● To which extent behavior change and institutional strengthening have been achieved, and

based on what evidences?

Page 16: ERMS OF REFERENCE

Page 16 of 17

Annex III: Evaluation Criteria Table

The evaluator will be expected to use the following table, or something similar to rank the

performance of the overall intervention using the DAC criteria. The table should be included in

annex of the evaluation report.

Criteria Rating

(1 low, 5

high)

Rationale

1 2 3 4 5

Design

Relevance

Coherence

Coverage

Efficiency

Effectiveness

Sustainability

Likelihood of Impact

Guidance for rating the evaluation criteria:

Rating Definition

1.

Unsatisfactory

Performance was consistently below expectations in most areas of

enquiry related to the evaluation criteria. Overall performance in

relation to the evaluation criteria is not satisfactory due to serious gaps

in some of the areas. Significant improvement is needed.

Recommendations to improve performance are outlined in the

evaluation report and Action Against Hunger will monitor progress in

these areas.

2. Improvement

needed

Performance did not consistently meet expectations in some areas of

enquiry– performance failed to meet expectations in one or more

essential areas of enquiry. Some improvements are needed in one or

more of these. Recommendations to improve performance are outlined

in the evaluation report and Action Against Hunger will monitor

progress in these key areas.

Page 17: ERMS OF REFERENCE

Page 17 of 17

3. On average

meets

expectations

On average, performance met expectations in all essential areas of

enquiry and the overall quality of work was acceptable. Eventual

recommendations over potential areas for improvement are outlined in

the evaluation report.

4. Meets

expectations

Performance consistently met expectations in all essential areas of

enquiry, and the overall quality of work was fairly good. The most

critical expectations were met.

5. Exceptional Performance consistently met expectations due to high quality of work

performed in all essential areas of enquiry, resulting in an overall quality

of work that was remarkable.