ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively...

43
ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES ODA EVALUATION GUIDELINES O O D D A A E E v v a a l l u u a a t t i i o o n n G G u u i i d d e e l l i i n n e e s s 8 8 t t h h E E d d i i t t i i o o n n (Tentative Translation) ODA Evaluation Division Minister’s Secretariat Ministry of Foreign Affairs of Japan May 2013

Transcript of ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively...

Page 1: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

ODA EVALUATION GUIDELINES

OODDAA EEvvaalluuaattiioonn

GGuuiiddeelliinneess

88tthh EEddiittiioonn

((TTeennttaattiivvee TTrraannssllaattiioonn))

OODDAA EEvvaalluuaattiioonn DDiivviissiioonn

MMiinniisstteerr’’ss SSeeccrreettaarriiaatt

MMiinniissttrryy ooff FFoorreeiiggnn AAffffaaiirrss ooff JJaappaann

MMaayy 22001133

Page 2: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

Introduction

The Ministry of Foreign Affairs of Japan (MOFA) conducts policy and program level evaluations (third-party evaluations) annually in order to enhance its transparency and accountability to the public as a government agency, as well as to improve management of Japan’s Official Development Assistance (ODA).

The first edition of the ODA Evaluation Guidelines was compiled in 2003, based on the national and international theories in evaluations, as an indicator of MOFA ODA evaluation. This current eighth edition of the guidelines is being published after seven times of revisions, incorporating the Ministry’s experience in the evaluation of ODA and the latest practices in this field.

This version was revised to provide good examples among recent ODA evaluations and to

promote the rating system (classification or grading) which MOFA has introduced on a trial basis into evaluations from the development viewpoint. Also, this version was intended to be a reference for various actors pertaining to evaluations by showing examples of the targets of recommendations and priorities, as well as presenting referential evaluation frameworks and systemized objectives.

In these guidelines, Chapter 1 “Basic Concepts of Evaluation” presents the concept of evaluations as a whole, and Chapter 2 “ODA Evaluation by MOFA” outlines the evaluation of MOFA as well as its institutional mechanism.

Chapter 3 “Implementation of ODA Evaluations” illustrates the actual method of the implementation of evaluations. We would recommend reading from Chapter 3 to those who use these guidelines as an instruction on the actual operation of ODA evaluations.

Furthermore, Chapter 4 “Feedback and Public Relations of Evaluation Results” explains about reflection of ODA evaluation results in policies and public relations. It is expected that the main readers of these guidelines are to be ODA evaluators, those in charge of various ODA activities, ODA researchers, etc. However, these guidelines are also created to be useful information for the general public who are interested in ODA evaluations. For these reasons, care is taken in making these guidelines to be both comprehensible and useful in practice. MOFA will continue its ODA evaluation activities in order to improve Japan’s ODA, and we sincerely hope that this publication will make a contribution to that end. May 2013

ODA Evaluation Division Minister’s Secretariat, MOFA

Page 3: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

Chapter 1

Basic Concepts of Evaluation ···················································· 1 1.1 Definition and Types of Evaluation ······························································· 1 1.2 Goals of Evaluation ······················································································· 3 1.3 Evaluation Standards ···················································································· 4 1.4 Methods of Collection and Analysis of Information ······································ 5

Chapter 2

ODA Evaluation by MOFA ·························································· 7 2.1 Background ··································································································· 7 2.2 Implementation System ··············································································· 9 2.3 Objectives and Functions ········································································· 15 2.4 Evaluation Targets ······················································································ 17

Chapter 3

Implementation of ODA Evaluations ········································ 19 3.1 Implementation Processes ········································································· 19 3.2 Evaluation Methods ···················································································· 22 3.3 Aiming for Higher Quality Evaluations ························································ 33

Chapter 4

Feedback and Public Relations of Evaluation Results ·········· 38 4.1 Feedback of ODA Evaluation Results························································ 38 4.2 Public Relations of ODA Evaluation Results ············································· 38

◆Table of Contents◆

Page 4: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

1

Chapter 1 Basic Concepts of Evaluation This chapter presents evaluation in the field of development assistance and explains the basic

concepts, such as goals, standards and methods. 1.1 Definition and Types of Evaluation (1)Definition of Evaluation

Opinions vary regarding the definition of evaluation, but OECD-DAC (Organization for Economic Co-operation and Development, Development Assistance Committee) defines evaluation as "the systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation and results.” As such, the term of “evaluation” shall mean a systematic and objective assessment, which is usually conducted by setting particular items (Evaluation Questions) to be clarified prior to conducting the evaluation.

(2)Types of Evaluation As you may see in the definition above, there are various types of evaluation, which are

categorized respectively according to their characteristics. They are categorized mainly by timing, evaluator, object, function, etc. (See BOX 1)

BOX 1 Types of Evaluation Timing ● Ex-ante evaluation

Evaluations conducted before the implementation of development assistance

● Mid-term evaluation Evaluations conducted in the intermediary stages of development assistance

● Ex-post evaluation Evaluations conducted after the completion of development assistance

Evaluator ● Internal evaluation Evaluations conducted by donors, development assistance experts, or divisions /

individuals who report to an operation unit of the aid agencies

● External evaluation Evaluations conducted by organizations / individuals who are not the donors / aid agencies

● Independent evaluation Evaluations conducted by organizations / individuals not constrained in any way by those in

charge of planning and implementation of the development assistance projects subject to

evaluations

Page 5: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

2

● Self-evaluation Evaluations conducted by those working in the planning and implementation of the projects

● Joint evaluation Collaborative evaluations with participation of a number of organizations and / or those

working on the projects

● Participatory evaluation Evaluations in which representatives of stakeholders (including beneficiaries) and aid

organizations jointly plan, implement and interpret the results of evaluations

Subjects ● Thematic evaluation Evaluations of a specific thematic priority that cuts across countries, regions and sectors

● Country program evaluation / country assistance evaluation Evaluations of a number of assistance programs by donors or agencies in a recipient

country as well as the assistance strategy behind the programs

● Sector program evaluation Evaluations of a collection of projects in a specific sector across a number of countries or

within one country that contribute to specific development goals

● Cluster evaluation Evaluations of a series of activities, projects, and / or programs

● Program evaluation Evaluations of a series of assistance activities organized to achieve development goals on

global, regional, national and sector bases

● Project evaluation Evaluations of individual projects designed to achieve specific goals with limited resources

and implementation period and often within a winder-range program

Function ● Formative evaluation Evaluations conducted to improve performance. Most of these evaluations are conducted

while projects or programs are being implemented.

● Process evaluation

Evaluations of the internal dynamics of implementing agencies, their policy instruments,

systems for providing service, management practices and the linkages between these

processes

● Summative evaluation Evaluations conducted at the end of an assistance project or at a particular stage of

assistance to determine how much of the anticipated outcomes have been achieved.

These evaluations aim to provide information regarding the value of the program.

Page 6: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

3

● Meta-evaluation Evaluations conducted to aggregate findings from a series of evaluations. The term may

also refer to an evaluation that assesses the performance of the evaluator or the quality of

the evaluation itself.

(3) Concepts of Evaluation Evaluation is often confused with similar concepts such as audit, review, or monitoring. However,

evaluation is a process involving systematic and objective verification and can be differentiated from monitoring which continuously measures a set indicator or reviews which examine performances. Evaluation includes conduct of detailed analysis by utilizing data accumulated through monitoring, and these two activities are complementary to each other.

Even though auditors can be differentiated from evaluators in the aspect that auditors are to determine the preciseness and adequacy of the subjects, there may be a case in which it is difficult to distinguish auditors from evaluators as auditors also analyze the process and/or the outcome of specific projects in the same manner as evaluation.

As such, there are some similar notions to "evaluation"; however, this set of guidelines defines "evaluation" as a systematic and objective assessment of ODA. BOX 2 Concepts Similar to Evaluation ● Monitoring

A continuous function which deals with systematic collection of data on specific indicators in order to provide

information on the progress of implementation, degree of achievement as well as the management of funds

to stakeholders and relevant staffs at an operation unit of implementing agencies

● Review

A periodic or need-basis assessment of performance of development assistance

● Audit An independent and objective procedure that is designed to improve certain operations of organizations or

to add further value (to the organizations). It helps organizations accomplish their objectives by utilizing a

systematic, disciplined approach to assess and improve the effectiveness of risk management, control and

governance processes.

*There are other similar concepts such as assessment, etc.

1.2 Goals of Evaluation In general, evaluation is considered to have two goals: “management,” which refers to bringing

lessons learned into the further improvement of development assistance, and the other, “accountability.”

Page 7: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

4

When evaluations were first introduced in the United States, the goal was set to be of use for Congress to ensure the accountability of the government’s administrative systems; however, as questions rose against governmental activities due to the budget deficits, etc, management of government activities were added to the objectives of evaluations.

Furthermore, in the view of DAC discussions on development assistance, management and accountability are regarded as two main goals of evaluation, and, accordingly, the following purposes were stated in the “Principles for Evaluation of Development Assistance,” published in 1991.

✦ Improve future assistance policies, programmes and projects through feedback of lessons learned

✦ Provide a basis for accountability, including the provision of information to the public In addition, DAC recognizes providing necessary and accurate information to policy makers and

implementing agencies as well as promoting communications among practitioners of development process as important roles of evaluation. Please note that the term “accountability” in this document refers to the responsibility of a government for bringing results/effectiveness of development to the public rather than that of financial or legal accountability. 1.3 Evaluation Standards

Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear standards are required for the assessment. OECD-DAC released, in 1991, a set of principles for evaluation, of which five criteria (Relevance, Effectiveness, Efficiency, Impact and Sustainability) have been widely used since then. These standards can be used selectively, not necessary as a set, depending on the goal or target of each evaluation. There are also other standards that may be used for evaluations. BOX 3 DAC's Five Evaluation Criteria ● Relevance

The extent to which the objectives of a development intervention are consistent with beneficiaries’

requirements, global priorities and policies of partners and donors

● Effectiveness

The extent to which the development intervention’s objectives have been achieved or are expected to be

achieved, taking into account their relative importance

● Efficiency A measure of how economically resources/inputs (funds, expertise, time, etc.) are converted to results

● Impact Positive and negative, primary and secondary long-term effects produced by a development intervention,

directly or indirectly, intended or unintended

Page 8: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

5

● Sustainability

The continuation of benefits from a development intervention after major development assistance has been

completed

1.4 Methods of Collection and Analysis of Information

Evaluations are conducted by collecting and analyzing information, and there are various methods as outlined below. Each of these methods has its own strengths and weaknesses; therefore, it is important to conduct an accurate collection and analyze information as much as possible by combining numerous methods. (1) Methods of Collecting Information A) Interview

Depending on the purpose, there are interviews of individuals, groups, key informants, etc. The method applied depends on the structure of each question; ●Structured interview

Conducted by following a specific set of questions as in a questionnaire ●Semi-structured interview

Conducted by following a general framework of the items with extra/detailed questions asked when needed

●Unstructured interview Conducted in an open-ended style with questions made with a clear understanding of the objectives

B) Focus Group Discussion Extraction of comments and ideas through a group discussion with approximately ten relevant

persons on specific themes C) Direct Observation Direct observation of persons, objects, actions, phenomena, etc. This method can be

implemented rather easily, although on the other hand, the results may vary, depending heavily on the interpretation of each observer. D) Literature Search Collection of information from existing statistics and reports; although this is a low-cost and

efficient way of researching, there may be an issue on the credibility of the used documents. E) Base-line Study This is an analysis of the current situations, which is conducted prior to the implementation of

development assistance, with which assessment of progress reports and comparisons are made. F) Questionnaire Survey A collection of comments from a wide range of direct beneficiaries and/or policy makers

Page 9: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

6

(2) Methods of Analysis A) Risk Analysis

Analysis or assessment on the causes that possibly influence/affect the achievement of assistance goals (called a hypothesis in a log frame). It is a process that 1) assesses in detail the possibilities that negative results may occur to people’s life, health, property or the environment, 2) provides information relating to such negative outcomes and 3) quantifies the possibility and possible impact of such suspected risks. B) Case Study

A method of developing principles of general characteristics of a specific group by clarifying characteristics through interviews and observations as well as by accumulating similar cases C) Cost-benefit Analysis

A method of determining the appropriateness of the implementation of a project, etc. by comparing the monetary values estimated on the presumption of social costs and benefits which may be brought by the implementation of the project D) Cost-effectiveness Analysis

A method of comparing social costs and benefits of the implemented project not necessarily indicated by monetary value E) Input-output Analysis

This refers to an analysis of other sectors as well. It was created by Leontief that aims for an empirical application of the general equilibrium theory. It calls for equilibrium output of an industrial sector where the final demand of the offering is produced by dividing economic systems into final sectors such as industrial sectors which produce non-joint production of finance/service, household expenses, etc. F) Analysis Using an Econometric Model

A method of analyzing financial effects of ODA by using an econometric model; the econometric model is “a formula made based on the statistics of mutual independence of each phenomenon by applying actual statistic data to the economic model." G) Impact Evaluation

An evaluation approach which measures in detail the changes brought to the target society by the implementation of projects and policies; it is a comparison of the actual situation of the implementation and the hypothetical state in which there were no implementation of the projects.

Page 10: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

7

Chapter 2 ODA Evaluation by MOFA This chapter outlines an overview of the past efforts implemented by MOFA in ODA evaluation

as well as explanations of the implementation system, objectives and targets of ODA evaluation. 2.1 Background (1) Introduction of ODA Evaluation

Japan implemented its first ODA evaluation when the former Overseas Economic Cooperation Fund conducted an ex-post evaluation in 1975. Later, the ODA evaluation system was established by MOFA and JICA, following the implementation of an ex-post evaluation by MOFA in 1981 and by JICA in the following year.

In 1981, MOFA established the Economic Cooperation Evaluation Committee, which was chaired by the Director-General for the Economic Cooperation Bureau, and started conducting ex-post evaluations on individual projects. Initially evaluations were implemented to "improve Japan's economic cooperation more effectively." Hence, the main objective of evaluation was to assist the management of individual projects. The focus on the management of projects was the reflection of influences from the opinions of OEDC-DAC as well as the practice of the United States, which initially introduced the evaluation system.

Evaluations were mainly conducted 1) by dispatched survey teams from MOFA with MOFA and JICA staff members, 2) through Japan’s diplomatic missions, 3) by JICA, or 4) by evaluations entrusted to private sector organizations. In addition, information was collected for these assessments mainly on infiltrations of the outcome into the local populations, the degree of appreciation by the recipient countries and the degree of achievement of the goals. (2) Expansion of ODA Evaluation A) Expansion of Objectives

In the 1980s, along with the expansion of ODA in a series of ODA mid-term goals, ODA gradually caught attention from the public. As such, ODA evaluation started to receive more attention as a means to pursue accountability of the Government of Japan regarding ODA. Therefore, MOFA, since the 1990s, has listed fulfilment of accountability as one of the major objectives of evaluations along with improvement of ODA management. B) Enhancement of Functions

In the 1990s, reflecting the expansion of evaluation objectives, the functions of evaluation also expanded. While, conventionally, the focus was made on feedback from the results of evaluations, which were to be reflected in the formulation and implementation of ODA plans, evaluations gradually gained a need to extend their functions to explain the effect of ODA on the public for the

Page 11: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

8

purpose of accountability. Therefore, evaluation has become a bridge between the public and government agencies as a means to provide information to the public regarding government activities. MOFA has been publishing the evaluation results every year on the MOFA website since 1997, and currently all the evaluation reports and annual reports are available on the website. C) Diversification of Time Periods for the Implementation of Evaluation

As more and more considerations were made on the importance of the role of evaluations reflecting the expansion of its objectives and functions, there were more demands for conducting ex-ante evaluations and mid-term evaluations on plans and implementation progresses of development assistance. This is believed to be a result that reflects a strong perspective that effective ODA can be achieved by managing each step of the development assistance, such as development of plans, implementation and realization of outcome, in a consistent manner.

In 2000 and 2001, two reports, "Report on Reform of Japan's ODA Evaluation System" and "ODA Evaluation Study Group Report," were submitted to the Foreign Minister by the ODA Evaluation Reviewing Panel, which was established as a private advisory body for the Director-General of the Economic Cooperation Bureau at the time. Their recommendations included that "a system should be established for consistent evaluation in each step of ex-ante, mid-term, and ex-post." The ODA Charter, which was revised in August 2003, also states that the government "will carry out (…) coherent evaluations at all stages, i.e. ex-ante, mid-term, and ex-post." D) Expansion of Objectives

From the late 1980s to the early 1990s, the main donors were "fatigued from assistance" and reduced ODA due to their own financial issues. Arguments were made on the significance and results of past performance of ODA, and suggestions were made to review the practice of ODA implementation in order to make ODA more efficient and effective.

Under these circumstances, new forms of development assistance were introduced for further promotion of ODA, including a comprehensive approach and a strategy beyond the framework of each project such as sector-wide approach, Comprehensive Development framework (CDF), Poverty Reduction Strategies (PRS), etc. These systems incorporated the idea of managing government activities, such as New Public Management, by adopting management mechanisms including planning, monitoring and evaluation of a comprehensive assistance strategy.

Japan has strengthened activities on country- and sector-based assistance and has also been developing Country Assistance Policies. Accordingly, the scope of evaluation targets expanded from individual projects to assistance plans at sector, country and thematic levels. Hence, MOFA and JICA as an implementing agency are collaborating with each other to conduct evaluations at the policy, program and project levels.

Page 12: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

9

BOX 4 Levels of Evaluation ● Policy-level evaluation

Evaluations of a collection of activities (programs and/or projects) pursuing the realization of Japan’s basic policies (such as ODA Charter, the Medium-Term Policy on ODA, Country Assistance Policies, etc.)

● Program-level evaluation Evaluations of a collection of a number of various activities such as projects, which share common objectives

● Project-level evaluation Evaluations of individual economic assistance projects

E) Diversification of Evaluators

Conventionally, ODA evaluations were conducted internally by MOFA and JICA. However, after 2000, as the reform of ODA became more full-fledged, external third-party evaluations gained more attention in order to ensure the transparency and efficiency of ODA. The report for the Second Council on ODA Reform, which was submitted in 2002, and the report of the Advisory Board on the Reform of MOFA, which was made for the “Conference for Change,” aimed for a reform of MOFA, including recommendations on the expansion of evaluations. Especially evaluations implemented by third parties and recipient countries/agencies as well as joint evaluations conducted with donors gradually gained importance.

Currently, MOFA implements third-party evaluations mainly as well as evaluations conducted by recipient countries/agencies and joint evaluations, etc. A third-party evaluation is an evaluation conducted by an external third party. An evaluation by a recipient country/organization is conducted through entrusted government officials, universities, or research institutes in the recipient country in order to incorporate a perspective of the recipient country. Joint evaluations are conducted by MOFA together with external agencies such as other donors, international organizations, NGOs, etc. As such, ODA evaluations are implemented by MOFA through various evaluators.

2.2 Implementation System (1) Roles of MOFA and JICA

Japan’s ODA evaluations are primarily implemented by MOFA and JICA. In order to perform evaluations efficiently, MOFA and JICA distinguish their objectives of evaluations and play different roles.

Page 13: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

10

MOFA implements evaluations at policy/program levels as it is responsible for planning/developing ODA policies. On the other hand, JICA, which implements each ODA project, focuses on project-level evaluations, while it also conducts some thematic evaluations.

Policy-level evaluations by MOFA are conducted on basic ODA policies such as the Medium-Term Policy on ODA, Country Assistance Policies and ODA policy on priority issues. Program-level evaluations are conducted by comprehensively analyzing a collection of several projects under the same objectives, following the MOFA ODA evaluation items. By contrast, JICA conducts its evaluations on each project in a more specific and precise manner.

In addition, since the “Basic Law on Reforming Government Ministries of 1998” granted MOFA a responsibility to coordinate overall government activities of ODA, the ODA Evaluation Division at the Minister’s secretariat (ODA Evaluation Division) of MOFA has been holding an inter-ministerial Liaison Meeting on ODA Evaluation every year in order to share information on evaluation methodology, international trends, etc. Furthermore, the Division publishes an “Annual Report on Japan’s ODA” every year, which is an accumulation of the results of ODA evaluations conducted by each Ministry.

Figure 1: Evaluation Scope and System of Implementation

Individual projects, etc.

ODA

Evaluation by

MOFA

ODA

Evaluation by JICA Project Level

Program Level

Policy Level

Sectoral assistance plans,

Aid modalities, etc.

ODA Charter Medium-Term Policy on ODA, Country Assistance Policies,

etc

ODA Evaluation ODA Implementation

Page 14: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

11

(2) MOFA’s ODA Evaluation and Policy Evaluation There are two types of MOFA's evaluations on economic assistance: “ODA Evaluation”, which

is conducted following the Order for the Organization of MOFA, and “Policy Evaluation”, which is conducted based on Government Policy Evaluation Act. Each of them is based on the respective ground with different implementation systems, procedures, etc. A) ODA Evaluation Based on the Order for the Organization of MOFA

Conventionally, the International Cooperation Bureau was in charge of the ODA evaluations based on the Order for the Organization of MOFA; however, after the process of institutional reform of MOFA, the ODA Evaluation Division was relocated to the Minister’s Secretariat (which is structurally positioned at higher rank than the International Cooperation Bureau) from FY2011, in order to maintain the independency of implementation of evaluations. ODA evaluations can be divided into the following three categories according to evaluation parties; (i) Third-Party Evaluation

Mostly all evaluations implemented by MOFA are third-party evaluations (evaluations conducted by a party independent from both donors and recipients of assistance), which were entrusted to external advisors or consultants by requesting the “External Advisory Meeting on ODA Evaluation” from October 2003 to March 2010. From FY2011, however, MOFA introduced an open tendering system, in which consulting companies participating in the bidding will submit proposals in addition to composing an evaluation team including experts (a chief evaluator and an advisor). The winning bidder will be entrusted the evaluation, whereas the bidder shall conduct the evaluation with the evaluation team it has applied. (Reference) Recent evaluation studies Country assistance evaluation:

Japan’s ODA policy to Nepal (FY2012), Cuba (FY2012), the Republic of Malawi (FY2012), etc Priority issue evaluation:

Triangular Cooperation (FY2012), Japan's Assistance for Policies and Institutions that Promote Gender Equality (FY2011), etc

Aid Modality Evaluation: Japan Disaster Relief Team (FY2012), Grant Aid for Fisheries (FY2010), etc

Sector evaluation: Health Sector in Cambodia (FY2012), etc.

(ii) Partner Country-led Evaluations

Having evaluations conducted by the recipient sides of assistance, i.e. by partner country governments and organizations, not only secures the transparency and fairness of Japan's ODA evaluations, but also promotes understanding of Japan's ODA to the public of recipient countries,

Page 15: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

12

in addition to enhancing their capabilities of evaluations. The evaluators will be government officials and agencies of the recipient countries, consultants, think tanks, academic institutions, or NGOs with professional competence in the field. (Reference) Recent evaluation studies:

Japan’s ODA to the education sector in Mozambique (FY2011) Japan’s ODA in the water sector in Senegal (FY 2010)

(iii) Joint Evaluations

MOFA also conducts joint evaluations with external organizations, such as other donor countries/international organizations, recipient countries, NGOs, etc. when considered appropriate. In some cases, MOFA officials or officials from other external organizations, in addition to experts, may also participate as evaluators. Generally, evaluators hire a consultant specialized in gathering and analyzing information as a member of the evaluation team. However, in some cases, consultants conduct an evaluation directly by themselves.

In addition to improving management and maintaining accountability of ODA, joint evaluations with recipient countries are significant in supporting the ownerships of the recipient countries as well as strengthening partnerships with them. Furthermore, more and more attention has been paid in recent years to the importance of implementing joint evaluations with several donors in order to have a better understanding of a range of development assistance implemented by various donors. Japan has accumulated past achievements of implementations since FY2002. In 1997, a joint evaluation of MOFA and NGOs was launched as “Mutual Learning and Joint Evaluation by NGOs and MOFA.” Initially, the goal was set for mutual understanding by conducting a joint evaluation; however, based on the success in such concepts, currently the main purpose is set to strengthen the coordination between the two parties. (Reference) Past evaluation studies: Joint Evaluations with Partner Countries:

Transport Infrastructure Sector in Vietnam (FY2005) Joint Evaluations with Other Donors or International Agencies:

The US-Japan Partnership for Global Health in Zambia (FY2007) Joint Evaluations with NGOs:

Education Sector in Laos (FY2008) B) Policy Evaluation Based on Government Policy Evaluations Act The Government Policy Evaluations Act (GPEA) was adopted in June 2001 and came into effect in April 2002. The objectives were to "promote the objective and strict implementation of policy evaluation by setting fundamental principles, etc. regarding policy evaluation conducted by governmental organizations; reflect the outcome to the policy making in an appropriate manner; publish the information regarding evaluation of policies to the public; promote efficient and effective

Page 16: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

13

government functions; and fulfil the accountability to public regarding actions taken by the government" The GPEA sets the primary objectives of evaluations as management and accountability, regulating evaluations by the responsible governmental organizations as well as reflecting the result of these evaluations to the policy making process in an appropriate manner, in the aspect of efficiency and effectiveness to gain a well understanding in the outcome of their policies. The targets of these evaluations are policies, implementations and projects; among which, MOFA implements policy evaluations on diplomatic policies such as overall economic assistance policy, in addition to ex-ante evaluations on general grant aid that exceed the 1 billion yen donation limit for exchange, yen loans that exceed the 1.5 billion yen limit, and ex-post evaluations that are conducted for ODA projects that have not been completed after five and ten years. Figure 2: Policy Evaluation and ODA Evaluation by MOFA Division in charge

Grounds for implementation

Subject of evaluation

Standards of evaluation

Evaluator

Policy Evaluation (since 2003)

JICA

ODA Evaluation (since 1981)

Examination/Policy Evaluation Division, Minister’s Secretariat

ODA Evaluation Division, Minister’s Secretariat

The Government Policy Evaluations Act (2001)

Order for Organization of MOFA,

ODA Charter, etc

Evaluation of policy and program levels of Japan’s ODA implementation. The following are the details; • Country assistance evaluation • Priority theme evaluation • Aid modality evaluation •Others (sector evaluation, Joint evaluation, evaluation through international organizations, etc.)

*JICA implements evaluation on each project level

Evaluation on policies and projects implemented by MOFA Including following economic assistance areas: • Overall policy on economic assistance • Ex-ante evaluation on general grant aid projects with E/N

credit limit exceeding 1 billion yen and ODA loan projects with E/N credit limit exceeding 15 billion yen

• Ex-post evaluation on ODA projects that have not been completed after five and ten years

"Relevance of Policies", "Effectiveness of Results" and

"Appropriateness of Processes" based on the five

OECD-DAC Evaluation Criteria.

Needs, effectiveness, and efficiency of the policy as well as

other necessary aspects of the implementation policy

Self-evaluation Third party (external)

Page 17: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

14

Policy Evaluation (MOFA)

<Grounds for Implementation> The Government Policy Evaluation Act (2001) <Form> Self Evaluation <Objectives> Overall policies/implementations of MOFA Policy evaluation on administrative projects (effective measures

to achieve goals of policy/implementation)

ODA evaluation based on the Government Policy Evaluation Act

(Arts 7 and 9)

Foreign policy International cooperation

ODA Evaluation (MOFA)

<Grounds for Implementation> Order for Organization of MOFA, ODA Charter, etc <Form> Third-Party Evaluation <Objectives> Policy level evaluation focusing on Country Assistance Policies and priority areas, etc

Operations Evaluation (JICA) <Grounds for Implementation> JICA Law, JICA Internal Regulations <Form> External and Self Evaluation <Objectives> Evaluation mainly on loan, grant, and technical cooperation projects

- Grant aid exceeding 1 billion yen (ex-ante evaluation)

- Yen loan projects exceeding 15 billion yen

(Ex-ante evaluation)

- Ex-post evaluations for ODA projects that have not been completed after five and ten years

Project level

Policy level ODA evaluation

Page 18: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

15

2.3 Objectives and Functions (1) Objectives

MOFA has set the following two objectives for ODA evaluation; • Improvement of the management of ODA (Feedback for ODA Policies) Work towards the improvement of quality of and support for the management of ODA by examining ODA activities and feeding lessons learned back to ODA policy making and implementation processes

• Fulfilment of accountabilities to the public Fulfilling accountability to the public in addition to promoting the participation and understanding of the public regarding ODA by improving the transparency by publishing evaluation results

(2) Functions

MOFA’s ODA evaluation is conducted in accordance with the implementation system of MOFA in order to achieve the two mentioned goals as listed above. For improving the management of ODA, a feedback function is essential for providing beneficial information for ODA policy making and implementation in the future. In addition, from the viewpoint of accountability to the public, there is a need to provide accurate information to the public in a clear way. In order to respond to the demand, MOFA publishes these evaluations on its website in addition to providing feedback on evaluation results to relevant MOFA divisions responsible for the policies/projects, implementation agencies and Japan’s diplomatic missions. Figure 3: Functions of ODA Evaluation (PDCA cycle)

Evaluations may be categorized in the following chronological order; ex-ante, mid-term and

ex-post evaluations. Other than project level evaluations, the concept of this categorization also applies to the policy/program-level evaluations. However, since ODA policies either continue or are revised in reality (excluding cases in which partner countries no longer receive ODA), it is difficult to

Plan Policy planning

Do Implementation

Check Evaluation,

Act Feedback

Page 19: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

16

classify ODA evaluations at the policy level in the chronological order. For instance, it is not appropriate to classify such assessment as one of ex-ante, mid-term or ex-post evaluations, there is a need to implement an evaluation prior to the end of ongoing plans or the time frame of principles in order to provide with beneficial inputs for developing the next Country Assistance Policies.

The following explains the classification of evaluations in the chronological order even though it does not apply to policy-level ODA evaluations; A) Ex-ante Stage

Evaluations conducted in the ex-ante stage (during policy planning stages) are called ex-ante evaluations. As mentioned above, ex-ante evaluations carry two aspects; providing information for policy-makers to make policies, and necessary materials to set clear goals and indicators for achieving the goals. B) Mid-term Stage

In case mid-term evaluations are conducted during implementation of ODA, they are conducted in order to provide information for improving ODA by assessing the appropriateness, efficiency and degree of achievement of the goals at the mid-term stage. Since the outcome of ODA may be realized years after the project is implemented, understanding the situation of implementation will be the focus of evaluation at mid-term stage, rather than the long-term effects of ODA. In addition, monitoring may be conducted periodically during implementation of ODA in order to measure indicators which had been set at the planning stage

C) Ex-post Stage

Ex-post evaluations are conducted after the ODA has been completed. Ex-post evaluations have functions to provide recommendations and beneficial information to be utilized in future ODA by examining the objectives, processes, degrees of achievement of the goals and effectiveness of ODA with a general viewpoint. D) Feedback

The results of evaluations are reported to the ODA policy makers and implementation agencies both in Japan and the recipient countries, as a result, the related agencies are required to feed back for the recommendations.

At MOFA, in order to evaluate the basic policy of ODA, recommendations and lessons learned are reflected in revisions and planning of the ODA Charter, ODA Mid-Term Review, Country Assistance Policies, etc. and are also utilized for ODA policy making and implementations at the International Cooperation Bureau and Japan’s diplomatic missions. In addition, since JICA mainly conducts project-level evaluations, feedback of lessons learned from similar projects implemented in the past is provided for the project planning stages.

Page 20: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

17

E) Publication

The results of ODA evaluations are publicized for the fulfilment of the government's accountability. The contents include the objectives, processes of implementation, effects, etc., which may be utilized by the public to gain a better understanding about the content of ODA. MOFA publishes reports and overviews of evaluations as well as an annual report (Annual Report on Japan’s ODA Evaluation) which is a summary of these reports. These reports are available on the MOFA website. 2.4 Evaluation Targets

Depending on the evaluation targets, MOFA classifies ODA evaluations into two: (1) policy-level evaluation and (2) program-level evaluation.

Policy-level evaluations (country assistance and priority issue evaluations) assess a collection of various projects and programs, and are implemented with a common purpose of realizing Japan's basic policies on economic cooperation.

Program-level evaluations (aid modality evaluations and sector evaluations) assess a collection of various projects, etc. which share common objectives.

Please note that JICA conducts project-level evaluations on individual projects. (1) Policy Level Evaluation A) Country Assistance Evaluations

Country Assistance Evaluations assess general assistance policies for individual countries. Conventionally, these evaluations were conducted in major recipient countries of which the government had country assistance plans, aiming to contribute to planning and revising country assistance programs. However, starting in FY2011, since the Country Assistance Policies started to be made for all ODA recipient countries, MOFA has been selecting the subjects of evaluations based on the comprehensive consideration made on the necessity to implement country assistance evaluations, which reflects the increasing importance of Japan's diplomatic policies, etc.

Assessments are conducted mainly on the progress of implementation of the ongoing assistance policies in order to gain lessons to improve future planning of assistance policies. Country assistance evaluations are conducted by external third parties. B) Priority Issue Evaluations

Priority Issue Evaluations assess initiatives presented in the ODA Charter as priority issues or at international conferences by Japan. For example, poverty reduction, disaster risk reduction and peace building can be the subjects of evaluations.

Assessments on priority issue evaluations aim to clarify the outcomes of the past activities (Comprehensive Evaluations) or to gain lessons to reflect on the future planning and

Page 21: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

18

implementation of Japan’s assistance policies (Formative Evaluations). Priority evaluations are conducted by third-party evaluators; however, it is also possible to conduct jointly with other donors, NGO, etc.

(2) Program-Level Evaluations A) Aid Modality Evaluations In principle, Aid Modality Evaluations are conducted focusing on particular aid modality in order to

gain lessons for further improvement/revisions of such aid modality (i.e. Grant Aid for Grassroots and Human Security, Grant Aid for Japan's NGO Projects, etc.). Aid Modality evaluations are conducted by third-party evaluators. B) Sector Evaluations

Sector Evaluations assess a collection of ODA activities in each sector in a specific country. If there is a sector based on development plans for areas such as medical care, health and infrastructures, these will be the subjects of evaluations; if there is no such plan, overall ODA activities in a particular sector will be assessed. These evaluations are conducted by third-party evaluators; however, it is also possible to conduct joint evaluations with other donors, NGO, etc.

Page 22: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

19

Chapter 3 Implementation of ODA Evaluations This chapter illustrates the processes and methods to implement MOFA’s ODA Evaluation.

3. 1 Implementation Processes

MOFA’s ODA evaluation is generally conducted in the following process; Planning:

MOFA creates an annual ODA evaluation plan by selecting the target countries, priority issues, aid modalities and sectors, in addition to the types of evaluations (third-party, partner country-led evaluation, joint evaluation, etc.). Furthermore, this plan will be reported to the Development Project Accountability Committee held by the International Cooperation Bureau of MOFA. Implementation:

The ways to implement evaluations vary according to the type of evaluation; therefore, the details will be explained later in the following categories: (1) third-party evaluations, (2) partner country-led evaluations and (3) joint evaluations. Feedback:

Evaluation results are reported to the ODA policy makers and implementation agencies both in Japan and recipient countries. Specifically, the ODA Evaluation Division, which belongs to the Minister’s Secretariat, invites the International Cooperation Bureau as well as JICA and hosts an Internal Follow-up Meeting on ODA Evaluation in order to report the evaluation results and develop response measures of the International Cooperation Bureau.

The results of the follow-up efforts are listed in annual reports available on the MOFA website (http://www.mofa.go.jp/policy/oda/evaluation/index.html). In addition, evaluation reports are distributed to the relevant officials of the recipient countries through Japan’s diplomatic missions. Publication:

Evaluation results are reported at the Development Project Accountability Committee and also available on the MOFA website. Moreover, annual reports with overviews of evaluations implemented every FY are also distributed to diet members, experts, NGOs, universities and libraries, etc.

Page 23: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

20

Figure 4: Implementation Processes of ODA Evaluations

◆ (1) Third-Party Evaluations

Implementation of third-party evaluations is entrusted to external third parties, which are selected by the general competitive bidding system. The entrusted evaluation team carries out the duties from the planning of ODA evaluation surveys to the creation of the report, having consultation with the ODA Evaluation Division. Details of the schedules and work procedures are as outlined below; A) Development of an Evaluation Implementation Plan (April)

The ODA Evaluation Division, in order to announce information for general competitive bidding, creates a tentative implementation plan of evaluation by consulting with relevant divisions and departments within MOFA. The tentative plan drafts basic concepts such as the purposes and the subjects of evaluation, the work process, implementation schedule, etc. B) Selection of Evaluation Teams (May - June)

Each consulting firm, upon participating in the general competitive bidding process, needs to present a project proposal in addition to forming an evaluation team, which shall consist of a chief evaluator (an expert of development evaluation with experience in the field) and an advisor (an expert of the region, field or bilateral relationship between Japan and the selected country). The chief evaluator is to be in charge of supervising the evaluation report as a whole, while the advisor needs to provide advice to the evaluation team as an expert of the evaluation subjects.

Prior to the implementation of evaluation, the ODA Evaluation Division holds a joint meeting with representatives of each team (primarily for the chief consultants) and explains general important items in order to ensure the coherence of quality of each evaluation. Furthermore, the Division also monitors the progress of evaluation reports in a timely manner. Representatives of the evaluation teams are requested to ensure good communication with other team members as well as to communicate with the ODA Evaluation Division to compile opinions from the entire team.

Planning Stage Plan on overall evaluation Plan individual evaluations

Implementation Stage Set a evaluation framework Implement the evaluation Analyze the assessment Create reports

Publication Stage

Publication of reports

Feedback Stage Internal Follow-up Meeting on ODA evaluation Follow-up on evaluations (response measures,

summarize the results)

Page 24: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

21

C) Planning Evaluation Designs (July to August)

The selected evaluation teams (a chief evaluator, an advisor, a few assisting consultants) develop evaluation designs, which include the objectives, scopes, methods, work schedules, etc. Upon completion, the first consultation meeting will be held with relevant MOFA and JICA officials. The meeting will be an opportunity to confirm the details of the evaluation designs with the relevant parties as well as to modify the designs, if necessary, by further understanding the evaluation needs. D) Implementation of an Evaluation Study (July to December)

Based on the evaluation designs, the evaluation teams conduct research in both Japan and through the field survey abroad (a MOFA official will accompany the field survey as an observer). The periods of the field survey in recipient counties or other relevant countries shall be decided by consulting with the ODA Evaluation Division. The second consultation meeting shall be held prior to the field survey in order to share the schedules, objectives, etc. of the evaluation among relevant parties.

Please note that requests for information and appointments for interviews with MOFA, JICA or the government agencies in the selected country shall be arranged through the ODA Evaluation Division. E) Field Survey Briefing Session (October to December)

After the implementation of the field survey abroad, the evaluation team holds a third consultation meeting with the participation of MOFA, JICA and other relevant agencies in order to share the results of the survey and the plans of the report.

F) Drafting the Report (December to January of the Following Year)

The evaluation team drafts the report by analyzing collected information and summarizing the results of their work. G) The Final Consultation Meeting/Completion of the Report (January of the Following

Year to February) When completing the final drafts of the report, the evaluation team holds a final consultation

meeting with the participation of MOFA, JICA and other relevant agencies in order to receive comments on the drafted reports. Considering these comments, the evaluation team finalizes the report. The contents of the report (the evaluation results) reflect the views of the evaluation teams, whereas the copyright of the report belongs to MOFA.

Page 25: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

22

(2) Partner Country-Led Evaluations In principle, Japan’s diplomatic missions decide evaluators, evaluation targets, etc., considering

the achievements by ODA and the evaluation capacities of recipient countries. The Japan’s diplomatic missions are responsible for concluding contracts with evaluators, following up the progress of evaluations, etc. When the final reports are completed, the Japan’s diplomatic missions invite relevant officials from the ODA Task Force and the governments of recipient countries to a local meeting to report the evaluation results as a follow-up. (3) Joint Evaluations A) Joint Evaluations with the Recipient Countries

This type of joint evaluations is conducted upon agreement with the recipient countries and Japan. In principle, evaluators of each side consist of officials from MOFA of Japan, relevant officials from the recipient countries as well as consultants from both countries; however, there are some cases in which the evaluations are conducted solely by the consultants selected by both countries. B) Joint Evaluations with Other Donors / International Organizations

This type of joint evaluations is conducted upon agreement with other donors or international organizations. The evaluators consist in the same way as above. C) Joint Evaluations with NGOs

This type of joint evaluations is conducted upon agreement with NGOs. The third-party joint evaluation conducted by NGOs and experts in FY2008 followed the procedures outlined in (1) and involved the representatives of NGOs in the evaluation team. 3.2 Evaluation Methods

This section introduces main evaluation criteria and basic evaluation methods used in third-party evaluations of MOFA mentioned in 3.1(1) above. (1) Main Evaluation Criteria A) Evaluation from Development Viewpoints (Three Evaluation Criteria)

Currently there is no set method for conducting policy and program level evaluations at the international or national level; nor are there any established standards.

In order to implement third-party evaluations, MOFA has set out its original evaluation standards based on the five evaluation criteria which were announced by OEDC-DAC in 1991. Among others, “Relevance of Policies”, “Effectiveness of Results”, and “Appropriateness of Processes” are to be examined.

Page 26: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

23

The following (2) and thereafter explain how to use these three criteria according to each evaluation subject. Please note that evaluations conducted by MOFA based on the Policy Evaluation Act are implemented in accordance with the “Basic Plans on Government Policy Evaluation” including the aspects of needs, effectiveness and efficiency, etc. BOX 5 Three Evaluation Criteria from Development Viewpoints adopted by MOFA ●Relevance of Policies

Evaluations assess the relevance of policies/programs of the evaluation subjects to Japan‘s higher ODA

policies and the needs in the partner country.

●Effectiveness of Results Evaluations assess whether or not the original goals have been achieved. Since Japan’s ODA policies or

programs rarely set measurable goals, it is often conducted as compilations of the details of outputs or

outcomes. ●Appropriateness of Processes

Evaluations assess whether or not the appropriate processes have been taken in order to ensure the

relevance or effectiveness of the policy/program. B) Evaluation from Diplomatic Viewpoints

Since Japan has been experiencing economic and financial difficulties, where ODA is funded by tax collected from the nationals, it has become particularly important to measure how much contributions have been made by ODA to meet Japan’s national interests from the diplomatic viewpoints, not only by evaluating the contribution of ODA to the recipient countries from the development viewpoint. “ODA Review Final Report,” which was presented by MOFA in June 2010, clearly states that ODA is one of Japan’s diplomatic measures, which pursues international common interest. As a result of this background, the sixth edition of the Guidelines introduces a new criterion, which is the evaluation from diplomatic viewpoints.

However, considering the difficulty of measuring effects on the national interest level objectively by third parties, this evaluation shall be a qualitative analysis, mainly comprised of interview surveys with relevant parties in Japan and abroad, not a quantitative analysis.

Page 27: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

24

BOX 6 Criteria of Evaluation from Diplomatic Viewpoints (Example) (From “Research on ODA evaluation (measures/systems) in policy level” in FY2010)

Evaluation in diplomacy Specific evaluation items

Diplomatic Importance

• Importance of diplomatic relations with the recipient country

• Importance of the recipient country in light of the backbone of Japan's foreign affairs (International

peace and prosperity)

• Importance of assistance in bilateral relations analyzed above

Diplomatic Impact

(Contribution to Japan's

Benefit)

• Effects on bilateral diplomatic relations

• Ripple effects on deepening the bilateral economic relations

• Ripple effects on promotion of friendly relations, and improvement of the favourable impressions of

Japan

• Effects on sharing opinion and support for Japan's position at international meetings, etc.

• Ripple effects on regional stability and sustainable development, etc.

BOX 7 Steps for Evaluating Diplomatic Goals (Example) (From “Evaluation on Grant Aid for Fisheries” in FY2011)

(2) County Assistance Evaluations A) Evaluation Designs

First, an evaluation team will develop an evaluation design which includes (i) objectives, (ii) scopes, (iii) methods, etc. In this process, the team may have necessary consultations with the ODA Evaluation Division. It is also required to acquire information and refer to comments from MOFA, JICA and other related parties by holding consultation meetings, etc. (i) Objectives

MOFA’s main objectives of evaluations are to improve the management of ODA and to fulfil accountability. In country assistance evaluations in particular, the objectives are to promote

Evaluation Step Indicator Evaluation Method Position of Evaluation Step 1 Does Japan share the diplomatic goals down to the field level?

Degree of understanding/ action towards Japan’s diplomatic goals by JICA and the Embassy

Interview surveys with JICA and the Embassy of Japan in the recipient country

Appropriateness of the process towards diplomatic goals

Step 2 Is the recipient country aware of Japan’s policy and the diplomatic goals?

Degree of understanding of Japan’s diplomatic goals by the recipient country

Interview surveys with the government agency of the recipient country

Effectiveness of schemes towards the diplomatic goals Step 3

Is the recipient country showing positive reaction?

Degree of cooperative action with Japan’s diplomatic goals by the recipient country

Data evaluation of the voting behaviour, etc. of the recipient country in the international arena

Page 28: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

25

understanding of the public regarding Japan's ODA and to reflect the results in Japan's foreign and development policies. In addition, when evaluations are conducted in alignment with review/revision cycles of the Country Assistance Policies, they will be used as reference.

(ii) Scopes Policy objectives will be sorted systematically in order to set boundaries for the scope of evaluation. When conducting evaluations on Country Assistance Policies, systematic charts, which simply outline the scopes, will be created. It is ideal to describe individual projects as much as possible by referencing the Rolling Plan, an annex of the Country Assistance Policies, etc.

In case where the target country is not in the scope of the Country Assistance Policies, the systematic chart of the country will be created based on the basic objectives and priority areas, by referencing the Rolling Plan of the selected country.

In addition, while all ODA activities can be the subjects of evaluations for country assistance evaluations, the scope shall be narrowed by setting an evaluation time period. Generally, when evaluating Country Assistance Policies, the time period is set from the time when the plan or policy was developed to the right before the evaluation.

Also, assistance provided through international agencies may be the subjects of evaluation when considered appropriate (however, this is limited to assistance of which the usage of funds can be identified such as the Japan Fund, excluding funds allocated to regular budgets of international organizations and contributions). (iii) Method

Determinations on methods of evaluations shall be made considering evaluation frameworks, which simplify viewpoints, standards, where information will be collected from, etc.

Currently there is no internationally or domestically established method; MOFA conducts evaluations comprehensively in aspects of planning, processes and outcomes. In this method, the evaluation is implemented based on the analysis of each evaluation aspect, which is set for each evaluation criterion. In principle, the standards of evaluations are set on the relevance of policies, effectiveness of results and appropriateness of processes. However, standards are not limited to these, and additional appropriate standards can be set depending on the condition.

Furthermore, it is important not to just strictly follow the rules (policies) of evaluations based on what is written in the Country Assistance Policies, but to verify in detail what have been achieved (nor have not achieved) in the general ODA policies, which are reflected in those principles, and what led to such results. ✦Relevance of Policies

Relevance of Policies assesses Japan’s ODA on 1) consistency with partner country’s needs, 2) consistency with priority in Japan's policies (whether it is still relevant after planning to time of evaluation), 3) consistency with international priority issues, 4) the balance with other donors, and

Page 29: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

26

5) Japan's competitive advantage. Regarding 1) and 2), evaluations will be implemented on the targets listed in a systematic chart by assessing whether the policy in question was set considering Japan's priority policies and international development challenges.

✦Effectiveness of Results

Effectiveness of Results assesses the degree of which the original goals have been achieved by assessing in detail the outcome, looking at the flow from input to output. Specifically, effectiveness of results assesses 1) proportion of Japan’s performance in the development budget of the recipient country (input), 2) degree of achievement of the original set goals, 3) the balance of input and the degree of achievement of the original set goals, etc. As such, evaluation on effectiveness requires indicators to assess the performance at each level: input, output and outcome. However, since such indicators are not set by Japan’s assistance policy, there is a need to set them at the designing stage by incorporating the relevant information. Furthermore, there needs to be inputs from other donors, international organizations, selected recipient countries, NGOs and other relevant agencies when understanding the effectiveness of Japan’s assistance. Therefore, the outcome of the development is the consequence of various factors.

✦Appropriateness of Processes

Appropriateness of Processes assesses whether a certain process has been adopted in order to ensure the relevance of policies and effectiveness of results. Specifically, appropriateness of processes assesses 1) whether efforts or approaches have been made to specific items listed in the Country Assistance Policies, 2) whether efforts have been made (policy meetings, sector meetings, etc.) to continuously understand the needs of the recipient country, 3) whether an implementation system for the local ODA Task Force and that of MOFA have been organized, 4) whether there has been a process to periodically monitor the conditions of implementation of policy, and 5) whether efforts have been made to coordinate with other donors, international agencies, etc.

B) Implementation of an Evaluation Survey

The evaluation team will conduct surveys both domestically and in the selected country to collect necessary information for the evaluation design, especially for making evaluations. Considering the limits to time and budget, the information shall be gathered as much as possible before field research, and the information which can be gathered only in the field shall be specified (i.e., interviews with relevant parties of the Government in the selected country, collection of local statistics, etc.).

While there are various methods to collect and analyze data such as literature reviews, research of examples, baseline studies, etc., it is effective to use the triangulation method, which is a combination of various methods to obtain information with high credibility.

Page 30: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

27

Regarding field studies, it is important to create questionnaires listing necessary questions for evaluations and send them beforehand (through the Japan’s diplomatic missions) to relevant parties in the selected countries prior to the field studies for effective implementation.

C) Analysis of Information

Evaluations are implemented based on the analysis of information collected in domestic and overseas field studies following the evaluation framework. In case quantitative objectives are set and there is abundant data, it is possible to conduct a quantitative analysis, such as cost effectiveness analysis, econometric analysis, etc; however, for most cases, since Japan’s Country Assistance Policies carry qualitative goals in many cases, qualitative analysis is made accordingly.

Based on the results led from the above mentioned analysis, evaluations shall be implemented on good practices for effective/efficient implementation in order to promote such examples. Furthermore, recommendations shall be made if there is a point to improve or reform by listing specific recommendations along with necessary information.

D) Making the Report

The evaluation team makes a report containing the details gained by the above mentioned procedure. In addition to a Japanese version, a foreign language version (summary) will be made in order to feed the information back to the selected countries and to share information with other donors and international organizations. Furthermore, summaries for publishing on the MOFA website are also created. The ODA Evaluation Division publishes the entire reports, summaries and foreign language versions on the MOFA website.

The evaluation team drafts a report and requests comments to the relevant agencies in order to avoid misjudgement of the facts, etc. In general, a consultation is held with participation of relevant MOFA and JICA members to discuss the contents of the draft report. In principle, the ODA Evaluation Division makes requests for written comments to the Japan’s diplomatic missions in the countries of evaluation and compiles those comments as a coordinator.

In case where an agreement cannot be made between the evaluation teams and other relevant parties because of their differences in opinions, it is reasonable that the evaluation team, which is responsible for making the report, makes a final decision, considering the benefit of third-party evaluations. However, in order to have practical feedbacks on the policies and implementations, it is important to make an objective and well-balanced evaluation report, reflecting the opinions of MOFA and JICA, which are the receivers of the report, up to some level.

The report consists of summaries and full text versions, which must contain implementing policies, summaries of scopes, results of evaluations and recommendations. It is important to make summaries in the aspect of accountability since an evaluation report tends to be of a large volume including reference material, data, etc. For details of the report format such as font, cover and contexts, the evaluation team is requested to carefully refer to “Handbook on creating ODA Evaluation Report,” published by MOFA by highlighting such information.

Page 31: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

28

Figure 5: Framework of Objectives in Japan’s ODA for Malawi (From “Country Assistance Evaluation of the Republic of Malawi” in FY2012)

(3) Priority Issue Evaluations

The guideline on planning evaluation designs for priority issue evaluations is as follows; (For implementation of evaluation surveys, information analyses and drafting the report, please

follow the manner of country assistance evaluation written in above) (i) Objectives

The evaluation aims to understand the implementing situation of priority issues of Japan’s diplomatic and development policies, to make reference to policy making and review of sector program initiatives, as well as to promote the understanding of the public in the activities and effort of Japan regarding the specific priority issues. (ii) Scopes

Policy objectives will be sorted systematically in order to set boundaries for the scope of subjects of evaluations. When the ODA Charter or priority issues for Medium-Term Policy on ODA are scopes of evaluation, a figure shall be created in order to illustrate a systematic chart of objectives by referring to ODA Charter, Medium-Term Policy on ODA, or the details of sector program initiatives when appropriate.

Furthermore, the scopes may be set on multilateral assistance through international organizations when considered necessary (however, this is limited to assistance of which the

Government of Malawi

Vision 2020 MGDS Policies of other donors

World Bank AfricanDevelopment Bank

Significance of ODAto Malawi

(1) Promoting poverty reduction and human security, which are priority issues in the ODA Charter

(2) Promoting aid to Africa based on the TICAD process, etc.

Basic aims

(1) To provide assistance based on the Malawi Growth and Development Strategy, which places the highest priority on poverty reduction

(2) To give consideration to the TICAD process and collaboration with other donors

Priority areas Priority issues

Boosting sustainable economic growth

Social development

Infrastructure development

Other

Agriculture andrural development

Improving basic education

Water resourcedevelopment

Improving health and medical services

Building transportation infrastructure

Electric powerdevelopment

Governance,participatory collaboration, etc.

United Kingdom United States Norway …

• Promoting sustainable land management,irrigation, and agriculture

• Improving rural livelihoods• One village, one product (OVOP)

• Improving primary education• Improving secondary education

• Safe and stable water supply

• Improving delivery of essential health package (EHP) services

• Improving HIV/AIDS services in northern Malawi

• Transportation infrastructure

• Electric power development

• Governance• Collaboration with public participation

Japan

OD

A C

harte

rM

ediu

m-te

rm O

DA

polic

yFo

reig

n po

licy

prio

ritie

s(T

ICA

D p

roce

ss, e

tc.)

Page 32: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

29

usage of funds can be identified such as the Japan Fund, excluding funds allocated to regular budgets of international organizations and contributions). (iii) Method

Evaluation standards shall be set to each evaluation viewpoint in order to make evaluations by analyzing the details according to it; furthermore, in order to conduct more specific assessment, it is effective to analyze a specific country as a case study for more detailed analysis. ✦ Relevance of Policies

Relevance of Policies assesses 1) the importance in Japan's foreign policies and 2) relevance to efforts and trends of international society/assistance policies. However, in contrast to country assistance evaluations, the priority issues in the subjects of the evaluation themselves tend to be listed as ones of the Japan’s high-level policies/objectives. ✦Effectiveness of Results

Effectiveness of Results assesses performance of Japan in the development of the selected country regarding the percentage and degree of effectiveness of such assistance in 1) the international community, 2) a specific country and 3) a specific region. ✦Appropriateness of Processes

Appropriateness of Processes assesses 1) whether efforts and approaches have been taken to specific themes noted in the Medium-Term Policy on ODA, 2) whether a coordination has been made between other donors and international organizations, 3) whether consultations have been made with the recipient country, 4) whether an implementation system has been organized, and 5) whether a process has been taken to periodically report the conditions of implementation of policies. The focus should be made on the issues which need to be improved, and good practice to be referenced in other programs regarding the results of evaluations on Relevance of Policies and Effectiveness of Results. (4) Aid Modality Evaluations

The guideline on planning evaluation designs for Aid Modality Evaluations is as follows; (For implementation of evaluation, information analyses and drafting the report, please refer to

the procedure listed for Country Assistance Evaluations.) (i) Objectives

Main objectives are to make reference to the improvement in overall aid modalities as well as promoting the understanding of the public in the activities and efforts made by the Government of Japan. It is effective to set evaluation standards to each evaluation viewpoint, with which the evaluation is implemented by analyzing the degree of achievement.

Page 33: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

30

(ii) Scopes

A systematic chart of the evaluation objectives is organized in order to set the scope of evaluation targets. A figure will be made on a simplified chart of the objectives, reflecting the implementation procedure for aid mobility. Even though the assistance in the modality overall will be subjects of evaluation, the scope will be narrowed by setting timelines for evaluation. In principle, the performances in the past three to five years tend to be selected. (iii) Method

In principle, comprehensive method for evaluation is made on the three basic aspects: policy, process and results. Since there may be different aspects to concern depending on the aid modalities, it is possible to assess in an open manner for each of the three aspects. At the same time, it is important to clarify issues of the aid modality by considering the external conditions such as possibility and risks, on top of understanding the strength and weakness of the aid modality itself.

In addition, it is effective to analyze a specific country as a case study for more detailed analysis. ✦Relevance of Policies

Relevance of Policies assesses 1) the consistency with development needs of the recipient country, 2) consistency with Japan's priority policies such as ODA Charter and Medium-Term Policy on ODA, and 3) consistency with international development issues.

✦Effectiveness of Results Effectiveness of Results assesses the degree to which the initial objectives have been achieved by measuring the degree of achievement of the target aid modality set by the implementation procedure. Since it is difficult to set objectives of aid modalities in a quantitative manner, evaluations are implemented in a qualitative way in principle. ✦Appropriateness of Processes Appropriateness of Processes assesses whether a procedure has been taken to ensure the relevance of the objectives of the aid modality and effectiveness of the results. Specifically, Appropriateness of Processes assesses 1) conditions of guidelines for operating aid modalities, 2) implementation system, and 3) progress of implementation of monitoring and follow-up.

Page 34: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

31

Figure 6: System of Objectives for Aid Modality Evaluations (Example) (From “Evaluation of Japan's Cultural Grant Aid” in FY2003)

(5) Sector Evaluations

The guideline on planning evaluation designs for sector evaluations is as follows; (For implementation of evaluation, information analyses, and drafting the report, please refer to

Country Assistance Evaluations). (i) Objectives Main objectives of ODA evaluation at MOFA are to improve management of ODA and fulfil accountability. In sector evaluations in particular, the main objectives of evaluation are to make reference for policy/ plan making and revisions in the target sector in the selected country as well as promoting understanding by public regarding the activities and efforts made by Japan for the target policy/plan. (ii) Scopes Objectives of plans and policies are sorted systematically in order to set boundaries for the evaluation targets. When there is a sector program plan/policy, a systematic chart shall be created based on the contents in order to illustrate the objectives in a simplified manner. In case there is no such plan or policy made for the sector, a chart should be made referring to the Country Assistance Policies or to the Rolling Plan. Even though it is difficult to insert every piece of

Final objectives

Development of education and culture in developing countries

Mid-term objective 1

Deepening understanding of

cultural assets/ heritage

Mid-term goal 2

Infiltrating understanding of

culture

Mid-term objective 3

Promoting education and research

Activity 1

Exhibitions for

cultural

assets/heritage

Activity 2

Conservation of

cultural assets

and heritage

Activity 1

Cultural

performances

Activity 2

Cultural

exhibitions

Activity 1

Implementation of

education

Activity 2

Implementation of

research

Page 35: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

32

information, when there is a large volume of projects, it is ideal to list individual projects as much as possible. Furthermore, for multilateral assistance and setting time periods for evaluation subjects, they should proceed in the same manner as the other types of evaluations. (iii) Method

Evaluation standard shall be set to each evaluation viewpoint in order to make evaluations by analyzing the degrees of achievements.

✦Relevance of Policies

Relevance of Policies assesses Japan’s assistance in 1) consistency with the policies, plans and needs of the recipient country, 2) consistency with Japan’s high-level policies, 3) the balance with other donors, etc.

✦Effectiveness of Results

Effectiveness of Results assesses 1) the degree of Japan's development assistance in comparison with other donors in the sector, 2) the degree of achievement of the initial goals, and 3) the degree of effectiveness in comparison to the input, etc.

For example, an evaluation can be implemented based on the amount of input to the education sector in the selected country (input), the capacity of the school which is built with Japan’s assistance, i.e. the percentage of the number of accommodation of children either in the country or a specific region (output), and the degree of improvement in school enrolment rate in the country or the region (outcome), etc.

✦Appropriateness of Processes Appropriateness of Processes assesses 1) whether enough consultation has been made with the recipient country in order to understand the development issue of the specific sector, 2) whether beneficial coordination has been made between sectors, 3) whether coordination and collaboration with other donors and international organizations have been made, 4) whether implementation systems for assistance have been sufficient, and 5) whether there has been a process to periodically grasp the conditions of implementation of policy.

Page 36: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

33

Figure 7: Systemized Objectives of Sector Program Evaluations (Example)

3.3 Aiming for Higher Quality Evaluations

Improvements in ODA evaluations methods have been consistently made while various challenges have been pointed out on evaluations such as confusion in the concept, technical challenges, feed-backs of evaluation results, independence of evaluators, ethical issues, etc.

Evaluators and other relevant parties including requesters of evaluation, policy makers, those that are evaluated, etc., need to consider the following points in order to further improve quality of ODA evaluations. (1) Improving Quality of Evaluations A) Macro Level Impacts of Development Assistance

In order to improve the quality of evaluations, it is important to properly understand the evaluation scopes in order to collect appropriate information on set evaluation items, questions and indicators. In other words, there is a need to create an appropriate “framework for evaluation,” which is referred to as an evaluation design. It is important to use the most appropriate method for evaluating three evaluation items. In particular, for “effectiveness of results,” which measures the impact of development assistance, it is possible to overview the causal relations at project level and micro level by utilizing a hierarchical tree and a logical tree. In addition, when it is difficult to capture the specific causal relations because of the wide spread of the evaluation targets to all over the world, it is effective to take a quantitative approach to explain in regression style the

Activity 1:

Assistance

for literacy

education

for women

Activity 2:

Promotion of

education in

rural areas

Activity 3:

Expansion of

health and

nutritional

services

Activity 1:

Training for

teachers

Activity 2:

Preparing

equipment

and facilities

for education

Activity 3:

Preparing

textbooks

and

teaching

materials

Mid-term goal 1

Improvement of enrolment rate

Final goal: Improvement of primary education in Country A

Mid-term goal 3

Achievement of gender equality

Mid-term goal 2

Improvement of students’ achievement

Activity 2:

Assistance for

remote

education

Activity 1:

Preparing

environment

for education

Page 37: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

34

relation of the changes in conditions and assistance provided by using quantitative data, as well as the method to explain in detail by analyzing a country as a case study. B) Trial Introduction of “Rating” For Evaluation from the Development Viewpoint

MOFA, at the “ODA Review Final Report (June, 2010)” suggesting ‘Visibility’ of ODA, recommended to introduce a rating system (classification or grading) on a trial basis into evaluation from the development viewpoint. Yet, different from project level evaluations, there has not yet been a satisfactory rating method developed for policy level evaluations because of the wide range of evaluation targets. In addition, there are a few cases in which other donors (especially bilateral donors) have introduced a rating system for their policy-level evaluations. Therefore, MOFA continuously promotes the trial introduction of rating by introducing good examples of past ratings.

Furthermore, while rating has a merit to make evaluations more “visible.” there is a risk to produce oversimplified results with a lack of consideration of individual issues or backgrounds of each evaluation item. Therefore, there is a need to minimize such possible risks by ensuring to provide supplementary explanations on documents constantly, etc. An example of rating standards is listed as Figure 8 for reference. Since it is an example, there is no requirement for the model, and it is also possible to use other rating standards or no rating standards at all. C) Knowledge and Experience of Evaluators The selection and procurement of the evaluation teams shall be conducted with a transparent and open procedure. The evaluation team shall include experts in the evaluation method and experts of the specific areas, as well as experts of the objective country/region with careful consideration of gender balance.

Page 38: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

35

Figure 8: Evaluation Items and Rating Scale (Example) (From “Country Assistance Evaluation of Nepal” in FY2012)

Evaluation Items Rating Scale Relevance of Policies

(a) Very high: Obtained an evaluation of “extremely high” for all items, and strategically conducted activities with originality and ingenuity.

(b) High: Obtained an evaluation of “high” for most items. (c) Moderate: Obtained an evaluation of “high” for many items. (d) Marginal: Not obtained an evaluation of “high” for many items.

Effectiveness of Results

(a) Very high: A very large effect was confirmed in all main sectors of assistance. (b) High: Significant effect was observed in most main assistance sectors. (c) Moderate: Some effect was confirmed in a number of main assistance sectors. (d) Marginal: Not obtained any noticeable effect in a number of main assistance

sectors. Appropriateness of Processes

(a) Very high: Obtained an extremely high evaluation for all items in the implementation process, and good practices were confirmed in the process of policy formation or implementation, which could be a reference for other countries.

(b) High: Obtained a high evaluation for most items in the implementation process. (c) Moderate: Obtained a high evaluation for many items in the implementation

process. (d) Marginal: Not obtained a high evaluation for many items in the implementation

process. Overall Evaluation (a) Extremely satisfactory: Obtained a rating of (b) or higher for relevance, and the

highest rating for the remaining two. (b) Highly satisfactory: Obtained a rating of (b) or higher for relevance, and a rating of

(c) or higher for the remaining two. (c) Moderately satisfactory: Obtained a rating of (c) or higher for all three items. (d) Marginally satisfactory: Obtained a rating of (d) for any of the items.

(2) Ensuring Neutrality of Evaluation A) Evaluator's Ethics and the Ideal Form of Evaluation

From the viewpoint of having responsibility to benefit the public, those involved in ODA evaluation must conduct fair and objective evaluations and implement evaluations in an effective and efficient manner with sincerity and practicality. In addition, along with respect for individuals, evaluators have to secure their independent point of view. B) Independence of the Department in Charge of Evaluations

It is important to improve the objectivity of evaluations by strengthening the system and independence of ODA evaluation. The relocation of the ODA Evaluation Division from the ODA Policy Division was stated in the “ODA Review Final Report.” Accordingly, the ODA Evaluation Division was transferred to the Minister’s Secretariat from the International Cooperation Bureau in April 2011. Since 2011, the director of the ODA Evaluation Division has been an external expert selected by open recruitment to enhance the expertise of the division. C) Transparency of ODA Evaluation Results

In relation to the objective of fulfilling accountability, it is important to try to explain evaluation results in the evaluation reports in an understandable way to persons without special knowledge of

Page 39: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

36

ODA evaluations. Since the main readers of the evaluation reports are assumed to be taxpayers, use of technical terms, which are only understandable to experts, shall be minimized. It is recommended to make the report simple by inserting figures, graphs, photographs, etc. (3) Evaluations and Stakeholders A) Protection of Stakeholders Who Have Cooperated with Evaluation Activities

Upon the implementation of evaluations, cooperation of stakeholders (including donors and beneficiaries) are essential in order to gain necessary information. When listening to opinions from the stakeholders, there shall be no direct disadvantages against them. (“standard of quality on evaluation” by OECD evaluation committee, 3.3 “Consultation with, and protection of stakeholders”)

B) Hearing Opinions from Stakeholders

Upon the submission of draft reports of the third-party evaluations at consulting meetings, MOFA, JICA and other stakeholders are able to make comments on such drafts. It is also an opportunity for the evaluators to avoid misunderstanding and lack of knowledge, while, at the same time, to deepen the understanding of the evaluation items. When the evaluators agree on the comments, they shall be integrated into the draft report.

In case a conflict of opinions among the evaluators and stakeholders in the final report occurs, both sides of the opinions shall be listed in the final report if both opinions are considered significant. (reference 4 3.15”Hearing of opinions from stakeholders”)

Also “intervention” of stakeholders in the judgement of evaluations shall be avoided considering the independency of the evaluators, while it is also important for the evaluators to conduct evaluations in a neutral manner by not taking a self-righteous manner. (4) Conducting Evaluations that Meet Policy Needs A) Understanding Policy Needs

One of the objectives of ODA evaluations is to promote the improvement of ODA, and not to conduct evaluations solely for the sake of it. Therefore, when deciding the scopes and priorities of evaluation, it is necessary to understand the needs of reflecting policies by conducting interviews, etc. with relevant officials of related divisions prior to the evaluations. B) Highly Feasible Recommendations which Improve ODA including Feedback of ODA

Evaluation Results. In relations with the purpose of evaluations to contribute to the improvement of ODA, there is a

constant need to improve the quality of evaluations and recommendations and implement highly feasible and concrete recommendations, in order to enable the evaluation results to be reflected in the policies. The conditions for high-quality recommendations are as follows;

Page 40: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

37

✦Clarify the “Targets” and “Objectives” of the Recommendations There is a need to clarify the targets of the recommendations on whether it has been made

to the entire organization, such as to “the domestic headquarter,” to the “field level” of as a response to individual cases. Furthermore, the “objectives” of the recommendations need to be clarified for the “policy/ strategy” or for the “method/ procedure of assistance.”

In fact, there may be a case in which developing response measures to a recommendation becomes difficult because of the poor quality of the evaluation reports, which simply list the recommendations without the targets or the objectives. Therefore, the “targets” and “objectives” of recommendations must be clear in order to avoid inaccuracy.

✦Make Recommendations by Listing the Three Stages of the "Evaluation Results," "Direction of Efforts" and "Details on Efforts and Actions"

It is essential for recommendations to be written in a simple manner, for example, in the three steps in which the details of a response action plan are stated, in addition to referring to the direction of efforts to improve or promote certain conditions by reflecting the evaluation results, which shall be listed as a source of such recommendations. In order to make a credible recommendation, there is a need to clarify the relations to the evaluation results.

✦Note “Degree of Priority” of Each Recommendation as Much as Possible

By arranging a priority list with the estimated implementation period as an effort to the evaluation recommendation, it makes it easier to respond to recommendations. To be specific, the priority and implementation period should be noted as much as possible such as by 1) listing recommendations that should be implemented in a short period of time (less than 1-2 years) and 2) recommendations that should be implemented in the mid-to-long term(3-5 years), etc.

C) Difference between Recommendations and Lessons

Recommendations are defined as suggestions based on the results of the evaluation towards the related organizations or persons regarding the target case. In this case, it is highly recommended to provide the detailed verifications as much as possible in order to clarify the targets, objectives, details of response measures, priorities, etc.

On the other hand, lessons refer to considerations based on the process of evaluations and the results of evaluations, which have broader applications. Even though lessons are not direct and detailed suggestions like recommendations, lessons present the items which may be useful in the future when planning or implementing ODA policies for other countries and issues.

Page 41: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

38

Chapter 4 Feedback and Public Relations of Evaluation

Results This chapter explains how ODA evaluation results are fed back to policymaking processes and

distributed to the public. 4.1 Feedback of ODA Evaluation Results

MOFA, as mentioned above, implements third-party evaluations primarily at the policy and program levels. The recommendations and lessons that have been brought as an outcome of the evaluations are fed back to relevant divisions of MOFA and JICA. Specifically, after the evaluation results are disseminated to the related divisions of MOFA and JICA, response measures for the evaluation recommendations are developed through consultations of the “Internal Follow-up Meeting on ODA Evaluation,” utilizing follow-ups on the results. Furthermore, as mentioned in the “ODA Review Final Report,” MOFA deeply understands the importance of feedbacks, particularly those of ODA evaluations, which is a part of the PDCA cycle. Moreover, since 2011, MOFA has published the entire processes of evaluations, including recommendations, response measures, actual implementations of such policies, etc as a part of the “Visibility” process. Figure 9: Flow of Feedback of ODA Evaluations by MOFA

March

April

June

July-

End of Year

July in the following year

Within the following year

Completion of Evaluations

Publication of the Evaluation Reports (on the MOFA website) and distribution

to relevant organizations

Development of response measures regarding recommendations

Follow-up

Publication of the annual evaluation report

Summarization of the previous year's follow up efforts

Publication of recommendations, response measures and conditions of

follow-up on the annual evaluation report

4.2 Public Relations of ODA Evaluation Results (1) The Importance of Public Relations

Making evaluation results available to the public helps fulfil the accountability to the public as well as promote understanding of ODA and participation of the public in the ODA activity by improving transparency.

Page 42: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

39

(2) Measures of Public Relations Public Relations of evaluation reports proceed in the following manners;

A) Distribution of Evaluation Reports (Hard Copies) Each evaluation report is distributed to the relevant divisions of MOFA and JICA, Japan’s diplomatic missions, and experts and NGOs that show interest in the topic. In addition, translated versions of the summaries are available in English or other foreign languages, and distributed to Japan’s diplomatic missions, governments and governmental organizations of the recipient countries, and relevant diplomatic missions in Japan, etc. Furthermore, MOFA publishes the Annual Report on Japan's ODA Evaluation every year,

which includes summaries of evaluations implemented by MOFA in the previous year, summaries of evaluations conducted by JICA, summaries of evaluation results on ODA projects conducted by other Ministries (mainly results for policy level evaluations) in addition to summaries of the implementation of response measures and follow-ups on the recommendations, for which MOFA makes efforts to enhance widespread public relations by distributing annual reports to diet members, experts, NGOs, universities and libraries. B) Publication on the MOFA website (ODA page) The full texts and summaries of the evaluation reports and annual evaluation reports are published on the ODA page of the MOFA website. (http://www.mofa.go.jp/policy/oda/evaluation/index.html). C) Others Information on evaluations is covered in the ODA e-mail newsletter. (3) Comprehensible Evaluation Results for Public For effective publicity of ODA evaluation results, comprehensibility is important when making the information available for the public. “ODA Review Final Report” also states that "Reports on evaluation results will be made more comprehensible for the public by rendering the contents succinct, avoiding technical terms and using graphical materials.”

Page 43: ODA Evaluation Guidelines · 2020-01-30 · Evaluation is systematically and objectively implemented based on an assessment of planning, implementation and results; therefore, clear

40

[Reference] Summary of “Country Assistance Evaluation of Nepal” in FY2012 (Example)

Country Assistance Evaluation of NepalEvaluation Team:

・Chief Evaluator: Hon. Prof. Ryokichi Hirono, Professor Emeritus, Seikei University・Advisor: Prof. Hisanobu Shishido, Professor, Tokyo Woman’s Christian University・Consultant: International Development Center of Japan Inc.Evaluation Period: July 2012 - February 2013Country Survey conducted in Nepal

Background and objectivesNepal is the least-developed country (LDC) in South Asia, with the lowest per-capita income. While the country is working on social and economic development, it is in need of international assistance due to such reasons as harsh geographical conditions. The peace and democratization process following a decade of armed conflict is still ongoing. This evaluation targets Japan’s assistance policy for Nepal from FY 2006 with the following objectives:1. An overall assessment of Japan’s aid policy for Nepal thus far, from both the point of view of development of Nepal as well as

assistance as a diplomatic measure to obtain lessons and recommendations for planning and implementation of future aid policy, which would be reflected in subsequent Japanese aid policy toward Nepal.

2. Fulfill accountability to the Japanese public through public disclosure of the evaluation results, as well as assisting ODA public relations by giving feedback on the results to the Government of Nepal and other donors.

3. Draw on the policy lessons learned on the appropriate way of assistance to South Asia, to least developed countries such as Nepal, and to traditional friends when Japan pursues “selection and concentration” of assistance, and harness that information for aid policies in similar countries and regions.

Evaluation Results Main ResultsFrom the development perspective, the relevance of Japan’s assistance policy was rated as “High.” The effectiveness of results was rated as “Moderate.” The appropriateness of the process was rated as “Moderate.” The overall evaluation of the assistance policy was rated as “Moderately satisfactory.” In addition, from the perspective of diplomacy, Japan’s assistance highly contributes to the diplomatic relations between the two countries.

Relevance of PoliciesThe basic policy for Japan’s assistance to Nepal is assistance to support sustainable and balanced economic growth, with the goal of allowing the country to graduate from the status of LDC. Its three priority areas include: poverty alleviation in rural areas, peacebuilding and steady transition to democracy, and building social infrastructure and institutions for balanced and sustainable economic growth. It was determined that Japan’s policy relevance was “High,” since almost all of the results for the evaluation criteria scored high. The assistance policy is consistent with the government policies and development needs of Nepal, international priorities, and the high-level policy of Japan’s ODA. Nepal’s development needs are enormous, and there is no significant overlap of support from other donors. Moreover, Japan’s assistance was based on its comparative advantages, including its neutral position, technical expertise, visible assistance and human resource development.

Effectiveness of ResultsThe evaluation team judges the effectiveness of results of Japan’s assistance policy to Nepal was “Moderate.” Although it has contributed consistently especially to subsectors to which Japan has made large-scale input, it was impossible to verify the effect on the entire sector from community-focused assistance and there were subsectors that had very small scale input. It is noteworthy that many years of assistance had produced a great accumulation of outcomes before the evaluation period in the areas of agriculture, education, health, roads, power, water supply, communications, and disaster prevention.

Appropriateness of ProcessesThe formulation and implementation process of the assistance policy was evaluated on the clarity of the process, the level of information sharing and stakeholder collaboration, ownership of the Nepalese side, coordination and cooperation with other donors, collaboration between schemes, and public relations. While many items received a high evaluation, judging from the fact that some items were still in need of improvement, the appropriateness of the formulation and implementation process was evaluated as “Moderate.”

(1) Recommendations to Enhance the Diplomatic Relations between the Two CountriesIt is important to proactively use ODA as a diplomatic measure. Therefore, steady assistance should continue to be directed to Nepal by maintaining certain amount of aid, combining the various schemes effectively, and thereby highlight the presence of Japan.

(2) Recommendations for Policy FormulationReification and strategization based on improvements in the format of country assistance policies: When developing similar policies for other countries in the future, it is desirable to reconsider the volume and contents of the document—including clarifying quantitative assistance targets and the development of related baseline data—from the perspective of enhancing the strategy and concreteness of the assistance policy.

Activation of policy dialogue and strengthening contributions to policy: Japan should hold regular high-level discussions with the Nepalese side, discuss the future direction of assistance and priority areas, and provide in-depth advice on policy issues. It would also be effective to dispatch policy advisors to key ministries and in areas with a proven track record of support and have them involved in effective policy formation, implementation, and monitoring.

(3) Recommendations to Increase the Effectiveness of ResultsTo increase the effectiveness of the assistance, it is necessary to utilize the comparative advantages of Japan to achieve more strategic focus of the assistance. For example, this could include: emphasizing social infrastructure and improvement of the investment climate, utilizing the advantages of Japan’s neutral support, achieving the synergy effects of assistance, the use of more visible assistance , and the improvement of agricultural productivity.

(4) Recommendations concerning the Implementation Process of the AidFurther strengthening of public relations: It is important to strengthen public relations activities targeting stakeholders both in Nepal and Japan. It is necessary to allocate more budget for public relations. In addition, it is important to be aware of and proactively use public relations in a wide variety of situations. To accomplish this, deploying designated staffs with expertise in public relations is indispensable.

A strengthening of efforts related to spreading the assistance model: Projects which formulate an assistance model should consider creating a model case that can be used elsewhere, or continue assistance to the point where it is possible to disseminate the model by carefully constructing the model in the first half of the process, and then simplifying it in the second half to make it easier to deploy.

Appropriate response to aid coordination: It is important to orchestrate coordinated support and deploy human resources with experience and competence in the field of aid coordination. They should have experiences of having a responsible position in international organizations, etc., and possess the relevant language skills so that they can maneuver between donors for aid coordination and subsequently market the outcome.

Sindhuli Road which fully utilizes Interview with the Ministry of Finance Sign Board of Japanese AssistanceJapanese expertise for the Kathmandu-Bhaktapur Road

Key Recommendations

Source of the Map: http://geography.howstuffworks.com/asia/geography-of-nepal.htmNote: Please note that the content of this site belongs to the Evaluation Team and does not reflect the views or positions of the Government of Japan.