Anna Downie ([email protected]) Monitoring and Evaluation Coordinator

12
Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie ([email protected]) Monitoring and Evaluation Coordinator Strategic Learning Initiative Institute of Development Studies, UK

description

Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally. Anna Downie ([email protected]) Monitoring and Evaluation Coordinator Strategic Learning Initiative Institute of Development Studies, UK. Evaluation as a public good. - PowerPoint PPT Presentation

Transcript of Anna Downie ([email protected]) Monitoring and Evaluation Coordinator

Page 1: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Valuing evaluation beyond programme boundaries:

Communicating evaluations to enhance development effectiveness

globallyAnna Downie ([email protected])Monitoring and Evaluation Coordinator

Strategic Learning InitiativeInstitute of Development Studies, UK

Page 2: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Evaluation as a public good

• Communicating evaluations for accountability and communicating to share learning

• Access to lessons learnt for practitioners and policy makers

• Benchmarking and learning between organisations

Page 3: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Evaluations, context and the politics of knowledge

• Evaluations are politically sensitive• Results are context specific and often complex– Especially from participatory monitoring and evaluation

processes– Lessons learnt either to general or too specific to be useful

• Thematic/sector/country wide evaluations try to make lessons learnt more relevant

• Need to find ways to communicate different types of evaluation, at different stages during the process

Page 4: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Challenges• Few incentives to communicate beyond the

programme• Experience of the IDS Knowledge Services:– Large and bulky reports, hard to summarise– Written for a specific audience (often the donor)– Rigour and quality often unclear– Often reports are hidden away in remote places on the web

(if they make it onto the internet)

• Need to learn from pilots in rapidly changing areas such as climate change- but rarely shared

Page 5: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Examples of good practice• DAC Evaluation

Resource Centre• World Bank Evaluation

Department• Danida• ALNAP• 3ie• IFAD

• But…– Focused on large scale impact

evaluations– How much do smaller

evaluations, or those using different methodologies, get shared?

– How do policy-makers or practitioners access the information?

– How much synthesis is being done and is that shared beyond organisational boundaries?

Page 6: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Understanding how evaluation influences

• Can learn from research influence and uptake• What works (experiences from IDS):– ‘Sticky messages’ / Rallying ideas– ‘Knit working’- building coalitions of connectors

and champions– Strategic opportunism – identifying windows of

opportunity for impact/influence• Challenge of evaluating the influence of

evaluations on policy and practice

Page 7: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Increasing the use of evaluations in policy and practice

• Availability on websites important, but doesn’t necessarily mean it will be used

• Understand how target groups search for, access and use information

• Information literacy• Incentives to look for and use

evaluations; incentives for organisational learning

• Need multiple communication strategies

Page 8: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

What can we learn from research communications?

• Timeliness and relevance

• Editing, summarising

• Brevity and clear messages

• Credibility and quality

• Synthesis important

• Marketing • Networking and

multi-way communication

• Being systematic and opportunistic

• Requires variety of different skills

Page 9: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Target groups• Identify different target

groups and tailor communication strategies

• Involve networks and communities of practice throughout the evaluation process

Page 10: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Multiple communication approaches

• Different tools– Print– Seminars– Toolkits– Email updates– Online discussions– Visual– Blogs– Pod casts– CD Roms/USB sticks– Policy briefings

• Different channels– Traditional academic:

e.g. journals, conferences, research networks

– Direct stakeholder involvement

– Practitioner and advocacy networks

– Information and Knowledge intermediaries

Page 11: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Conclusions• Building in incentives to

communicate evaluations

• Learn from experience of research communications– Tailored approaches for

different audiences– Build communications in

from the start

• Role of information and knowledge intermediaries

• Horizontal learning and accountability:– Involving and sharing

learning with wider range of stakeholders, including networks and communities of practice, throughout the evaluation process

Page 12: Anna Downie  (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator

Questions for the future• How can context specific and potentially sensitive evaluations be

shared, adapted and applied beyond the programme context? • How do we assess the influence of evaluations on policy/practice?• How can ‘decision-makers’ be encouraged and supported to use

evaluations from other contexts/programmes for evaluation informed decision-making?

• What strategies/channels/methods are effective in communicating evaluations beyond the specific programme context?

• What kinds of networks and communities could both benefit from, and add insight to, the final conclusions of an evaluation itself?

• Share your views: www.alineplanning.org