Download presentation
Presentation is loading. Please wait.
Published byOsbaldo Pipkins Modified over 9 years ago
1
Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie (a.downie@ids.ac.uk) Monitoring and Evaluation Coordinator Strategic Learning Initiative Institute of Development Studies, UK
2
Evaluation as a public good Communicating evaluations for accountability and communicating to share learning Access to lessons learnt for practitioners and policy makers Benchmarking and learning between organisations
3
Evaluations, context and the politics of knowledge Evaluations are politically sensitive Results are context specific and often complex – Especially from participatory monitoring and evaluation processes – Lessons learnt either to general or too specific to be useful Thematic/sector/country wide evaluations try to make lessons learnt more relevant Need to find ways to communicate different types of evaluation, at different stages during the process
4
Challenges Few incentives to communicate beyond the programme Experience of the IDS Knowledge Services: – Large and bulky reports, hard to summarise – Written for a specific audience (often the donor) – Rigour and quality often unclear – Often reports are hidden away in remote places on the web (if they make it onto the internet) Need to learn from pilots in rapidly changing areas such as climate change- but rarely shared
5
Examples of good practice DAC Evaluation Resource Centre World Bank Evaluation Department Danida ALNAP 3ie IFAD But… – Focused on large scale impact evaluations – How much do smaller evaluations, or those using different methodologies, get shared? – How do policy-makers or practitioners access the information? – How much synthesis is being done and is that shared beyond organisational boundaries?
6
Understanding how evaluation influences Can learn from research influence and uptake What works (experiences from IDS): – ‘Sticky messages’ / Rallying ideas – ‘Knit working’- building coalitions of connectors and champions – Strategic opportunism – identifying windows of opportunity for impact/influence Challenge of evaluating the influence of evaluations on policy and practice
7
Increasing the use of evaluations in policy and practice Availability on websites important, but doesn’t necessarily mean it will be used Understand how target groups search for, access and use information Information literacy Incentives to look for and use evaluations; incentives for organisational learning Need multiple communication strategies
8
What can we learn from research communications? Timeliness and relevance Editing, summarising Brevity and clear messages Credibility and quality Synthesis important Marketing Networking and multi-way communication Being systematic and opportunistic Requires variety of different skills
9
Target groups Identify different target groups and tailor communication strategies Involve networks and communities of practice throughout the evaluation process
10
Multiple communication approaches Different tools – Print – Seminars – Toolkits – Email updates – Online discussions – Visual – Blogs – Pod casts – CD Roms/USB sticks – Policy briefings Different channels – Traditional academic: e.g. journals, conferences, research networks – Direct stakeholder involvement – Practitioner and advocacy networks – Information and Knowledge intermediaries
11
Conclusions Building in incentives to communicate evaluations Learn from experience of research communications – Tailored approaches for different audiences – Build communications in from the start Role of information and knowledge intermediaries Horizontal learning and accountability: – Involving and sharing learning with wider range of stakeholders, including networks and communities of practice, throughout the evaluation process
12
Questions for the future How can context specific and potentially sensitive evaluations be shared, adapted and applied beyond the programme context? How do we assess the influence of evaluations on policy/practice? How can ‘decision-makers’ be encouraged and supported to use evaluations from other contexts/programmes for evaluation informed decision-making? What strategies/channels/methods are effective in communicating evaluations beyond the specific programme context? What kinds of networks and communities could both benefit from, and add insight to, the final conclusions of an evaluation itself? Share your views: www.alineplanning.orgwww.alineplanning.org
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.