Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie

Slides:



Advertisements
Similar presentations
What Researchers (and National Health Research Systems) Can Do 1.Research competes with many other factors in the policymaking process [Context] 2.Research.
Advertisements

The Analytical Framework
Implications for Think Tanks Need to be able to: –Understand the political context –Do credible research –Communicate effectively –Work with others Need.
A Practical Framework. RAPID Programme SMEPOL, Cairo, February, An Analytical Framework The political context – political.
Tools for Policy Influence. RAPID Programme SMEPOL, Cairo, February, Practical Tools.
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
© UKCIP 2011 Learning and Informing Practice: The role of knowledge exchange Roger B Street Technical Director Friday, 25 th November 2011 Crew Project.
Building a TEC Communications Strategy: Questions and Issues for Consideration.
The Global Authority on the Environment Workshop on Communication of Environmental Information Arendal, October, 2001.
Role of RAS in the Agricultural Innovation System Rasheed Sulaiman V
1 Learning from Christian Aid Bolivia Impact assessment - climate change advocacy in Bolivia.
RBM Communications Assessment Challenges and Opportunities in Ghana, Mali, Senegal, Tanzania and Uganda.
Idasa – Governance and AIDS Programme Building a habit of citizen action through HIV and AIDS Communication.
Dissemination pathways Science and policy
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Orienting Innovation towards Grand Challenges: a real-time experiment in the application of foresight-assisted processes Professor Ron Johnston Australian.
MONITORING AND EVALUATION – A PERSISTENT CHALLENGE 78 th Session of the Evaluation Committee Rome, 5 September 2013.
Factors that Influence Evaluation Utilisation. Theoretical Perspectives Michael Quinn Patton’s ‘Utilisation Focused Evaluation’ Addresses issue of use.
Developing Capacity on Water Integrity WATER INTEGRITY NETWORK Delft 31st May 2013 Francoise Nicole Ndoume Regional Coordinator Water Integrity Network,
THE MYSTERY OF GETTING RESEARCH INTO USE… THE ONGOING MYSTERY OF GETTING RESEARCH INTO USE.
3ie Grantees Communication for Policy Influence Clinic Negombo 16 th – 18 th July 2012.
International perspectives on e- learning: mapping strategy to practice Gráinne Conole Towards a pan-Canada e-learning research agenda.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
Lessons from RAPID’s work on research-policy links John Young.
Information Literacy in the workplace: implications for trainers By Dr. Mark Hepworth Department of Information Science Loughborough University.
Communications & Marketing at London’s Global University.
1 Introduction and Basic Elements of Advocacy. 2 What is advocacy? A systematic approach to changing policies and programs to reflect the needs of individuals.
Beyond impact factors: making your research count through better translation Clinical and Public Health Seminar April 2014 Associate Professor Harriet.
Objectives: 1. To learn why advocacy is one of the roles of CSOs. 2. To learn the process for developing an effective strategic advocacy campaign 3. To.
JOINT STRATEGIC NEEDS ASSESSMENT Rebecca Cohen Policy Specialist, Chief Executive’s.
IMI Initiative for Mainstreaming Innovation Developing Marketing Chains from Producers to Consumers LESSONS AND OPPORTUNITIES FOR INNOVATION.
Bridging the Gap between Research and Policymaking in India Seminar, Delhi, 3 rd January 2004 The Analytical Framework The political context – political.
Inclusive Business in Agrifood Markets: Evidence and Action FANRPAN Model – A Regional Multi-stakeholder Platform for Research, Knowledge Sharing and Policy.
Challenges of putting research into action. Oxfam-Monash Partnership Research that will “make a difference in people’s lives” Research conducted by academics.
Objectives 1. Build capacity for Regional Coordinators to better support the NOs. 2. Prepare the Malaysian Programme field staff on local level advocacy.
Adaptation knowledge needs and response under the UNFCCC process Adaptation Knowledge Day V Session 1: Knowledge Gaps Bonn, Germany 09 June 2014 Rojina.
Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Introduction to the UJ- BCURE programme UJ-BCURE Funded by.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Communications Review 2009 Niamh Brannigan February 2010.
(1) Bridging research, policy and politics the RAPID+ framework This presentation is based on: Court, J., and Young, J Bridging research and policy.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Linking Research, Policy and Practice Dr Niamh Gaynor IAP Director.
Workshop on health systems research in low and middle income countries: the role of global health funders in the UK The Wellcome Trust, Gibbs Building,
1 Implementing a Knowledge Cycle for Best Practices in Health Promotion and Chronic Disease Prevention Kerry Robinson, Vincent Turgeon, Dexter Harvey,
GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATION Organized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Improving sexual and reproductive health in poor and vulnerable populations Using intermediaries to communicate with DFID Kate Hawkins Strengthening research.
Communicating your findings to ‘hard to reach’ public audiences Lessons from medical research translation and social marketing Johanna Bell, Menzies School.
Workshop on Communication of Environmental Information.
Strategies for making evaluations more influential in supporting program management and informing decision-making Australasian Evaluation Society 2011.
EVIDENCE BASED POLICY RECOMMENDATIONS – TAKE AWAY LESSONS ON HOW TO PROGRESS EFFECTIVE ALCOHOL EDUCATION BETSY THOM Drug and Alcohol Research Centre MIDDLESEX.
1 How to Become a Strategic Communicator. 2 Topics Communications: The Big Picture Building a Communications Strategy.
Pathways for Scaling Up Capacity building at all levels Focus on the most vulnerable How do we support capacity building at scale— what are appropriate.
Caring, Ethics, Science. Caring Physicians of the World (CPW) Medical Leadership Communication and Medical Advocacy Course Monday, 2 nd May.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
Stimulating innovation in engaged practice and developing institutional cultures that support it 3. Capacity building and skills development  Supporting.
+ Welcome to PAHO/WHO Sustainable Development and Health Toolkit for the UN Global Conference RIO + 20 Welcome to PAHO/WHO Sustainable Development and.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Office of Overseas Programming & Training Support (OPATS) Community Economic Development Advocacy Training Package Advocacy Implementation Plan.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Renewing our focus on Impact Becky Murray Nairobi, 15 March 2016 Twitter: #impactafrica.
AGRICULTURE DEVELOPMENT
Advocacy and CampaiGning
AGRICULTURE DEVELOPMENT
AGRICULTURE DEVELOPMENT
Presentation transcript:

Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie Monitoring and Evaluation Coordinator Strategic Learning Initiative Institute of Development Studies, UK

Evaluation as a public good Communicating evaluations for accountability and communicating to share learning Access to lessons learnt for practitioners and policy makers Benchmarking and learning between organisations

Evaluations, context and the politics of knowledge Evaluations are politically sensitive Results are context specific and often complex – Especially from participatory monitoring and evaluation processes – Lessons learnt either to general or too specific to be useful Thematic/sector/country wide evaluations try to make lessons learnt more relevant Need to find ways to communicate different types of evaluation, at different stages during the process

Challenges Few incentives to communicate beyond the programme Experience of the IDS Knowledge Services: – Large and bulky reports, hard to summarise – Written for a specific audience (often the donor) – Rigour and quality often unclear – Often reports are hidden away in remote places on the web (if they make it onto the internet) Need to learn from pilots in rapidly changing areas such as climate change- but rarely shared

Examples of good practice DAC Evaluation Resource Centre World Bank Evaluation Department Danida ALNAP 3ie IFAD But… – Focused on large scale impact evaluations – How much do smaller evaluations, or those using different methodologies, get shared? – How do policy-makers or practitioners access the information? – How much synthesis is being done and is that shared beyond organisational boundaries?

Understanding how evaluation influences Can learn from research influence and uptake What works (experiences from IDS): – ‘Sticky messages’ / Rallying ideas – ‘Knit working’- building coalitions of connectors and champions – Strategic opportunism – identifying windows of opportunity for impact/influence Challenge of evaluating the influence of evaluations on policy and practice

Increasing the use of evaluations in policy and practice Availability on websites important, but doesn’t necessarily mean it will be used Understand how target groups search for, access and use information Information literacy Incentives to look for and use evaluations; incentives for organisational learning Need multiple communication strategies

What can we learn from research communications? Timeliness and relevance Editing, summarising Brevity and clear messages Credibility and quality Synthesis important Marketing Networking and multi-way communication Being systematic and opportunistic Requires variety of different skills

Target groups Identify different target groups and tailor communication strategies Involve networks and communities of practice throughout the evaluation process

Multiple communication approaches Different tools – Print – Seminars – Toolkits – updates – Online discussions – Visual – Blogs – Pod casts – CD Roms/USB sticks – Policy briefings Different channels – Traditional academic: e.g. journals, conferences, research networks – Direct stakeholder involvement – Practitioner and advocacy networks – Information and Knowledge intermediaries

Conclusions Building in incentives to communicate evaluations Learn from experience of research communications – Tailored approaches for different audiences – Build communications in from the start Role of information and knowledge intermediaries Horizontal learning and accountability: – Involving and sharing learning with wider range of stakeholders, including networks and communities of practice, throughout the evaluation process

Questions for the future How can context specific and potentially sensitive evaluations be shared, adapted and applied beyond the programme context? How do we assess the influence of evaluations on policy/practice? How can ‘decision-makers’ be encouraged and supported to use evaluations from other contexts/programmes for evaluation informed decision-making? What strategies/channels/methods are effective in communicating evaluations beyond the specific programme context? What kinds of networks and communities could both benefit from, and add insight to, the final conclusions of an evaluation itself? Share your views: