Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas.

Slides:



Advertisements
Similar presentations
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Advertisements

KYRGYZ REPUBLIC Programmatic Public Expenditure Review Monitoring and Evaluation The MOF/Donor Workshop on PPER Bishkek, September 26, 2005.
Spencer Henson & Oliver Masakure International Food Economy Research Group Department of Food, Agricultural & Resource Economics University of Guelph.
Standards and Trade Development Facility (STDF) A joint initiative in SPS capacity building and technical cooperation OIE Global Conference on Veterinary.
From Research to Advocacy
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Evaluation and performance assessment - experience from DFID Colin Kirk Head, Evaluation Department, DFID.
Ray C. Rist The World Bank Washington, D.C.
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Commonwealth Local Government Forum Freeport, Bahamas, May 13, 2009 Tim Kehoe Local Government and Aid Effectiveness.
Latest Trends in Evaluation: Interviews with Industry Leaders Don Snodgrass and Zan Northrip October 2, 2008 DAI.
Evaluation. Practical Evaluation Michael Quinn Patton.
Policy and Operations Evaluation Department (IOB) Evaluability Asessment: Preparatory Steps before Starting an Evaluation Prof. dr. Ruerd Ruben Director.
Social Development Department The World Bank Poverty and Social Impact Analysis: Is it Working in the World Bank? February 8, 2008 United Nations Commission.
Institutionalizing Impact Evaluation in Human Development The Spanish Impact Evaluation Fund (SIEF) Sebastian Martinez Human Development Network World.
BY Margaret Kakande President, Uganda Evaluation Association 1.
Page 0 Agency Approaches to Managing for Development Results Why Results? What Results? Key Challenges, lessons learnt Core principles and draft action.
4 March 2014 ©2014 Newdea, Inc. 2 A Global, On-Line, Customizable Project Management Tool for PPD Coordination Units Outcomes >Transparency >Learning.
1 Informing a Data Revolution Getting the right data, to the right people, at the right time, on the right format Johannes Jütting, PARIS21 Tunis, 8 Decemeber.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Assessing the Impact of Policy Oriented Research Peter Hazell External Coordinator for Impact Assessment at IFPRI 1.
The African Development Bank’s Contribution to Aid for Trade (AfT) Henri A. MINNAAR NEPAD, Regional Integration and Trade Department WIPO Conference on.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Managing Performance The experience of the World Bank Mauritania Country Office.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
3ie  Context: the results agenda  The call for evidence  Rigorous impact evaluation  The international response  Reactions.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points Western and Central Africa Dakar, May 2007.
Global Partnership on Disability and Development What is the GPDD? Presentation to JICA Group Training Course HIV/AIDS Section Judith Heumann, Lead Consultant,
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
DAC Network on Development Evaluation Rob D. van den Berg Chairman.
RIA: Communication – building credibility Aleš Pecka Department of Regulatory Reform and Public Administration Quality Ministry of Interior, Czech Republic.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Education and MDGs The MDGs provided a powerful framework However, there are weaknesses: – Equity – Interconnectivity of issues – Sustainable development.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Is the Accountability, Impact and VFM Debate too Donor Focused? Cathy Shutt November 23 rd INTRAC Workshop.
Impact of evaluations matters IDEAS Conference 2011, Amman “Evidence to Policy: Lessons Learnt from Influential Impact Evaluations” Presenter: Daniel Svoboda,
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Part Two Corruption Assessments Photos by Adam Rogers/UNCDF.
Country Partnership Strategy FY12-16 Consultations with Civil Society The World Bank Group June 2, 2011.
Brief Introduction Dr R Vincent: 1 Most Significant Change: using stories to assess impact.
21 st Century Principals Institute Copy March 2009.
Global Programme on Democratic Governance Assessments Strengthening inclusive participation and responsive institutions through measuring.
Driving towards Impact through Development Goals Washington, DC 04/13/2011.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Bangladesh Joint Country Assistance Evaluation: Assessing Total ODA at the Country Level Presentation to OECD DAC November 2006 Bruce Murray Director General.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
Rooting evaluation independence in the context of multilateral development organizations Oscar A. Garcia Director, Independent Office of Evaluation of.
DATA FOR EVIDENCE-BASED POLICY MAKING Dr. Tara Vishwanath, World Bank.
International Standards of Supreme Audit Institutions (ISSAIs) Jennifer Thomson Director OPSPF & Chief Financial Management Officer World Bank.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
The Global Partnership Monitoring Framework Purpose and Scope of Monitoring, Role of Participating Countries UNDP-OECD support team Copenhagen, 12 June,
Country Level Programs
How IFAD Promotes Learning among Development Partners in the Field
Evaluation: For Whom and for What?
Session 3 The monitoring framework
Module 1: Introducing Development Evaluation
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Tracking development results at the EIB
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
Session 3 The monitoring framework
Monitoring and Evaluating FGM/C abandonment programs
Effectiveness Working Group
Assessing the Relevance of Global and Regional Partnership Programs (GRPPs) Chris Gerrard Global Programs Coordinator, IEG November 13,
Presentation transcript:

Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas Independent Evaluation Group World Bank Group

2 WHY EVALUATE?  It’s a burden  Ancient history department  No one wants to read those reports  We know the problems and are already fixing them

3 We Evaluate Because …  Officials are accountable for their use of public funds  We are learning organizations “Many development schemes and dreams have failed. This is not a reason to quit trying. It is cause to focus continually and rigorously on results and on the assessment of effectiveness.” Robert B. Zoellick, President of the World Bank Group

4 The Power of Measuring Results The Power of Measuring Results  If you do not measure results, you cannot tell success from failure.  If you cannot see success, you cannot reward it.  If you cannot reward success, you are probably rewarding failure.  If you cannot see success, you cannot learn from it.  If you cannot recognize failure, you cannot correct it. From Osbourne and Gaebler (1992) Reinventing Government

5  We need a monitoring system to track our key indicators so as to know if we are getting the change we have anticipated.  We need an evaluation system to tell us: - Are we doing the right things? - Are we doing things right? - Are there better ways of doing it? How do you tell the difference between success and failure?

6 For Credible Evaluation…  Adequate resources for evaluation  People with the right skills  Consultation with partners, beneficiaries, other key stakeholders  Understanding of the theory of change  Sound evaluation design  Valid methods  Relevant and accurate data collection– including baseline data  Communication  Lessons identification  Recommendations and management action tracking system

7 “The golden rule of evaluation is that is it should be protected from capture by program managers and donors.” “The golden rule of evaluation is that is it should be protected from capture by program managers and donors.” Robert Picciotto,  There are trade-offs between internal and external evaluation  Using consultants is no guarantee of independence  Independence: organizational, behavioral, absence of conflict of interest, free from external influences  Independent evaluation and self-evaluation are complementary and synergistic

8 Trends: The New Impact Evaluation Trends: The New Impact Evaluation Ingredients: Ingredients: A skeptical public not convinced that aid makes a difference A skeptical public not convinced that aid makes a difference Media questioning ability to provide evidence of attribution, e.g. NYT “…only 2% of WB projects properly evaluated.” Media questioning ability to provide evidence of attribution, e.g. NYT “…only 2% of WB projects properly evaluated.”  Demand for rigorous evaluations = RCTs aka “the Gold Standard” and, if not possible, quasi-experimental desgins  Campbell Collaboration (2000), Global Development Center (2001), Mexico legislation (2001), Poverty Action Lab (2003)

9 Trends: Continued  Initiatives: Development Impact Evaluation Initiative (DIME), IFC & J-PAL conduct 32 TA impact evals, Spanish WB Trust Fund €10.4 million, Africa Impact Eval Initiative, 3IE, etc.  But useful for discrete interventions rather than broad multi-components, multi- participant country programs

10 Evolution of Development Evaluation At same time globalization/global public goods/MDGs=global partnership globalization/global public goods/MDGs=global partnership demand for country evaluations not project demand for country evaluations not project from attribution to relative contribution and additionality at the country level from attribution to relative contribution and additionality at the country level growth of national evaluation associations and beginning of country reviews of donor assistance growth of national evaluation associations and beginning of country reviews of donor assistance

11 Challenges for Management of Development Evaluation  Promoting a mixed methods approach to project evaluation, supporting RCTs where appropriate (e.g. EES)  Determining role in impact evaluations + skills mix  Shifting resources to increasing joint evaluations  Measuring additionality  Mapping of country efforts  Methodology for aggregation in joint country and sector or thematic evaluations

12 “Which road shall I take?” Alice asked the Cheshire cat. “Where do you want to get to?” the cat asked helpfully. “I don’t know,” admitted Alice. “Then,” advised the cat, “any road will take you there.” Lewis Carroll. Alice in Wonderland.