Download presentation
Presentation is loading. Please wait.
Published byBenjamin Parks Modified over 9 years ago
1
Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas Independent Evaluation Group World Bank Group
2
2 WHY EVALUATE? It’s a burden Ancient history department No one wants to read those reports We know the problems and are already fixing them
3
3 We Evaluate Because … Officials are accountable for their use of public funds We are learning organizations “Many development schemes and dreams have failed. This is not a reason to quit trying. It is cause to focus continually and rigorously on results and on the assessment of effectiveness.” Robert B. Zoellick, President of the World Bank Group
4
4 The Power of Measuring Results The Power of Measuring Results If you do not measure results, you cannot tell success from failure. If you cannot see success, you cannot reward it. If you cannot reward success, you are probably rewarding failure. If you cannot see success, you cannot learn from it. If you cannot recognize failure, you cannot correct it. From Osbourne and Gaebler (1992) Reinventing Government
5
5 We need a monitoring system to track our key indicators so as to know if we are getting the change we have anticipated. We need an evaluation system to tell us: - Are we doing the right things? - Are we doing things right? - Are there better ways of doing it? How do you tell the difference between success and failure?
6
6 For Credible Evaluation… Adequate resources for evaluation People with the right skills Consultation with partners, beneficiaries, other key stakeholders Understanding of the theory of change Sound evaluation design Valid methods Relevant and accurate data collection– including baseline data Communication Lessons identification Recommendations and management action tracking system
7
7 “The golden rule of evaluation is that is it should be protected from capture by program managers and donors.” “The golden rule of evaluation is that is it should be protected from capture by program managers and donors.” Robert Picciotto, 2008. There are trade-offs between internal and external evaluation Using consultants is no guarantee of independence Independence: organizational, behavioral, absence of conflict of interest, free from external influences Independent evaluation and self-evaluation are complementary and synergistic
8
8 Trends: The New Impact Evaluation Trends: The New Impact Evaluation Ingredients: Ingredients: A skeptical public not convinced that aid makes a difference A skeptical public not convinced that aid makes a difference Media questioning ability to provide evidence of attribution, e.g. NYT “…only 2% of WB projects properly evaluated.” Media questioning ability to provide evidence of attribution, e.g. NYT “…only 2% of WB projects properly evaluated.” Demand for rigorous evaluations = RCTs aka “the Gold Standard” and, if not possible, quasi-experimental desgins Campbell Collaboration (2000), Global Development Center (2001), Mexico legislation (2001), Poverty Action Lab (2003)
9
9 Trends: Continued Initiatives: Development Impact Evaluation Initiative (DIME), IFC & J-PAL conduct 32 TA impact evals, Spanish WB Trust Fund €10.4 million, Africa Impact Eval Initiative, 3IE, etc. But useful for discrete interventions rather than broad multi-components, multi- participant country programs
10
10 Evolution of Development Evaluation At same time globalization/global public goods/MDGs=global partnership globalization/global public goods/MDGs=global partnership demand for country evaluations not project demand for country evaluations not project from attribution to relative contribution and additionality at the country level from attribution to relative contribution and additionality at the country level growth of national evaluation associations and beginning of country reviews of donor assistance growth of national evaluation associations and beginning of country reviews of donor assistance
11
11 Challenges for Management of Development Evaluation Promoting a mixed methods approach to project evaluation, supporting RCTs where appropriate (e.g. EES) Determining role in impact evaluations + skills mix Shifting resources to increasing joint evaluations Measuring additionality Mapping of country efforts Methodology for aggregation in joint country and sector or thematic evaluations
12
12 “Which road shall I take?” Alice asked the Cheshire cat. “Where do you want to get to?” the cat asked helpfully. “I don’t know,” admitted Alice. “Then,” advised the cat, “any road will take you there.” Lewis Carroll. Alice in Wonderland.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.