Co-ordination of multi-site evaluations: design, support for execution, QA and synthesis in the Paris Declaration Evaluation Bernard Wood and Julia Betts,

Slides:



Advertisements
Similar presentations
Summary of Report to IATI Steering Committee, Paris 9 February 2011 Richard Manning.
Advertisements

ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Child Friendly Local Governance
2002 CCA/UNDAF Rollout Lessons Learned from the 5 Country Teams.
Delivering as One UN Albania October 2009 – Kigali.
SGA1 – The evolving role of UNAIDS in a changing financial environment UNAIDS has adapted to a new funding environment and developed strong and positive.
EuropeAid ENGAGING STRATEGICALLY WITH NON-STATE ACTORS IN NEW AID MODALITIES SESSION 1 Why this Focus on Non-State Actors in Budget Support and SPSPs?
EuropeAid Pre-Assessment and Assessment for Parliamentary Development Promoting domestic accountability: engaging with parliaments EC support to governance.
Track A- Developing Effective Partnerships to Roll Back Malaria Experiences and lessons.
Leicestershires Vision for short break transformation Leicestershire is committed to the transformation and expansion of short break services for disabled.
+ African Legal Support Facility Negotiations of natural resource contracts : Role of ALSF 2013 African Legal Support Facility Stephen Karangizi Director,
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1FANIKISHA Institutional Strengthening Project First Author: Henry Kilonzo Second Author: Dr. Daraus Bukenya Enabling Kenyan Civil Society Organizations.
How to commence the IT Modernization Process?
1 Department of State Program Evaluation Policy Overview Spring 2013.
Delivering effective enterprise education: the role of learning design and technology Professor Pauric McGowan University of Ulster Dr Richard Blundel.
Donald T. Simeon Caribbean Health Research Council
Supported self-evaluation in assessing the impact of HE Libraries Sharon Markless, King’s College London and David Streatfield, Information Management.
Lucila Beato UNMIL/HRPS
Open Forum on CSO Development Effectiveness as a Response to Paris Declaration IDEAS Global Assembly 2009 Getting to Results: Evaluation Capacity Building.
Representing Central Government in the South East Monday, 27 April 2015 Vivien Lines DCSF Safeguarding Adviser VCS Safeguarding Seminar 17 December 2009.
1 Tools and mechanisms: 1. Participatory Planning Members of local communities contribute to plans for company activities potentially relating to business.
Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013.
Charting a course PROCESS.
1 Lal Shanker Ghimire Joint Secretary FACD, Ministry of Finance Joint Evaluation in Nepal: Experience Sharing from the Paris Declaration Evaluation.
Challenges of a Harmonized Global Safety Regime Jacques Repussard Director General IRSN IAEA 2007 Scientific Forum.
Accessibility Planning, Training & Advisory Programme Making the connections—making it happen Putting Accessibility Planning withinreach! Derek Palmer.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
From Effective Aid to Effective Institutions Synthesis of Joint International Evaluations Julia Betts and Helen Wedgwood Paris 5 th October 2011.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Evaluation of the SEND Pathfinder Programme: Early Findings Graham Thom and Meera Prabhakar May 2012.
Early Help Strategy Achieving better outcomes for children, young people and families, by developing family resilience and intervening early when help.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Roundtable #4 Democratic ownership: Managing for Development results and Mutual Accountability.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Report on the Evaluation Function Evaluation Office.
Commissioning Self Analysis and Planning Exercise activity sheets.
United Nations Volunteers Volunteerism for Development in the context of CBA Adeline Aubry CBA Volunteerism & Community Adaptation Specialist United Nations.
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Evaluation of the Paris Declaration Phase 2 Presentation by Dorte Kabell Member of the Core team Evaluation of the Implementation of the Paris Declaration.
Lessons from Programme Evaluation in Romania First Annual Conference on Evaluation Bucharest 18 February 2008.
Monitoring the Paris Declaration in 2011 Preliminary Findings Working Party on Aid Effectiveness Paris, 5-8 July 2011.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
AfCoP and the AAA Reflections on future engagement By Richard Ssewakiryanga
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Validated Self Evaluation of Alcohol and Drug Partnerships Evidencing Implementation: The Quality Principles – Care Inspectorate/The Scottish Government.
Kathy Corbiere Service Delivery and Performance Commission
Joint Evaluation of the Paris Declaration, Phase 2Core Team Joint Evaluation of the Paris Declaration, Phase 2 Evaluation Framework & Workplan Presentation.
Paris, Accra, Busan. Paris Declaration of 2005 Provides foundation for aid effectiveness agenda. Introduces aid effectiveness principles which remain.
APPG Equalities 29 th October 2014 Fair Financial Decision-Making: Follow up to the Equality & Human Rights Commission’s S31 assessment of the 2010 Spending.
Joint Evaluation of the Paris Declaration, Phase 2Core Team IRG Meeting 30 Nov 2009 Key conclusions & follow-up actions DRAFT Core Evaluation Team.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
SWA Progress Review Initial Framing Ken Caplan & Leda Stott 12 November 2013 SWA Partnership Meeting 2013.
PEFA FRAMEWORK FOR ASSESSING PUBLIC FINANCIAL MANAGEMENT Module 9: Comparisons over time & between countries.
Monitoring the Paris Declaration Emerging Findings Brenda Killen, OECD Ministry of Foreign Affairs, Helsinki, Finland 30 August.
1 Evaluation of the Paris Declaration Organising the Evaluation AEA 2011 Niels Dabelstein Head, PD Evaluation Secretariat.
Capacity Building in: GEO Strategic Plan 2016 – 2025 and Work Programme 2016 Andiswa Mlisa GEO Secretariat Workshop on Capacity Building and Developing.
Open and inclusive budgeting: Working beyond boundaries
GEF governance reforms to enhance effectiveness and civil society engagement Faizal Parish GEC, Central Focal Point , GEF NGO Network GEF-NGO Consultation.
Draft OECD Best Practices for Performance Budgeting
The PEFA Program – roll-out, impact and future focus
UNDP-UNEP POVERTY & ENVIRONMENT INITIATIVE (PEI): MID-TERM REVIEW
Statistics Governance and Quality Assurance: the Experience of FAO
Follow us: June 15-16, 2017 Steering Committee Meeting Summary of Conclusions and Action Items Follow.
MAKING INTERNATIOAL MA MECHANISMS FIT FOR PURPOSE
SWA Progress Review Initial Framing
Presentation transcript:

Co-ordination of multi-site evaluations: design, support for execution, QA and synthesis in the Paris Declaration Evaluation Bernard Wood and Julia Betts, Core Evaluation Team PDE February 2012

Core Team roles: To move from approach paper and other preparatory stages to an operational design, framework and matrix for the overall evaluation and country studies. What worked well? a.International governance arrangements, culture, participation and support reflected vast experience & global best practice in joint evaluation. b.Phase I lessons could be applied to Phase 2 framework & country studies. c.Identifying the self defined intended outcomes of the Paris Declaration, the implicit programme theory and the centrality of context. d.Introducing 3-question sequence and contribution analysis as the best available way to handle the difficult link to development results e.Support of the International Management Group and IRG for these design steps. f.Wide participation in fleshing out the evaluation framework and matrix strengthened ownership and trust by stakeholders. g.Designing the synthesis approach from the start and gearing all tools, analytical processes etc. to the same framework. Design (1)

Challenges (what did not work well?) and responses a.Much progress was needed from the approach paper and earlier theoretical exploration. All (esp. IMG) tacitly accepted the need for a fresh start. b.Too many evaluation questions: The 11 intended outcomes had compelling legitimacy, but were still very wide. Participation added yet more questions. No full solution: Using some big questions for conclusions helped. The ability to add country-specific questions also helped contain this problem but not fully overcome it. Resulted in some spotty coverage & fewer hard quantifiable findings. But the parallel monitoring survey fed the appetite for hard indicators, while also exposing their limits. c.Most teams did not focus enough on context chapters to get full value, esp. in the Busan era. (e.g. on forces beyond aid, non-traditional providers). The synthesis pushed contextual discussion to the limits of the evidence, no resources to supplement fully. d.Too much information and candour were expected from country studies on the performance of traditional and non-traditional donors. The synthesis pushed to the limits of evidence, including from other solid sources. Design (2)

Support to execution by country and donor teams Core Team roles: to support teams in applying the framework, solving issues, and helping safeguard professional independence What worked well? a.Regional workshops (good but expensive) and individual video and other support (mainly request driven). Intranet tool for managing guidance and information was vital, but not used by all. b.Standard framework of questions, sub-questions and suggested indicators and sources c.Flexible, multi-lingual core team resources d.Tracking progress at milestones and following up Challenges (what did not work well?) and responses a.Staggered starting points (esp. in contracting teams) made support more expensive and less effective Extra support resources, sessions and follow-up were added – of some help b.Donor studies were contracted before the framework was set, support provisions were unclear. Not overcome, limited support possible to donor studies, more was needed c.Different understandings of independence and QA roles. International scrutiny, some interventions & clarifications of independence (with Sect.) helped

Core Team roles: As part of overall QA strategy, to assess the quality of draft country and donor reports & suggest strengthening, then validate and gauge the reliability of evidence from each final report for the synthesis. What worked well? a.Systematic check by at least two CET members of each main finding for strength of evidence and conclusions and recommendations for clarity of argument. b.The Emerging Findings workshop, as a forum for transparent focus on quality, examples of good practice, constructive peer pressure and support opportunities for lagging cases. Challenges (what did not work well?) and responses a.The number of late and incomplete drafts limited the scope for a solid emerging findings report and for rigorous overall checks at the Bali workshop. Used the solid evidence on hand but intensive extra work was needed post-Bali to extract evidence from final drafts / reports etc. b.Some workshop participants focused on their own opinions and experiences rather than evaluation findings. Listened and reported back faithfully, but filtered to keep solid evaluation evidence as the base for the synthesis. c.Double checks of both drafts and final reports imposed heavy demands in a very short time. Worked harder and didnt compromise on rigour Quality assurance

Synthesis process Core team roles: Systematically assemble and reflect key findings & conclusions from the body of evaluations and studies and distil policy-relevant overall findings, conclusions and recommendations, calibrated to the strength of the synthesized evidence. What worked well? a.Assembling validated evidence and following the evaluation framework. b.Finding the balance - enough detail to reflect key evidence, but focusing on strategic findings and conclusions. c.Making the leap to policy-relevant findings, conclusions and recommendations – requires policy grasp as well as evaluation rigour. Level and language geared to dissemination and use. d.Validation process (and the rules applied) for the first draft synthesis, steered by the Management Group. Challenges (what did not work well?) and responses a.Some different expectations for synthesis product: accessibility and policy- relevance vs. detail and methodology. Opted for accessibility with essential details and a well-signposted technical annex. b.Uneven engagement by IRG: thus fuller discussion needed at the final validation workshop, while protecting the agreed process c.Time pressures, as throughout: work harder.

1.Campaign for evaluable frameworks of intended outcomes in the up-front design of programmes and policies, but remain wary of crude and over- simplified indicators. Accept and embrace the need for rigorous qualitative evaluations of complex realities. 2.Aim for governance arrangements, culture, participation and support that reflect experience and best practice in joint evaluation. 3.Keep the working language as clear and non-technocratic as possible, minimizing jargon – especially, but not only, in multi-lingual and multi- cultural evaluation processes. Carry this through to reports to maximize ultimate dissemination and use. 4.Recognize genuinely participatory design and validation as not just desirable but integral to the necessary ownership of the process and the ultimate quality and utility of the evaluation. Build in the careful planning, structure, execution, and facilitation implied (MQP). 5.Recruit highly competent teams early to play a major role in design, together with evaluation managers and stakeholders. 6.(Perhaps) be prepared to impose selectivity, even among vital questions, in order to have a manageable challenge across the body of cases. Some key lessons

Some key lessons (2) 7.Prepare for complex and uneven processes in multi-site evaluations but set and keep the deadlines necessary to maintain momentum and deliver timely results. 8.While working to strengthen them, expect uneven capacities and delivery among varied teams. Be ready to reinforce, but if necessary abandon or sideline results where they are found weak against transparent standards. Ensure in advance that an adequate base will remain after dropouts for reasonable overall validity. 9.Recognize that written component and synthesis reports are only part of the contribution of the evaluation, alongside benefits from the process and building a community of shared understanding and trust. 10.Set and consistently apply rules to protect teams independence within agreed evaluation frameworks and arrangements for quality assurance and validation. 11.Be realistic about the candour to be expected in assessments of other actors performance as well as self assessments 12.Calibrate the strength of particular synthesis findings, conclusions and recommendations according to the relative strength of evidence in the body of cases.