Download presentation
Presentation is loading. Please wait.
Published byCharleen Dorsey Modified over 8 years ago
1
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division Canadian International Development Agency (CIDA) June 2008 Paris, France
2
Background: Why is this needed? Substantial share of donors ’ resources are channelled through multilateral partners (40-50% for CIDA, or $7-8 B. from 2001-2007) Early 90s, joint evaluations of UNICEF, WFP, UNFPA proved to be useful to institutions and donors 10 years later, more joint evaluations were conducted (IFAD, WFP, FAO) which attests to the need for info Joint external evaluations have been effective but are costly, lengthy, infrequent and unsustainable. Unable to cover enough agencies in a reasonable time frame A new approach to assess performance, based on peer reviews of evaluation functions was introduced and tried (UNDP, UNICEF, others) But this does not adequately address the need 2
3
3 Background MOPAN common approach – More of a monitoring effort by our Multilateral Departments Peer Reviews and MOPAN improve coverage but lack direct evidence of effectiveness at country level Other efforts to improve effectiveness measurement and self-reporting under way at agency and system level (UNEG, ECG) are not yet sufficient Need for performance info continues but demand not being met from current systems Current agency self-reporting is inconsistent in quality, coverage, and scope – Lacks credibility Performance info not adequate as judged by: independence, credibility, usefulness, cost- effectiveness and timeliness
4
CIDA’s 2009 Review of Multilateral Effectiveness Review focused on 23 institutions which have received more than 80% of CIDA’s multilateral funding in recent years 2 main components: 1) meta-evaluation of 117 evaluations of global, regional or large scale country programs 2) assessment of the institutions’ capacity to manage for development effectiveness through reviews of systems and processes (evaluation, RBM, monitoring and reporting on effectiveness) The Review was intended to assess effectiveness of the channel as a whole and not individual institutions The Review revealed that 69% of the multilateral organization development and humanitarian programs evaluated were satisfactory or better The Review also found that 75% of the organizations reviewed had put in place effectiveness-reporting, RBM and evaluation systems The Review recommended that CIDA explore with other bilateral donors the feasibility of undertaking a similar approach for assessing the effectiveness of individual multilateral organizations 4
5
5 Proposed Approach: Features Common methodology and assessment criteria Joint implementation by bilateral development organizations working in cooperation with multilateral partners and governments Burden sharing, delegated programming and division of labour Builds on proven methods: meta-evaluation and structured review of Management for Development Results (MfDR) systems Ensures consistent coverage of core effectiveness issues with credible information and allows for building a common body of knowledge over time
6
6 Methodology Components 1. Meta Evaluation – Institutional, program and project evaluations carried out by the agency under review and by partner donor agencies – Rely where possible on Peer Review Process to certify reliability of evaluations 2. Direct Review of Systems and Procedures for Managing for Development Results (MfDR) – Systems and Processes for: Evaluation, Results-Based Management, Effectiveness Monitoring and Reporting and Knowledge Management 3. Brief Field Evaluation – Selected Agency Programs in 2 Countries – Demonstrate operation at field level of systems for MfDR
7
7 Common Assessment Criteria 1. Relevance of Agency Programs to International and National Development Goals 2. Objectives Achievement 3. Cost Effectiveness 4. Sustainability 5. Comparative Advantage 6. Managing for Development Effectiveness 13 Sub-Criteria And a Detailed Guidelines for Scoring
8
8 A Structured System for Scoring Consistency Scoring Guide for Meta-evaluations Evaluations assessed using a 5-point scale (Not demonstrated, Highly Unsatisfactory, Unsatisfactory, Satisfactory, Highly Satisfactory) For each criteria, each level of the 5-point scale is clearly defined with indicators imbedded in the definition so that reviewers are able to maintain consistency in scoring
9
9 Reasonable Resource Requirements 1. Estimated Cost for Review of a Major Agency = 250K to 300K USD 2. Estimated Duration is One Calendar Year 3. Elapsed Time Kept Brief by: – Standard assessment criteria and Generic ToRs – Common evaluation team structure – Contracting by a single agency – Common reporting format
10
10 Elements of A System Coverage of 4-5 Multilateral Partners Per Year and 20 most significant on a five year cycle Strict adherence to common approach Each assessment carried out by a Lead Agency using its own procurement system but with financial and managerial support from 2-3 others working in a small Management Group Identification of Multilateral Partners for review on an annual basis dependent on bilateral organization interest and commitment to cover major multilateral organizations Coordination through a Task Team under the OECD/DAC Evaluation Network
11
11 Governance and Reporting Interim solution: Task Team under DAC Evaluation Network to further develop and implement the approach Task Team to solicit nominations for organizations to be reviewed and volunteer bilateral organizations to be lead agencies and members of management committees Draft reports submitted to the management group for each review: final reports to be logged with the Evaluation Network Rotating secretariat could be longer term solution
12
12 Next Steps Establish a Task Team to work on the proposed Approach to Assess Multilateral Agency Performance Task Team mandate will be to: – Refine the approach and methodology – Identify potential review candidates as pilots – Identify a potential lead and supporting bilateral organizations – Conduct 2-3 Pilot Reviews using the proposed method Canada is willing to host the first Task Team meeting in Ottawa in the Fall
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.