“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Halton Housing Trust Customer Scrutiny Panel An introduction to our Service Reviews.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Presented by Samuel Cudjoe, Principal Program Officer, NAPRM-GC, Ghana at the Governance Assessments for Accountable Politics Workshop in Windhoek, Namibia;
Child Safeguarding Standards
SAI Performance Measurement Framework
Measuring Outreach Effectiveness Web4Dev – November 2006 Alex McKenzie - Knowledge & Evaluation Capacity Development Independent.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
IS Audit Function Knowledge
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
Quality evaluation and improvement for Internal Audit
Office of Inspector General (OIG) Internal Audit
Purpose of the Standards
Part II Objectives F Describe how policies and procedures are used F Identify different types of P & P F Describe the purpose and components of a Policy.
Designing Influential Evaluations Session 2 Topics & Timing Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
REVIEW AND QUALITY CONTROL
Simon Brewin Overview of the National Supply Chain Reform Task Force.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
PILOT PROJECT: External audit of quality assurance system on HEIs Agency for Science and Higher Education Zagreb, October 2007.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Audit objectives, Planning The Audit
PEACE III - Theme 1.1 Aid for Peace – Phases I & II 21 September 2011 Celeste McCallion.
Evaluation in the GEF and Training Module on Terminal Evaluations
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU. Quality Assurance José Viegas Ribeiro IGF, Portugal SIGMA.
7 June 2012 Guidelines for the Compilation of Water Accounts and Statistics Guidelines for the Compilation of Water Accounts and Statistics UN Statistics.
School Finances for Finance Subcommittees School Councils.
1 Interaction between SAIs and PACs. Presentation to SADCOPAC.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Evaluation of EU Structural Funds information and publicity activities in Lithuania in Implementing recommendations for Dr. Klaudijus.
PLANNING WORKBOOK TUTORIAL MODULE 3 STEPS FOR DEVELOPING ROADWAY USER AWARENESS AND EDUCATION PROGRAMS FHWA Highway Safety Marketing, Communications, and.
Bank Audit. Internal Audit Internal audit is an independent, objective assurance activity and can give valuable insight in providing assurance that major.
Subcommittee on Design New Strategies for Cost Estimating Research on Cost Estimating and Management NCHRP Project 8-49 Annual Meeting Orlando, Florida.
The Performance Management And Appraisal System (PMAS) Principal Performance Appraisal.
Performance and Development Teacher Librarian Network
Capacity Building for the Kosovo Anti- Corruption Agency Constantine Palicarsky.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Workshop on Implementing Audit Quality Practices Working Group on Audit Manuals and Methods March 2006 Vilnius (Lithuania) Hungarian Experiences.
Lessons from Programme Evaluation in Romania First Annual Conference on Evaluation Bucharest 18 February 2008.
IRES: an update on the revision process Vladimir Markhonko United Nations Statistics Division The 4 th meeting of the Oslo Group on energy statistics Ottawa,
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
Safety Management System Implementation Michael Niels Thorsen Moscow 15 September 2005.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Cooperation with the Seimas Audit Committee Tomas Mackevičius Deputy Auditor General.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
Ombudsman Western Australia Serving Parliament – Serving Western Australians Evaluation in the Western Australian Ombudsman’s Office Kim Lazenby & Jane.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Internal Audit Section. Authorized in Section , Florida Statutes Section , Florida Statutes (F.S.), authorizes the Inspector General to review.
Evaluation Capacity Building at Country Level: GEF Focal Points 1 Osvaldo Néstor Feinstein AEA 2011 Conference GEF Evaluation Office Panel.
OH NO!!! Quality Improvement. Objectives Define a Quality Improvement Program Identify how to get started Identify who should be involved Identify how.
Patient and Public Involvement: Supporting a future of better research Isabelle Abbey-Vital, Research Involvement Officer.
Is medical revalidation building trust and assurance in doctors
GEF Familiarization Seminar
IPSP Outcomes Reporting Framework
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Monitoring and Evaluation using the
Statistics Governance and Quality Assurance: the Experience of FAO
Evaluation in the GEF and Training Module on Terminal Evaluations
Safety Management System Implementation
Presentation for the EDOREN closure event
Valvira The National Supervisory Authority for Welfare and Health
TECHNOLOGY ASSESSMENT
Experiences in Developing Statistical Quality Frameworks
Evaluation use in practice
Part II Objectives Describe how policies and procedures are used
Presentation transcript:

“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat Mike Hendricks

Evaluation Follow-up 2 Osvaldo Néstor Feinstein AEA 2011 Conference

Themes of this presentation on evaluation follow-up interest in going beyond reports comparative work that was carried out good practice guidelines issues to think about

Motivation benefit of evaluations depend on their use perception/evidence of too limited use risk of cost without benefit potential benefits of learning from evaluations “anxiety of influence” Increase the benefit/cost ratio of evaluations

Comparative work on evaluation follow-up done on behalf of the United Nations Evaluation Group (UNEG) ; updated for WBG 2010 literature review and interviews mainly to UN agencies; also some other type of organizations focus on procedures/systems/mechanisms for evaluation recommendations and management response follow-up

Main findings emphasis on formal procedures, similar to those used by auditors high transaction costs – low efficiency concerns with the quality of evaluation recommendations confusion of “adoption of recommendations” with “consistency or alignment between recommendations and actions taken by Management” confirmation that there was scope for improvement of evaluation follow-up systems

Good practice guidelines endorsed at the UNEG Annual Meeting 2010 Good practices in management response to evaluation Development of systems for tracking and reporting on the implementation of the evaluations' recommendations, and Mechanisms for facilitating learning and knowledge development from evaluations.

Some issues for thinking/discussion could the engagement of evaluators with managers, that are expected to implement evaluation recommendations, jeopardize evaluators’ independence? If evaluation recommendations were of really high quality, wouldn’t they be applied even without evaluation follow-up?

Balancing Accountability and Learning in Evaluation Follow-Up: Lessons Learned from Reforms at the World Bank Group Bidjan Nashat Strategy Officer, World Bank Independent Evaluation Group AEA Think Tank Evaluation Follow-Up: Challenges and Lessons November 4,

IEG’s Mandate on follow-up IEG recommendations intend to “help improve the development effectiveness of the World Bank Group’s programs and activities, and their responsiveness to member countries’ needs and concerns.” IEG is also mandated to report "periodically to the Board on actions taken by WBG management in response to evaluation findings.” Source: IEG Mandate 10

The three stages of reforming IEG’s follow-up system 1.Upstream process: How do we come up with recommendations? 2.Follow-up process: How do we track if they are being implemented? 3.Analysis and Utilization: What does it mean for IEG, World Bank Management, and the Board? 11

1.Reforming the upstream process Quality control: What is a good recommendation? Context: What is the link between recommendations and findings? Engagement: What it would take to fix the problem? Responsibility: Who is doing what when and how? Governance: Who decides in the case of disagreements? 12

IEG’s follow-up process reform Reform: recommendation standards. IEG clearly indicates the link between its findings and how they lead to draft recommendations Reform: Engagement on findings and recommendations. IEG team meets with management counterparts at the working level to discuss findings and get input and suggestions on draft recommendations Reform: Actions and timelines. Within 90 days after the CODE meeting, management lays out more specific actions and timelines for each accepted recommendation Reform: Reporting to CODE. IEG reports on a quarterly basis to CODE on submitted action plans and timelines. IEG monitors and reports on adoption annually. Source: IEG Results and Performance Report 2011, ieg.worldbank.org/content/dam/ieg/rap2011/rap2011_vol1.pdf 13

2.Reforming the follow-up process Learning from others: How are other evaluation units following up? Simplify and unify ratings and procedures: How can we streamline annual tracking across the WBG? Move from manual process to automation: How can we automate the tracking process? 14

3.Reforming Analysis and Utilization Ask the right questions: What is the right question to ask before analyzing the data? Provide context: What does follow up data tell us about our theory of change? Make it public: How should we disclose follow up data? Utilization: How can we link follow-up data to our organizational impact? Close the loop: How do we make follow-up data useful for new evaluations and our work program? 15

Thank you! Contact: 16

Broadening Our Thinking About Evaluation Follow-Up Michael Hendricks Presented at the American Evaluation Association November 4, 2011

What Do We Mean By “Follow-Up”? We talk about evaluation use and evaluation follow-up What exactly are we concerned about? Generally, our findings and recommendations Especially... “Have our recommendations been accepted and implemented?” But are we thinking broadly enough about this?

Four (or More) Different “Models” for Offering Recommendations Who are the Persons Involved with the Recommendations? What Type of Model for Offering Recommendations Is This? Electrical inspector to homeowners Advisor to “uninformed” Physician to patient Advisor to “personally aware” Management consultant to client Advisor to “differently expert” Assistant coach to head coach Advisor to “equally expert” Others??

Is It Reasonable to Expect the Same Type of Use from Each Model? Who are the Persons Involved with the Recommendations? What Type of Model for Offering Recommendations Is This? What Type of “Use” Is It Realistic to Expect? Electrical inspector to homeowners Advisor to “uninformed” ? Physician to patient Advisor to “personally aware” ? Management consultant to client Advisor to “differently expert” ? Assistant coach to head coach Advisor to “equally expert” ? Others???

Aren’t There Many Different Ways to “Use” Recommendations? Ignore them/pay no attention Consider them carefully when offered Discuss them later in depth (“learning events”) React to them formally Implement them in part or whole Monitor their implementation Evaluate their effects

Are We Asking These Important Questions? Which models best describe an evaluator offering recommendations to a program manager or to a governing body? How often is it advisor to uninformed? Advisor to equally expert? Something in between? Won’t different models lead us to expect different uses? And won’t that give us insights into how to measure -- and improve -- each type of use? Would it be useful to explore these ideas further?