Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.

Slides:



Advertisements
Similar presentations
A Roadmap to Successful Implementation Management Plans.
Advertisements

Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
HR Manager – HR Business Partners Role Description
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
National Human Resources for Health Observatory HRH Research Forum Dr. Ayat Abuagla.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Welcome to The Expert Community Forum 19 November 2007.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
EVALUATION IN THE GEF Juha Uitto Director
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.
UNDP Support to UN Cooperation in Moldova Annual Programme Review UNDP Moldova 18 December, 2003.
Assessing Capabilities for Informatics Enabled Change: The LISA Toolset Informatics Capability Development LISA – Local Health Community Informatics Strategic.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Gender and Development Effectiveness. Entry points for Tanzania? DPG Main, 8 May 2012 Anna Collins-Falk, Representative, UN Women on behalf of DPG Gender.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
Presented by Linda Martin
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Nursing Research Capacity Building. Background CON –opened as 9 th College at SQU in 2008 The CON’s next challenge is promoting nursing care based on.
"Assessing the quality of policy-relevant research in the African context - metrics and indicators for research into use“ By Eugenia Kaitesi ED, IPAR-Rwanda.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Components of a national drug prevention system Ms. UNODC.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Assessing The Development Needs of the Statistical System NSDS Workshop, Trinidad and Tobago, July 27-29, 2009 Presented by Barbados.
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
5 th Inter-Agency Meeting on Coordination and Harmonization of HIV/AIDS, TB and Malaria Strategies RECOMMENDATIONS OF THE EXPERT MEETING 5-7 MARCH 2014,BRAZZAVILLE,
Environmental Management System Definitions
Georgia Institute of Technology CS 4320 Fall 2003.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Ministry for Women, Youth, Children and Persons with Disabilities.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Developing a Sustainable Procurement Policy and Strategy EAUC – EAF Programme.
NSDS DESIGN PROCESS: ROAD MAPS & OTHER PRELIMINARIES Prof. Ben Kiregyera NSDS Workshop, Addis Ababa, Ethiopia 9 August 2005.
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Development Impact Evaluation in Finance and Private Sector 1.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
UNEP EIA Training Resource ManualTopic 14Slide 1 What is SEA? F systematic, transparent process F instrument for decision-making F addresses environmental.
Advisory Forum, July 2005 Outcome of the first retreat of ECDC Management Team (EXC) 4-5 July 2005 Krägga Herrgård Zsuzsanna Jakab Director ECDC.
Managing the National Communications Process UNFCCC Workshop on Exchange of Experiences and Good Practices among NAI Countries in Preparing NCs September.
Tracking HIV/AIDS resources in-country: Institutionalization through capacity building and regional networks T. Dmytraczenko and S. De, Abt Associates.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
HRSA Early Childhood Comprehensive Systems (ECCS) Impact 2016 Funding Opportunity Announcement (FOA) Barbara Hamilton, Project Officer Division.
2007 Pan American Health Organization 2004 Pan American Health Organization Malaria in the Americas: Progress, Challenges, Strategies and Main Activities.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
OGB Partner Advocacy Workshop 18 th & 19 th March 2010 Indicators.
Module 2 National IEA Process Design and Organization Stakeholder’s consultative workshop/ Training on Integrated assessments methodology ECONOMIC VALUATION.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
LIVING LAB OF GLOBAL CHANGE RESEARCH
Monitoring and Evaluating Rural Advisory Services
Roadmap for Health in All Policies in Sudan
Impact Evaluation for Real Time Decision Making
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
UN Support to SDG implementation in Seychelles.
Institutionalizing the Use of Impact Evaluation
Implementation of the UN DA 10 project “The African context”
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Operational Aspects of Impact Evaluation
Environment and Development Policy Section
Implementing the 2030 Agenda in the Asia- Pacific region, January 2019, Shanghai Institutional arrangements to facilitate coherence in sustainable.
Presentation transcript:

Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS

Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development Impact Evaluation (DIME), World Bank Institutionalizing the Use of Impact Evaluation

 Improve quality of programs  Use evaluation to test alternatives & inform program design in real time  Increase program effectiveness over time & secure success  Build government institutions for evidence-based policy-making  Change the way we work  Contingent planning  Find customized solutions  Adopt better way of taking decisions

 Taking the right decisions plays a key role for development effectiveness  Rigorous testing of policy alternatives can transform the way decisions are being made  Institutional change for evidence-base policy making is a long and arduous process  much overlooked  requires longer-term commitment

 Policy relevant  address the important issues policy makers face  Operationally driven  reflect the learning priorities of programs and policies  Institutionalized  there is a process to produce and feed results back into the policy process

Programs What works best? Does the program deliver results? Ministries/Depart ments Choosing between alternative policies and programs Ministry of Finance Allocating resources across agencies and programs Presidency Managing constituencies

 We start from programs with innovative teams  Developing new way of doing things  Demonstrating that it is possible and valuable  Learning how to do it well  Communicating on an ongoing manner  Improving results

 Learning collaboratively  Harmonizing measurement to facilitate comparisons across programs to inform the Ministry’s decisions  Building the data systems to reduce the costs of iterative evaluations across several programs  Communicating externally

 Institutionalizing reporting mechanisms  Using impact evaluation results to inform the budget process  Identifying knowledge gaps  Harmonizing measurement across programs to facilitate comparisons  Improving budget allocations  Communicating with constituencies

Develop capacity for impact evaluation Institute yearly cycle of evaluation Evaluate programs Improve programs Report results Request budget Agencies & programs Establish common metric and evaluation guidelines Collect and catalogue evaluation results Analyze results and improvements Prepare matrix of results across government programs National impact evaluation council Feed matrix of results into budget process and MTEF Consider trade-offs Improve budget allocations to meet national objectives Identify knowledge gaps Planning & Finance

 Every program has its specific objectives against which a program is evaluated  To evaluate a program against a national framework of analysis, need a vector of indicators  Agree on common national indicators against which individual programs can be benchmarked  Evaluate specific programs against own objectives and relevant common indicators

Capacity development Empower teams Technical quality & external validation Maintain reputation Participation Relevance Multiple products and ongoing communication Management support and sustainability Reward systems for learning & change Work for success

Be informed users of evaluation Set policy learning agendas Test policy & implementation alternatives Measure results rigorously Incorporate evaluation guidance into program design Improve program effectiveness Start over

 You are doing it  It is the teams that design & implement programs that need to acquire the skills  Appreciation of senior management must be secured

 A few big decisions are taken during design but many more decisions are taken during roll out & implementation

School based prevention Information NGO delivered student training Abstinence and condomsSame cohort sex Teacher HIV training Training onlyTraining plus instructional material Incentives to yr olds to stay negative Monetary FixedIncreasing with age Scholarship Girls onlyAll students Peer student groups Rapid results approach (100 days) Students onlyStudents and teachers All year round meetings and activities Students onlyStudents and teachers

 Scientifically test critical nodes of the decisions tree over time  Start with what you need to know most  Move along the decision tree as more results comes in and options are sorted out  Cannot learn everything at once  Select carefully what you want to test by involving all relevant partners

School based prevention Information NGO delivered student training Abstinence and condomsSame cohort sex Teacher HIV training Training onlyTraining plus instructional material Incentives to yr olds to stay negative Monetary FixedIncreasing with age Scholarship Girls onlyAll students Peer student groups Rapid results approach (100 days) Students onlyStudents and teachers All year round meetings and activities Students onlyStudents and teachers

 When we compare and adopt better alternatives, we cannot fail, we can only get closer to success  Reward systems need to recognize trial and error as part of the learning process  Managerial incentives a good place to start

Baseline and targeting analyses to inform intervention design Feedback into quality of administrative data systems Testing operational alternatives against short term outcomes (take-up rates and adoption) and improve programs Testing programs against higher level outcomes (welfare, health status or income) and chose better programs Doing impact evaluation is not a one off but a series of exercises to inform the decision process

 Keep key actors involved in the doing through active participation in the evaluation  Submit early drafts—not final designs—to internal discussion and validation  Get management buy-in  Let other partners know what you are doing  Communicate results as they come out and decisions as they are taken internally and externally

Thank you