Introduction to Comprehensive Evaluation

Slides:



Advertisements
Similar presentations
Rwanda Case Study Additional Slides on Stakeholder Involvement.
Advertisements

GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Review of key themes. Module 1: Data demand & use key concepts  Data are needed to improve the delivery of services  Providers play a critical role.
Building Capacity on Program Evaluation in Latin America: The Experience of the Partnership between Mexico’s National Institute of Public Health (INSP)
Measuring and Evaluating Reproductive Health Programs and Initiatives Bridgit Adamou Global Maternal Health Conference 2010 August 30 th, 2010.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Violence Against Women and Girls A Compendium of Monitoring and Evaluation Indicators By Shelah S. Bloom Presented by: Anupa Deshpande.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Session 3 General RIA Training 6–8 July 2009 EuropeAid/125317/D/SER/TR
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Community Health Information System in Action in SSNPR/Ethiopia
Difference-in-Differences Models
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Project Cycle Management
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
From the Conceptual Framework to the Empirical Model
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Managing for Results Capacity in Higher Education Institutions
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 8:
Community Health Information System in Action in SNNPR/Ethiopia
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
The PLACE Mapping Tool Becky Wilkes, MS, GISP Marc Peterson, MA, GISP
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
ROUTINE HEALTH INFORMATION SYSTEMS
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Session: 4 Using the RDQA tool for Data Verification
Use of Community Health Data for Shared Accountability
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Process Improvement, System Design, and Usability Evaluation
Process Improvement, System Design, and Usability Evaluation
Information Systems for Health:
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Siân Curtis, PhD OVC Evaluation Dissemination Meeting,
Data and Interoperability:
Use of Information for Decision Making
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
EHRs and Privacy Protection in LMICs
Willis Odek, PhD Chief of Party/Senior Technical Advisor,
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Introduction to Comprehensive Evaluation Justice Nonvignon, PhD School of Public Health, University of Ghana, Legon July 2016 International Workshop on Impact Evaluation of Population, Health and Nutrition Programs

Acknowledgments Sangeeta Tikyani Singh and Juan Pablo Gutierrez (INSP Mexico) contributed to this presentation. Materials were adapted from MEASURE Evaluation and INSP Mexico materials.

Comprehensive: One Word, Multiple Potential Meanings Regarding Evaluation Comprehensive in terms of scope (i.e., evaluating a sector, broad policy, government) Comprehensive in terms of approach (i.e., multiple perspectives) Comprehensive in terms of methods Comprehensive in terms of the evaluatee (programme, policy, intervention, institution)

What kind of comprehensiveness is desirable? What is relevant for the evaluatee? Depends on where in the cycle of operation How better address relevant questions? Perspective, tools, methods that are most suitable for a particular question

What are policies meant for? Ideally: Decision making to improve welfare Identify a condition that requires attention Design interventions/ programs / policies Implement interventions/ programs / policies What to do and how to do it? Planning Real life It is important to contextualize this evaluation view. This has been developed thinking on public programes and policies, and under the general view that decision making in the public context is related to improving welfare. So, when a condition that is decided that requieres attention is identified, there is design process that then will be follow by the implementation. This is intendent to translate what was planned to the real life

Before effectiveness: design & operation Identified causes Best available evidence Operation Implementation according to design Standardization Results Accomplishment of goals Effectiveness So, in our context, that most of the focus is put on final results, i.e. the program is reaching what was intended to or not, we wanted to highlight that before thoses results can or cannot be reached, it is very important to discuss design and operation

Why evaluate? Decide allocation Learning Improve performance Evidence-based policy: Improving efficiency Improving effectiveness Why evaluate? Decide allocation Provide evidence of destination and results of public funds Accountability Learning I am sure that this is something that you have your own way to present it, but this has been useful for us in thinking the different questions an evaluator may answer. We evaluate, of course, because we want to learn, as we are an academic institution. But the demand for evaluation is driven by the need for useful information for the programs. Policies and programs are evaluated because there is a need to generate evidence to inform the decision making process. We promote evaluation because we are also promoting evidence based policy and because we are convinced of the relevance of evaluation for the accountability process. Identify best practices and opportunity areas Test alternatives within existent programs Transferability to other contexts Modifications to existent programs. Assessment of continuation. Improve performance

Comprehensive evaluation

A general framework to connect different types of evaluations Basics of comprehensive evaluation Evaluation phases mirroring programs life-cycle Linking program phases from evaluation perspective It is proposing evaluation as a stepwise building The design of a program is its foundations, identify what is intended to achieve Define what indicators should be used to measure success So, our approach is to look at the different evaluation needs that are related with the different phases of a program, and linking thoses phases from an evaluation perspectie. That is, seen it as a continuos.

A framework built on existing research Comprehensive evaluation as a process Each type of evaluation is an input for the next phase of evaluation (and of the program) i.e. evidence from an evaluation phase is needed for the next phase

A definition for comprehensive evaluation A dynamic, interactive and progressive process that aims to analyze program performance using synergically a set of analytical tools that allow to identify areas of improvement considering formulation, planning and operation of programs, and providing recommendations oriented to improve programs It includes diagnosis, design, processes, implementation, results, effectiveness, efficiency

Validation of identified problem and it´s causal chain Comprehensive evaluation framework Características de la evaluación integral Validation of identified problem and it´s causal chain Analysis of diagnosis

Comprehensive evaluation framework Características de la evaluación integral Consistency between design and causal chain, target population definition Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Características de la evaluación integral Verification of resources allocation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Analysis of resources and information flows for program initiation Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Validation of operation according to planning Evaluation of processes Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Goals and performance analysis Evaluation of results Evaluation of processes Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Measurement of attributable effect Evaluation of effectiveness Evaluation of results Evaluation of processes Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Measurement of efficiency (allocation, technical, administrative) Comprehensive evaluation framework Evaluation of efficiency Evaluation of effectiveness Evaluation of results Evaluation of processes Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Comprehensive evaluation framework Evaluation of efficiency Evaluation of effectiveness Evaluation of results Evaluation of processes Características de la evaluación integral Evaluation of implementation Evaluation of targeting and coverage Design evaluation Analysis of diagnosis

Measuring and accountability Evidence based design Implementation according to planning Measuring results THIS IS THE PURPOSE OF COMPREHENSIVE EVALUATION

How to state evaluation questions? Understanding the evaluatee What it is for? What is the theory behind? What is it trying to accomplish? Understanding evaluatee needs In what phase is the evaluatee? What evidence will help the evaluatee to improve its performance?

This presentation was produced with the support of the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AID-OAA-L-14-00004. MEASURE Evaluation is implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International; John Snow, Inc.; Management Sciences for Health; Palladium; and Tulane University. Views expressed are not necessarily those of USAID or the United States government. www.measureevaluation.org