Monitoring Progress & Evaluating Impact Presented by: Cindy Banyai, Ph.D. Executive Director, efocus Institute IREX Community Solutions Program September.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Mywish K. Maredia Michigan State University
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
Constructing a Monitoring & Evaluation Plan Presented by: Cindy Banyai, Ph.D. Executive Director, efocus Institute IREX Community Solutions Program September.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Strategic Planning Presented by: Cindy Banyai, Ph.D. Executive Director, Refocus Institute IREX Community Solutions Program August 25 – September 1, 2011.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Roma Inclusion: Monitoring and Evaluation for Results November 2010, Brussels, Belgium – DG REGIONAL POLICY Joost de Laat (PhD), Economist, Human Development.
Results-Based Management: Logical Framework Approach
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Evaluation. Practical Evaluation Michael Quinn Patton.
RCOG International Office Consultancy Skills and Tools Angela Brown, Technical Assistance Manager, RCOG International Office.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Program Evaluation Using qualitative & qualitative methods.
Understanding Project Management. Project attributes  The project has a purpose  Project has a life cycle like an organic entity  Project has clearly.
The Evaluation Plan.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Seminar on Mid Term Evaluation in Objective 1 and 2 Regions Lessons from the Mid Term Evaluation of Merseyside Objective One.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
United States Foreign Assistance NOT FOR DISTRIBUTION Evaluation in US Foreign Assistance Monitoring and Evaluation Roles, Systems, Priorities DAC Evaluation.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Quantitative and Qualitative Approaches
2006 Socio-economic component CUMBERLAND RESOURCES LTD.  Baseline  Impact Assessment  Mitigation and Benefit Enhancement  Monitoring  Intervener Comments.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
111 CINDI PMER Workshop March, 2007 By Simon Opolot © 2007 Office of the Premier, KwaZulu-Natal Province, Private Bag X9037 Telecom House, 2nd Floor,
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Logical Framework Approach An Evaluation Toolbox Presentation
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Strategic Planning for Learning Organizations
WHAT is evaluation and WHY is it important?
Presentation transcript:

Monitoring Progress & Evaluating Impact Presented by: Cindy Banyai, Ph.D. Executive Director, efocus Institute IREX Community Solutions Program September 1, 2011 – September 8, 2011

Learning Objective Identify and employ key methods and tools for monitoring progress and evaluating impact IREX CSP, Cindy Banyai, refocusinstitute.com

Project Management Cycle (PMC) 3 Based on Miyoshi, IREX CSP, Cindy Banyai, refocusinstitute.com

Monitoring Progress Monitoring - mid-term and terminal evaluations –Examine implementation process -- assess inputs through outputs –Measure performance – assess outputs to intermediate outcomes 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 4

Evaluating Impact Impact evaluation – measures relationship between intermediate outcomes and end outcomes Sustainability evaluation – measures long- term impacts, periphery effects of program 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 5

USAID on Impact Evaluation Measures the change in a development outcome that is attributable to a defined intervention Based on models of cause and effect Requires a credible and rigorously defined counterfactual to control for factors other than the intervention that might account for observed change Comparisons between beneficiaries that are randomly assigned to either a treatment or a control group provide the strongest evidence of a relationship between the intervention under study and the outcome measured (USAID 2011) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 6

DAC 5 Evaluation Criteria Efficiency - A measure of how economically resources/inputs (funds, expertise, time, etc.) are converted to results. Effectiveness - The extent to which the development intervention’s objectives were achieved, or are expected to be achieved, taking into account their relative importance IREX CSP, Cindy Banyai, refocusinstitute.com 7

DAC 5 Evaluation Criteria (con’t) Impact - Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. Sustainability - The continuation of benefits from a development intervention after major development assistance has been completed. The probability of continued long- term benefits. The resilience to risk of the net benefit flows over time IREX CSP, Cindy Banyai, refocusinstitute.com 8

DAC 5 Evaluation Criteria (con’t) Relevance - The extent to which the objectives of a development intervention are consistent with beneficiaries’ requirements, country needs, global priorities and partners’ and donors’ policies. (DAC 2002) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 9

10 Relationship between Evaluation Criteria & Logic Model End Outcomes Intermediate Outcomes Outputs Inputs impact effectiveness efficiency Relevance sustainability Activities Measuring performance Examining process Assessment of 5 criteria Measuring performance Examining implementing process Miyoshi 2008

TOOLS FOR MONITORING AND EVALUATION 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 11

Traditional types of data gathering Surveys Tests Interviews Observation Documents 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 12

Participatory types of data gathering Focus groups – cooperative inquiry Training Action research project –Photographs –Video –Plays/storytelling –Metaphor drawing –sculpture IREX CSP, Cindy Banyai, refocusinstitute.com

Quantitative data Need counterfactual - situation a participating subject would have experienced had he or she not been exposed to the program Baseline or pre-program data used to assess impact –Predict outcomes that might result from the program (as in ex ante evaluations) –Make before-and-after comparisons (also called reflexive comparisons) (WB 2010) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 14

Quantitative impact evaluation Ex-ante impact evaluation - measures intended impacts of future policies/programs/projects, predicts program impacts. –Looks at a potentially targeted area’s current situation –May involve simulations based on assumptions about how the economy works –Based on structural models of the economic environment facing potential participants 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 15

Quantitative impact evaluation Ex-post impact evaluations - measure actual impacts accrued by beneficiaries attributable to policy/program/project –Examines immediate benefits, reflects reality –Sometimes miss the mechanisms underlying the program’s impact on the population, which structural models aim to capture (WB 2010) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 16

Quantitative methods for ex-post impact evaluation Randomized evaluations Matching methods, specifically propensity score matching (PSM) Double-difference (DD) methods Instrumental variable (IV) methods Regression discontinuity (RD) design and pipeline methods Distributional impacts Structural and other modeling approaches (WB 2010) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 17

Qualitative data 3 main sources of qualitative data – open- ended interviews, observations, documents (Patton 2002) Emerging sources – photographs, video, artistic expressions (Hesse-Beiber and Leavy 2008) Can be coded, transformed into statistics 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 18

Qualitative research Looks at a phenomenon, situation in-depth and detail The researcher is the instrument – validity relies on her skill, competence and rigor –Internal validity – when qualitative data is corroborated from multiple sources 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 19

Quantitative vs. Qualitative Need both to provide depth and understanding of situation –Numbers balance narrative accounts –Stories and visuals put a human touch on numbers 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 20

Determine approach and tools Use these questions to guide your selection –Who is the information for and who will use the findings? –What kinds of information are needed? –How is the information to be used? For what purposes are the evaluation being done? –When is the information needed? –What resources are available to conduct the evaluation? –Given the answers to the proceeding questions, which methods are appropriate? (Patton 2002) 2011 IREX CSP, Cindy Banyai, refocusinstitute.com 21

Parting thoughts on M&E Collect the data you need. Use the data you collect. Doing so improves your efficiency and the clarity of your evaluations, in addition to freeing up time for other program activities IREX CSP, Cindy Banyai, refocusinstitute.com 22

Thanks for your attention! Please send questions & comments to com

References Banyai, Cindy. (2010). Community capacity and governance – New approaches to development and evaluation. PhD diss., Ritsumeikan Asia Pacific University. Bureau for Policy, Planning, and Learning – United States Agency for International Development [USAID]. (2011.) Evaluation Policy. Washington, D.C.: USAID. Development Assistance Committee [DAC] of the Organization for Economic Cooperation and Development [OECD]. (2002). Glossary of Terms in Evaluation and Results Based Management. Paris: OECD. Hesse-Beiber, Sharlene Nagy and Leavy, Patricia. (2008). Handbook of Emergent Methods. New York: Guilford Press. Japan International Cooperation Agency - Office of Evaluation and Post Project Monitoring, Planning and Evaluation Department (2004). JICA Evaluation Handbook: Practical Methods for Evaluation. Tokyo: Japan International Cooperation Agency. Khandaker, Shahidur R., Koolwal, Gayatri B., Samad, Hussain A. [WB] (2010). Handbook on Impact Evaluation: Quantitative Methods and Practices. Washington, D.C.: World Bank. Miyoshi, Koichi. (2008). What is Evaluation?. In Miyoshi, Koichi (Ed). Hyoka-ron wo Manabu Hito no tameni (For People Learning Evaluation Theory). (pp. 1-16). Tokyo. Sekaishisosha. Patton, Michael Quinn (2002). Qualitative Research & Evaluation Methods. Thousand Oaks: Sage Publications, Inc IREX CSP, Cindy Banyai, refocusinstitute.com 24