T EN QUESTIONS TO CONSIDER IN DESIGNING AN IMPACT EVALUATION.

Slides:



Advertisements
Similar presentations
Performance Measurement and Evaluation 2/8/2014 Performance Measurement and Evaluation 1 Performance Measurement and Evaluation What are they? How are.
Advertisements

Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
Why We Do Research Chapter 1. Ordinary Versus Systematic Biased Question: A question that leads to a specific response or excludes a certain group Nonscientific.
What is research methodology
Mywish K. Maredia Michigan State University
Cross Sectional Designs
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Developing a Framework for Examining Validity in State Accountability Systems Ellen Forte Fast edCount, LLC AERA Division H Paper Presentation April 15,
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
Using the Crosscutting Concepts As conceptual tools when meeting an unfamiliar problem or phenomenon.
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
TOOLS OF POSITIVE ANALYSIS
Thinking: A Key Process for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how.
OVE’s Experience with Impact Evaluations Paris June, 2005.
Asking the right questions Chapter 3. Nonimpact evaluations and when they are most useful Module 3.1.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Researching the Social Impact of the Arts Evidence Network for Culture and Sport Centre for Cultural Policy Research, Glasgow University, with the support.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Public Value Innovation and Research Evaluation Discussion by Karen Macours INRA - Paris School of Economics.
Professor Ewan Ferlie Dept of Management King’s College London January 2013 Presentation at NHS Confederation Event.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance Mark Billera USAID Office of Democracy and Governance Perspectives.
PERSPECTIVES ON IMPACT EVALUATION Approaches to Assessing Development Effectiveness CAIRO, March - April 2009 Ignacio Pardo Universidad de la República,
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
The aim / learning outcome of this module is to understand how to gather and use data effectively to plan the development of recycling and composting.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Science This introductory science course is a prerequisite to other science courses offered at Harrison Trimble. Text: Nelson, Science 10 Prerequisite:
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
+ Conceptualizing Influence and Impact in Development Research Katie Wright.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
Program Evaluation Week 3. Tonight Name Tags Your Q’s Quant. vs. Qual Paradigms activity Evaluation developments Context-Adaptive Model HW.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Unpacking the notion of program theory from a complexity lense: Can indigenous and cross-cultural perspectives help? Sanjeev Sridharan and Janet Smylie.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
What is randomization and how does it solve the causality problem? 2.3.
Evaluating Educational Technology Brian McNurlen & Chris Migotsky University of Illinois at Urbana-Champaign.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Graduate School for Social Research Autumn 2015 Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com Causality.
Strategic Planning Session Part II: Outcome Assessment and Program Evaluation Chapter 16: John Clayton Thomas Presented by : David Rudder, Ph.D.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Warm-up August 29, 2008 Anticipation Guide. Scientific Inquiry.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Making Causal Claims in Non-Experimental Settings
Right-sized Evaluation
Introduction to Evaluation
Strategic Planning for Learning Organizations
The added value of evaluation
Descriptive Studies; Causality and Causal Inference
Positive analysis in public finance
Title Team Members.
Presentation transcript:

T EN QUESTIONS TO CONSIDER IN DESIGNING AN IMPACT EVALUATION

W HAT IS IMPACT ? The difference in outcomes with the program versus without the program – for the same unit of analysis (e.g. individual). ……..but does this not sound very straightforward? One reasonably straightforward view (Ravaillon, World Bank)

N AÏVE COMPARISON 1: B EFORE VS AFTER. W E OBSERVE AN OUTCOME INDICATOR, Intervention

AND ITS VALUE RISES AFTER THE PROGRAM : Intervention The difference in outcomes with the program versus without the program – for the same unit of analysis (e.g. individual)

H OWEVER, WE NEED TO IDENTIFY THE COUNTERFACTUAL … Intervention

… SINCE ONLY THEN CAN WE DETERMINE THE IMPACT OF THE INTERVENTION

DO WE ALSO NEED TO KNOW WHAT IS IT ABOUT THE PROGRAM THAT CAUSED THE EFFECT? But is this empirical view of impact evaluation enough? Is this enough of a basis to generalize the results of the evaluation?

D IFFERENT MODELS OF CAUSALITY Successionist “......what is needed to infer causation is the ‘constant conjunction’ of events: when the cause X is switched on (experiment) effect Y follows, and when the cause is absent (control) no effect is observed.” Generative “The generative model calls for a more complex and systemic understanding of connectivity. It says that to infer a causal outcome (O) between two events (X and Y) one needs to understand the underlying generative mechanism (M) that connects them and the context (C) in which the relationship occurs.” (Pawson et al, 2004, p, 2)

A CONCRETE EXAMPLE — FURTHER WISDOM FROM P AWSON “To use a physical science example, researchers would not claim that repeated observations of the application of a spark (X) to gunpowder and the subsequent explosions (Y) was a sufficient base on which to understand the causal relationship. Rather the connection (O) is established by what they know about the chemical composition of gunpowder and its instability when heat is applied (M). They also know that this mechanism is not always fired and that the explosion depends on other contextual features (C) such as the presence of oxygen and the absence of dampness.” (Pawson et al., 2004, p. 2) ……notice the question is shifting….it is not simply does a program work? ….but what is it about a program that brought about the change

D EVELOPMENTAL E VALUATION Formative Developmental Summative

T HE Q UESTIONS (1) 1. What is the purpose of the evaluation? 3. Is there clarity on the program theory? Does the program theory help identify the key outcomes? 2. Is your program or policy stable over time? 4. Is there clarity on the anticipated trajectory of impacts (if the program is successful)? Is there clarity on the anticipated earliest timeline of impact?

T HE Q UESTIONS (2) 5. How will the program’s context be measured? Process? Key outcomes? Are the measures being collected over time? 6. What evaluation design is being implemented to decide if a program was successful? Was the key informed by the theory of change? Did the design pay attention to key linkages in the program theory? Was the design informed by the anticipated impact trajectory?

What is the program? One example

T HE Q UESTIONS (3) 7. What analytical methods will be implemented to decide if the program was successful? Can qualitative and quantitative methods both be used to decide if the program was successful? 9. What are some of the unintended consequences of the program? What method/approaches are implemented to learn about the unintended consequences? 8. What methods are used to learn about the heterogeneous impacts of the program? Did the program impact have differential impacts on different populations? 10. What has been learned about how the program impacts intended beneficiaries? How should the program be modified to impact intended beneficaries?