Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Performance Assessment
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
SETTINGS AS COMPLEX ADAPTIVE SYSTEMS AN INTRODUCTION TO COMPLEXITY SCIENCE FOR HEALTH PROMOTION PROFESSIONALS Nastaran Keshavarz Mohammadi Don Nutbeam,
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1 CASE STUDY RESEARCH An Introduction. 2 WHY CASE STUDY RESEARCH? The case study method is amongst the most flexible of research designs, and is particularly.
Victorian Curriculum and Assessment Authority
Study Objectives and Questions for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Leading by Convening: The Power of Authentic Engagement
© UKCIP 2011 Learning and Informing Practice: The role of knowledge exchange Roger B Street Technical Director Friday, 25 th November 2011 Crew Project.
Mywish K. Maredia Michigan State University
Strategies and Structures for Research and Policy Networks: Presented to the Canadian Primary Health Care Research Network, 2012 Heather Creech, Director,
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
IAPB 9 th General Assembly Eye Health: Everyone’s Business Hyderabad, India September 17-20, 2012 Scaling up: Perspectives for VISION 2020 Prof. Don de.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
The Modeling Process Esmaeil Khedmati Morasae Center for Community-Based Participatory Research in Health Tehran University of Medical Sciences Winter.
Funded By NATO Oyster reefs are complex ecological systems because they:
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
T EN QUESTIONS TO CONSIDER IN DESIGNING AN IMPACT EVALUATION.
AN INTRODUCTION TO STRATEGIC ENVIRONMENTAL ASSESSMENT
® © 2005, CARE USA. All rights reserved. A System for Measuring Impact Organizational Performance and Learning November 2, 2010.
Dissemination pathways Science and policy
Cultural Competencies: The Big Picture Melynda Huskey Office of the Vice President for Equity and Diversity.
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
Health Systems and the Cycle of Health System Reform
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Evaluation and Policy in Transforming Nursing
Brendan Halloran Transparency and Accountability Initiative Beyond Theories of Change: Working Politically for Transparency.
Self-Assessment Tool Exploring Organizational Capacity to Use Research.
Gender-Based Analysis (GBA) Research Day Winnipeg, MB February 11, 2013.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
The BC Clinical Care Management Initiative as a Case Study in Large Scale Change CARES International Conference on Realist Approaches, October 29,
Dissemination and Implementation Ellen Goldstein, MA, Kevin Grumbach, MD Translating Practice into Evidence: Community Engaged Research.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
HW 425 Unit 6 Seminar Theory in Health Promotion & Education.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
Vulnerability, Resilience, and Adaptation in the IPCC framework Vulnerabilities of the Carbon- Climate-Human System Patricia Romero Lankao Paris, June.
PHSB 612: Interventions Diane M. Dowdy, Ph.D. Spring 2008.
Unpacking the notion of program theory from a complexity lense: Can indigenous and cross-cultural perspectives help? Sanjeev Sridharan and Janet Smylie.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Reflection and Learning Workshop April 2015 Imperial Botanical Garden Hotel, Entebbe.
VELS The Arts. VELS (3 STRANDS) Physical, Personal and Social Learning Discipline-based Learning Interdisciplinary Learning.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Goals of the Workshop BUILD A COMMUNITY INTERESTED IN EVALUATION DEVELOP APPROACHES TO EVALUATION THAT COMBINE ACCOUNTABILITY AND LEARNING USE EVALUATIONS.
Social Ecological Models
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Digging Deeper into Macro Social Analysis Gary Green University of Wisconsin.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Building good theories, hypotheses, and research questions? Defining appropriate scope of a project H397: Research Practicum Course.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Health Systems Analysis
a New Focus for External Validity
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Module 7 Key concepts Part 2: EVALUATING COMPLEX DEVELOPMENT PROGRAMS
A Multi-disciplinary Perspective on Decision-making and Creativity:
Presentation transcript:

Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan

2 Impact Evaluation Four Stories Thinking of complexity Looking forward Incorporating diversity

One definition of impact evaluation: The search for ‘hard evidence’ Unlike general evaluations, which can answer many types of questions, impact evaluations are structured around one particular type of question: What is the impact (or causal effect) of a program on an outcome of interest? This basic question incorporates an important causal dimension: we are interested only in the impact of the program, that is, the effect on outcomes that the program directly causes. An impact evaluation looks for the changes in outcome that are directly attributable to the program. Gertler et al, 2007, World Bank Publication

Why ‘one’ and ‘only’ may not cut it? “The “hard evidence” from the randomized evaluation has to be supplemented with lots of soft evidence before it becomes usable” (Rodrik, 2008)

SO WHAT IS THE PROBLEM HERE? The Lagos-London problem

Some Stories on Impacts

1. LESSONS FROM AN EVALUATION OF A MEDIA CAMPAIGN On love and propensity scoring models

LESSONS It is never about just about a single method Methods are fallible Theory matters The need to move beyond being an accidental tourist

2. LESSONS FROM HAVE A HEART PAISLEY

AN EXAMPLE: PRIMARY PREVENTION HAVE A HEART PAISLEY 10

LESSONS Adaptation matters Going beyond protocols and experiments Dynamics of interventions Incompleteness of knowledge at the outset

3. LESSONS FROM DANCING WITH PARKINSON’S

LESSONS Support structures matter Mechanisms Heterogeneity is not noise Scaling up

4. EXPERIENCES WITH THE WHO

LESSONS The importance of understanding the nature of connections Issues of power The attribution/contri bution problem The inequity problem

What makes an intervention complex? What are the implications for impact evaluation? A hypothesis: Idealized views of ‘best’ or even ‘appropriate’ evaluation approaches might depend on the complexity of the intervention 16

Complexity of Intervention Evaluation Approach 17

Complexity of Intervention Approach to Evaluation Clinical Intervention Community Intervention System-level Intervention 18

Working definition of complex interventions 19 COMPLEX INTERVENTIONS ARE: DYNAMIC: An intervention that changes over time (both in response to changing context and learnings over time) HETEROGENOUS, CONTEXT shaped: Is constrained and shaped by the context (the same intervention will look very different in very different contexts) HETEROGENOUS, CONTEXT shaped: Is constrained and shaped by the context (the same intervention will look very different in very different contexts) MULTIPLE INTERACTING COMPONENTS: These multiple interacting components has the potential of changing the overall intervention over time

QUESTIONS TO DESCRIBE COMPLEX INTERVENTIONS How hard is it to describe? How hard is it to create? What is its degree of organization? 20

THE LOGIC OF AN EVOLUTIONARY STRATEGY 21  Box et al (1978, p. 303): ... the best time to design an experiment is after it is finished, the converse is that the worst time is the beginning, when least is known. If the entire experiment was designed at the outset, the following would have to be assumed as known: (1) which variables were the most important, (2) over what ranges the variables should be studied... The experimenter is least able to answer such questions at the outset of an investigation but gradually becomes more able to do so as a program evolves. (p. 303)

FEATURES OF COMPLEX INTERVENTIONS (PAWSON ET AL., 2004) The intervention is a theory or theories The intervention involves the actions of people. The intervention consists of a chain of steps These chains of steps or processes are often not linear, and involve negotiation and feedback at each stage. Interventions are embedded in social systems and how they work is shaped by this context. Interventions are prone to modification as they are implemented. Interventions are open systems and change through learning as stakeholders come to understand them. 22

SYSTEM DYNAMIC APPROACHES (STERMAN, 2006) Constantly changing; Governed by feedback; Non-linear, History-dependent; Adaptive and evolving; Characterized by trade-offs; Policy resistance: “The result is policy resistance, the tendency for interventions to be defeated by the system’s response to the intervention itself.” 23

WHY DOES DIVERSITY MATTER IN CONDUCTING IMPACT EVALUATION? 24 LEARNING: Very different learning needs of different stakeholders LEARNING: Very different learning needs of different stakeholders VIEWS of SUCCESS: Differing views of what constitutes success VIEWS of SUCCESS: Differing views of what constitutes success DIVERSITY of QUESTIONS: Heterogeneity of relevant evaluation questions DIVERSITY of QUESTIONS: Heterogeneity of relevant evaluation questions COMPLEXITY of the INTERVENTION: Might be hard to focus on all aspects of the intervention. What to focus might depend on what stakeholders value COMPLEXITY of the INTERVENTION: Might be hard to focus on all aspects of the intervention. What to focus might depend on what stakeholders value MIXTURES of DESIGNS: Varieties of designs need to be integrated to answer the relevant evaluation questions MIXTURES of DESIGNS: Varieties of designs need to be integrated to answer the relevant evaluation questions DIVERSITY of MEASUREMENT: Diversity in views of what constitute the most important measures DIVERSITY of MEASUREMENT: Diversity in views of what constitute the most important measures

LOOKING FORWARD What is a stable- enough intervention? Incompleteness of knowledge Heterogeneity is not noise Ecology of Evidence The need for an evolutionary strategy Culture Context Dynamics: timeline. trajectories Causal Structure of Connections

BUILDING EVALUATION AS A FIELD 26

Models of Causation (Successionist vs. Generative Models of Causation) Ecology of Evidence Integrating Knowledge Translation with evaluation Capacity Building Developmental evaluation in Complex Dynamic Settings Portfolio of designs and approaches Portfolio of designs and approaches Program Theory and Incompleteness Time Horizons and Functional forms Spread, Scaling up and Generalization 27

Questions to improve reporting of Impact Evaluations 28

Describe the Intervention What was the Setting of the intervention? What was the Context? Was there a discussion of the evidence informing the program? Was the evidence from multiple disciplines? Is the program a pilot? Challenges of adaptation to specific setting? What was the duration of the intervention? Discussion on timelines and trajectories of impacts? Were there changes in the intervention over time? How did the evaluation explore this? Was the theory of change described? Did it change over time? 29

How were impacts studied; what design were implemented? Was there a Formal process of ruling out threats to internal and external validity? Was the program a success? Unintended outcomes? Differential impacts for different groups? Did the evaluation help with decisions about sustainability? Was the organizational structure of the intervention described? What was the Intervention Planners’ view of success? Were there formal structures to learn and modify the program over time? Was there a discussion of what can be spread as part of learnings from the evaluation? 30