Making Causal Claims in Non-Experimental Settings

Slides:



Advertisements
Similar presentations
Making Contribution Claims IPDET 2011 Ottawa John Mayne, Ph D Advisor on Public Sector Performance Adjunct Professor, University of Victoria
Advertisements

Mywish K. Maredia Michigan State University
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Research problem, Purpose, question
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
DISCUSSION PAPER PREPARED FOR THE WHO’S DEPARTMENT OF HUMAN RESOURCES FOR HEALTH ON BEHALF OF THE UN TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS BY.
Workshop on Using Contribution Analysis to Address Cause-Effect Questions Danish Evaluation Society Conference Kolding, September 2008 John Mayne, Advisor.
Logic Models and Theory of Change Models: Defining and Telling Apart
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
JEFF ALEXANDER The University of Michigan The Challenge and Promise of Delivery System Research: A Meeting of AHRQ Grantees, Experts, and Stakeholders.
Office of the Auditor General of Canada Modernizing Accountability A need for evaluation Presentation to the CES 2003 Annual Conference Vancouver John.
The need-to-knows when commissioning evaluations Rick Davies, Stockholm, 27 June, 2016.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Stages of Research and Development
WHAT IS THE NATURE OF SCIENCE?
Conceptual Change Theory
Descriptive and Causal
Incorporating Evaluation into a Clinical Project
Technical Assistance on Evaluating SDGs: Leave No One Behind
DATA COLLECTION METHODS IN NURSING RESEARCH
Introduction Front-End Analysis
Lesson Overview 1.2 Science in Context.
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
Workshop to develop theories of change
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
Managing for Results Capacity in Higher Education Institutions
Right-sized Evaluation
Classification of Research
Internal Assessment 2016 IB Chemistry Year 2 HL.
Introduction to Evaluation
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Understanding by Design
THEORY IN EDUCATIONAL RESEARCH
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Conceptual Frameworks, Models, and Theories
Lesson Overview 1.2 Science in Context.
Grade 6 Outdoor School Program Curriculum Map
Our new quality framework and methodology:
Logic Models and Theory of Change Models: Defining and Telling Apart
Danish Evaluation Society Conference Kolding, September 2008
Comparative Research.
4.2 Identify intervention outputs
REAL (working with Singizi) Presentation to NSA 3rd/4th August
CATHCA National Conference 2018
Narrowing the evaluation gap
Lesson Overview 1.2 Science in Context.
Lesson Overview 1.2 Science in Context.
A LEVEL Paper Three– Section A
WHAT is evaluation and WHY is it important?
Lesson Overview 1.2 Science in Context.
Regulated Health Professions Network Evaluation Framework
Lesson Overview 1.2 Science in Context.
Lesson Overview 1.2 Science in Context.
Chapter 3 The Idea Of Causation on Social Research
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Unit 2 Read, wRite, and Research
Lesson Overview 1.2 Science in Context.
Scientists argue, but they argue about ideas.
Lesson Overview 1.2 Science in Context.
Debate issues Sabine Mendes Lima Moura Issues in Research Methodology
A brief introduction to the usefulness of Outcome Mapping for assessing impact
Formulating Research Questions (RQ’s) Refer EXERCISE # 1
Lesson Overview 1.2 Science in Context.
Presentation transcript:

Making Causal Claims in Non-Experimental Settings IPDET 2016 Ottawa John Mayne, PhD Advisor on Public Sector Performance john.mayne@rogers.com I hope to leave you with a few ideas of how to making cause-effect claims in non-experimental settings.

The Context Interventions are aimed at bringing about development impacts Interventions are not acting on their own. There may be: other events and conditions at play other interventions at work other relevant contextual factors Development impacts result from a mix of such actions and context

The challenge We still need to understand and make credible claims about the contribution interventions (programs and projects) are making to development outcomes and impacts Experimental designs for these interventions often not feasible or practical

The Issue In this context, what can we say about the relationship between the intervention and observed development results? The intervention ‘caused’ the results? The intervention made a difference? The intervention contributed to the results? The intervention can be attributed to a specific net result? What makes sense?

Conceptualizing Causality What kind of causal relation exists then between a development intervention (X) and an impact (Y)? Can we say X causes Y? No Is X necessary for Y? No Is X sufficient for Y? No But we clearly want to make some causal link between the intervention and the impact

Conceptualizing Causality As might be imagined, these conundrums have been well addressed in the vast literature on causality. There is an answer! An intervention works as part of a broader causal package of other events and conditions. And if it works, then this causal package is indeed sufficient to bring about the impact. Further, if the intervention is ‘working’—doing its job—then it is an essential part of this causal package.

INUS Causality This is the very useful concept of INUS causality and INUS conditions. An Insufficient but Necessary part of a condition that is itself Unnecessary but Sufficient for the occurrence of the effect (Mackie 1974). Mackie argued that much of the time, when we are talking about causality we are in fact talking about INUS causality.

Intervention Causality Thus, an intervention “made a difference” when: The intervention causal package was sufficient to bring about the impact, and The intervention was a necessary component of the causal package The intervention in this case is a contributory cause. On its own it is neither necessary nor sufficient. Smoking and lung cancer is an example.

The Intervention as Trigger An intervention then is one among several ‘causes’. But is that all? We probably expect more, that the intervention: acts as a trigger to start the causal chain (the spark that lights the fire) and may act as sustaining support for change along the way (gasoline to keep the fire going) A principal contributory cause

Meaningful Causal Questions Has the intervention made a difference? Is the intervention a contributory cause? What contribution has the intervention made? Why has the impact occurred? How did the causal factors bring about the result? What was the context and the mechanisms? What role did the intervention play? The question ‘What contribution did the intervention play?’ is asking about whether the intervention played, for example, a trigger role or a more modest role, with something else as a trigger.

Demonstrating Contributory Causes How then to show that the intervention made a difference? Connecting to theory-based evaluation approaches Sufficiency through generative (process) causality approaches One approach: contribution analysis There are other approaches: Other T-B approaches QCA

Theories of Change Results chains/logic models Impact pathways — the sequence of steps to impact Theories of change — pathways with causal link assumptions Causal link assumption — events and/or conditions needed to make the causal link work

intermediate outcomes Traditional Theory of Change Impact intermediate outcomes Immediate outcomes outputs activities

A Generic Useful Theory of Change   Wellbeing   Wellbeing Assumptions   Direct Benefits   Direct Benefits Assumptions Behaviour changes   External Influences Behaviour Change Assumptions Capacity changes in knowledge, attitudes skills, & opportunities   Note the External Influences Can also include possible Unintended Effects in the ToC (not shown) Key to behaviour change models is that all of the capacity change elements ae needed to bring about behaviour change. Capacity Change Assumptions Reach & Reaction   Reach Assumptions Timeline Activities and Outputs with respect to beneficiaries

Theories of Change and Causal Packages Theories of change are causal packages, and more: ToC identify supporting factors (assumptions) ToC set out the relationship between the supporting factors and the intervention ToC is a model of the intervention as a contributing cause

A Generic Useful Theory of Change   Wellbeing   Wellbeing Assumptions   Direct Benefits   Direct Benefits Assumptions Behaviour changes   External Influences Behaviour Change Assumptions Capacity changes in knowledge, attitudes skills, & opportunities   Note the External Influences Can also include possible Unintended Effects in the ToC (not shown) Key to behaviour change models is that all of the capacity change elements ae needed to bring about behaviour change. Capacity Change Assumptions Reach & Reaction   Reach Assumptions Timeline Activities and Outputs with respect to beneficiaries Causal package

Four approaches to causal attribution Regularity frameworks that depend on the frequency of association between cause and effect - basis for statistical approaches Counterfactual frameworks that depend on the difference between two otherwise identical cases –basis for experimental and quasi experimental approaches Comparative frameworks that depend on combinations of causes that lead to an effect - basis for ‘configurational’ approaches, such as QCA Generative frameworks that depend on identifying the causal links and ‘mechanisms’ that explain effects –basis for ‘theory based’ and ‘realist’ approaches.

Generative Causation Process or mechanistic causality Tracing the links in between causal events Everyday causality: auto mechanic, air crashes, forensic work, doctors The basis for theory-based evaluation approaches The causal analysis approach that is behind contribution analysis. Concluding on cause and effect by tracing the smaller intermediate steps between the cause and the effect.

Contribution analysis: the practice Set out the causal question Critically develop the expected (robust) theory of change Gather the existing evidence verifying (or not) the theory of change Assess the resulting contribution story Seek out additional evidence Revise & strengthen the contribution story What is the attribution question? [Not all questions are useful to pursue.] Traditional causality questions (Often don’t make sense in multi-cause situations) Has the program caused the outcome? To what extent quantitatively has the program caused the outcome? Contribution questions Has the program made a difference? That is, has the program made an important contribution to the observed result? Has the program influenced the observed result? Why has the result occurred? What role did the intervention play? Management questions Is it reasonable to conclude that the program has made a difference? What does the preponderance of evidence say about how well the program is making a difference? What conditions are needed to make this type of program succeed? Develop a robust ToC based on documents, stakeholder views, prior research Gather existing evidence on the links and assumptions in the ToC (M&E, prior studies) Assessing the contribution story: Where are the weaknesses? Do a theory of change analysis. Identify weak links; look for prior research; which are contested? Additional evidence: refined ToC, more measurement Revise and strengthen: design survey and case study tools based on ToC; triangulate

Contribution Analysis CA shows that an intervention is a contributory cause: The ToC is robust and hence credible The expected result occurred The causal package is sufficient supporting factors (assumptions) occurred The intervention is necessary for the package to be sufficient In short, the robust ToC is verified that the expected result occurred that the causal package is sufficient the supporting factors—the assumptions for each link in the theory of change—have occurred and together provide a reasonable explanation for the occurrence of the results, and any other identified supporting factor that occurred has been included in the causal package, thereby revising the theory of change, any plausible rival explanations—external causal factors—have been accounted for, and that the intervention is a necessary part of the causal package without the activities and outputs of the intervention, the supporting factors alone are not sufficient to bring about the result—knowing that with the intervention the causal package was sufficient.

Contribution Analysis in Complex Settings CA can be data and analysis demanding In complex settings, ToCs are much more complex and challenging Applying CA in those settings not straightforward Stay tuned!!

Main Messages We expect most interventions are principal contributory causes The intervention causal package is sufficient & the intervention is essential to the package Want to also know why the impact occurred; to be able to explain ToC are models of the intervention as a contributory cause Contribution analysis and other T-B approaches can be used to show cause-effect relationships

Some References DFID (2012). Broadening the Range of Designs and Methods for Impact Evaluation, London. Available at http://www.dfid.gov.uk/R4D/Output/189575/Default.aspx Mayne, J. (ed) (2012). Contribution Analysis. Evaluation, Special Issue, 18(3). Mayne, J. (2008). Contribution Analysis: An Approach to Exploring Cause and Effect, ILAC Brief 16. Available at http://www.cgiar-ilac.org/files/publications/briefs/ILAC_Brief16_Contribution_Analysis.pdf Mayne, J. (2012). Making Causal Claims, ILAC Brief No. 26: The Institutional Learning and Change Initiative. Available at http://www.cgiar-ilac.org/files/publications/mayne_making_causal_claims_ilac_brief_26.pdf. Mayne (2015). Useful Theory of Change Models. Canadian Journal of Evaluation, 30 (2), 119-142