ASSESSING THE IMPACT OF SOCIAL POLICIES ON HEALTH AND HEALTH INEQUALITIES: THEORY, METHODS, AND EVIDENCE Technical Meeting on Measuring & Monitoring Action.

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

Explanation of slide: Logos, to show while the audience arrive.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Mywish K. Maredia Michigan State University
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
TOOLS OF POSITIVE ANALYSIS
Global Health Challenges Social Analysis 76: Lecture 6
OVE’s Experience with Impact Evaluations Paris June, 2005.
David Card, Carlos Dobkin, Nicole Maestas
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 6: Maternity leave and related types of leave Maternity.
Ministry of Education and Religious Affairs General Secretariat for Research and Technology EEA Financial Mechanism Research within Priority.
PROMOTING GENDER STATISTICS IN EVIDENCE-BASED POLICYMAKING 2 nd Global Forum on Gender Statistics, January 2009 Neda Jafar Statistics Division UN ESCWA.
1 OPHS FOUNDATIONAL STANDARD BOH Section Meeting February 11, 2011.
Common framework Guidelines for Pilot Actions Debrecen 2013 Municipality of Debrecen Department of Sociology University of Debrecen External expert.
Centre for Market and Public Organisation Using difference-in-difference methods to evaluate the effect of policy reform on fertility: The Working Families.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Institute for International Programs An international evaluation consortium Institute for International Programs An international evaluation consortium.
Chapter 2: The Role of Economics
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
 Community Health Status Assessment MAPP Phase 3 California Gaining Ground Coalition Small County Learning Community August 13, 2015 Tamara Maciel Bannan,
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
CityMatCH PPOR Learning Network, Integrating PPOR and FIMR, June 2007 Integrating PPOR and FIMR CityMatCH PPOR Level 2 Learning Network Seminar Call, June.
The effect of paid maternity leave on early childhood growth in low and middle income countries Deepa Jahagirdar June 10, 2016 Canadian Society for Epidemiology.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Methods of Presenting and Interpreting Information Class 9.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Stages of Research and Development
Introduction Social ecological approach to behavior change
Evaluating the Quality and Impact of Community Benefit Programs
Monitoring and Evaluating Rural Advisory Services
Operational Aspects of Impact Evaluation
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Module 2 Basic Concepts.
M&E Basics Miguel Aragon Lopez, MD, MPH
Measuring Results and Impact Evaluation: From Promises into Evidence
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Introduction to Comprehensive Evaluation
UNECE Work Session on Gender Statistics, Belgrade,
Gender Gender refers to the socially constructed characteristics of women and men – such as norms, roles and relationships of and between groups of women.
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Explanation of slide: Logos, to show while the audience arrive.
CAPACITY DEVELOPMENT THROUGH SYSTEMS USE, RESULTS AND sustainable development goals Workshop on New Approaches to Statistical Capacity Development,
Planning a Learning Unit
SOCIAL SCIENCES &TECHNOLOGY
11/20/2018 Study Types.
Module 66 Regulations and Equity
Development Impact Evaluation in Finance and Private Sector
1 Causal Inference Counterfactuals False Counterfactuals
Using Data for Program Improvement
Evaluating Impacts: An Overview of Quantitative Methods
WHAT is evaluation and WHY is it important?
Class 2: Evaluating Social Programs
Class 2: Evaluating Social Programs
Using Data for Program Improvement
Positive analysis in public finance
TRACE INITIATIVE: Data Use
Dr Timothy Armstrong Coordinator
Integrating Gender into Rural Development M&E in Projects and Programs
M & E Plans and Frameworks
Presentation transcript:

ASSESSING THE IMPACT OF SOCIAL POLICIES ON HEALTH AND HEALTH INEQUALITIES: THEORY, METHODS, AND EVIDENCE Technical Meeting on Measuring & Monitoring Action on the Social Determinants of Health Arijit Nandi 20 June 2016 Department of Epidemiology, Biostatistics and Occupational Health Département d’épidémiologie, biostatistique et santé au travail

I. WHY EVALUATE?

3 Social factors and health: a conceptual model

What does the evidence tell us?  Most studies concerning the social determinants of health are descriptive analyses that track the social patterning of health 1  Association ≠ causation (an association does not imply that intervening to change a social factor would alter the outcome)  Comparatively few studies focus on understanding the causal effects of social exposures, such as education, on health  Moreover, not all causal evidence is necessarily policy relevant, motivating calls for evaluation studies to prioritize the “assessment of the potential contribution to population health of particular interventions implemented” Nandi and Harper (2014); 2 Galea (2013)

What can we do about it?

... public health researchers should generate evidence that is closely aligned with what policy makers and program planners can use (O’Campo, 2012) Source: O’Campo, 2012

Evaluation promotes evidence-based public health  Evidence-based public health calls for “a solid knowledge base on disease frequency and distribution, on the determinants and consequences of disease, and on the safety, efficacy, and effectiveness of interventions and their costs” 1  The results of a well-conducted impact evaluation study provides causal evidence on the effect of a particular program or policy and can help assess if the intervention was effective  Undertaking “evaluation research to generate sound evidence [is] fundamental to achieving a more evidence-based approach to public health practice” 2 1 Victora (2004); 2 Brownson (2009)

II. WHAT IS IMPACT EVALUATION?

It’s evaluation for cause-and-effect questions  Impact evaluation studies are among a range of complementary techniques for supporting evidence-based policymaking  An impact evaluation “assesses the changes in the well-being of individuals that can be attributed to a particular project, program, or policy” 1, which we generally call interventions  “Impact evaluation asks about the difference between what happened with the program and what would have happened without it (referred to as the counterfactual)…This difference is the impact of the program.” 2 1 Gertler (2011); 2 CGD (2006)

What impact evaluation is not  Some erroneously equate monitoring, health impact assessment, and impact evaluation  Monitoring refers to “the process of collecting data and analyzing it to verify whether programs were implemented according to plan, whether financial resources and inputs were applied as intended, whether the expected outputs were realized, whether intended beneficiaries were reached, and whether time schedules were met.” 1  Health impact assessment: “A combination of procedures, methods and tools by which a policy, programme or project may be judged as to its potential effects on the health of the population, and the distribution of those effects” 2 1 CGD (2006); 2 WHO

Potential outcomes framework  What-if or counterfactual questions about the impact of an intervention are hypotheticals—so how can we answer them?  The potential outcomes framework provides us with a guide for posing and answering counterfactual questions; it is the common language for impact evaluation in the social sciences  The potential outcomes framework uses the specification of well-defined causal states to which all members of the population of interest could be exposed to identify what would have been under an alternative counterfactual scenario

True (unobserved) impact of an intervention Years (t) Outcome [E(Y i )] E(δ i )=E(y i 1 ) – E(y i 0 ) Counterfactual Intervention E(y i 1 ) E(y i 0 )

 As with individuals, it is not possible to observe the same target population simultaneously under two different conditions  Unlike their individual-level analogues, we can use our observed data to estimate E(y i ) and calculate a “naïve” estimate of the ATE Fundamental problem of causal inference

 As with individuals, it is not possible to observe the same target population simultaneously under two different conditions  Unlike their individual-level analogues, we can use our observed data to estimate E(y i ) and calculate a “naïve” estimate of the ATE  We could observe the same group at different time periods (pre- post), but other things may have changed since the intervention Fundamental problem of causal inference Causal effect? Years (t) Outcome [E(Y i )] Intervention E(y i |d i = 1, t+Δt) E(y i |d i = 0, t) Counterfactual

Fundamental problem of causal inference  As with individuals, it is not possible to observe the same target population simultaneously under two different conditions  Unlike their individual-level analogues, we can use our observed data to estimate E(y i ) and calculate a “naïve” estimate of the ATE  We could observe the same group at different time periods (pre- post), but other things may have changed since the intervention  Alternatively, we can observe a different group that was unexposed Causal effect? Years (t) Primary Outcome [E(Y i )] Intervention E(y i |d i = 1, t+Δt) E(y i |d i = 0, t+Δt) Counterfactual

Fundamental problem of causal inference  As with individuals, it is not possible to observe the same target population simultaneously under two different conditions  Unlike their individual-level analogues, we can use our observed data to estimate E(y i ) and calculate a “naïve” estimate of the ATE  We could observe the same group at different time periods (pre-post), but other things may have changed since the intervention  Alternatively, we can observe a different group that was unexposed  Either way, the substitutes were not reasonable counterfactuals  The pre-post comparison ignores “secular trends” (changes in other factors influencing the outcome since the intervention)  The comparison of exposed and unexposed neglects that exposed individuals generally differ from unexposed individuals

III. HOW TO EVALUATE?

IDENTIFY “POLICY EXPERIMENTS”. Prioritizing specific policy reforms with the potential to affect major sources of global morbidity and mortality prioritized by the UN Development Goals SUPPORT EVIDENCE-BASED DECISION MAKING. Translating research findings for academic and non-academic audiences and identifying practical solutions for improving socioeconomic development and health Quasi-experiments. Estimating policy impact, and inequalities by gender, SES, and urban-rural residence FEEDBACK. Refining research priorities based on knowledge created and changing priorities of network partners Mediation analysis. Examining mechanisms through which policies influence health targets Cost-effectiveness. Comparing the costs and benefits of policies that have a robust effect on health EVALUATE.

Building data infrastructure

 We collect longitudinal data on social policies related to poverty, income and gender inequality for all LMICs  Breastfeeding breaks at work (1995-current)

21 Were breastfeeding breaks guaranteed at work?

Building data infrastructure  We collect longitudinal data on social policies related to poverty, income and gender inequality for all LMICs  Breastfeeding breaks at work (1995-current)  Minimum wage (1999-current)

How have minimum wages evolved over time?

Building data infrastructure  We collect longitudinal data on social policies related to poverty, income and gender inequality for all LMICs  Breastfeeding breaks at work (1995-current)  Minimum wage (1999-current)  Maternal and paternal leave policies (1995-current)  Minimum age of marriage (1995-current)  Family cash benefits (1999-current)  Child labour (1995-current)

Building data infrastructure  We collect longitudinal data on social policies related to poverty, income and gender inequality for all LMICs  Breastfeeding breaks at work (1995-current)  Minimum wage (1999-current)  Maternal and paternal leave policies (1995-current)  Minimum age of marriage (1995-current)  Family cash benefits (1999-current)  Child labour (1995-current)  Expanding to other public policy areas…  Join policy data to survey data from harmonized DHS/MICS or other sources to create multilevel datasets

Applying empirical methods  Difference-in-differences (DD) can be used to compare the change over time in an outcome for a treatment group that experienced a reform vs. a control group that did not when a policy is adopted or modified in some areas but not others

Health behaviors (i.e., breastfeeding, vaccination) Child health Maternity leave Uptake of pre- & post-natal health services Prenatal maternity stress

Increased leave lowered infant mortality Change in mortality/1000 live births for each month of paid leave* *results were robust to adjustment for individual, household, and country-level characteristics, including the wage replacement rate, GDP per capita, female labor force participation, government health expenditure per capita, and total health expenditure per capita

Applying empirical methods  Difference-in-differences (DD) can be used to compare the change over time in an outcome for a treatment group that experienced a reform vs. a control group that did not when a policy is adopted or modified in some areas but not others  Regression discontinuity (RD) methods can be used when eligibility for a particular program is determined by whether individuals are below or above a threshold value on a continuously measured variable, such as age or income

RD measures the difference in post-intervention outcomes between units near the cutoff—e.g., those units that were just above the threshold and did not receive cash payments serve as the counterfactual comparison group Source: Gertler (2011)

Future research directions  Continuing to examine social policies that might influence health over the life-course, including early-life interventions  Expanding to other public policy areas, including policies that can influence healthcare services, public health, social welfare, economic opportunity, gender equality, and the environment  Assessing heterogeneous effects and impacts on social and gender inequalities in health  Examining/monitoring implementation of policies and coverage of social determinants of health

Key collaborators: Efe Atabay, John Frank, Mohammad Hajizadeh, Sam Harper, Jody Heymann, Jay Kaufman, Lauren Maxwell, José Mendoza Rodriguez, Erin Strumpf, Ilona Vincent, and all of the research team Thanks! Partner institutions: Funding: For further information see