ENDOGENEITY - SIMULTANEITY Development Workshop. What is endogeneity and why we do not like it? [REPETITION] Three causes: – X influences Y, but Y reinforces.

Slides:



Advertisements
Similar presentations
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Advertisements

Designing an impact evaluation: Randomization, statistical power, and some more fun…
Introduction to Propensity Score Matching
REGRESSION, IV, MATCHING Treatment effect Boualem RABTA Center for World Food Studies (SOW-VU) Vrije Universiteit - Amsterdam.
Treatment Evaluation. Identification Graduate and professional economics mainly concerned with identification in empirical work. Concept of understanding.
School District Consolidation William Duncombe and John Yinger The Maxwell School, Syracuse University February 2013.
Review of Identifying Causal Effects Methods of Economic Investigation Lecture 13.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
1 Arlene Ash QMC - Third Tuesday September 21, 2010 (as amended, Sept 23) Analyzing Observational Data: Focus on Propensity Scores.
LT8: Matching Sam Marden Introduction Describe the intuition behind matching estimators. Be concise. Suppose you have a sample of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences.
The Fundamental Problem of Causal Inference Alexander Tabarrok January 2007.
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Chapter 2 – Tools of Positive Analysis
© Institute for Fiscal Studies The role of evaluation in social research: current perspectives and new developments Lorraine Dearden, Institute of Education.
Chapter 13: Experiments and Observational Studies
EXPERIMENTS AND OBSERVATIONAL STUDIES Chance Hofmann and Nick Quigley
Introduction to PSM: Practical Issues, Concerns, & Strategies Shenyang Guo, Ph.D. School of Social Work University of North Carolina at Chapel Hill January.
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
Association vs. Causation
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
Copyright © 2010 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Experiments and Observational Studies. Observational Studies In an observational study, researchers don’t assign choices; they simply observe them. look.
Copyright © 2010 Pearson Education, Inc. Slide
Chapter 13 Notes Observational Studies and Experimental Design
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 13 Experiments and Observational Studies.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
 Is there a comparison? ◦ Are the groups really comparable?  Are the differences being reported real? ◦ Are they worth reporting? ◦ How much confidence.
Experimental Design All experiments have independent variables, dependent variables, and experimental units. Independent variable. An independent.
LT 4.2 Designing Experiments Thanks to James Jaszczak, American Nicaraguan School.
Matching Estimators Methods of Economic Investigation Lecture 11.
Jean Yoon December 15, 2010 Research Design. Outline Causality and study design Quasi-experimental methods for observational studies –Covariate matching.
ECON 3039 Labor Economics By Elliott Fan Economics, NTU Elliott Fan: Labor 2015 Fall Lecture 21.
Chapter 3.1.  Observational Study: involves passive data collection (observe, record or measure but don’t interfere)  Experiment: ~Involves active data.
Public Policy Analysis ECON 3386 Anant Nyshadham.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Christine Pal Chee October 9, 2013 Research Design.
Generalizing Observational Study Results Applying Propensity Score Methods to Complex Surveys Megan Schuler Eva DuGoff Elizabeth Stuart National Conference.
Applying impact evaluation tools A hypothetical fertilizer project.
Instrumental Variables: Introduction Methods of Economic Investigation Lecture 14.
Average Treatment Effects and Propensity Score Methods Zhehui Luo, MS, PhD Department of Epidemiology Michigan State University.
FINAL MEETING – OTHER METHODS Development Workshop.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
Aim: What factors must we consider to make an experimental design?
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
What can a CIE tell us about the origins of negative treatment effects of a training programme Miroslav Štefánik miroslav.stefanik(at)savba.sk INCLUSIVE.
MATCHING Eva Hromádková, Applied Econometrics JEM007, IES Lecture 4.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
The Evaluation Problem Alexander Spermann, University of Freiburg 1 The Fundamental Evaluation Problem and its Solution SS 2009.
Alexander Spermann University of Freiburg, SS 2008 Matching and DiD 1 Overview of non- experimental approaches: Matching and Difference in Difference Estimators.
Experimental Evaluations Methods of Economic Investigation Lecture 4.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Henrik Winterhager Econometrics III Before After and Difference in Difference Estimators 1 Overview of non- experimental approaches: Before After and Difference.
Matching methods for estimating causal effects Danilo Fusco Rome, October 15, 2012.
Looking for statistical twins
School District Consolidation
Econometrics ITFD Week 8.
Chapter 13- Experiments and Observational Studies
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Methods of Economic Investigation Lecture 12
Matching Methods & Propensity Scores
Impact Evaluation Methods: Difference in difference & Matching
Evaluating Impacts: An Overview of Quantitative Methods
Analysing RWE for HTA: Challenges, methods and critique
Presentation transcript:

ENDOGENEITY - SIMULTANEITY Development Workshop

What is endogeneity and why we do not like it? [REPETITION] Three causes: – X influences Y, but Y reinforces X too – Z causes both X and Y fairly contemporaneusly – X causes Y, but we cannot observe X and Z (which we observe) is influenced by X but also by Y Consequences: – No matter how many observations – estimators biased (this is called: inconsistent) – Ergo: whatever point estimates we find, we can’t even tell if they are positive/negative/significant, because we do not know the size of bias + no way to estimate the size of bias

The magic of „ceteris paribus” Each regression is actually ceteris paribus Problem: data may be at odds with ceteris paribus Examples?

Problems with Inferring Causal Effects from Regressions Regressions tell us about correlations but ‘correlation is not causation’ Example: Regression of whether currently have health problem on whether have been in hospital in past year: HEALTHPROB | Coef. Std. Err. t PATIENT | _cons | Do hospitals make you sick? – a causal effect

The problem in causal inference in case of simultaneity Confounding Influence Treatment Outcome Observed Factor Unobserved Factor

Any solutions? Confounding Influence Treatment Outcome Observed Factor Unobserved Factor

Instrumental Variables solution… Confounding Influence Treatment Outcome Instrumental Variable(s) Observed Factor Unobserved Factor

Fixed Effects Solution… (DiD does pretty much the same) Confounding Influence Treatment Outcome Fixed Influences Observed Factor Unobserved Factor

Short motivating story – ALMPs in Poland Basic statement: 50% of unemployed have found employment because of ALMPs Facts: – 50% of whom? – only those, who were treated (only those were monitored) – only 90% of treated completed the programmes – of those, who completed, indeed 50% work, but only 60% of these who work say it was because of the programme So how many actually employed because of the programme?

Short motivating story – ALMPs in Poland ??? 90 % Product Gross effectiveness Net effectiveness Net efficiency? Completed training …... found employment thanks to programme…

Basic problems in causal inference Compare somebody „before” and „after” – If they were different already before, the differential will be wrongly attributed to „treatment”  can we measure/capture this inherent difference?  does it stay unchanged „before” and „after”?  what if we only know „after”? If the difference stays the same => DiD estimator => assumption that cannot be tested for If the difference cannot be believed to stay the same?

Faked counterfactual or generating a paralel world o MEDICINE: takes control groups – people as sick, who get a different treatment or a placebo => experimenting o What if experiment impossible? Seminarium magisterskie - zajęcia 4 12

What if experiment impossble? Only cross-sectional data Panel data „Regression Discontinuity Design“ „Propensity Score Matching“ Instrumental variables Before After Estimators Difference in Difference Estimators (DiD) „Propensity Score Matching“ + DiD

Propensity Score Matching Confounding Influence Treatment Outcome Treatment Observed Factor Unobserved Factor

Propensity score matching GroupY1Y0 Treated (D=1) Observedcounterfactual – (does not exist) Nontreated (D=0) counterfactual – (does not exist)observed Average treatment effect E(Y)=E(Y1-Y0)=E(Y1)-Y0 Average treatment effect for the untreated E(Y1-Y0|D=0)=E(Y1|D=0)-E(Y0|D=0) Average treatment effect for the treated (ATT) E(Y1-Y0|D=1)=E(Y1|D=1)-E(Y0|D=1)

Propensity Score Matching Idea – Compares outcomes of similar units where the only difference is treatment; discards the rest Example – Low ability students will have lower future achievement, and are also likely to be retained in class – Naïve comparison of untreated/treated students creates bias, where the untreated do better in the post period – Matching methods make the proper comparison Problems – If similar units do not exist, cannot use this estimator

How to get PSM estimator? – First stage: run „treatment” on observable characteristics – Second stage: estimate the probability of „treatment” – Third stage: compare results of those „treated” and similar non-treated („statistical twinns”) – The less similar they are, the less likely they should be compared one with another

The obtained propensity score is irrelevant (as long as consistent) NEAREST NEIGHBOR (NN) Pros => tzw. 1:1 Cons => if 1:1 does not exist, completely senseless

The obtained propensity score is irrelevant (as long as consistent) CALIPER/RADIUS MATCHING(NN) Pros => more elastic than NN Cons => who specifies the radius/caliper?

The obtained propensity score is irrelevant (as long as consistent) Stratification and Interval Pros => eliminates discretion in radius/caliper choice Cons => within strata/interval, units don’t have to be „similar” (some people say 10 strata is ql)

The obtained propensity score is irrelevant (as long as consistent) KERNEL MATCHING (KM) Pros => uses always all observations Cons => need to remember about common support TreatmentControl * ** * * *

What is „common support”? Distributions of pscore may differ substantially across units Only sensible solutions!

Real world examples

Next week – practical excercise Read the papers posted on the web I will post one that we will replicate soon…