to see if program cost causes more in savings. $1.74

Slides:



Advertisements
Similar presentations
REGRESSION, IV, MATCHING Treatment effect Boualem RABTA Center for World Food Studies (SOW-VU) Vrije Universiteit - Amsterdam.
Advertisements

Questions What is the relationship between ‘research designs’ and ‘research strategies’? Which method of experiments, within subjects or between subjects.
Advantages and limitations of non- and quasi-experimental methods Module 2.2.
Lecture 11 (Chapter 9).
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
1 Arlene Ash QMC - Third Tuesday September 21, 2010 (as amended, Sept 23) Analyzing Observational Data: Focus on Propensity Scores.
Evaluating the Long-Term Earnings Outcomes of Vocational Rehabilitation Participants using Administrative Data Sources Presented at the: Cornell StatsRRTC.
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Bivariate & Multivariate Regression correlation vs. prediction research prediction and relationship strength interpreting regression formulas process of.
Experiments Explanatory research True experiments Experimental designs
Inferential statistics Hypothesis testing. Questions statistics can help us answer Is the mean score (or variance) for a given population different from.
David Card, Carlos Dobkin, Nicole Maestas
Advanced Statistics for Interventional Cardiologists.
Meta Analysis MAE Course Meta-analysis The statistical combination and analysis of data from separate and independent studies to determine if there.
Non Experimental Design in Education Ummul Ruthbah.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Everything is Missing… Data A primer on causal inference and propensity scores Dan Chateau.
ECON ECON Health Economic Policy Lab Kem P. Krueger, Pharm.D., Ph.D. Anne Alexander, M.S., Ph.D. University of Wyoming.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
1 The Need for Control: Learning what ESF achieves Robert Walker.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Sampling Class 7. Goals of Sampling Representation of a population Representation of a population Representation of a specific phenomenon or behavior.
Chapter Four Experimental & Quasi-experimental Designs.
Article Review Cara Carty 09-Mar-06. “Confounding by indication in non-experimental evaluation of vaccine effectiveness: the example of prevention of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Jean Yoon December 15, 2010 Research Design. Outline Causality and study design Quasi-experimental methods for observational studies –Covariate matching.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Public Policy Analysis ECON 3386 Anant Nyshadham.
MSRP Year 1 (Preliminary) Impact Research for Better Schools RMC Corporation.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Latent Growth Modeling Byrne Chapter 11. Latent Growth Modeling Measuring change over repeated time measurements – Gives you more information than a repeated.
Applying impact evaluation tools A hypothetical fertilizer project.
Selection Models Evaluation Research (8521) Prof. Jesse Lecy 1.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.
EXPERIMENTAL DESIGNS. Categories Lab experiments –Experiments done in artificial or contrived environment Field experiments –Experiments done in natural.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Evaluating the benefits of using VAT data to improve the efficiency of editing in a multivariate annual business survey Daniel Lewis.
1 The Training Benefits Program – A Methodological Exposition To: The Research Coordination Committee By: Jonathan Adam Lind Date: 04/01/16.
Alexander Spermann University of Freiburg, SS 2008 Matching and DiD 1 Overview of non- experimental approaches: Matching and Difference in Difference Estimators.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Lurking inferential monsters
Evidence Synthesis/Systematic Reviews of Eyewitness Accuracy
General belief that roads are good for development & living standards
School District Consolidation
Planning for Evaluation
Quasi Experimental Methods I
An introduction to Impact Evaluation
Impact Evaluation Methods
Making Causal Inferences and Ruling out Rival Explanations
Introduction to Design
Impact evaluation: The quantitative methods with applications
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Quasi-Experimental Design
Matching Methods & Propensity Scores
Impact Evaluation Methods: Difference in difference & Matching
Evaluating Impacts: An Overview of Quantitative Methods
Experiments: Part 2.
Sampling for Impact Evaluation -theory and application-
Analysing RWE for HTA: Challenges, methods and critique
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Presentation transcript:

to see if program cost causes more in savings. $1.74   What is this? How and why are they using it? Advantages and weaknesses in this evaluation Meta-analysis  looking past studies on same topic; expected effect; summarizing expected effect across many studies. +:can compare results—how similar? Omitted less rigorous evals. Less labor; cheap. Theoretical foundation for having drug court; compensates for limitations of indiv. methods; more confident that there is an effect. -:maybe differences due to difs in implementation; Multivariate regression with statistical controls  gave general idea of relationship (overview of what impacted recid. Rates). Also what variables influence recid (for risk score matching).  +:easy -:still have unobservable characteristics Propensity score analysis  match opt in to not opt in based on propensity to participate; among eligible, all given a rating to opt in; logit model  +:deals with selection; if match closely, just seeing program effect. -some unobservables not accounted for. Risk score matching  matching based on factors associated with recidivism; logit  +:deal selection to the extent related to recidivism -:14 variables; reduced covariates so could make more matches. Reflexive control  pre-post measure; select same type of people before program existed. Ctrl for changes in recid over time due to other causes.  +: accounts for other changes in recidivism; -:maturation; not controlling for unmeasured innate differences Cost-benefit analysis  to see if program cost causes more in savings. $1.74  +: quantifying benefits; have a number to gauge if program worth it. Purely court costs (so more benefits—conservative? so trustworthy? Depends on goal of eval.) -:not long-term impacts: employment, health care. They didn't use these. Why not? Instrumental variables  couldn’t find one.  a better (more precise) estimate of program impacts Regression discontinuity  no score; elect in, not quantifiable score that allowed people into tmt./ ethics.  apples to apples: controlling selection.