Econometrics with Observational Data: Research Design Todd Wagner.

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

Postgraduate Course 7. Evidence-based management: Research designs.
Random Assignment Experiments
Sources of bias in experiments and quasi-experiments sean f. reardon stanford university 11 december, 2006.
Aftercare Attendance Partially Moderated by History of Physical Abuse and Gender Louise F. Haynes 1 ; Amy E. Herrin 1 ; Rickey E. Carter 1 ; Sudie E. Back.
Econometrics with Observational Data Will begin at 2PM ET For conference audio, dial and use access code After entry please dial *6.
Jean Yoon July 11, 2012 Research Design. Outline Causality and study design Quasi-experimental methods for observational studies –Sample selection –Differences-in-differences.
Delay from Testing HIV Positive until First HIV Care for Drug Users: Adverse Consequences and Possible Solutions Barbara J Turner MD, MSEd* John Fleishman.
Journal Club Alcohol, Other Drugs, and Health: Current Evidence November–December 2014.
Introduction and Identification Todd Wagner Econometrics with Observational Data.
1 1 DSHS | Planning, Performance and Accountability ● Research and Data Analysis Division ● FEBRUARY 2011 Substance Abuse Treatment Opportunities for Health.
Pooled Cross Sections and Panel Data II
Journal Club Alcohol and Health: Current Evidence March-April 2006.
Clustered or Multilevel Data
Journal Club Alcohol, Other Drugs, and Health: Current Evidence May–June 2009.
TOOLS OF POSITIVE ANALYSIS
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Introduction to PSM: Practical Issues, Concerns, & Strategies Shenyang Guo, Ph.D. School of Social Work University of North Carolina at Chapel Hill January.
Impact of Hospital Provider Payment Mechanism on Household Health Service Utilization in Vietnam (preliminary results) Sarah Bales Public Policy in Asia,
The Economic Impact of Intensive Case Management on Costly Uninsured Patients in Emergency Departments: An Evaluation of New Mexico’s Care One Program.
Research methods in clinical psychology: An introduction for students and practitioners Chris Barker, Nancy Pistrang, and Robert Elliott CHAPTER 8 Foundations.
Epidemiology The Basics Only… Adapted with permission from a class presentation developed by Dr. Charles Lynch – University of Iowa, Iowa City.
The Indiana Family and Social Services Administration Section 2703 Health Homes July 13,2012.
Effects of Pediatric Asthma Education on Hospitalizations and Emergency Department Visits: A Meta-Analysis June 3, 2007 Janet M. Coffman, PhD, Michael.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Healthcare for Workers with Disabilities Supporting and Encouraging Employment APRIL 29, 2009 MIG and DMIE Employment Summit Services and Strategies that.
Adult Drug Courts: The Effect of Structural Differences on Program Retention Rates Natasha Williams, Ph.D., J.D., MPH Post Doctoral Fellow, Morgan State.
URBDP 591 A Lecture 8: Experimental and Quasi-Experimental Design Objectives Basic Design Elements Experimental Designs Comparing Experimental Design Example.
Evidence-Based Journal Article Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department.
Introduction and Identification Todd Wagner Econometrics with Observational Data.
Jean Yoon December 15, 2010 Research Design. Outline Causality and study design Quasi-experimental methods for observational studies –Covariate matching.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Ch. 2 Tools of Positive Economics. Theoretical Tools of Public Finance theoretical tools The set of tools designed to understand the mechanics behind.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Do Diabetes Group Visits Lead to Lower Medical Care Charges? Kathryn Marley Magruder, PhD, MPH VA Medical Center Medical University of South Carolina Charleston,
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
EXPERIMENTAL EPIDEMIOLOGY
Christine Pal Chee October 9, 2013 Research Design.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Applying impact evaluation tools A hypothetical fertilizer project.
What is randomization and how does it solve the causality problem? 2.3.
Chapter 10 Finding Relationships Among Variables: Non-Experimental Research.
Poor Research Designs in Policy Impact Studies: “Lies, Damn Lies, and Statistics” AHRQ 2007 Annual Conference: Improving Healthcare, Improving Lives September.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Oregon's Coordinated Care Organizations: First Year Expenditure and Utilization Authors: Neal Wallace, PhD, Peter Geissert, MPH 1, and K. John McConnell,
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
VA Economic Data Sets: CDR, MPCR, Person-level cost Todd Wagner.
Effects of the State Children’s Health Insurance Program on Children with Chronic Health Conditions Amy J. Davidoff, Ph.D. Genevieve Kenney, Ph.D. Lisa.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
Randomized Assignment Difference-in-Differences
“SEVERE MENTAL ILLNESS & CONGESTIVE HEART FAILURE OUTCOMES AMONG VETERANS” by Jim Banta UCLA Committee members: Alexander Young, Gerald Kominksi, William.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Garnet Anderson Katie Arnold SWOG Statistical Center Fred Hutchinson Cancer Research Center October 24, 2014.
Printed by A Follow-Up Study of Patterns of Service Use and Cost of Care for Discharged State Hospital Clients in Community-Based.
1 Econometrics (NA1031) Chap 7 Using Indicator Variables.
(ARM 2004) 1 INNOVATIVE STATISTICAL APPROACHES IN HSR: BAYESIAN, MULTIPLE INFORMANTS, & PROPENSITY SCORES Thomas R. Belin, UCLA.
Pediatric Asthma Hospitalizations: Impact of Managed Care in the Patterns of Outpatient Healthcare Utilization Capriles, JA., Rodríguez, MH., Rios, R.,
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
David M. Murray, Ph.D. Associate Director for Prevention Director, Office of Disease Prevention Multilevel Intervention Research Methodology September.
Emergency department pediatric psychiatric services
Pooling Cross Sections across Time: Simple Panel Data Methods
S1316 analysis details Garnet Anderson Katie Arnold
Pooling Cross Sections across Time: Simple Panel Data Methods
Daniel Lessler, MD, MHA Chief Medical Officer Health Care Authority
Ch. 13. Pooled Cross Sections Across Time: Simple Panel Data.
Impact Evaluation Methods: Difference in difference & Matching
outpatient drug or alcohol clinic, mental health or community health center, private mental health professional, in-home counseling or crisis services,
Positive analysis in public finance
Is it in medical centers’ self-interest to provide substance use disorder treatment?: A cost-consequence analysis in a national health care system Presented.
Ch. 13. Pooled Cross Sections Across Time: Simple Panel Data.
Presentation transcript:

Econometrics with Observational Data: Research Design Todd Wagner

Research Design Goal: evaluate behaviors and identify causation Goal: evaluate behaviors and identify causation –Policy X caused effect Y –Medication A resulted in B hospitalizations Unit of analysis can be individual or organizational Unit of analysis can be individual or organizational

Research Methods Random assignment? Intent to Treat? Yes

Research Methods Random assignment? Intent to Treat? Yes Basic RCT Analysis Yes No On Treatment

On Treatment RCT comparing drug A to drug B RCT comparing drug A to drug B Adherence for drugs Adherence for drugs –A is 70% –B is 40% What does a comparison of A versus B tell us? What does a comparison of A versus B tell us?

Research Methods Random assignment? Intent to Treat? Is there a control group? Yes No

Research Methods Is there random assignment Randomized Trial Is there a control group Quasi-experimental Design Descriptive Study

Research Methods Is there random assignment Randomized Trial Is there a control group Quasi-experimental Design Descriptive Study

Quasi-Experimental Designs Difference-in-differences Difference-in-differences Regression discontinuity Regression discontinuity Switching replications Switching replications Non-equivalent dependent variables Non-equivalent dependent variables Most common In health

Difference-in-Differences AKA: DD, D in D, or Diff in Diff AKA: DD, D in D, or Diff in Diff Differences across time and arms Differences across time and arms –Usually two arms: treatments, controls –In theory can be used with 3+ arms

Methods for Identifying Controls Inherent matching: Find similar individuals not getting treatment to serve as controls (e.g., twins) Inherent matching: Find similar individuals not getting treatment to serve as controls (e.g., twins) Statistical: use statistical techniques to identify best comparison groups Statistical: use statistical techniques to identify best comparison groups Location: use other geographic sites, states or regions as controls Location: use other geographic sites, states or regions as controls

Unit of Analysis D in D works for different units of analysis D in D works for different units of analysis –Person–people followed over time –Site– sites followed over time –State– states followed over time May need to make some analytical changes depending on unit of analysis May need to make some analytical changes depending on unit of analysis

Diff in Diff example Gruber, Adams and Newhouse (1997) Gruber, Adams and Newhouse (1997) Tennessee increased Medicaid fees for primary care services (goal encourage office care; decrease hospital-based ambulatory care) Tennessee increased Medicaid fees for primary care services (goal encourage office care; decrease hospital-based ambulatory care) What is the effect of this policy change? What is the effect of this policy change?

Research Designs Difference-in-differences Difference-in-differences Regression discontinuity Regression discontinuity Switching replications Switching replications Nonequivalent dependent variables Nonequivalent dependent variables

Regression Discontinuity Participants are assigned to program or comparison groups solely on the basis of an observed measure (education test or means test) Participants are assigned to program or comparison groups solely on the basis of an observed measure (education test or means test) Appropriate when we wish to target a program or treatment to those who most need or deserve it Appropriate when we wish to target a program or treatment to those who most need or deserve it

Regression Discontinuity Partial coverage (not everyone gets the treatment) Partial coverage (not everyone gets the treatment) Requires the selection mechanism to be fully known Requires the selection mechanism to be fully known Selection mechanism must be consistently applied to all persons Selection mechanism must be consistently applied to all persons

RD Design Graphically Source: Urban Institute Threshold MUST be known and consistently applied Test for significance

Research Designs Difference-in-differences Difference-in-differences Regression discontinuity Regression discontinuity Switching replications Switching replications Nonequivalent dependent variables Nonequivalent dependent variables

Switching Replications Has two groups and three waves of measurement Has two groups and three waves of measurement AKA waitlist control group AKA waitlist control group This design is sometimes used in randomized trials This design is sometimes used in randomized trials

Example from Pap Smear Study treat 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% > 12 Months since Initial Pap Cumulative % Followed Up InterventionControl Immediate treatment delayed treatment

Research Designs Difference-in-differences Difference-in-differences Regression discontinuity Regression discontinuity Switching replications Switching replications Nonequivalent dependent variables Nonequivalent dependent variables

Non-Equivalent DVs Analyze dependent variable that should not be affected by the intervention Analyze dependent variable that should not be affected by the intervention Example: Intervention is designed to affect quality of diabetes care, but could also see if intervention affected quality of asthma care Example: Intervention is designed to affect quality of diabetes care, but could also see if intervention affected quality of asthma care

Notes on the Analysis of DD data

Analytical Methods Plot or graph unadjusted data Plot or graph unadjusted data Graduate to more complex models Graduate to more complex models Address, if possible, model limitations Address, if possible, model limitations

DD Raw Data Standard deviations in parentheses +DD = (Exp followup - Exp baseline )-(Control followup - Control baseline ) † unadjusted estimates Baseline 1-Year Follow-Up Exp. Control Exp Control DD Utilization Entry (% yes) 84.5% 86.1% 88.9% 86.8% 3.7 (36.2) (34.6) (31.4) (33.9) No. of visits (0-16) (4.28) (4.36) (4.00) (4.07)

Diff n Diff Model Y =  +   G +   T +   GT+  X +  Y=outcome G = group (0=control, 1=treatment) T= time (0=baseline, 1=follow-up) X = characteristics of person, place, etc.  = error term

Program Effect  f  3 = 0 then the program has no effect  f  3 = 0 then the program has no effect Limited statistical power. Testing interactions increases risk of type 2 error. Limited statistical power. Testing interactions increases risk of type 2 error. Outcome =  +  1 G +  2 T +  3 GT +  X + 

avgcost sta3n exp yr_d year Note: Data Listed in Stata Organizing the Dataset

Identification How do you obtain an unbiased estimate of  3 ? How do you obtain an unbiased estimate of  3 ? For an unbiased estimate of GT, G must not be correlated with  that is, G must be exogenous For an unbiased estimate of GT, G must not be correlated with  that is, G must be exogenous Outcome =  +  1 G +  2 T +  3 GT +  X + 

Identification G may be endogenous G may be endogenous Selection bias Selection bias –Selection bias is type of endogeneity –Caused by non-random assignment –Outcome and G (group) affect each other -- causality runs both ways –Impact:  3 is biased Outcome =  +  1 G +  2 T +  3 GT +  X + 

Wagner, T. H., & Chen, S. (2005). An economic evaluation of inpatient residential treatment programs in the department of veterans affairs. Med Care Res Rev, 62(2), Example: VA Residential Treatment

Residential Treatment Programs RTPs provide mental health and substance use treatment RTPs provide mental health and substance use treatment RTPs were designed to RTPs were designed to –treat eligible veterans in a less- intensive and more self-reliant setting. –to provide cost-effective care that “promotes independence and fosters responsibility.”

Objectives 1. Did the RTPs save money? 2. Were savings a “one-time” event or do they continue to accrue?

Design Choice Selection mechanism is not observed– can’t use regression discontinuity Selection mechanism is not observed– can’t use regression discontinuity We know who adopted RTP and when– DD is feasible We know who adopted RTP and when– DD is feasible

Methods Built a longitudinal dataset for for all VA medical centers Built a longitudinal dataset for for all VA medical centers Tracked approved RTP programs (N=43) Tracked approved RTP programs (N=43) We merged data from the PTF and CDR to track We merged data from the PTF and CDR to track –Total MH inpatient days (PTF) and dollars (CDR) –Total SA inpatient days (PTF) and dollars (CDR)

Outcomes Department-level costs Department-level costs –Average cost per MH day –Average cost per SA day –Total MH/SA department costs Sensitivity analysis Sensitivity analysis –Outpatient MH/SA costs –FTE

Multivariate models Fixed-effects models 1 Fixed-effects models 1 –DV: Department-level costs –Controlled for medical center size –Inflation adjusted to 1999 using CPI –Year dummies –Wage index 1 Random effects were similar; Hausman tests were not significant. Fixed effects were more conservative.

Results: Mental Health Average cost savings of $81 per day (p<0.01). Average cost savings of $81 per day (p<0.01). Savings do not appear to be increasing over time. Savings do not appear to be increasing over time.

Mental Health Costs

Results: Substance Abuse Average cost savings of $112 per day (p<0.01). Average cost savings of $112 per day (p<0.01). Savings do not appear to be increasing over time. Savings do not appear to be increasing over time.

Mental Health Costs

Sensitivity Analysis RTPs were associated with a slight decrease in the costs of outpatient psychiatry. RTPs were associated with a slight decrease in the costs of outpatient psychiatry. RTPs were associated with a decrease in FTE RTPs were associated with a decrease in FTE

Limitations Not clear if RTPs could be better– are they treating the right patient? Not clear if RTPs could be better– are they treating the right patient? Endogeneity of RTPs Endogeneity of RTPs –1 and 2 year lags (medical centers with RTPs in 1994 and 1995) are not associated with costs –There does not appear to be self-selection in RTPs.

Any Questions?

Design References Trochim, W. Research Methods Knowledge Database Rossi, PH, and HE Freeman. Evaluation: A systematic approach. 5th ed. New York: Sage, 1993.

Regression References Wm. Greene. Econometric Analysis. Wm. Greene. Econometric Analysis. J Wooldridge. Econometric Analysis of Cross Section and Panel Data. J Wooldridge. Econometric Analysis of Cross Section and Panel Data.

You’ve Almost Made It June11 th Mark Smith, Endogeneity June11 th Mark Smith, Endogeneity TBA: Todd Wagner: Using Stata TBA: Todd Wagner: Using Stata