Presentation is loading. Please wait.

Presentation is loading. Please wait.

Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University The Relationship between Performance on.

Similar presentations


Presentation on theme: "Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University The Relationship between Performance on."— Presentation transcript:

1 Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University The Relationship between Performance on Medicare’s Process Quality Measures and Mortality: Evidence of Correlation, not Causation Andrew Ryan, Doctoral Candidate

2 2 Acknowledgements Support Agency for Healthcare Research and Quality Jewish Healthcare Foundation Dissertation Committee Stan Wallack Chris Tompkins Deborah Garnick Kit Baum

3 3 Outline Measurement of quality in health care Literature on relationship between process and outcome measures in health care Methods Results Policy implications

4 4 Value Based Purchasing 2005 Deficit Reduction Act called for hospital Value Based Purchasing (VBP)2005 Deficit Reduction Act called for hospital Value Based Purchasing (VBP) To be implemented by the Centers for Medicare and Medicaid Services (CMS)To be implemented by the Centers for Medicare and Medicaid Services (CMS) VBP will integrate pay for performance (P4P) and public quality reporting into Medicare hospital careVBP will integrate pay for performance (P4P) and public quality reporting into Medicare hospital care The ultimate design of VBP (performance measures, payout rules, competition pool) is unclearThe ultimate design of VBP (performance measures, payout rules, competition pool) is unclear

5 5 Measurement of quality in health care Quality measurement Central to public quality reporting and pay for performance Process measures Assess whether “what is known to be ‘good’ medical care has been applied” (Donabedian, 1966) Process performance measurement has dominated quality measurement Hospital Compare CMS/Premier Hospital Quality Incentive Demonstration Why? Providers deemed to have control over performance Greater statistical power Provide “actionable” information for improvement (Birkmeyer et al. 2006; Mant 2001) Fear of inappropriately labeling hospitals as “bad” due to random variation in outcomes Fear that use of outcome measures will lead to avoidance of higher risk patients

6 6 Potential problems with process measures A process measure might: not appropriately represent a process that is positively associated with patient health represent a process that is positively associated with patient health but has been rendered obsolete by advances in clinical practice (Porter and Teisberg 2006) Complicated nature of inpatient care may make process measures an inadequate proxy for quality due to limited scope Implementation of process measurement may alter the relationship between the observed process measures and patient outcomes measurement error enhanced record keeping, or gaming multitasking

7 7 CMS/ Joint Commission Starter Set Process Measures Hospital Compare hospital care voluntary, internet-based public quality reporting implemented by CMS in 2003 After low rates of voluntary reporting in 2003, CMS made hospitals’ 2004 payment update conditional on reporting for 10 of the 17 indicators-- reporting increased dramatically. CMS/Joint Commission Starter Set measures AMI aspirin at arrival/discharge β blocker at arrival/discharge use of ACE inhibitor Heart failure use of ACE inhibitor assessment of left ventricular function Pneumonia oxygenation assessment timing of initial antibiotics pneumococcal vaccination Exception reporting: hospitals have discretion to exclude patients from calculation of process performance

8 8 Research Question: Relationship of Process Measures to Outcome For AMI, heart failure, and pneumonia: 1)Are the CMS/Joint Commission process measures correlated with hospital mortality? 2)Are the CMS/Joint Commission process measures causally related to hospital mortality?

9 9 Effects of exception reporting Greater exception reporting may improve process performance Association between process performance and mortality may be weaker for hospitals that report more exceptions Supplemental analysis to examine: Relationship between exception reporting and process performance Effect of exception reporting on relationship between process performance and mortality

10 10 Previous research finds association between processes and outcomes Association between process measures and mortality for AMI, heart failure, pneumonia Bradley et al. 2006; Eagle et al. 2005; Fonarow et al. 2007; Granger et al. 2005; Luthi et al. 2004; Luthi et al. 2003; Peterson et al. 2006; Werner and Bradlow 2006; Jha et al. 2007 Limitations of studies Employed cross-sectional designs Results may be confounded by unobserved factors

11 11 Data and Methods

12 12 Data Medicare fee-for-service inpatient claims and denominator files (2004-2006) Primary diagnoses for which beneficiaries were admitted Secondary diagnoses, demographics, and type of admission for risk adjustment Discharge status to exclude transfer patients Outcome measure: 30-day mortality 2006 Medicare hospital characteristics 2004-2006 Hospital Compare data Hospital process performance

13 13 Composite process quality measure Defined as the z score of the weighted sum of z- scores for process measures corresponding to each condition Transform each process measure by the z-score to avoid bias Bias could result from a positive correlation between the likelihood of reporting on a measure and performance on that measure. Transform the sum of the weighted z-scores by the z-score Facilitates interpretation Compute for all hospitals reporting a denominator of at least 10 patients for at least one measure

14 14 Risk adjustment of 30-day Mortality AMI, Heart Failure, Pneumonia Hospital-level observed / expected Expected: estimated from patient-level logit models where mortality was regressed on: Age, gender, race Elixhauser comorbidities (Elixhauser et al.1998) Type of admission (emergency, urgent, elective) Season of admission

15 15 Two Specifications for Relationship of Process Measures to Mortality #1: Pooled cross section #2: Add hospital Fixed Effects Then consider impact of hospitals’ exceptions reporting for both specifications

16 16 Analysis Analysis #1:Pooled Cross Section ln (RA Mortality jkt )= b 0 + b 1 Z j + b 2 year t + b 3 Z j * year t + δ 1 Process jkt + δ 2 Process 2 jkt + e jkt Where: j is indexed to hospitals, k is indexed to condition (AMI or heart failure), t is indexed to year (2004-2006) Z is a vector of hospital characteristics (bed size, ratio of residents / average daily census, urbanicity, ownership, % of Medicare admissions) Process is the composite quality measure year is a vector of dummy variables for 2005 and 2006

17 17 Analysis: Analysis: #2 Hospital Fixed Effects ln (RA Mortality jkt )= b 0 + b 2 year t + b 3 Z j * year t + δ 1 Process jkt + δ 2 Process 2 jkt + h j + e jkt Where h is a vector of hospital-specific fixed effects The inclusion of hospital-specific effects controls for unobserved time-invariant factors at the hospital level (e.g. physician skill/experience, technology) that may confound the relationship between processes of care and outcomes

18 18 Standard error specification Multiple observations from the same hospitals over time gives rise to group-level heteroskedasticity Cluster-robust standard errors are estimated (Williams 2000) Hospital-level RA mortality rates vary in their precision as a result of the number of patients in the denominator of the calculation Analytical weights (Gould 1994), based on the average number of patients in the denominator of hospitals' RA mortality calculation over the two study years, are employed

19 19 Results

20 20 Descriptive statistics Hospital characteristics200420052006 Between hospital s.d. Within hospital s.d. AMI n2,9862,9252,886-- Risk-adjusted 30-day mortality (mean) 16.6%16.3%16.5%7.3%7.1% Composite 1 (mean)0.831.071.13.99.39 Heart failure n3,2763,2253,219-- Risk-adjusted 30-day mortality (mean) 11.7%11.1% 3.4%2.8% Composite 1 (mean) 0.811.111.18.93.34 Pneumonia n3,2613,2333,230-- Risk-adjusted 30-day mortality (mean) 10.1% 9.9%3.4%3.0% Composite 1 (mean)0.641.131.26.93.45 Note: table includes data from hospitals that are included in at least one of the regression models Note: data from individual process measures is included if hospitals report at least 10 in the measure denominator Note: composite measures are scaled so that the mean of each measure across all hospitals across all years is 1

21 21 Marginal effects of process performance on risk-adjusted mortality DescriptionAMIHeart failurePneumonia ∂y/∂xnR2R2 nR2R2 nR2R2 #1:Pooled cross section -9.5%*** (1.0) 8,696.09-2.1%** (0.9) 9,630.04-2.1%*** (0.7) 9,673.03 #2: Hospital fixed effects 0.2% (1.4) 8,549.020.7% (1.3) 9,546.030.3% (1.0) 9,596.04 *** p<0.01, ** p<0.05, * p<0.1 Note: hospital controls include ownership, bed size, teaching status, urbanicity, and ratio of residents to average daily census Note: Marginal effects are evaluated at median of process composite Note: Marginal effects are multiplied by 100 to facilitate interpretation Note: Robust standard errors in parentheses

22 22 Effects of exception reporting Not associated with process performance for composite measures for any diagnosis Higher exception reporting was associated with a substantial and significant increase in one process performance measure: ACE inhibitor for AMI Did not moderate effect of process performance in expected direction Did not explain null results from FE model

23 23 Summary of findings Pooled cross section models showed CMS/Joint Commission process performance measures are correlated with patient 30-day RA mortality for AMI, pneumonia, and heart failure Consistent with recent evidence However, hospital fixed effects models showed no evidence of an association between process performance and mortality for any diagnosis Suggests that CMS/Joint Commission measures are not causally related to mortality Conflicts with recent evidence Unobserved improvements in record keeping may be responsible for lack of casual relationship Higher rate of exceptions does not appear to increase measured process quality nor affect relationship between process performance and mortality

24 24 Implications for payment system Correlation between process performance and mortality supports utility of process measures for public reporting Steer patients toward higher quality providers Absence of a causal relationship casts serious doubt on the utility of current process performance measures as a metric for hospital quality improvement


Download ppt "Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University The Relationship between Performance on."

Similar presentations


Ads by Google