Download presentation
Presentation is loading. Please wait.
Published byClifford Boyd Modified over 9 years ago
1
Day 2: Session II Presenter: Jeffrey Geppert, Battelle AHRQ QI User Meeting September 26-27, 2005
2
Methods for Creating Aggregate Performance Indices AHRQ QI User Meeting September 27, 2005 Jeffrey Geppert Battelle Health and Life Sciences
3
Overview Project objectives Project objectives Why composite measures? Why composite measures? Who might use composite measures? Who might use composite measures? Alternative approaches Alternative approaches Desirable features of a composite Desirable features of a composite Proposed approach for the AHRQ QI Proposed approach for the AHRQ QI Questions & Answers Questions & Answers
4
Project Objectives Composite measures for the AHRQ QI included in the National Healthcare Quality Report and Disparities Report Composite measures for the AHRQ QI included in the National Healthcare Quality Report and Disparities Report Separate composites for overall quality and/or quality within certain domains (e.g., cardiac care, surgery, avoidable hospitalizations, diabetes, adverse events) Separate composites for overall quality and/or quality within certain domains (e.g., cardiac care, surgery, avoidable hospitalizations, diabetes, adverse events) A methodology that can be used at the national, state and provider/area level A methodology that can be used at the national, state and provider/area level
5
Project Objectives Feedback Feedback Does the proposed approach meet user needs for a composite? Does the proposed approach meet user needs for a composite? What analytic uses should the composite address? What analytic uses should the composite address? What are the important policy issues? What are the important policy issues? How should the composite be incorporated into the AHRQ QI software? How should the composite be incorporated into the AHRQ QI software?
6
Goals of National Healthcare Reports National Level National Level – Provide assessment of quality and disparities – Provide baselines to track progress – Identify information gaps – Emphasize interdependence of quality and disparities – Promote awareness and change State / Local / Provider Level State / Local / Provider Level – Provide tools for self-assessment – Provide national benchmarks – Promote awareness and change
7
Unique challenges to quality reporting by states States release comparative quality information in a political environment States release comparative quality information in a political environment – Either must adopt defensible scientific methodology or make conservative assumptions Examples of reporting decisions: Examples of reporting decisions: – Small numbers issues – Interpretive issues (better/worse, higher/lower) Purchasers demanding outcomes and cost information from states Purchasers demanding outcomes and cost information from states
8
Why Composites? Summarize quality across multiple measures Summarize quality across multiple measures Improve ability to detect quality differences Improve ability to detect quality differences Identify important domains and drivers of quality Identify important domains and drivers of quality Prioritize action Prioritize action Make current decisions about future (unknown) healthcare needs Make current decisions about future (unknown) healthcare needs Avoids cognitive “short-cuts” Avoids cognitive “short-cuts”
9
Why Not Composites? Mask important differences and relationships among components (e.g. mortality and re-admissions) Mask important differences and relationships among components (e.g. mortality and re-admissions) Not “actionable” Not “actionable” Difficult to identify which parts of the healthcare system contribute most to quality Difficult to identify which parts of the healthcare system contribute most to quality Detract from the impact and credibility of reports Detract from the impact and credibility of reports Independence of components Independence of components Interpretation of components Interpretation of components
10
Who Might Use Them? Consumers – To select a hospital either before or after a health event Consumers – To select a hospital either before or after a health event Providers – To identify the domains and drivers of quality Providers – To identify the domains and drivers of quality Purchasers – To select hospitals in order to improve the health of employees Purchasers – To select hospitals in order to improve the health of employees Policymakers – To set policy in order to improvethe health of a population Policymakers – To set policy in order to improve the health of a population
11
Examples “America’s Best Hospitals” (U.S. News & World Report) “America’s Best Hospitals” (U.S. News & World Report) Leapfrog Safe Practices Score ( 27 procedures to reduce preventable medical mistakes ) Leapfrog Safe Practices Score ( 27 procedures to reduce preventable medical mistakes ) NCQA, “America’s Best Health Plans” NCQA, “America’s Best Health Plans” QA Tools (RAND) QA Tools (RAND) Veteran Health Administration (Chronic Disease Care Index, Prevention Index, Palliative Care Index) Veteran Health Administration (Chronic Disease Care Index, Prevention Index, Palliative Care Index) Joint Commission (heart attack, heart failure, pneumonia, pregnancy) Joint Commission (heart attack, heart failure, pneumonia, pregnancy) National Health Service (UK) Performance Ratings National Health Service (UK) Performance Ratings CMS Hospital Quality Incentive Demonstration Project CMS Hospital Quality Incentive Demonstration Project
12
Examples Measures Components Composite GoalsUtility
13
Alternative Approaches ApproachGoalUtility Opportunity Appropriate care Volume of opportunities Burden Minimize excess death/costs Measures with most excess Expected quality Better than reference Lowest ratio Variation Better than reference Outliers Latent quality Reduce variation Measures with greatest variation
14
Desirable Features Valid - Based on valid measures Valid - Based on valid measures Reliable – Improve ability to detect differences Reliable – Improve ability to detect differences Minimum Bias – Based on unbiased measures Minimum Bias – Based on unbiased measures Actionable – Interpretable metric Actionable – Interpretable metric Benchmarks or standards Benchmarks or standards Transparent Transparent Predictive – Should guide the decision-maker on likely future quality based on current information. Predictive – Should guide the decision-maker on likely future quality based on current information. Representative – Should reflect expected outcomes for population Representative – Should reflect expected outcomes for population
15
Proposed Approach A modeling-based approach A modeling-based approach Latent quality – observed correlation in individual measures is induced by variability in latent quality Latent quality – observed correlation in individual measures is induced by variability in latent quality Individual measures with highest degree of variation have larger contribution to composite Individual measures with highest degree of variation have larger contribution to composite Theoretical interpretation Theoretical interpretation Consistent with goal of reducing overall variation in quality Consistent with goal of reducing overall variation in quality
16
Proposed Approach Latent Quality Component
17
Advantages Avoids contradictory results with individual measures or the creation of composites that may mislead Avoids contradictory results with individual measures or the creation of composites that may mislead Construction of the composite increases the power of quality appraisals Construction of the composite increases the power of quality appraisals Allows for both measure-specific estimates and composites Allows for both measure-specific estimates and composites Allows for validation with out-of-sample prediction Allows for validation with out-of-sample prediction
18
Advantages (Continued) Hierarchical – for small numbers, the best estimate is the pooled average rate at similar hospitals Hierarchical – for small numbers, the best estimate is the pooled average rate at similar hospitals Allows for incorporation of provider characteristics to explain between- provider variability (e.g., volume, technology, teaching status) Allows for incorporation of provider characteristics to explain between- provider variability (e.g., volume, technology, teaching status) Gives policymakers information on the important drivers of quality Gives policymakers information on the important drivers of quality
19
Overview of AHRQ QIs Prevention Quality Prevention QualityIndicators Inpatient Quality Indicators Inpatient Quality Indicators Patient Safety Indicators Patient Safety Indicators Ambulatory care sensitive Ambulatory care sensitiveconditions Mortality following procedures Mortality following procedures Mortality for medical conditions Mortality for medical conditions Utilization of procedures Utilization of procedures Volume of procedures Volume of procedures Post-operative complications Post-operative complications Iatrogenic conditions Iatrogenic conditions
20
Examples
21
Examples
22
Examples
23
Examples
24
Examples
25
Examples
26
Hierarchical Models Also referred to as smoothed rates or reliability-adjusted rates Also referred to as smoothed rates or reliability-adjusted rates Endorsed by NQF for outcome measures Endorsed by NQF for outcome measures Methods to separate the within and between provider level variation (random vs. systematic) Methods to separate the within and between provider level variation (random vs. systematic) Total variation = Within provider + Between provider (Between = Total – Within) Total variation = Within provider + Between provider (Between = Total – Within) Reliability (w) = Between / Total Reliability (w) = Between / Total – Signal ratio = signal / (signal+noise)
27
Hierarchical Models Smoothed rate is the (theoretical) best predictor of future quality Smoothed rate is the (theoretical) best predictor of future quality Provides a framework for validation and forecasting Provides a framework for validation and forecasting Smoothed rate (single provider, single indicator) = Smoothed rate (single provider, single indicator) = Hospital-type rate * (1 – w) + Hospital-specific rate * w Multivariate versions Multivariate versions – Other Years (auto-regression, forecasting) – Other Measures (composites) – Non-persistent innovations (contemporaneous, nonsystematic shocks)
28
Outcomes and Process Source: Landrum et. al. Analytic Methods for Constructing Cross-Sectional Profiles of Health Care Providers (2000)
29
Hierarchical Models Provider Type Provider Patient PolicyQuality
30
Policy and Prediction The best predictor of future performance is often historical performance + structure The best predictor of future performance is often historical performance + structure The greater the reliability of the measure for a particular provider, the more weight on historical performance The greater the reliability of the measure for a particular provider, the more weight on historical performance The less the reliability of the measure for a particular provider, the more weight on structure The less the reliability of the measure for a particular provider, the more weight on structure Volume often improves the ability to predict performance for low-volume providers Volume often improves the ability to predict performance for low-volume providers Other provider characteristics (e.g. availability of technology) do as well Other provider characteristics (e.g. availability of technology) do as well Area characteristics (e.g., SES) do as well Area characteristics (e.g., SES) do as well
31
Socio-Economic Status The Public Health Disparities Geo-coding Project - Harvard School of Public Health (PI: Nancy Krieger) The Public Health Disparities Geo-coding Project - Harvard School of Public Health (PI: Nancy Krieger) Evaluated alternative indices of SES (e.g. Townsend and Carstairs) Evaluated alternative indices of SES (e.g. Townsend and Carstairs) Occupational class, income, poverty, wealth, education level, crowding Occupational class, income, poverty, wealth, education level, crowding Gradations in mortality, disease incidence, LBW, injuries, TB, STD Gradations in mortality, disease incidence, LBW, injuries, TB, STD Percent of persons living below the U.S. poverty line Percent of persons living below the U.S. poverty line – Most attuned to capturing economic depravation – Meaningful across regions and over time – Easily understood and readily interpretable
32
Socio-Economic Status
33
Limitations Measures and methods difficult Measures and methods difficult Restrictive assumptions on correlation Restrictive assumptions on correlation Correlations may vary by provider type Correlations may vary by provider type Requires a large, centralized data source Requires a large, centralized data source
34
Expansions Flexibility in weighting the components Flexibility in weighting the components Empirical – domains driven entirely by empirical relationships in the data Empirical – domains driven entirely by empirical relationships in the data A priori – domains determined by clinical or other considerations A priori – domains determined by clinical or other considerations Combination – empirical when the relationships are strong and the measures precise, otherwise a priori Combination – empirical when the relationships are strong and the measures precise, otherwise a priori
35
Welfare-driven Composites Measures Composites Meta-composites
36
Welfare-Driven Composites Making current decisions about future needs – maximize expected outcomes, minimize expected costs Making current decisions about future needs – maximize expected outcomes, minimize expected costs Policymaker focus – for a population Policymaker focus – for a population A provider focus – for their patients A provider focus – for their patients A employer focus – for their employees A employer focus – for their employees A consumer focus – based on individual characteristics A consumer focus – based on individual characteristics
37
Acknowledgments Funded by AHRQ Support for Quality Indicators II (Contract No. 290-04-0020) Mamatha Pancholi, AHRQ Project Officer Marybeth Farquhar, AHRQ QI Senior Advisor Mark Gritz and Jeffrey Geppert, Project Directors, Battelle Health and Life Sciences Data used for analyses: Nationwide Inpatient Sample (NIS), 1995-2000. Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality State Inpatient Databases (SID), 1997-2002 (36 states). Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality
38
Acknowledgements We gratefully acknowledge the data organizations in participating states that contributed data to HCUP and that we used in this study: the Arizona Department of Health Services; California Office of Statewide Health Planning & Development; Colorado Health & Hospital Association; Connecticut - Chime, Inc.; Florida Agency for Health Care Administration; Georgia: An Association of Hospitals & Health Systems; Hawaii Health Information Corporation; Illinois Health Care Cost Containment Council; Iowa Hospital Association; Kansas Hospital Association; Kentucky Department for Public Health; Maine Health Data Organization; Maryland Health Services Cost Review; Massachusetts Division of Health Care Finance and Policy; Michigan Health & Hospital Association; Minnesota Hospital Association; Missouri Hospital Industry Data Institute; Nebraska Hospital Association; Nevada Department of Human Resources; New Jersey Department of Health & Senior Services; New York State Department of Health; North Carolina Department of Health and Human Services; Ohio Hospital Association; Oregon Association of Hospitals & Health Systems; Pennsylvania Health Care Cost Containment Council; Rhode Island Department of Health; South Carolina State Budget & Control Board; South Dakota Association of Healthcare Organizations; Tennessee Hospital Association; Texas Health Care Information Council; Utah Department of Health; Vermont Association of Hospitals and Health Systems; Virginia Health Information; Washington State Department of Health; West Virginia Health Care Authority; Wisconsin Department of Health & Family Services. We gratefully acknowledge the data organizations in participating states that contributed data to HCUP and that we used in this study: the Arizona Department of Health Services; California Office of Statewide Health Planning & Development; Colorado Health & Hospital Association; Connecticut - Chime, Inc.; Florida Agency for Health Care Administration; Georgia: An Association of Hospitals & Health Systems; Hawaii Health Information Corporation; Illinois Health Care Cost Containment Council; Iowa Hospital Association; Kansas Hospital Association; Kentucky Department for Public Health; Maine Health Data Organization; Maryland Health Services Cost Review; Massachusetts Division of Health Care Finance and Policy; Michigan Health & Hospital Association; Minnesota Hospital Association; Missouri Hospital Industry Data Institute; Nebraska Hospital Association; Nevada Department of Human Resources; New Jersey Department of Health & Senior Services; New York State Department of Health; North Carolina Department of Health and Human Services; Ohio Hospital Association; Oregon Association of Hospitals & Health Systems; Pennsylvania Health Care Cost Containment Council; Rhode Island Department of Health; South Carolina State Budget & Control Board; South Dakota Association of Healthcare Organizations; Tennessee Hospital Association; Texas Health Care Information Council; Utah Department of Health; Vermont Association of Hospitals and Health Systems; Virginia Health Information; Washington State Department of Health; West Virginia Health Care Authority; Wisconsin Department of Health & Family Services.
39
Questions & Answers Questions And Answers Questions And Answers
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.