Presentation is loading. Please wait.

Presentation is loading. Please wait.

Understanding Hospital Mortality Indicators Paul Aylin Clinical Reader in Epidemiology and Public Health Dr Foster Unit at Imperial College London Dec.

Similar presentations


Presentation on theme: "Understanding Hospital Mortality Indicators Paul Aylin Clinical Reader in Epidemiology and Public Health Dr Foster Unit at Imperial College London Dec."— Presentation transcript:

1 Understanding Hospital Mortality Indicators Paul Aylin Clinical Reader in Epidemiology and Public Health Dr Foster Unit at Imperial College London Dec 2013 p.aylin@imperial.ac.uk@imperial.ac.uk

2 Contents Why monitor mortality? HSMR: effect of changing its construction What does the HSMR mean? The SHMI HSMR and SHMI compared Individual diagnosis/procedure based SMRs Interpretation of hospital mortality figures

3 What to measure Process vs. Outcome Debate simmering!!

4 The case FOR measuring outcomes  It really matters to the patient  Common endpoint in RCTs etc for assessing treatments  A ‘hard’ i.e. objective measure  Often readily available in routinely collected data

5 And the case AGAINST Casemix adjustment is needed for fair comparison and is difficult Can be affected by artefact, e.g. deaths post discharge, inter-hospital transfers All-cause mortality is not the same thing as preventable mortality Attribution: which hospital?

6 Hospital Standardised Mortality Ratio Originally developed by Brian Jarman Jarman et al. “Explaining Differences in English Hospital Death Rates Using Routinely Collected Data,” BMJ 1999;318:1515-1520 Covers diagnoses leading to 80% of all in- hospital deaths (56 dx groups) We use a set of 56 casemix adjustment models

7 Florence Nightingale Uniform hospital statistics would: “Enable us to ascertain the relative mortality of different hospitals as well as of different diseases and injuries at the same and at different ages, the relative frequency of different diseases and injuries among the classes which enter hospitals in different countries, and in different districts of the same country” Nightingale 1863

8 Purpose and rationale of (H)SMRs Many studies show link between quality of care and mortality 5% of in-hospital deaths are preventable through case note review – but will vary by hospital Can’t spot these in routine data, so monitor all deaths Hogan H, Healey F, Neale G, Thomson R, Vincent C, Black N. Preventable deaths due to problems in care in English acute hospitals: a retrospective case record review study. BMJ Qual Saf. 2012 Sep;21(9):737-45. doi: 10.1136/bmjqs-2011-001159.

9 Process vs. Outcome Review of relation between quality of care and death rates (36 studies): 26/51 processes: good care -> low rates 16/51 processes: no relation found 9/51 processes: good care -> high rates… Pitches D, Mohammed MA, Lilford R. What is the empirical evidence that hospitals with higher-risk adjusted mortality rates provide poorer quality of care? A systematic review of the literature. BMC Health Serv. Res 2007;7:91

10 HSMR construction Build one regression model per dx group Attach a predicted risk of death to each admission based on their age, sex etc Sum them up by dx group to get the expected number of deaths E Compare this with the observed deaths O: SMR for a diagnosis (CCS) group = O/E HSMR = (sum of 56 sets of O) / (sum of 56 sets of E) x100

11 Interpretation 100 is national average HSMRs>100 mean more deaths than predicted O minus E is NOT number of preventable deaths. Reasons? N, O, P, Q, R Noise in the data Organisation factors – transfers, step-down Patient factors – unaccounted-for casemix Quality of care – only after excluding others Random variation

12 Current casemix adjustment model for each dx group HSMRs are adjusted for: Age (<1, 1-4, 5-9, …, 85-89, 90+) Sex Elective status Areal socio-economic deprivation (Carstairs) Diagnosis group and subgroups Co-morbidity – Charlson index (switching to Elixhauser + dementia) Number of emergency admissions in previous 12 months Palliative care (secondary dx code or specialty) Year of discharge Month of admission Source of admission, e.g. pt’s own home, other hospital, care home

13

14 HSMRs with and without adjusting for Charlson

15 Palliative care – with and without adjustment

16 56 diagnosis groups HSMR vs. all 259 diagnosis groups

17 Excluding zero-day emergency survivors

18 An HSMR by month (random hospital)

19 …and the same by quarter: less fluctuation

20 The SHMI: Summary Hospital-level Mortality Indicator Owned by the NHS Information Centre Developed from DH Mortality Technical Working Group Informed by limited further modelling commissioned by NHSIC Ongoing review Intended to be supported by the private sector who help Trusts understand their data

21 SHMI v HSMR SHMIHSMR Adms and deaths included  All in-hospital deaths for inpatients and day cases and deaths 30d post discharge  For all diagnosis supergroups except births, stillbirths  All in-hospital deaths for inpatients and day cases  For the 56 diagnosis (CCS) groups accounting for 80% of deaths Risk adjustment  Age group  Comorbidities  Admission method  Gender  Discharge year (1-3)  Models based on just three years of data and rerun each quarter  Diagnosis groups much broader  Age group  Comorbidities  Admission method  Gender  Palliative care coding  Diagnosis/procedure subgroup  Deprivation  Number of previous emergency adms  Discharge year  Month of admission  Source of admission  Models based on 11 years  Includes interaction terms between age and co-morbidity Assigning deaths  The last acute provider in superspell  Every provider in the superspell prior to death

22 Casemix HSMR casemix adjustment seems to account for more variation 30% more trusts above 99.8% control limit for SHMI

23 Do we need both? Neither is “right” Neither is “wrong” As they look at the situation from a different perspective, they are both useful, if you understand the perspective: “multiple lenses” (Berwick) Other quality measures also needed

24 Reasons for high or low values Noise in the data Organisation factors – transers, step- down, on-site hospice Patient factors – unaccounted-for casemix Quality of care – only after excluding others Random variation

25 What can HSMRs and SHMIs tell us? High (or rising) suggests potential quality of care issue (Mid Staffs) Some of this could relate to other parts of the care pathway, e.g. primary care …or failure to submit all the data (e.g. no secondary dx coding) Falling value suggests potentially care improvement or various artefacts

26 What can HSMRs and SHMIs not tell us? If there is definitely a problem in the hospital If hospital is good/safe in all clinical areas The number of preventable deaths What to do next other than “investigate”

27 Some suggestions for investigating your SHMI Split it by hospital site Break it down into dx-specific SMRs Check electronic then paper data Look at place of death Pathways – transfers, LCP Weekday v weekend, elec v emerg Audit etc as usual

28 Keogh Review 14 hospital trusts covered by the review were selected using national mortality measures as a "warning sign" or "smoke-alarm" for potential quality problems 11 of the 14 trusts were placed into special measures by Monitor and the NHS Trust Development Authority.

29 In summary HSMR Summary figure, developed by Imperial and produced by DFI Observed/Expected for 56 diagnosis groups accounting for 80% of all in- hospital deaths Screening tool, so various possible reasons for high/low figure Not a measure of avoidable deaths Detailed breakdowns and analyses available SHMI Summary figure produced by Information Centre Also a screening tool and not a measure of avoidable deaths More limited casemix adjustment for dx, age, sex, emerg/elec, Charlson comorbidity Some minor differences cf HSMR (100% vs 80% admissions) Some larger differences cf HSMR (out of hospital deaths, attribution of death to final acute centre)

30

31 Detecting outliers “Even if all surgeons are equally good, about half will have below average results, one will have the worst results, and the worst results will be a long way below average” Poloniecki J. BMJ 1998;316:1734-1736

32 Adjusted (EuroSCORE) mortality rates for primary isolated CABGs by centre (3 years data up to March 2005) using SCTS data with 95% and 99.8% control limits based on mean national mortality rates

33 Funnel plots No ranking Visual relationship with volume Takes account of increased variability of smaller centres …. but not as useful for continuous surveillance and less sensitive to sudden increases in mortality

34 http://www.erpho.org.uk/statistical_tools.aspx

35 Risk-adjusted Log-likelihood CUSUM charts STEP 1: estimate pre-op/admission risk for each patient, given their age, sex etc. This may be national average or other benchmark STEP 2: Order patients chronologically by date of operation STEP 3: Choose chart threshold(s) of acceptable “sensitivity” and “specificity” (via simulation) STEP 4: Plot function of patient’s actual outcome v pre-op risk for every patient, and see if – and why – threshold(s) is crossed

36 More details Based on log-likelihood CUSUM to detect a predetermined increase in risk of interest Taken from Steiner et al (2000); pre-op risks derived from logistic regression of national data The CUSUM statistic is the log-likelihood test statistic for binomial data based on the predicted risk of outcome and the actual outcome Model uses administrative data and adjusts for age, sex, emergency status, socio-economic deprivation etc. Bottle A, Aylin P. Intelligent Information: a national system for monitoring clinical performance. Health Services Research (in press).

37

38

39

40 How do you investigate a signal?

41

42

43 Imperial College Mortality alerts Look at alerts generated at 0.1% statistical False Alarm Rate (default in Real Time Monitoring is 1%) Write to trusts with doubling of odds of death over previous 12 months

44

45

46

47 Francis report 2013 Recognised the role that our work on HSMRs and our surveillance system of mortality alerts had to play in identifying Mid Staffs as an outlier Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry 2013. Volume 1. Pages 458 - 466 http://www.midstaffspublicinquiry.com/report. http://www.midstaffspublicinquiry.com/report “All healthcare provider organisations should develop and maintain systems which give effective real-time information on the performance of each of their services, specialist teams and consultants in relation to mortality, patient safety and minimum quality standards.” Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry 2013. Executive Summary. Recommendation 262: http://www.midstaffspublicinquiry.com/report).http://www.midstaffspublicinquiry.com/report “Summary hospital-level mortality indicators should be recognised as official statistics. ” Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry 2013. Executive Summary. Recommendation 271: http://www.midstaffspublicinquiry.com/report.http://www.midstaffspublicinquiry.com/report

48 Further reading Bottle A, Jarman B, Aylin P. Hospital Standardised Mortality Ratios: sensitivity analyses on the impact of coding. Health Serv Res 2011; 46(6): 1741-1761 Bottle A, Jarman B, Aylin P. Hospital Standardised Mortality Ratios: Strengths and Weaknesses. BMJ 2011; 342: c7116 Campbell MJ, Jacques RM, Fotheringham J, Maheswaran R, Nicholl J. Developing a summary hospital mortality index:retrospective analysis in English hospitals over five years. BMJ 2012; 344: e1001


Download ppt "Understanding Hospital Mortality Indicators Paul Aylin Clinical Reader in Epidemiology and Public Health Dr Foster Unit at Imperial College London Dec."

Similar presentations


Ads by Google