The ‘Centre Effect’ and Statistical Process Control Alex Hodsman
Liv RI – Rank 31 Chester – Rank 35
CentreN% < 1.8mmol/L 95% CI for %<1.8mmol/LMedian rank 95% CI for rank Probability of being in ‘top 10 centres’ (%) Probability of being in ‘last 10 centres’ (%) Tyrone London West Reading Chelmsford Derby Bangor Swansea Chester Stevenage Dorset Shrewsbury Truro Preston Carlisle
What are the aims for comparing centre outcomes? 1.Identify ‘meaningful’ differences between centres 2.Identify improvement/deterioration 3.Multiple simultaneous comparisons 4.Make fair comparisons 5.(Identify modifiable clinical processes)
Why use SPC? Inter centre variability in outcome measures –Chance –Data quality –Definitions –Case mix –Quality of care Organisational structure Processes of care Intra centre variability in outcome measures
SPC Method of monitoring, controlling improving a process through statistical analysis Key principles –Variability in all systems –Differentiate ‘special cause’ from ‘normal random’ variation –Identify and improve processes to reduce special cause variation
Examples of SPC Cross sectional –Funnel plots Longitudinal –Control charts –CUSUM, EWMA, SPRT etc.. Hybrid –Funnel plots
Principles of SPC
Cross sectional plots Specificity False positive rate/Type 1 error 3SD = 0.27% 2SD = 5%
Longitudinal plots Type 1 error 25 data points 3SD = 6.5% 2SD = 27.7%
Longitudinal plots - Interpretation Shewhart’s original rule –> 3SDs from the process average Numerous additional rules –Patterns/Trends in the data –E.g. 7 points in the same direction –Enhance sensitivity –Probability calculations
2004 Report Funnel plot of age adjusted 1 year after 90 days survival, cohort 2006 Report Funnel plot of % with serum phosphate<1.8mmol/L:HD SPC and the UKRR
Phosphate distributions
Cross sectional vs. Longitudinal Cross sectional Inter centre variability Good for looking at stable unit characteristics –Data, Case mix, Organisational structure Longitudinal Intra centre variability Good for looking at less stable unit characteristics –Data, Processes of care
Data collection Define specification of audit measure Funnel plot to compare all centres Identify and analyse outliers Check data against local audit data Data correctData incorrect Refer to control chart to identify time of UKRR fault Individual control chart for each centre Updated quarterly P chart - % achieving audit measure XMR chart for mean XMR chart for SD ? Also include a measure of process capability Investigate causes Case mix Quality (organisational structure) Investigate causes Quality (processes of care)
Conclusions Methodical diagnostic approach to performance Takes chance out of the equation Focus resources Statistics are complex but the output is user friendly Limited ability to compare centres longitudinally i.e. rate of change