Download presentation
Presentation is loading. Please wait.
Published byKristopher Green Modified over 6 years ago
1
Intelligent targets Learning Session 3 Workshop 30 September 2009
2
Performance Improvement for Patients
A View from the Top ?? Minister view clear about need to improve quality of care Consensus that more focus /pace required Wide range of national performance improvement tools Where are we know ? The Dark Side ‘Annual Operating Framework’ Top down improvement Nationally set targets
3
Perceptions and Myths Improvement targets not owned by all Not sensible or clinically credible Single points of complex pathways – not representative Perverse incentives Limited change Changes not sustainable It has achieved improvement !
4
Somewhere over the rainbow ?
Self governing organisations Quality focused improvement targets Clinically driven and owned Across care pathways Patient care at the centre To.. Improve patient experience Involve carers and professionals See some change in a complex system
5
View from the bottom (up)
Intelligent targets Objectives Complete care pathway Critical success / ‘Wow’ factors Outcomes measures (effectiveness, safety, experience) Approach 4 pilots (cardiac, stroke, unscheduled care, mental health) National Steering Group supported by 4 core groups Driven and made up of healthcare professionals, policy leads Facilitated by NLIAH
7
Wales stroke audits over 5 years -what is worse?
Brain scan in 24 hours – 60% to 38% OT assessment - 62% to 50% Home visit before discharge – 80% to 53% -what is better? Aspirin started – 72% to 76% MDT goals agreed – 58% to 70%
8
What are we actually trying to do?
Improve the reliability of care in Wales Raise the standards of care in Wales
9
A quote Although clinicians setting targets is the way forward, how do we re-educate them to move away from end line inspection to on line inspection? They have grown up in organisations which review complaints, undertake audit, reflect on research….which all have their place, but amazingly with such a captured audience ‘the patient’ they fail miserably to monitor, measure and improve quality at the bedside. Can you imagine a world whereby all staff were involved in quality questioning….I would predict a stepped change in complaints!
10
The Intelligent Targets Approach
Focus on process of change Use expert groups for subject knowledge Use model for change as a standard Greenhalgh criteria
11
Greenhalgh criteria It must have clear relative advantage
It must have compatibility with the user’s values and ways of working Complexity must be minimised Users will adopt more readily if innovations allow trialability There must be observability, that is it must be seen to deliver benefit Reinvention is the propensity for local adaptation
12
An evidence based model for producing clinical change
The Model for Improvement Agreed process changes (care pathways and driver diagrams) Outcome and Process measures Appropriate Performance Management Support for improvement (will/ ideas/ execution) Tools- data handling, driver diagrams, collaborative learning
13
Model for Improvement
14
An example from another setting
Acute MI Care in US Aspirin at discharge ACEI for LVSD Beta-blocker at arrival Beta-blocker at discharge Door to lytic Door to PCI Smoking cessation advice Composite and all-or-none scores Survival rate/index Aspirin at arrival In 2002 in the mayo Clinic in the states decided to look at how good it was at delivering an acute MI bundle.
15
At the beginning an average of 56% of patients got what they should have had- that’s 44% that didn’t. The scale on this slide is mislabelled and should read from 0 to 100%. Improving reliability in the process from 56 to 80% required introducing level one and two changes, which had a cummulative effect.
16
Strategies –Level 1 “Intent, vigilance, hard work”
Standardized protocols Feedback Training Checklists The evidence is that threatening people does not improve reliability!
17
don’t rely on checking”
Strategies –Level 2 “Redesign the system – don’t rely on checking” Decision aids and reminders built into the system Automation Evidence as the default Scheduling Connection to habits Mechanising the system- coverting human’s into robots Evidence based practice as the default, not the aspiration- If the process doesn’t get something right, then investigate and do something about it. Its important to attach the process to the way we work.
18
The improvements in relaibilty in care were reflected in improvements in Mortality rates.
19
Stroke An example from our setting
20
Acute Phase Driver Diagram
21
All Wales Data
22
Examples from One Trust
26
All Wales - Compliance with First Hours Bundle
Within 3 hours Screening tool Confirmation of diagnosis by experienced clinician Stat aspirin
27
All Wales - Compliance with First Day Bundle
Within 24 Hours CT Scan Admission to stroke bed Swallow screen Regular aspirin
34
Findings Method makes sense
Measurement and reliability are new concepts Team work is encouraged across pathway Connections with management need work We are seeing change and so are patients!
35
Respecting measurement
Domain Examples Uptake (organisational conditions Identified management lead Identified clinical champion Intranet sign up Data submitted Teams trained Local communication strategy in place Process change (Intelligent Targets) Bundle compliance Uptake of new practice (specific to driver diagram) Outcome change (consequence of process) Reduced morbidity Reduced mortality Reduced dependency Reduced hospital stay
36
Taking this forward Stroke as a starter Four clinical areas
Agree driver diagrams Design and prove spreadsheet Incorporate in Annual Operating Framework 2010/11 Support learning and implementation 5 year rolling prgramme
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.