Download presentation
Presentation is loading. Please wait.
1
Falkirk Children’s Services
Jude Breslin, Children’s Services Coordinator & PACE Lead Model for Improvement
2
Scotland’s first multi-agency quality improvement programme
Early Years Collaborative – January 2012 Raising Attainment for All – November 2014 Children and Young People Improvement Collaborative (CYPIC) – November 2016. Across Falkirk, practitioners are using improvement science to improve outcomes for children and young people, change culture, systems and practice and move towards earlier intervention.
3
National Context – Lots of Drivers
Lots of drivers for us, GIRFEC – NIF, PEF, Cfe – and that’s just education! Also got Children and Young people Act which is huge!
4
Council Planning Corporate Plan
Falkirk Golden Thread Council Planning Corporate Plan Service Plans Division Plans Council Planning Community Planning: SOLD ICSP School improvement plans
5
JC
6
Supporting implementation of priorities
8
Improvement is all of our business!
Doesn’t matter where you work 0 -26 Model can support us to deliver on our aims regardless Model can support your improvement plans and give you the evidence for your RACI’s.
9
Why would you use the model for improvement?
We know the evidence based practice that we want to make happen but history has told us it takes 17 years to get 14% of evidence base into practice
10
‘Not all change is improvement, but all improvement requires change’
Don Berwick MFI gives you a set of tools to help understand whether change is an improvement or not
11
We want to change Scotland’s place in the world and to do this we must find new and better ways to achieve the outcomes we want. This 3-Step Improvement Framework has been developed to help unlock and channel the collective knowledge and energy of our people towards a common goal of real and lasting improvement across our public services. The Framework is designed to prompt self-assessment and debate. It is about getting started and ‘doing’: creating conditions for and implementing the improvements A vision – capable of stirring the heart of the community and able to serve as a constant reference and anchor point as the change moves forward. A story – to enable people to recognise where they have been and where they are going. A set of actions – to take us to the next steps towards releasing the vision. A clear framework for improvement. A strategy to engage and empower the workforce - to provide the stimulation, development and opportunity our staff need to fully release their deep commitment to public service. An understanding of how change will work locally (everywhere) – recognising communities are different and creativity should be nurtured and released at a local level.
12
The Improvement Framework
Provides a clear thinking approach to change and improvement Can be applied to business processes and systems as well as policy making and delivery Recognises all improvement happens where the work is and the conditions needed to support this It’s about getting started, and doing………… Aim- Is there an agreed aim that is understood by everyone in the system? Correct changes - Are we using our full knowledge to identify the right changes and priorities those that are likely to have the biggest impact on our aim? Clear change method- Does everyone know and understand the method(s) we will use to improve? Measurement- Can we measure and report progress on our improvement aim? Capacity and capability - Are people and other resources deployed in the best way to enable improvement? Spread plan- Have we set out our plans for innovating, testing, implementing and sharing new learning to spread the improvement everywhere it is needed?
13
The Improvement Framework Approach
Conference Room DESIGN APPROVE IF NECESSARY Real World TEST & MODIFY TEST & MODIFY TEST & MODIFY START TO IMPLEMENT The improvement framework approach takes things out to the real world ASAP and tests on a small scale, building up as we go along.
14
Why would you use the model for improvement?
It’s not about working harder and faster – need to change
15
Change Bingo Message - This is the real world but our ambition for our children and families is that Scotland is the best place to grow up – our values fuel the effort and we can effect change within the parameters by using the model. MFI is incremental and gives us confidence in the change.
16
Fundamental Principles of Improvement
Knowing why you need to improve Having a feedback mechanism so you know if a change is an improvement Developing effective changes Testing changes before attempting to implement Knowing when and how to make a change permanent
17
Model for Improvement Act on the plan - Testing
The Model For Improvement No model is perfect – some are useful The Thinking Part – relate back to the content of their “what better would look like” conversation. The specifics – the what (service output), the who (children and families) and the where (your establishment / your Aim Have you identified when you want (need) it to happen Aim What it would look like – more/less/all , a number , a percentage Measures Did you even start to discuss ideas of what you could do differently, stop doing, introduce something Changes The Doing Part – PDSA
18
AIM
19
Setting the Aim Ambitious, unachievable by hard work alone Measurable
…..without measurement you will never know if a change is an improvement Time specific ….. ensures focus Specific: Define the group you are focusing improvement efforts on
20
Setting aims: Characteristics
What? Measurable (How good?) Time specific (By when?) Define target group (Who?)
21
Vision and aims matter Getting everyone behind the improvement Understand what success looks like The Model For Improvement - What are we trying to acomplish? Aim – Be specific (health warning – check for strategic fit) – Need for a common purpose “what better would look like” conversation The specifics – the what (service output), the who (children and families) and the where (your establishment / your Aim Have you identified when you want (need) it to happen Aim
22
Aims create systems & let us know what part of the system to look at.
Provides a clear sense of what we are trying to accomplish Measurable – how much by when? Specific – who, where? Unachievable by hard work alone An improvement aim… Provides a clear sense of what we are trying to accomplish Measurable – how much by when? Specific – who, where? Unachievable by hard work alone… YOU WANT TO INSPIRE TRANSFORMATIONAL CHANGE. change to the system not within the system Once you’ve got A CLEAR AIM THAT EVERYONE IS BOUGHT IN TO, STICK WITH IT!!!!
23
Measures
24
just about measurement
How do we know that a change is an improvement? Improvement is not just about measurement However… without measurement you will never be able to answer the question
25
Why are you collecting data?
Research? Judgment? It doesn’t have to be perfect for research and it’s not for performance management or judgment – it should be useful to you to show what and how you have improved Improvement?
26
Let’s talk about measures!
Don’t ask “what will we measure?” Ask “what do we need to know?” then figure out if you can measure it.
27
Understanding the different types of measures
Outcome Process Balancing Outcome Measures: Directly relate to overall aim Is this a better result for learners & their families? How is the system performing? Process Measures: Are the processes that we believe will contribute to the aim performing as planned? Are we doing what we planned to (the changes) reliably? Balancing Measures: Are there unacceptable consequences to related processes/ outcomes/ culture?
28
Balancing Measures Outcome Measures Process Measures
Is the young person getting the right outcome? Is the system working as planned? What about the bigger picture? Balancing Measures Looking at the system from different dimensions. Does improving one thing cause problems elsewhere? Outcome Measures Are we making things better? Are we on track to achieve our Aim? Process Measures Are we doing the right things at the right time, every time? Is the process reliable? Process measures are about the how The outcomes measures are about the impact, the so what The balancing measures are the unintended consequences and can be negative or positive Dave Williams, IHI
29
Types of measures % receiving a story % of stories read at bedtime
% of parents reporting improved bedtime routine % enjoying the bedtime story % reporting increase in bedtime story reading There might be lots of different things you can measure in relation to your improvement project. However, you don’t have to measure all of them – measure what is useful and gives you the best information
30
Imagine you’re flying a plane
Imagine you’re flying a plane. There’s loads of measurement going on here….
31
The right measures are the ones derived from purpose…
You can see that it’s pretty useful stuff – how fast the plane is flying, what direction it’s heading, how the oil pressure is doing, and so on. All linked to purpose – in this case to fly passengers safely to their destination. The method by which the information is relayed to the pilot is useful too – the dials are capable of indicating change in real time, enabling our pilot to respond accordingly and make adjustments if necessary. Simon Guilfoyle, 2013
32
What if we use the wrong measures?
Well here’s the same set of dials, this time configured to measure the wrong things. Using the wrong measures means it’s impossible to establish if you’re achieving purpose. There’s no point counting the wrong things, just because they’re easy to count. Furthermore, if you’re using the wrong measures, don’t expect them to encourage the right behaviours. Simon Guilfoyle, 2013
33
Or measure the right things in the wrong way..
Here’s the ‘binary comparison’ version of the cockpit dials. All the same measures as in the original configuration, but presented in a way that can’t tell you anything useful. We often find people want to do this as we’re used to this kind of pass/fail targets. So, did people wait in A&E for more or less than 4 hours? What would be more useful in improvement would be to look at the average time taken. Simon Guilfoyle, 2013
34
And if we didn’t measure anything?
If you don’t measure anything youre flying blind. How can you make strategic decisions like this?? Simon Guilfoyle, 2013
35
Operational Definitions
36
Operational Definition & Improvement
An Operational Definition when applied to data collection is a clear, concise definition of a measure This is fundamental for collecting all types of data and vital when deciding if something is correct or incorrect! In other words we need to be clear about what it is we are measuring if I said to you that we will be measuring how well a baby is latched on then we need to define what that means for example is the baby’s nose to the nipple? Is the bottom lip curled right back, is the baby rythmically feeding, likewise if we were meauring how many children had a healthy diet we would need to define what we mean’t by healthy, does the child eat fresh fruit everyday or does or how many portions of wholemeal foods does the child have…..
37
Take a few moments in pairs to discuss one or two of these………..
How would you define these concepts? Take a few moments in pairs to discuss one or two of these……….. A “fair” tax A tax “loophole” A “good” holiday A “great” movie The “rich” or the “poor” 37
38
An Operational Definition….
It gives communicable meaning to a concept Is clear and unambiguous Specifies measurement method and equipment Identifies criteria ….is a description in quantifiable terms, of what to measure and the steps to follow to measure it consistently So how good do you think you are at explaining what we mean by something, we are now going to have a go, I hope in a fun way, if you have done this before we will expect to see excellent results………. So now at your tables I would like you to get into groups of four and spend a few minutes deciding on a name for your group, I would in some way like the names of the groups to reflect banana’s in some way for example the bendy bananas – be inventive!!! Now we are going to take a few minutes to just record the names of all the groups we have in the room, so please can we start here and shout out the name of the groups and make a note of your name so you don’t forget who you are!!! 38
39
Aims & measurement is crucial
Are we all working to the same aim? Have you defined the things in your aim so it’s clear and we can all understand it? How do you know you’re on the right track – by measuring the right things. Measurement must be linked to an aim.
40
Measures
41
Aggregated Data Tells us very little about the improvement journey
42
Tells us very little about the improvement journey Week beginning
Number of children 21-Jan 5 25-Jan 8 04-Feb 7 11-Feb 18-Feb 9 25-Feb 04-Mar 10 11-Mar 6 18-Mar 25-Mar 12 01-Apr 4 08-Apr 15-Apr 22-Apr 11 29-Apr 18 06-May 17 13-May 19 20-May 20 27-May Tells us very little about the improvement journey Even though it is over time – very little information
43
Plot your data on a chart and use annotations to tell the story of what was tested and explain reasons for data activity - engages everyone in the improvement journey.
44
Data+ Story = Learning
45
Centre line is the median Value being measured for each time point
Run Charts Display data to make process performance visible Centre line is the median Value being measured for each time point Can be started as soon as you have data - to facilitate learning as soon as possible Usually time is represented along the bottom
46
Data collection planning
Who: is responsible and what are contingency plans What: operational definitions, numbers, words, pictures. Whole population or a sample Where: in setting, sessions, corridors When: are you going to collect data… everyday, once a week, at each session, specific day. Who will be responsible for collecting the data – what are your contingency plans for holidays, weekends, afternoons, etc? What are they going to collect – operational definitions, numbers, words, pictures? All of your population / some of your population? Where are they going to collect it? In the playground, ward corridor, coffee room, canteen? Some data by its very nature will dictate the answer to this but if there is opportunity for interpretation make it explicit – you don’t want signals in your data that doesn’t need to be there! When are you going to collect it – everyday, once a week, on a Friday? Again your project may dictate this i.e. the number of people attending the Friday morning meeting already tells you when you are collecting the data! Be explicit – says time in the long run.
47
Tips for effective measures
Plot data over time - annotate Seek useful not perfection Use sampling Integrate measurement into daily routine Use qualitative and quantitative data Improvement requires change, and change is, by definition, a temporal phenomenon. Much information about a system and how to improve it can be obtained by plotting data over time, such as data on length of stay, volume, patient satisfaction — and then observing trends and other patterns. Tracking a few key measures over time is the single most powerful tool a team can Remember, measurement is not the goal; improvement is the goal. In order to move forward to the next step, a team needs just enough data to know whether changes are leading to improvement. Sampling is a simple, efficient way to help a team understand how a system is performing. sampling can save time and resources while accurately tracking performance. For example, instead of monitoring the time for a referral to be made continuously, measure a random sample of referrals per month. In addition to collecting quantitative data, be sure to collect qualitative data, which often are easier to access and highly informative.. focus your efforts on improving family satisfaction, ask families about their experience 47
48
We are drowning in data right now.
How do we choose the right things to measure?
49
Change ideas/ Tests of change - PDSA
50
CHECKPOINT Answer the 3 questions associated with the thinking part of the model before you start testing 70% of projects fail because they don’t answer these questions at the outset Project from harvard university showed that 70% projects fail…
51
Selecting Changes Evidence base: use the literature and evidence to inform your practice Use your own experience as a practitioner: your own hunches and theories Be strategic: set priorities based on the aim, known problems and feasibility Steal shamelessly! Learn from others and their experiences, hunches and theories Use the evidence base to inform your thinking Be strategic – if you know there is a problem with a part of your service, focus your improvement on that Follow hunches, use your intuition Steal shamelessly Learn from what has worked elsewhere - what are the concepts that work, can they be adapted?
52
Plan-Do-STUDY-Act What is your data telling you?
What is it not telling you? 52
53
Do Study Act Plan Do Study Act
Northwest Improvement Initiative Act Adapt, Adopt or Abandon? Plan Purpose, Predict & Prepare Do Data Study Reflect & Record “What will happen if we try something different?” “What’s next? ” The Model For Improvement The Doing Part – PDSA Think Big, Start Small Test Often – Learn Rapidly Plan – be clear and predict Do – carry out as per the Plan Study – against the predication and evaluate what actually happened and why Act – Abandon / Adapt / Adopt Adapt – keep going, learning and building confidence Four parts of the cycle: Plan: Decide what change you will make, who will do it, and when it will be done. Formulate an hypothesis about what you think will happen when you try the change. What do you expect will happen? Identify data that you can collect (either quantitative or qualitative) that will allow you to evaluate the result of the test. Do: Carry out the plan. Record any problems and spot new ideas. Collect data. Study: Make sure that you leave time for reflection about your test. Use the data and the experience of those carrying out the test to discuss what happened. Did you get the results you expected? If not, why not? Did anything unexpected happen during the test? Act: Given what you learned during the test, what will your next test be? Will you make refinements to the change? Abandon it? Keep the change and try it on a larger scale? In the spirit of science, there really is no such thing as a 'failed experiment.' Any test that yields valid data is a valid test. Adam Savage “Let’s try it!” “Did it work?” © R. Scoville & I.H.I.
54
CHANGE IDEA: 10 mins support with reading each day
Aim: improve john’s reading comprehension by week 12 using 10 mins support each day using X tool/ strategy. CHANGE IDEA: 10 mins support with reading each day Test in range of conditions AM/ PM, peer reading etc. Gives us immediate data to tell us is this working or not – not waiting a term or a year. A test is different from a pilot. 1 child Child 1 (3 cycles: SW, health, legal) 2/4/14 Simulated test (practice on existing case) 11/3/14
55
Source: Bromford P (2015), ”What’s the difference between a test and a pilot?”
56
“If you think you are too small to make a difference,
try sleeping with a mosquito” His Holiness the 14th Dalai Lama Thank You & Good Byes We hope you had fun and got an insight into how the model could help you. CYPIC is a community – engage and you will get support! Next part of the day instructions …..
57
Re-cap of 3 questions 1. What are we trying to accomplish?
2. How will we know that a change is an improvement? 3. What changes can we make that will result in improvement?
58
Was the change an improvement?
Lt’s see what you think – a wee practical exercise
60
YOUR TURN: what are your aims & change ideas?
Thinking Part The Model for Improvement* provides an overarching approach to testing improvement at the local level, but we recognise there are also other approaches that work. The benefits of testing: To increase the belief that this particular change works and will result in improvement for your clients To learn how to adapt the change to conditions in your setting To evaluate the costs and “side-effects” of changes To minimise the resistance when spreading the change throughout the organisation Doing Part
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.