Download presentation
Presentation is loading. Please wait.
Published byFrederick Spencer Modified over 9 years ago
1
Outcome Based Accountability Adapted for Luton Children’s Services by: Carole Brooks Performance Review Manager Children & Learning Department, Luton Borough Council September 2008
2
Mark Friedman Fiscal Policy Studies Institute Sante Fe, New Mexico www.resultsbasedaccountabilty.com www.raguide.com www.trafford.com
3
Purpose Improve outcomes Offer a partnership way of working that helps move from: outcomes to needs analysis to inter-agency service planning and development to joint commissioning.
4
Why is it so good? Adopted by government agencies (including IdEA) Widely used in the US Used within many local authorities already Simple, flexible Common language, common sense, common ground
5
The Language Trap Too many terms, Too few definitions, Too little discipline Benchmark Target Indicator Goal Result Objective Outcome Measure Modifiers MeasurableCore UrgentQualitative PriorityProgrammatic TargetedPerformance IncrementalStrategic Systemic Systemic
6
Outcome Accountability is made up of two parts: Population Accountability about the well-being of WHOLE POPULATIONS for neighbourhoods – districts – regions - countries Performance Accountability About the well-being of CLIENT POPULATIONS for projects – agencies – service providers
7
Definitions Outcomes Indicators Performance Measures
8
Definition: OUTCOMES “A condition of well-being for children, adults, families or communities.” Children born healthy Children succeeding in school Safe communities Clean Environment Prosperous Economy Stated in plain language that people can understand Not about government jargon “A condition of well-being for people in a place......” E.g. “All Babies in Luton are born healthy”
9
Outcomes for Communities: Healthy Births Healthy Children and Adults Children Ready for School Children Succeeding in School Young People Staying Out of Trouble Stable Families Families with Adequate Income Safe and Supportive Communities
10
Every Child Matters Outcomes Being Healthy: enjoying good physical and mental health and living a healthy lifestyle. Staying Safe: being protected from harm and neglect and growing up able to look after themselves. Enjoying and Achieving: getting the most out of life and developing broad skills for adulthood. Making a Positive Contribution: to the community and to society and not engaging in anti-social or offending behaviour. Economic Well-being: overcoming socio-economic disadvantages to achieve their full potential in life.
11
Definition: INDICATORS A measure which helps quantify the achievement of an outcome. Rate of low birth weight babies Key stage test scores Burglary rate Air quality index Life expectancy rates How would we recognise these outcomes in measurable terms if we tripped over them? E.g. Low burglary rate helps to quantify a safe community
12
Definition: PERFORMANCE MEASURES A measure to evaluate how well a programme, agency or service system is working. Performance measures tell us how well service providers are working as opposed to the impact on whole populations (i.e. outcomes)
13
From ends to means... ENDS OUTCOMES “A condition of well-being for children, adults, families or communities” PERFORMANCE MEASURES “A measure to evaluate how well a programme, agency or service system is working” MEANS INDICATORS “A measure which helps quantify the achievement of an outcome” Population Accountability Performance Accountability
14
Outcome, Indicator or Performance Measure? 1. Safe Community 2. Crime Rate 3. Average Police Response Time 4. A graffiti free community 5. % of buildings surveyed without graffiti 6. Children staying safe 7. Incidences of child abuse 8. Number of abuse investigations completed Outcome Indicator P.Measure Indicator P.Measure
15
MEANS not ENDS: Collaboration (eg Local Strategic Partnership) System Reform (eg Progressing Integration Project) Service Integration (eg Children’s Trusts) Funding pools (eg Pooled Budgets) Strategies and Plans (eg Sustainable Community Strategy, Children and Young People Plan, Local Area Agreement) (to improving outcomes) (in themselves)
16
The Leaking Roof Inches of Water Time Experience Measure Story behind the baseline (causes) Partners What Works? Action Plan (Strategy) Forecast
17
OUTCOME “Children Being Healthy” INDICATORS Measures of the outcome 1.Infant mortality rate 2.Use of Class A Drugs 3.% Teenage Smokers BASELINES Where we’ve been Where we’re going Where we want to be STORY Behind the baselines The causes, the forces at work What’s driving the baselines? PARTNERS With a role to play Public, Private and Voluntary Sector Community groups Residents WHAT WORKS What would it take to turn the curve? Best practice Best hunches ACTION PLAN What we propose to do, how and by when Data Development Agenda (Pt 1) Data Development Agenda (Pt 2)
18
The 7 Population Accountability Questions 1. What are the quality of life conditions we want for the children, adults and families who live in our community? 2. What would these conditions look like if we could see them? 3. How can we measure these conditions? 4. How are we doing on the most important of these measures? 5. Who are the partners that have a role to play in doing better? 6. What works to do better, including no cost/low cost ideas? 7. What do we propose to do?
19
Using The Framework The outcome required State who is your population The indicators measuring this The story behind the baseline Data development agenda Key Partners Ideas to improve
20
Turning the Curve – report card workshop 10 mins: Starting Points Consider your theme and vulnerable group Decide what “hats” you are going to wear to ensure all key partners have a say 10 mins: Establish Your Baseline Identify your indicators. Speculate on the forecast and draw the curve to turn 30 mins: The Story Behind The Baseline What are the causes/forces at work? What factors are driving the baseline (Data Development Agenda) 30 mins: What Would It Take To Turn The Curve? What could work to do better? Each partners contribution No cost/low cost ideas 10 mins: Reivew your report and create action plan Baseline, Story, 3 best ideas (including one no cost/low cost idea) Start your action plan 10 mins: Plan to continue the work after today
21
Choosing Indicators Communication Power Does the indicator communicate to a broad range of audiences? Proxy Power Does the indicator say something of central importance about the outcome? Does the indicator bring along the data HERD? Data Power Is quality data available on a timely basis?
22
Outcome _______________________ Candidate Indicators Communication Power Proxy Power Data Power H M L H Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 Measure 6 Measure 7 Measure 8 H Staying Safe H M L H M L H H H L Criteria for Choosing Indicators Data Development Agenda (Pt 1)
23
Criteria for Choosing Indicators Part 1:Primary Indicators Does the indicator communicate to a broad range of audiences? Part 2:Secondary Indicators Everything else that’s any good (Nothing is wasted.) Used to help inform intelligence information Part 3:Data Development Agenda New data Data in need of repair (quality, timeliness etc.)
24
The Matter of Baselines H M L HistoryForecast Point to Point Improvement Turning the Curve Baselines have two parts: and forecast history OK?
25
Speculating on a base line Choose an indicator (e.g. under 18 conception rate) Start with NOW “Backcasting” Where have we been? OK? The curve to turn Number Time Forecasting Where are we going?
26
Story behind the baseline What are the policies are strategies? How does the process work? What factors are driving the baseline? What is research telling us? What is the history? What are the causes/forces at work Data Development Agenda: Any further data needs or information not available
27
Key Partners Who are the stakeholders that can have an effect on the outcome or that the outcome affects? Are they engaged in this process? How will you get them engaged?
28
Ideas to improve What would it take to “turn the curve”? What is best practice? What actions can be put in place to improve the outcome? Include at least one low cost/ no cost Include at least one “off the wall” Try and include one “quick win” Have ideas from all partners been included (including customers) Think SHARP
29
Sharp Edges SPECIFIC Is it specific enough? LEVERAGE How much difference will it make? VALUES Is it consistent with our personal and community values? REACH Is it feasible and affordable?
30
NOW ACTION & IMPROVE! Finalise action plan: What do we propose to do? How much will it cost? Are funds available? Ensure actions have a responsible person, timescale, costed How will you measure progress? Implement: Ensure all stakeholders are aware of actions required and take responsibility for their actions Review: How much did we do? How well did we do it? Is anyone better off?
31
Performance Accountability For programs, agencies and service systems
32
“All Performance Measures that have ever existed for any program in the history of the universe involve answering two sets of interlocking questions….”
33
Programme Performance Measures QUANTITYQUALITY How Much did we do? (number) How Well did we do it? (percent)
34
Programme Performance Measures Effort How hard did we try? Effect Is anyone any better off?
35
Programme Performance Measures How much service did we deliver? How well did we deliver it? How much change/effect did we produce? What quality of change/effect did we produce? QUANTITYQUALITY EFFORT EFFECT INPUT OUTPUT Cause Effect
36
Programme Performance Measures How much did we do? How well did we do it? QUANTITYQUALITY EFFORT EFFECT Is anyone better off? Number Percentage
37
Education QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? Number of Students Pupil/Student Ratio Number of 16 year olds with 5 A to C GCSEs Number with good school attendance Percentage of 16 year olds with 5 A to C GCSEs Percentage with good school attendance
38
Health Clinic QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? Number of Patients Treated Percentage of patients treated in less than one hour Recovery Number (for patients of the clinic) Recovery rate (for patients of the clinic)
39
Fire and Rescue Service QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? Number of Respondents Response Time Number of Fire Related Deaths (in catchment area) Rate of Fire Related Deaths (in catchment area)
40
General Motors QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? Number of Production Hours Number of tons of steel Employees per vehicle produced Number of Cars Sold Amount of Profit Residual Value after 2 years % Market Share Profit per Share % Residual Value after 2 years
41
Customer Satisfaction QUANTITYQUALITY EFFORT EFFECT How well did we do it? How much did we do? How much change/effect did we produce? What quality of change/effect did we produce? Did we treat you well? Did we help You with your problems? (The world’s simplest and yet complete customer satisfaction survey)
42
Not All Performance Measures Are Created Equal... QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? LEAST important Also Very Important MOST important
43
The Matter of Control QUANTITYQUALITY EFFORT EFFECT How much did we do? How well did we do it? Is anyone better off? MOST control LESS control We have the least control over the most important matters
44
Matter of Use First purpose is to improve results and services. Avoid the “performance measurement equals punishment” trap Acknowledge the experience as real. Work to create a healthy service improvement culture that sees performance measurement as a means to an end, not the end. Don’t create perverse measures that can be ‘fixed’ or don’t reflect what you are trying to measure. If possible, use a balanced set of measures.
45
Comparing Performance 1. To ourselves first Can we do any better than our own history? 3. To standards When we know what good performance is Using a baseline, Chart on the Wall 2. To others When it’s a FAIR apples/apples comparison Reward? Punish?
46
The Matter of Standards Quantity Effect Effort 1. Quality of Effort Standards are sometimes WELL ESTABLISHED Childcare staffing ratios Application processing times Disability access facilities Child abuse report response times 2. Quality of Effect Standards are almost always EXPERIMENTAL Hospital recovery rates Employment placement and retention rates Recidivism rates 3. Both require a LEVEL PLAYING FIELD and an ESTABLISHED RECORD of what good performance is. BUT AND
47
Advanced Baselines Outcome (line) Target or Standard Your Baseline Comparison Baseline Avoid publicly declaring Targets by year if possible Instead: Count anything better than baseline as progress Indicator (Units) Timeline (Years) Create Targets only where they are FAIR and USEFUL
48
Summary of Performance Measures How much did we do?How well did we do it? Number of Customers served (by customer characteristic) Number of Activities (by type of activity) % Common measures Workload ratio, staff turnover rate, staff morale, percentage of staff fully trained, worker safety, unit cost, customer satisfaction: Did we treat you well? Etc. % Activity Specific Measures Percentage of actions timely and correct, percentage clients completing activity, percentage of actions meeting standards etc. (Percentage) Skills/Knowledge Attitude/Opinion Including customer satisfaction Did we help you with your problems? Behaviour Circumstances ( Quantity) Skills/Knowledge Attitude/Opinion Behaviour Circumstances Is anyone better off?
49
Performance Accountability: Getting from Talk to Action Customers Story behind the baselines (Information and Research Agenda about Causes) Partners What works (Information & Research Agenda about Solutions) Strategy and Action Plan Criteria How much did we do? How well did we do it? Is anyone better off? Turned Curve Trend (Data Development Agenda) Performance Measures Baselines
50
The 7 Performance Accountability Questions 1. Who are our customers? 2. How can we measure if our customers are better off? (Lower Right Quadrant Measures) 3. How can we measure if we are delivering services well? (Upper Right Quadrant measures) 4. How are we doing on the most important of these measures? 5. Who are our partners that have a role to play in doing better? 6. What works to do better, including no-cost and low-cost ideas? 7. What do we propose to do?
51
Identifying Performance Measures: The Five Step Method How much did we do?How well did we do it? Number of customers served (by customer characteristic) Number of Activities (by type of activity) (Percentage) (Quantity) Is anyone better off? % Common Measures % Activity measures 12 33
52
Identifying Performance Measures: The Five Step Method QUALITY How Well? Better Off (%)? Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 Measure 6 Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 Measure 6 4 Identify measures with good data Prioritise 5 Identify & Prioritise Data Development measures DD1 DD2 DD3 Headline Measures Secondary Measures Data Development Agenda How Much ? Number of customers served (by customer characteristic) Measure 1 Measure 2 Measure 3 Number of Activities (by type of activity) Measure 1 Measure 2 Measure 3 Measure 4 Better Off (#) Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 Measure 6 QUANTITY 1 2 3
53
Fitting it all Together Population and Performance Accountability
54
The Linkage between Population and Customer Outcomes POPULATION ACCOUNTABILITY Being Healthy Rate of low birth-weight babies Staying Safe Rate of Child Abuse and Neglect Enjoying and Achieving % with five GCSEs Grade A-C PERFORMANCE ACCOUNTABILITY No. of Investigations completed % initiated within 24 hours of report No. of repeat abuse/neglect cases % of repeat abuse/neglect cases CUSTOMER OUTCOMES POPULATION OUTCOMES Contribution Relationship Alignment of Measures Appropriate Responsibility Child Protection Service
55
Relationship between Performance Measures and Indicators Programme CLIENTS Agency CLIENTS Service System CLIENTS Size of Client Population Indicator TOTAL POPULATION Performance Measure
56
Every time you make a presentation, use a two-part approach Outcome: to which you contribute to most directly. Indicators: Story: Partners: What would it take?: Your Role: as part of a larger strategy. Population Accountability Service: Performance measures: Story: Partners: Action plan to get better: Performance Accountability Your Role
57
How this fits with Narrowing the Gap Or how to turn the curve AND narrow the gap at the same time…..
58
Turning the curve AND Narrowing the gap Improving outcomes for vulnerable populations Use the “turning the curve” templates to narrow the gap.
59
Narrowing The Gap PASTNOWFUTURE Good performance is low “Backcasting” Where have we been? OK? The curve to turn: bigger for vulnerable groups to narrow the gap Indicator Result Forecasting Where are we going? Not All C&YP Vulnerable group
60
Three Wise Men… “Statistics: The only science that enables different experts using the same figures to draw different conclusions. “ Evan Esar “Not everything that can be counted counts, and not everything that counts can be counted.” Albert Einstein “It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.” Sir Arthur Conan Doyle
61
Toolkit Luton: http://www.cypp.luton.gov.uk/go/narrowingthegap/ Mark Friedman: www.resultsbasedaccountabilty.com www.raguide.com
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.