1 Facilitators: Lorna McCue and Pam Kinzie, HC Link Guest Presenter: Andrew Taylor, Taylor Newbury Consulting Results-Based Accountability for community organizations and networks Session 2: Performance Accountability for Programs, Agencies and Service Systems
Web/Teleconference Tips Synchronize your voice with the webconnection
Web/Teleconference Tips Mute/Un-mute: *1 ON HOLD
Web/Teleconference Tips Private and group chat
Web/Teleconference Tips Size of screen If the whole page is not displayed an image of the the page will appear for navigation Disconnected? See instructions Screen saver or Power saver
Contact Us Toll-free: Fax:
Introductions Who’s Online? Facilitators: Pam Kinzie, HC Link Lorna McCue, HC Link Andrew Taylor, Taylor Newbury Consulting Please indicate your: Name Organization
Session 2 Agenda 1.Welcome and Introductions 2.Learning Objectives & Agenda Review 3.Recap from Session 1 4.Performance Measures – 4 Quadrants Choosing Headline Measures Comparing Performance Turn the Curve Report: Performance Performance accountability questions 5.Performance Measures – Real life Examples 6.“Homework” Assignment 7.Q&A 8.Wrap-Up
Learning Objectives After participating in this webinar you will be able to: Define performance accountability; Describe performance measures in each of the 4 quadrants Identify how performance accountability may be useful to your organizations Take the next steps to find out more about RBA
Results Accountability is made up of two parts: Performance Accountability about the well-being of CLIENT POPULATIONS For Programs – Agencies – and Service Systems Population Accountability about the well-being of WHOLE POPULATIONS For Communities – Cities – Counties – States - Nations
“All performance measures that have ever existed for any program in the history of the universe involve answering two sets of interlocking questions.”
How Much did we do? ( # ) How Well did we do it? ( % ) Quantity Quality Performance Measures
Effort How hard did we try? Effect Is anyone better off? Performance Measures
How Much How Well Performance Measures Effort Effect
Quantity How much did we do? Education Quality How well did we do it? Effect Effort Is anyone better off? How much service did we deliver? How well did we deliver it? How much change / effect did we produce? What quality of change / effect did we produce?
Quantity How much did we do? Education Quality How well did we do it? Effect Effort Number of students Student- teacher ratio Number of high school graduates Percent of high school graduates Is anyone better off?
Quantity How much did we do? Education Quality How well did we do it? Effect Effort Number of students Student- teacher ratio Is anyone better off? Number of 9 th graders who graduate on time and enter college or employment after graduation Percent of 9 th graders who graduate on time and enter college or employment after graduation
Lay Definition All Data Have Two Incarnations Technical Definition HS Graduation Rate % enrolled June 1 who graduate June 15 % enrolled Sept 30 who graduate June 15 % enrolled 9 th grade who graduate in 12th grade
RBA Categories Account for All Performance Measures (in the history of the universe) Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Product Output Impact Process Input Effect Effort Cost TQM Effectiveness Efficiency
Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM Product Output Impact RBA Categories Account for All Performance Measures (in the history of the universe)
Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM 1. Did we treat you well? 2. Did we help you with your problems? * Product Output Impact RBA Categories Account for All Performance Measures (in the history of the universe) * World’s simplest complete customer satisfaction survey
Not All Performance Measures Are Created Equal Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM Product Output Impact
How much did we do? Not All Performance Measures Are Created Equal How well did we do it? Is anyone better off? Least Important QuantityQuality Effect Effort Most Important Least Most Also Very Important
How much did we do? The Matter of Control How well did we do it? Is anyone better off? Quantity Quality Effect Effort Least Control PARTNERSHIPS Most Control
The Matter of Use 1.The first purpose of performance measurement is to improve performance. 2. Avoid the performance measurement equals punishment trap. Create a healthy organizational environment. Start small. Build bottom-up and top-down simultaneously.
1. To Ourselves Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. Comparing Performance
2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. 1. To Ourselves First Can we do better than our own history? Using a Baseline CHART ON THE WALL Comparing Performance
1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. Reward?Punish? 3. To Standards When we know what good performance is. Comparing Performance
1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. Comparing Performance 3. To Standards When we know what good performance is.
The Matter of Standards Quantity Effect Effort 1. Quality of Effort Standards are sometimes WELL ESTABLISHED Child care staffing ratios Application processing time Handicap accessibility Child abuse response time 2. Quality of Effect Standards are almost always EXPERIMENTAL Hospital recovery rates Employment placement and retention rates Recidivism rates 3. Both require a LEVEL PLAYING FIELD and an ESTABLISHED RECORD of what good performance is. BUT AND
Advanced Baseline Display Your Baseline Comparison Baseline Goal (line) Target or Standard Instead: Count anything better than baseline as progress. Avoid publicly declaring targets by year if possible. ● Create targets only when they are: FAIR & USEFUL
How much did we do? Separating the Wheat from the Chaff Types of Measures Found in Each Quadrant How well did we do it? Is anyone better off? # Clients/customers served # Activities (by type of activity) % Common measures e.g. client staff ratio, workload ratio, staff turnover rate, staff morale, % staff fully trained, % clients seen in their own language, worker safety, unit cost % Skills / Knowledge (e.g. parenting skills) # % Attitude / Opinion (e.g. toward drugs) # % Behavior (e.g.school attendance) # % Circumstance (e.g. working, in stable housing) # % Activity-specific measures e.g. % timely, % clients completing activity, % correct and complete, % meeting standard Point in Time vs. Point to Point Improvement
How much did we do? Choosing Headline Measures and the Data Development Agenda How well did we do it? Is anyone better off? Quantity Quality Effect Effort # Measure # Measure # Measure # Measure # Measure # Measure # Measure #1 Headline #2 Headline #3 Headline #1 DDA #2 DDA #3 DDA % Measure % Measure % Measure % Measure % Measure % Measure % Measure # Measure # Measure # Measure # Measure # Measure # Measure # Measure % Measure % Measure % Measure % Measure % Measure % Measure % Measure
Program: _______________ Performance Measure Performance Measure Baseline Story behind the baseline Partners Three Best Ideas – What Works No-cost / low-cost One Page Turn the Curve Report: Performance Off the Wall
Performance Accountability 1.Who are our customers? 2.How can we measure if we are delivering services well? 3.How can we measure if our customers are better off? 4.How are we doing on the most important of these measures? 5.Who are the partners that have a role to play in doing better? 6.What works, what could work, to do better? 7.What do we propose to do? for programs, agencies and service systems