Accelerating Progress for Neighborhood Level Outcomes for Youth and Families Carla M. Taylor, PhD Senior Associate Ideas Into Action.

Slides:



Advertisements
Similar presentations
Results/Outcomes Based Accountability
Advertisements

RBA Results-Based Accountability The Fiscal Policy Studies Institute Book - DVD Orders amazon.com resultsleadership.org.
1 Welcome! Building Assets of People, Families and Communities “What Do Outcomes Have to do with It?” Jessie Ball duPont Fund October 28-30, 2008.
Outline of Presentation 1.Mission, Vision and Values for Task Force 2.Definition of Engagement 3.Explanation of Research-Informed Framework 4.Characteristics.
Turning the curve: children’s services and outcomes based accountability Jacky Tiotto Deputy Director, Government Office planning and performance support,
1 Outcomes Focussed Practice; the Paradigm shift in Public Services Rob Hutchinson, CBE April 2007 Based on Results Based Accountability
CHCORG610B Mange Change in a community sector organisation CHCCS503A Develop, implement and review services and programs to meet client needs
Commissioning Self Analysis and Planning Exercise activity sheets.
STOP SPINNING YOUR WHEELS: USING RBA TO STEER YOUR AGENCY TO SUCCESS! Anne McIntyre-Lahner Director of Performance Management Connecticut Department of.
RBA Results-Based Accountability The Fiscal Policy Studies Institute Book - DVD Orders amazon.com resultsleadership.org.
Results Based Accountability Basics An Introduction to RBA Standard Training Slides Sponsored by the Ministry of Social Development.
Kimberly Scott, Literacy Funders Network Adam Luecking, Results Leadership Group Nina Sazer O’Donnell, National Results and Equity Collaborative Michael.
Results Based Accountability Basics An Introduction to RBA Standard Training Slides Sponsored by the Ministry of Social Development.
RBA Results-Based Accountability The Fiscal Policy Studies Institute Book - DVD Orders amazon.com resultsleadership.org.
RBA Results-Based Accountability The Fiscal Policy Studies Institute Book - DVD Orders amazon.com resultsleadership.org.
Chapter 9 Review How can you measure employee engagement levels over time?
Results Based Accountability. Overview of Results Based Accountability The context of the development of Results Based Accountability in New Zealand An.
Results Based Accountability The Fiscal Policy Studies Institute Santa Fe, New Mexico 1.
Some Common Interview Questions Exposed Lynn D’Angelo-Bello The Center for Career & Professional Development.
Tackling Health Inequalities using Outcome Based Accountability with a focus on BME communities David Burnby This presentation is.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Here’s something you’ll already know Curriculum for Excellence is designed to support young people to develop the four capacities.
LB160 (Professional Communication Skills For Business Studies)
Achieving Outcomes – Choosing Targets and Making a Difference Lynne Dean DH/CSIP.
National Coalition Academy Summary
Results-Based Accountability (RBA)
How to show your social value – reporting outcomes & impact
Strategic Planning Forum Number Three
Learning and Development Developing leaders and managers
There is great power in harmony and mutual understanding.
Module 2 Basic Concepts.
Customer Service, Balanced Scorecards: The Road to Becoming a Service-Oriented Organization 1.
J. Sterling Morton High Schools
Building the foundations for innovation
APS Strategic Plan Steering Committee
Chester School District
Research Program Strategic Plan
Telling Your SSIP Story
K-3 Student Reflection and Self-Assessment
Strategic Planning Setting Direction Retreat
Learning and Development Developing leaders and managers
Writing Curriculum Showcase Thursday, April 13, :00am
Logic Models and Theory of Change Models: Defining and Telling Apart
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
M.A.T.C.H. Professional Series: Module 11
12/9/2018 3:38 PM Quality Improvement Plans (QIPs): Aligning the Content to the QIP Guidelines.
Your Library: Explore, Learn, Read, Connect
Let’s Talk Data: Making Data Conversations Engaging and Productive
Introduction to the PRISM Framework
There is great power in harmony and mutual understanding.
Response to Instruction/Intervention (RtI) for Parents and Community
Results Based Accountability
K–8 Session 1: Exploring the Critical Areas
Response to Instruction/Intervention (RtI) for Parents and Community
RBA Results-Based Accountability
The American dream.
Results Based Accountability (RBA)
A Focus on Strategic vs. Tactical Action for Boards
Collective Impact: Starting with the end in mind
Beth Leeson Center for the Study of Social Policy
Strategic Plan: Heards Ferry Elementary
Solving Problems in Groups
Forging Effective Partnerships
Time Scheduling and Project management
Beyond The Bake Sale Basic Ingredients
Building Strong School-Family Partnerships with the Right Question School-Family Partnership Strategy Luz
Outcomes Based Library Planning – Part 2
Developing SMART Professional Development Plans
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

Accelerating Progress for Neighborhood Level Outcomes for Youth and Families Carla M. Taylor, PhD Senior Associate Ideas Into Action

session results By the end of this session you will… Have a conceptual understanding of the RBA framework   Understand how RBA can strengthen cross-system work to improve outcomes for youth and families Identify an opportunity to apply RBA in your own efforts

introductions

What is RBA? A commonsense framework to help us: Improve the quality of life for communities, individuals and families (population accountability); and Improve the quality and effectiveness of organizations, agencies and programs (performance accountability). Developed by Mark Friedman and outlined in his book “Trying Hard is Not Good Enough.”

Key ideas of rba Common sense: Start at the end (results) to determine what you ultimately want to achieve Common ground: Stakeholders rally and contribute to shared purpose, shared ground Common language: Use “language discipline” to agree on definitions and meaning for specific words

Key ideas of rba Data-driven and transparent way of decision making and communication Two levels of accountability (that are often confused): Population or whole community = Population Accountability Service System, Agency or Program = Performance Accountability

[ ] DEFINITIONS RESULT INDICATOR PERFORMANCE MEASURE ACCOUNTABILITY LANGUAGE DISCIPLINE RESULT A condition of well-being for children, adults, families or communities. ACCOUNTABILITY POPULATION EXAMPLES: Children succeed in school; Safe neighborhood; Children are safe; Children are healthy INDICATOR A measure which helps measure the achievement of a result. EXAMPLES: Rate of children enrolled in early childcare; rate of children born healthy; rate of subsequent pregnancies So what we did a few years ago is develop a set of definitions that would allow us to have a disciplined conversation about this very complex work we’re trying to do. Now the purpose of these definitions is not to impose words on people. Words like “result” or “outcome” are just labels for ideas. If you think about if for a minute, that’s what words are, labels for ideas. And the same idea can have many different labels. What’s important here are not the labels. You can pick whatever labels you like. What important are the ideas, and that we manage to keep three ideas separate at the beginning of this work. Read the ideas and the examples for Results and Indicators. Now this last category, performance measures…. Are measures of how well a program, agency or service system is working. Now there are many different ways to categorize performance measures, but I believe that all performance measures can be categorized into one of these three categories: How much did we do? How well did we do it? Is anyone better off? And this last category we sometimes call “customer results” or “customer outcomes.” And if you do nothing else in terms of your language convention, I would strongly encourage you…. That whenever you want to use a word like “outcome” or “result” and you’re talking about a program or agency, put a modifier in front of it. Call if “program results” or “client outcomes,” something to distinguish it from the use of the words results and outcome to mean the whole population. This is the single biggest source of language confusion in the U.S. today. From www.raguide.org The Language of Accountability The most common problem in this work is the problem of language. People come to the table from many different disciplines and many different walks of life. And the way in which we talk about programs, services and populations varies, literally, all over the map. This means that the usual state of affairs in planning for children, families, adults, elders and communities is a Tower of Babel, where no one really knows what the other person is saying, but everyone politely pretends that they do. As a consequence, the work is slow, frustrating and often ineffective. It is possible to exercise language discipline in this work. And the way to do this is to agree on a set of definitions that start with ideas and not words.  Results (or outcomes or goals) are conditions of well-being for children, adults, families or communities, stated in plain English (or plain Spanish, or plain Korean...). They are things that voters and taxpayers can understand. They are not about programs or agencies or government jargon. Results include: "healthy children, children ready for school, children succeeding in school, children staying out of trouble, strong families, elders living with dignity in setting they prefer, safe communities, a healthy clean environment, a prosperous economy." (An interesting alternative definition of a result is provided by Con Hogan: "A condition of well-being for people in a place - stated as a complete sentence." This suggests a type of construction for a result statement as "All ______ in ______ are _____." e.g. All babies in Vermont are born healthy.") Words are just labels for ideas. And the same idea can have many different labels. The following four ideas are the basis for definitions used at the beginning of this work. Alternative labels are offered: Indicators (or benchmarks) are measures which help quantify the achievement of a result. They answer the question "How would we recognize these results in measurable terms if we fell over them?" So, for example, the rate of low-birthweight babies helps quantify whether we're getting healthy births or not. Third grade reading scores help quantify whether children are succeeding in school today, and whether they were ready for school three years ago. The crime rate helps quantify whether we are living in safe communities, etc. Performance Measures are measures of how well public and private programs and agencies are working. The most important performance measures tell us whether the clients or customers of the service are better off. We sometimes refer to these measures as client or customer results (to distinguish them from cross-community population results for all children, adults or families). It is sometimes useful to distinguish "program performance measures," from "agency performance measures" from "service system performance measures." Strategies are coherent collections of actions which have a reasoned chance of improving results. Strategies are made up of our best thinking about what works, and include the contributions of many partners. No single action by any one agency can create the improved results we want and need. The principal distinction here is between ends and means. Results and indicators are about the ends we want for children and families. And strategies and performance measures are about the means to get there. Processes that fail to make these crucial distinctions often mix up ends and means. And such processes tend to get mired in the all-talk-no-action circles that have disillusioned countless participants in past efforts. You actually have choices about which labels to use in your work. And clarity about language at the start will help you take your work from talk to action. What Mission and Vision, Values, Goals, Objectives, Problems, Issues  Inputs and Outputs? First, remember that words are just labels for ideas. These seven words have no natural standard definition that bridges across all the different ways they are used. They are terms of art which can and are used to label many different ideas. This is why we pay so much attention to getting language discipline straight at the very beginning. It's the ideas that are important not the words. So you can choose to label the ideas in this guide with any words you like, provided you are consistent.  Many of us have grown up with these traditional words in strategic planning and budgeting. Where do they fit?  The word "mission" is usually used in relation to an organization, agency, program, initiative or effort. It is therefore mostly used in connection with agency or program performance accountability. Mission statements are usually concise statements of the purpose of an organization, sometimes also telling why and how the organization does what it does. Mission statements can be useful tools in communicating with internal and external stakeholders. It is possible to construct a mission statement from the performance measurement ideas in the upper right ("How well did we deliver service?") and lower right ("Is anyone better off?") quadrants of the performance measurement framework: For example: "Our mission is to help our clients become self sufficient ("Is anyone better off?" lower right) by providing timely, family friendly, culturally competent job training services ("How well did we deliver service?" upper right)." One mistake that is often made is that organizations spend months and sometimes years trying to craft the perfect mission statement before any other work can proceed. In the FPSI framework, mission statements are set aside, allowing the work of identifying and using performance measures to proceed quickly. Then, on a parallel track a small group can, if it is useful, use the work of the performance measurement groups to craft a workable mission statement. The word "vision" is often used to convey a picture of a desired future, often one that is hard but possible to attain. This is a powerful idea. And in fact one can think of the set of desired results for children and families as one way of articulating such a vision. "We want our community to be one which is safe and supportive, where all children are healthy and ready for school, where all children succeed in school, and grow up to be productive and contributing adults." This is an example of a vision statement made up of desired results or ends. It is possible to craft such a statement before or after the development of results. The word "values" in some ways defies definition. It is about what we hold most dear, how we view right and wrong, how we believe we should act, and how those beliefs are, in fact, reflected in our actions.  Our values underlie all of the work we do. And that is nowhere more true than in the work on the well-being of children, families and communities. Our values will guide our choice of results for children and families and the decisions we make about how we and our partners take action to improve those results. The word "goal" is often used interchangeably with "result and outcome" to label the idea of a condition of well-being for children, adults, families or communities (as in the case of Georgia, Missouri and Oregon for example). The word goal has many other common usages as well. It often serves as an all-purpose term to describe a desired accomplishment. "My goal for this month is to fix the roof." "Our goal is to increase citizen participation in the planning process." " The primary goal of the child welfare system is to keep children safe." and so forth. The word goal (or target) is sometimes used to describe the desired future level of achievement for an indicator or performance measure. "Our goal is 95% high school graduation in 5 years." "Our goal is to improve police response time to under 3 minutes." These are widely different usages. Still another use of the word "goal" is in relation to an implementation plan. Given a strategy and action plan to improve a particular result (children ready for school for example), it is possible to structure the action plan as a series of planned accomplishments (goals) with timetables and assigned implementation responsibility. For example, a goal in a "children ready for school plan" might be to "increase funding for child care by 25% this year and 50% next year." This is a specific action which will contribute to achieving the result. There is nothing wrong with any of these usages, provided they are clearly distinguished, used consistently and do not confuse the underlying concepts labeled results, indicators, strategies and performance measures discussed above. The word "objective" is often paired with the word goal to specify what amount to a series of  "subgoals" required to achieve the "higher" goal. The set of terms "mission, goal and objective" have a long history in the military to describe the strategic and  tactical components of a large or small action or engagement. And some of their usage in the business sector and the public and private service sector derives from this history. In this framework, the terms goal and objective are most often used to structure the action plan and specify who will do what, how, and by when. The words "problem" and "issue" are used in more ways that just about any planning term. They can be used to describe almost anything. "The problem with this computer is that the keyboard is too small." "The problem with our community is that there is not a safe place for children to play." "We must solve the issue of affordability if we are to provide child care for all who need it." These are three different uses of the words and there are countless others. Again, there is nothing wrong with any of these usages, provided that they do not interfere with the language discipline discussed above about ends and means. Change Agent vs. Industrial Models: Much of the tradition of performance measurement comes from the private sector and in particular the industrial part of the private sector. Work measurement - dating back to the time and motion studies of the late 19th and early 20th centuries - looked at how to improve production. Industrial processes turn raw materials into finished products. The raw materials are the inputs; the finished products are the outputs. The words "input" and "output" are commonly used categories for performance measures. There is no standard usage. The word "input" is most often used to describe the staff and financial resources which serve to generate "outputs." "Outputs" are most often units of service.   This model does not translate very well to public or private sector enterprises which provide services. It does not make much sense to think of clients, workers and office equipment as inputs to the service sausage machine, churning out satisfied, cured or fixed clients. Instead we need to begin thinking about services in terms of the change agent model. In this model, the agency or program provides services which act upon the environment to produce demonstrable changes in the well-being of clients, families, or communities. If the input/output language is maintained, then providing service is the input, change in customers' lives is the output. One common situation illustrates the problems which arise when industrial model thinking is applied to services. It is the belief that the number of clients served is an output. ("We have assembled all these workers in all this office space; and we are in the business of processing unserved clients into served clients.") This misapplication of industrial performance concepts to services captures much of what is wrong with the way we measure human service performance today. "Number of clients served" is not an output. It is an input, an action which should lead to a change in client or social conditions - the real output we're looking for. ("We served 100 clients - input - and 50 of them got jobs - output - and 40 of them still had jobs a year later - even more important output.") This is a whole different frame of mind and a whole different approach to performance measurement. A closely related industrial model problem involves treating dollars spent as inputs, and clients served as outputs. In this distorted view, dollars are raw materials, and whatever the program happens to do with those dollars are outputs. It's easy to see why this over-simplification fails to meet the public's need for accountability. In this construct, the mere fact that the government spent all the money it received is a type of performance measurement. This is surely a form of intellectual, and perhaps literal, bankruptcy. In this perverse scheme, almost all the agency's data is purportedly about outputs. This gives the agency the appearance of being output-oriented and very progressive. It just doesn't happen to mean anything. Much of the confusion about performance measurement derives from the attempt to impose industrial model concepts on change agent services. The best model would be one which could span industrial and change agent applications. Some government services still involve industrial-type production (although these are often the best candidates for privatization and a diminishing breed.) In other cases, discussed below, the service itself, or components of the service, have product-like characteristics and industrial model concepts apply well. But most government and private sector human services fall into the change agent category. The approach to performance measurement described in this website can be used for either industrial or change agent applications. (Excerpt from "A Guide to Developing and Using Performance Measures, Finance Project, 1997)   PERFORMANCE ACCOUNTABILITY PERFORMANCE MEASURE A measure of how well a program, agency or service system is working. Three types: 1. How much did we do? 2. How well did we do it? 3. Is anyone better off? = Customer Results

DEFINITIONS POP QUIZ Safe community Crime rate Average police department response time A community without graffiti Percent of surveyed buildings without graffiti Installation of new street lights to make people feel safer

Population Accountability Getting from Talk to Action Results Experience Indicators Baselines (Data Development Agenda) Turned Curve Trend Story behind the baselines (Information & Research Agenda about Causes) Partners What works (Information & Research Agenda about Solutions) Criteria Strategy and Action Plan

The 7 Population Accountability Questions What are the quality of life conditions we want for the children, adults and families who live in our community? What would these conditions look like if we could see them? How can we measure these conditions? How are we doing on the most important of these measures? Who are the partners that have a role to play in doing better? What works to do better, including no-cost and low-cost ideas? What do we propose to do? Results Indicators Baseline & Story

Performance Accountability Who are your customers/clients? What is your contribution/role? How will you know how well you are doing your work? How will you know what impact your work has?

Performance Measures How much did we do? How well did we do it? Quantity Quality How much did we do? How well did we do it? Effect Effort Is anyone better off? # %

[ ] DEFINITIONS RESULT INDICATOR PERFORMANCE MEASURE ACCOUNTABILITY LANGUAGE DISCIPLINE RESULT A condition of well-being for children, adults, families or communities. ACCOUNTABILITY POPULATION EXAMPLES: Children succeed in school; Safe neighborhood; Children are safe; Children are healthy INDICATOR A measure which helps measure the achievement of a result. EXAMPLES: Rate of children enrolled in early childcare; rate of children born healthy; rate of subsequent pregnancies So what we did a few years ago is develop a set of definitions that would allow us to have a disciplined conversation about this very complex work we’re trying to do. Now the purpose of these definitions is not to impose words on people. Words like “result” or “outcome” are just labels for ideas. If you think about if for a minute, that’s what words are, labels for ideas. And the same idea can have many different labels. What’s important here are not the labels. You can pick whatever labels you like. What important are the ideas, and that we manage to keep three ideas separate at the beginning of this work. Read the ideas and the examples for Results and Indicators. Now this last category, performance measures…. Are measures of how well a program, agency or service system is working. Now there are many different ways to categorize performance measures, but I believe that all performance measures can be categorized into one of these three categories: How much did we do? How well did we do it? Is anyone better off? And this last category we sometimes call “customer results” or “customer outcomes.” And if you do nothing else in terms of your language convention, I would strongly encourage you…. That whenever you want to use a word like “outcome” or “result” and you’re talking about a program or agency, put a modifier in front of it. Call if “program results” or “client outcomes,” something to distinguish it from the use of the words results and outcome to mean the whole population. This is the single biggest source of language confusion in the U.S. today. From www.raguide.org The Language of Accountability The most common problem in this work is the problem of language. People come to the table from many different disciplines and many different walks of life. And the way in which we talk about programs, services and populations varies, literally, all over the map. This means that the usual state of affairs in planning for children, families, adults, elders and communities is a Tower of Babel, where no one really knows what the other person is saying, but everyone politely pretends that they do. As a consequence, the work is slow, frustrating and often ineffective. It is possible to exercise language discipline in this work. And the way to do this is to agree on a set of definitions that start with ideas and not words.  Results (or outcomes or goals) are conditions of well-being for children, adults, families or communities, stated in plain English (or plain Spanish, or plain Korean...). They are things that voters and taxpayers can understand. They are not about programs or agencies or government jargon. Results include: "healthy children, children ready for school, children succeeding in school, children staying out of trouble, strong families, elders living with dignity in setting they prefer, safe communities, a healthy clean environment, a prosperous economy." (An interesting alternative definition of a result is provided by Con Hogan: "A condition of well-being for people in a place - stated as a complete sentence." This suggests a type of construction for a result statement as "All ______ in ______ are _____." e.g. All babies in Vermont are born healthy.") Words are just labels for ideas. And the same idea can have many different labels. The following four ideas are the basis for definitions used at the beginning of this work. Alternative labels are offered: Indicators (or benchmarks) are measures which help quantify the achievement of a result. They answer the question "How would we recognize these results in measurable terms if we fell over them?" So, for example, the rate of low-birthweight babies helps quantify whether we're getting healthy births or not. Third grade reading scores help quantify whether children are succeeding in school today, and whether they were ready for school three years ago. The crime rate helps quantify whether we are living in safe communities, etc. Performance Measures are measures of how well public and private programs and agencies are working. The most important performance measures tell us whether the clients or customers of the service are better off. We sometimes refer to these measures as client or customer results (to distinguish them from cross-community population results for all children, adults or families). It is sometimes useful to distinguish "program performance measures," from "agency performance measures" from "service system performance measures." Strategies are coherent collections of actions which have a reasoned chance of improving results. Strategies are made up of our best thinking about what works, and include the contributions of many partners. No single action by any one agency can create the improved results we want and need. The principal distinction here is between ends and means. Results and indicators are about the ends we want for children and families. And strategies and performance measures are about the means to get there. Processes that fail to make these crucial distinctions often mix up ends and means. And such processes tend to get mired in the all-talk-no-action circles that have disillusioned countless participants in past efforts. You actually have choices about which labels to use in your work. And clarity about language at the start will help you take your work from talk to action. What Mission and Vision, Values, Goals, Objectives, Problems, Issues  Inputs and Outputs? First, remember that words are just labels for ideas. These seven words have no natural standard definition that bridges across all the different ways they are used. They are terms of art which can and are used to label many different ideas. This is why we pay so much attention to getting language discipline straight at the very beginning. It's the ideas that are important not the words. So you can choose to label the ideas in this guide with any words you like, provided you are consistent.  Many of us have grown up with these traditional words in strategic planning and budgeting. Where do they fit?  The word "mission" is usually used in relation to an organization, agency, program, initiative or effort. It is therefore mostly used in connection with agency or program performance accountability. Mission statements are usually concise statements of the purpose of an organization, sometimes also telling why and how the organization does what it does. Mission statements can be useful tools in communicating with internal and external stakeholders. It is possible to construct a mission statement from the performance measurement ideas in the upper right ("How well did we deliver service?") and lower right ("Is anyone better off?") quadrants of the performance measurement framework: For example: "Our mission is to help our clients become self sufficient ("Is anyone better off?" lower right) by providing timely, family friendly, culturally competent job training services ("How well did we deliver service?" upper right)." One mistake that is often made is that organizations spend months and sometimes years trying to craft the perfect mission statement before any other work can proceed. In the FPSI framework, mission statements are set aside, allowing the work of identifying and using performance measures to proceed quickly. Then, on a parallel track a small group can, if it is useful, use the work of the performance measurement groups to craft a workable mission statement. The word "vision" is often used to convey a picture of a desired future, often one that is hard but possible to attain. This is a powerful idea. And in fact one can think of the set of desired results for children and families as one way of articulating such a vision. "We want our community to be one which is safe and supportive, where all children are healthy and ready for school, where all children succeed in school, and grow up to be productive and contributing adults." This is an example of a vision statement made up of desired results or ends. It is possible to craft such a statement before or after the development of results. The word "values" in some ways defies definition. It is about what we hold most dear, how we view right and wrong, how we believe we should act, and how those beliefs are, in fact, reflected in our actions.  Our values underlie all of the work we do. And that is nowhere more true than in the work on the well-being of children, families and communities. Our values will guide our choice of results for children and families and the decisions we make about how we and our partners take action to improve those results. The word "goal" is often used interchangeably with "result and outcome" to label the idea of a condition of well-being for children, adults, families or communities (as in the case of Georgia, Missouri and Oregon for example). The word goal has many other common usages as well. It often serves as an all-purpose term to describe a desired accomplishment. "My goal for this month is to fix the roof." "Our goal is to increase citizen participation in the planning process." " The primary goal of the child welfare system is to keep children safe." and so forth. The word goal (or target) is sometimes used to describe the desired future level of achievement for an indicator or performance measure. "Our goal is 95% high school graduation in 5 years." "Our goal is to improve police response time to under 3 minutes." These are widely different usages. Still another use of the word "goal" is in relation to an implementation plan. Given a strategy and action plan to improve a particular result (children ready for school for example), it is possible to structure the action plan as a series of planned accomplishments (goals) with timetables and assigned implementation responsibility. For example, a goal in a "children ready for school plan" might be to "increase funding for child care by 25% this year and 50% next year." This is a specific action which will contribute to achieving the result. There is nothing wrong with any of these usages, provided they are clearly distinguished, used consistently and do not confuse the underlying concepts labeled results, indicators, strategies and performance measures discussed above. The word "objective" is often paired with the word goal to specify what amount to a series of  "subgoals" required to achieve the "higher" goal. The set of terms "mission, goal and objective" have a long history in the military to describe the strategic and  tactical components of a large or small action or engagement. And some of their usage in the business sector and the public and private service sector derives from this history. In this framework, the terms goal and objective are most often used to structure the action plan and specify who will do what, how, and by when. The words "problem" and "issue" are used in more ways that just about any planning term. They can be used to describe almost anything. "The problem with this computer is that the keyboard is too small." "The problem with our community is that there is not a safe place for children to play." "We must solve the issue of affordability if we are to provide child care for all who need it." These are three different uses of the words and there are countless others. Again, there is nothing wrong with any of these usages, provided that they do not interfere with the language discipline discussed above about ends and means. Change Agent vs. Industrial Models: Much of the tradition of performance measurement comes from the private sector and in particular the industrial part of the private sector. Work measurement - dating back to the time and motion studies of the late 19th and early 20th centuries - looked at how to improve production. Industrial processes turn raw materials into finished products. The raw materials are the inputs; the finished products are the outputs. The words "input" and "output" are commonly used categories for performance measures. There is no standard usage. The word "input" is most often used to describe the staff and financial resources which serve to generate "outputs." "Outputs" are most often units of service.   This model does not translate very well to public or private sector enterprises which provide services. It does not make much sense to think of clients, workers and office equipment as inputs to the service sausage machine, churning out satisfied, cured or fixed clients. Instead we need to begin thinking about services in terms of the change agent model. In this model, the agency or program provides services which act upon the environment to produce demonstrable changes in the well-being of clients, families, or communities. If the input/output language is maintained, then providing service is the input, change in customers' lives is the output. One common situation illustrates the problems which arise when industrial model thinking is applied to services. It is the belief that the number of clients served is an output. ("We have assembled all these workers in all this office space; and we are in the business of processing unserved clients into served clients.") This misapplication of industrial performance concepts to services captures much of what is wrong with the way we measure human service performance today. "Number of clients served" is not an output. It is an input, an action which should lead to a change in client or social conditions - the real output we're looking for. ("We served 100 clients - input - and 50 of them got jobs - output - and 40 of them still had jobs a year later - even more important output.") This is a whole different frame of mind and a whole different approach to performance measurement. A closely related industrial model problem involves treating dollars spent as inputs, and clients served as outputs. In this distorted view, dollars are raw materials, and whatever the program happens to do with those dollars are outputs. It's easy to see why this over-simplification fails to meet the public's need for accountability. In this construct, the mere fact that the government spent all the money it received is a type of performance measurement. This is surely a form of intellectual, and perhaps literal, bankruptcy. In this perverse scheme, almost all the agency's data is purportedly about outputs. This gives the agency the appearance of being output-oriented and very progressive. It just doesn't happen to mean anything. Much of the confusion about performance measurement derives from the attempt to impose industrial model concepts on change agent services. The best model would be one which could span industrial and change agent applications. Some government services still involve industrial-type production (although these are often the best candidates for privatization and a diminishing breed.) In other cases, discussed below, the service itself, or components of the service, have product-like characteristics and industrial model concepts apply well. But most government and private sector human services fall into the change agent category. The approach to performance measurement described in this website can be used for either industrial or change agent applications. (Excerpt from "A Guide to Developing and Using Performance Measures, Finance Project, 1997)   PERFORMANCE ACCOUNTABILITY PERFORMANCE MEASURE A measure of how well a program, agency or service system is working. Three types: 1. How much did we do? 2. How well did we do it? 3. Is anyone better off? = Customer Results

Karin Scott Research Director, Delta Health Alliance

1. Indianola Promise Community Overview 2. IPC Program Accountability 3. IPC Population Level Accountability 4. IPC Staff Accountability

1. IPC Overview

From initiative-wide goals to a system of care… Community-wide goals…Not isolated projects Thinking beyond performance measures

Building a pipeline of care to make sure that • young children are healthy, • kindergarteners are prepared, and • struggling students get help quickly. Building a pipeline of care to make sure that young children are healthy, kindergarteners are prepared, and struggling students get extra help quickly. When we do this, young people can be ready to graduate from high school, thrive in college, and find jobs with good pay. When we do this, young people can be ready to • graduate from high school, • thrive in college, and • find jobs with good pay.

Not just school-based programs with academic goals. Health / Family / Community as insulation/protective piping. The IPC offers a collective approach to providing children with the opportunity to excel, from birth to college graduation, through programs and services complementing and building on each other. The IPC pipeline is constructed through programs and services complementing and building on each other.

Results-based accountability. Tracking outcomes: Each program will be carefully evaluated according to objectives and goals, to ensure its success. Results-based accountability. How do we measure the effectiveness of IPC programs? How do we connect program outcomes to initiative-wide goals?

2. IPC Program-Level Accountability

Identifying a baseline and setting targets for each program.

1st Phase: 2nd Phase: 3rd Phase: Identify evidence-based strategies that speak to the results Develop program specific goals and performance measures Agree on common language 2nd Phase: MOAs with external partnerships Enrollment and CONSENT! Quality, frequent data collection Shared data system 3rd Phase: Development of program-level Scorecard Set baselines and meaningful targets Formal program-level accountability system

3. IPC Population-level Accountability

4. IPC Staff Accountability

IPC STAFF ACCOUNTABILITY MEETINGS (SAMs) Monthly staff Performance STAT 2-3 programs selected at random Full staff meeting, including: CEO, Project Director Program, data teams Finance department How much? How well? Is anyone better off? Focus is on what is NOT working Not a “presentation” but a working meeting where the leadership team gives direct feedback to program staff

And, now, we are beginning to see results at the population level….

More children in Indianola are considered “ready” at Kindergarten entry in 2014 +19% *NWEA 2013-2014 Goal Score Translation Chart, **STAR Early Literacy assessment Fall 2014

Proficiency rates on state tests, in both math and English, improved in Indianola schools. +3%

Table Talk What’s one idea or concept that stood out for you? What’s most compelling about the RBA framework or its application in IPC? What’s one simple action you could take (in your role) to strengthen the results orientation of your own work?

Reflections & Wrap Up

Carla Taylor 202-454-4142 carla.taylor@cssp.org Karin Scott 662-686-3871 kscott@deltahealthalliance.org Carla Taylor 202-454-4142 carla.taylor@cssp.org

https://twitter.com/CtrSocialPolicy https://www.facebook.com/pages/Center-for-the-Study-of-Social-Policy www.cssp.org