Using Data for Program Improvement

Slides:



Advertisements
Similar presentations
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Advertisements

State Systemic Improvement Plan: Preparing, Planning, and Staying Informed Presentation to Louisiana ICC July 10, 2013.
Data, Now What? Skills for Analyzing and Interpreting Data
Data Analysis for Assuring the Quality of your COSF Data 1.
Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Using data for program improvement Early Childhood Outcomes Center1.
Using Data for Program Improvement Christina Kasprzak May, 2011.
Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
A Model for Collaborative Technical Assistance for SPP Indicators 1, 2, 13, & 14 Loujeania Bost, Charlotte Alverson, David Test, Susan Loving, & Marianne.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
May 20, Purpose of the Self- Assessment Required by the Head Start Performance Standards (i)(1) Head Start Ac 2007 Head Start Act Section.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
By Jo Ann Vertetis and Karin Moe. Self-Assessment Can you define RTI? What is its purpose? Rate your understanding of RTI and how to implement it on a.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Cheryl Simonson, CESA 6 Jacob Hollnagel, DPI. Guiding Successful Implementation of Educator Effectiveness Understand successful educator effectiveness.
Early Childhood Outcomes Center1 Connecting the Three OSEP Family Outcomes with IFSP Outcomes and Local Practices Christina Kasprzak, NECTAC/ECO Connie.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
SCHOOL BASED SELF – EVALUATION
Looking at Data Presented by The Early Childhood Outcomes Center
EIA: Using data for program improvement
Welcome to the Annual Meeting of Title I Parents
The assessment process For Administrative units
Child Outcomes Summary Process April 26, 2017
Phase I Strategies to Improve Social-Emotional Outcomes
Welcome to the Annual Meeting of Title I Parents
Evaluating SPP/APR Improvement Activities
Supporting Student Success
TAIS Overview for Districts
Add your school name and the date and time of the meeting
Child Outcomes Summary (COS) Process Training Module
OSEP Project Directors Meeting
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Jean Scott & Logan Searcy July 22, MEGA
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Data Workshop: Analyzing and Interpreting Data
Supporting States in Building a Child Outcomes Measurement System
Integrating Outcomes Learning Community Call February 8, 2012
Implementation Guide for Linking Adults to Opportunity
Early Childhood Transition APR Indicators and National Trends
Using outcomes data for program improvement
Using Family Survey Data for Program Improvement
Data Workshop: Analyzing and Interpreting Data
Welcome to the CIS Annual Meeting of Title I Parents
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Evaluating SPP/APR Improvement Activities
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Child Outcomes Summary (COS) Process Training Module
Refresher: Background on Federal and State Requirements
Student Success: Imagine the Possibilities
Welcome to the Annual Meeting of Title I Parents
Using the Child and Family Outcomes Analysis Tools
Measuring Child and Family Outcomes Conference August 2008
Welcome to the Annual Meeting of Title I Parents
Using Data to Build LEA Capacity to Improve Outcomes
Presentation transcript:

Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010 Early Childhood Outcomes Center

Systems Thinking: Systems are made up of interrelated, interconnected components Systems change involves changing the capacity, interrelationships, and interdependencies among parts, levels and stakeholders Desired changes in one part or level of the system must be accompanied by changes in other parts or levels Early Childhood Outcomes Center

SPP as long term plan for systems change: EI/ECSE are systems of complex, interrelated components with goal of achieving outcomes for children and families Changes to EI/ECSE systems require a combination of improvement activities that are interconnected and support changes to infrastructure that work together to achieve the desired results Early Childhood Outcomes Center

Developing improvement activities: Effective Policies, Procedures, and MOU Data Systems and Monitoring TA and Professional Development Fiscal Management Evaluation Early Childhood Outcomes Center

Evaluating SPP/APR Improvement Activities Early Childhood Outcomes Center

The Need for Evaluation Provides an organized way of assessing work in progress and results obtained Assesses the impact of an activity on the area targeted for improvement Identifies strengths and weaknesses in the implementation of the improvement activity Keep in mind that it is important to develop a comprehensive plan for improvement, but it is equally important to assess the effectiveness of implementing that plan and the impact of the improvement efforts thru your improvement activities. The purposes for conducting evaluation are numerous and can vary depending on the needs of the agency. Thinking critically about evaluation is the key to ensuring that the agency will obtain information that accurately reflects the progress and impact of the improvement activity. Evaluation can help the agency team to judge whether the identified improvement activity has been successful in strengthening the designated area for improvement. The evaluation can: Provide an organized way of assessing work in progress and results obtained. Assess the impact of an activity on the area targeted for improvement-i.e., specific indicators. Identify strengths and weaknesses in the implementation of the improvement activity.

Types of Evaluation Process evaluation Evaluates the improvement process itself Impact evaluation Evaluates the results produced by the process There are two types of evaluation that can and should be used to evaluate improvement activities : 1.Evaluating the improvement process itself (Process Evaluation) 2.Evaluating the results of the improvement process (Impact Evaluation)

Process Evaluation Questions Process evaluation questions might answer To what extent is the improvement activity being implemented as intended? To what extent is the improvement activity reaching the target audience (i.e., children, staff, parents)? Is everyone doing what they said they would do? Are resources still available to adequately support this improvement activity? Some questions can be answered through a process evaluation include: To what extent is the improvement activity being implemented as intended? To what extent is the improvement activity reaching the target audience (i.e., children, staff, parents)? Is everyone doing what they said they would do?

Impact Evaluation Questions Impact evaluation questions might include Did the improvement activity accomplish what it was supposed to? Which parts of the improvement activity worked well? Which parts of the improvement activity did not work well? Should the agency continue the improvement activity? What has changed as a result of implementing the improvement activity? Some questions to Consider when conducting an impact evaluation of an improvement activity: Did the improvement activity accomplish what it was supposed to? Which parts of the improvement activity worked well? Which parts of the improvement activity did not work well? Should the agency continue the improvement activity? What has changed as a result of implementing the improvement activity?

Reviewing Improvement Activities Improvement activities are aligned to the indicators, including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. So before thinking about the entire evaluation plan you may want to assess what it is you already know about your improvement activities. Improvement activities are aligned to the indicators-including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. Improvement activities include measures of performance. Improvement activities include time lines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs.

Reviewing Improvement Activities Improvement activities include measures of performance. Improvement activities include timelines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs. So before thinking about the entire evaluation plan you may want to assess what it is you already know about your improvement activities. Improvement activities are aligned to the indicators-including if reflected across related indicators. Improvement activities reflect state priorities. Improvement activities are actionable. Improvement activities are realistic. Improvement activities include measures of performance. Improvement activities include time lines. Improvement activities identify responsibility for implementation. Improvement activities include technical assistance needs.

Categorizing Improvement Activities Training and Professional Development Improve Data Collection Improve Systems Administration and Monitoring Improve Collaboration and Coordination Program Development Clarify/Examine/Develop Policies & Procedures Provide Technical Assistance Increase/Adjust FTE Evaluation Another way of thinking about how to assess the effectiveness of your improvement activities is to try to categorize them prior to evaluating them. This may help in looking overall at your improvement activities. In most instances, improvement activities can be categorized in the following ways: Training and Professional Development Improve Data Collection Improve Systems Administration and Monitoring Improve Collaboration and Coordination Program Development Clarify/Examine/Develop Policies & Procedures Provide Technical Assistance Increase/Adjust FTE Evaluation Because it may too daunting of a task to try to evaluate every single improvement activity included in your SPP/APR, you may want to focus your evaluation on one or two of the categories that seem to be most prevalent. For example, you may want to start out by looking system wide at training and PD and Program Development. Try to determine which category contains the most improvement activities and begin with that.

Developing a Plan for Evaluation Identify the goal of the evaluation. Frame the evaluation questions to be answered. Identify evaluation methods and measurement options. Identify data sources. Determine data analysis techniques. Establish timelines. There are several steps that needs to completed in order to develop an evaluation plan for improvement efforts. These steps include: Identify the goal of the evaluation. Frame the evaluation questions to be answered. Identify evaluation methods and measurement options. Identify data sources. Determine data analysis techniques. Establish timelines. I will discuss each step in more depth on the following slides.

What’s Next? Moving beyond data quality Understanding and manipulating the data you have Using data to make program improvements

Looking at Data: To better understand issues and areas of concern to focus improvement activities... What does the data analysis say? What are the ‘root causes’ of issues or challenges? Early Childhood Outcomes Center

Using data for improvement Evidence Inference Action Early Childhood Outcomes Center

Evidence Evidence refers to the numbers, such as “89% of families reported ...” The numbers are not debatable Early Childhood Outcomes Center

Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Early Childhood Outcomes Center

At the state level – TA, policy Program improvement: At the state level – TA, policy At the regional or local level – supervision, guidance At the service/classroom level– implement high quality individualized family centered services Different program improvement levers at different levels. Going to be focusing primarily on the state level use of information. Some state applications translate directly to smaller units. How interventionists or teacher use outcome data for program improvement is a completely different topic – very important but we are not going to cover it here. Early Childhood Outcomes Center

Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning You cannot prevent the misuse of data but you can set up conditions to make it less likely. Early Childhood Outcomes Center

Continuous Program Improvement Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Early Childhood Outcomes Center

Tweaking the System Reflect Check Plan (vision) Implement Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

Outcome questions for program improvement, e.g. Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Early Childhood Outcomes Center

Looking at Family Outcomes by Subgroups System Characteristics Family Characteristics Child Characteristics Service Characteristics Early Childhood Outcomes Center

Are there differences in outcomes across family characteristics? Race/ethnicity Family Income Primary language Family structure Etc Early Childhood Outcomes Center

Are there differences in outcomes across child characteristics? Race/ethnicity Type of disability Length of time in services Etc Early Childhood Outcomes Center

Examples of process questions Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? Early Childhood Outcomes Center

Working Assumptions There are some high quality services and programs being provided across the state. There are some families who are not getting the highest quality services. If we can find ways to improve those services/programs, these families will experience better outcomes. Early Childhood Outcomes Center

Action Given the inference from the numbers, what should be done? Develop improvement activities that are: Targeted based on data analysis Based on evidence based practices Interconnected, work together to accomplish the desired result Early Childhood Outcomes Center

Small Group Scenarios Early Childhood Outcomes Center

Small Group Scenarios Each table is assigned a scenario Choose a table / scenario As a group, walk through the scenario. Discuss and answer questions. Jot down your ideas and be prepared to share back some highlights. Early Childhood Outcomes Center