Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.

Slides:



Advertisements
Similar presentations
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Advertisements

Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
Data, Now What? Skills for Analyzing and Interpreting Data
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference September 15-17, 2013 Washington, DC Why CEDS for Part C.
Welcome! Review of the National Part C APR Indicator 4 Family Data FFY 2011 ( ) Siobhan Colgan, ECTA, DaSy Melissa Raspa, ECTA.
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
Update on Part C Child Outcomes Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center June 2011 Kathy Hebbeler ECO at SRI International.
Early Childhood Outcomes ECO Institute – Day 2: Data Workshop Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Using data for program improvement Early Childhood Outcomes Center1.
Using Data for Program Improvement Christina Kasprzak May, 2011.
Child Outcomes Data July 1, 2008 – June 30, 2009.
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Patterns in Child Outcomes Summary Data: Cornelia Taylor, Lauren Barton, Donna Spiker September 19-21, 2011 Measuring and Improving Child and Family Outcomes.
The Center for IDEA Early Childhood Data Systems Does Your Data System Answer Your Critical Questions? Abby Winer, DaSy Center at SRI International Grace.
Data analysis for Program Improvement: Part 1 Kathy Hebbeler, ECO at SRI Cornelia Taylor, ECO at SRI.
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
Looking for Patterns in Child Outcome Data – Examples from NYS New York State Department of Health Bureau of Early Intervention.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
The Center for IDEA Early Childhood Data Systems Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,
The Center for IDEA Early Childhood Data Systems Sharing is Good for Kids Kathy Hebbeler NERCC Part C Coordinators Meeting April 24, 2013.
The Center for IDEA Early Childhood Data Systems CEDS: Common Data Elements and Definitions for Part C and 619 Programs Meredith Miceli, OSEP Missy Cochenour,
Presented at State Kindergarten Entry Assessment (KEA) Conference San Antonio, Texas February, 2012 Comprehensive Assessment in Early Childhood: How Assessments.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Early Childhood Outcomes Center Orientation to Measuring Child and Family Outcomes for New People Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG/UNC.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Kathy Hebbeler SRI International February 17, 2010 Characteristics of Children Served in Part C.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
Embedding Child and Family Outcomes into Practice – Part 2 Kathy Hebbeler ECO at SRI International Early Childhood Outcomes Center Webinar for the Massachusetts.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Embedding Child and Family Outcomes into Practice - Part 5 Kathy Hebbeler ECO at SRI International Early Childhood Outcomes Center Webinar for the Massachusetts.
Patricia Gonzalez, OSEP June 14, The purpose of annual performance reporting is to demonstrate that IDEA funds are being used to improve or benefit.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Looking at Data Presented by The Early Childhood Outcomes Center
Incorporating Early Childhood into Longitudinal Data Systems:
OSEP Project Directors Meeting
Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development Subcomponent Taletha Derrington and Kathleen.
Integrating Outcomes Learning Community Call February 8, 2012
OSEP Initiatives on Early Childhood Outcomes
Examining Data to Identify Meaningful Difference
Using outcomes data for program improvement
Using Data for Program Improvement
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Integrating Outcomes Learning Community June 12, 2013
Measuring Part C and Early Childhood Special Education Child Outcomes
Refresher: Background on Federal and State Requirements
Welcome to the Workshop!
Using the Child and Family Outcomes Analysis Tools
Presentation transcript:

Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood Conference, Arlington VA, December 2009 Cornelia Taylor and Kathy Hebbeler SRI International

Our time together Give you a sense of how to approach your data to be able to use it for program improvement –Multiple step process –Each step involves skills –Not enough time to skill build at every step Give you a sense of the steps, a taste of what is involved, and what else you need to learn.

Finding the Killer Questions

Examining and Tweaking Your Service System Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Is there a problem? Why is it happening? What should be done? Is it being done? Is it working?

Starting with a question (or two..) All analyses are driven by questions Several ways to word the same question Some ways are more “precise” than others Questions come from different sources Different versions of the same question are necessary and appropriate for different audiences.

Question sources Internal – State administrators, staff External – –The governor, the legislature –Advocates –Families of children with disabilities –General public –OSEP External sources may not have a clear sense of what they want to know

A possible question Are programs serving young children with disabilities effective?

8 Areas for Program Improvement WHO SERVICES COSTQUALITY OUTCOMES

Sample basic questions Who is being served? What services are provided? How much services is provided? Which professionals provide services? What is the quality of the services provided? What outcomes do children achieve?

Sample questions that cut across components How do outcomes relate to services? Who receives which services? Who receives the most services? Which services are high quality? Which children receive high cost services?

Making comparisons How do outcomes for 2008 compare to outcomes for 2009? In which districts are children experiencing the best outcomes? Which children have the best outcomes? How do children who receive speech therapy compare to those who do not?

Making comparisons Disability groups Region/school district Program type Household income Age Length of time in program Comparing Group 1 to Group 2 to Group 3, etc.

Question Clarification External sources may not have a clear sense of what they want to know –You need to clarify with them or for them More liberty to pursue alternative questions when the questions are completely internal

Activity 1 Imagine you are the state coordinator for EI or ECSE. A major foundation in your state has announced they will be giving your department a very large grant to improve services across the state. What are the 5 top questions you want answered to be able to plan this new program improvement effort?

Talking to Your Analyst

Question precision A research question is completely precise when the data elements and the analyses have been specified. Are programs serving young children with disabilities effective? (question 1)

Question precision Of the children who exited the program between July 1, 2008 and June 30, 2009 and had been in program at least 6 months and were not typically developing in outcome 1, what percentage gained at least one score point between entry and exit score on outcome 1? (question 2)

Finding the right level of precision Who is the audience? What is the purpose? Different levels of precision for different purposes BUT THEY CAN BE VERSIONS OF THE SAME QUESTION

Elements Who is to be included in the analysis? –Exit between July 1, 2008 and June 30, 2009 –In program at least 6 months (exit date minus entry date) –Not typically developing at entry (hmm….) What about them? –Entry score outcome 1 –Exit score outcome 1 Do we need to manipulate the data? –Gain = Exit score minus entry score

Variables/Data Elements ID Year of Birth Date of entry Score on Outcome 2 at entry Gender

Many options… How do exit scores compare to entry scores? –Compare average score at entry and exit –Compare two frequency distributions of scores –Compare % who were rated typical Need to decide what you want May need to be able to communicate it to someone else.

Variables/Data Elements What data elements do you need to answer your questions? Do you need to compute variables to answer your question? –Time in program? –Age at entry?

Activity 2: Do You Have What it Takes? (AKA Specifying the Data) Look over your questions and select two to answer. List the question and then specify the data elements you need to answer the question. You might want to rewrite the question with additional precision to clarify what is needed. Bonus: Answer questions Q1 and Q2 when you finish.

Working with Table Shells

Terminology Frequency (count, percentage) –16 boys, 62% –10 girls, 38% Cross-tabulation (data element by data element) –12 boys with Communication Delays, 4 Other –5 girls with Communication Delays, 5 Other Average or Mean –Average age at entry = 17 months

Next decisions Tables and Graphs - How do you want your data displayed? What is the display that will answer your question?

Descriptive: Analysis: Filling the Shell

Describing characteristics that are measured as categories IDProgram Progress Category 1 Children’s Corner b 2 Elite Care c 3 Ms Mary’s a 4 New Horizons d 5 Oglethorpe d 6 Elite Care e 7 Ms Mary’s c 8 New Horizons e 9 Oglethorpe e

ProgramProgress Categories OSEP 1 abcde Row total Children’s Corner Elite Care Ms Mary’s New Horizons Oglethorpe Column total

Analyzing categorical data Row percentages –percentages computed with the Row total as the denominator Children in Elite Care with a score of “b” Total number of children in Elite Care What does this tell us?

Analyzing categorical data Column percentages –percentages computed with the Column total as the denominator Children in Elite Care with a score of “b” Total number of children with a score of “b” What does this tell us?

Quiz Three years ago, Ms Mary implemented a state of the art social skills intervention for all the classrooms in her program. She wants to see if this intervention was effective. As a preliminary analysis she wants to compare the percent of children in category D for OSEP outcome 1 between her program and other similar programs. Given the data structure described previously, should she use row or column percents?

ProgramProgress Categories OSEP 1 abcde Row percent totals Children’s Corner 33%8%20%6%20%16% Elite Care 33%46%13%11%15%19% Ms Mary’s 33%23%20%61%33%35% New Horizons 0%8%27%11%8%11% Oglethorpe 0%15%20%11%25%19% Column percent totals 100%

ProgramProgress Categories OSEP 1 abcde Row percent totals Children’s Corner 7% 21%7%57%100% Elite Care 6%35%12% 35%100% Ms Mary’s 3%10% 35%42%100% New Horizons 0%10%40%20%30%100% Oglethorpe 0%12%18%12%59%100% Column percent totals 3%15%17%20%45%100%

Final results Using the row percents we know that 35% of children in Ms Mary’s programs closed the gap in Outcome 1. As a reference, we can compare this to the 20% of children across all programs that closed the gap in Outcome 1. Is this an important difference?

Activity 3: Building a table shell Take your question and develop a “table shell” that you would give to an analyst or programmer to show how you want the data analyzed and displayed.

Descriptive data analysis in MS Excel: Demonstration

Activity 4. Descriptive analysis in MS Excel: Filling the table shell Draw your table shell. Using MS Excel and your state data or the dummy data, do a descriptive analysis and use the results to fill in your table shell.

Wrap Up: Getting on the Road to Data-Informed Program Improvement

Steps to Program Improvement 1.Have a vision for the service system 2.Implement the services

Examining and Tweaking Your Service System Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Is there a problem? Why is it happening? What should be done? Is it being done? Is it working?

Steps to Program Improvement 3.Formulate questions about the service system 4.Identify the variables needed to answer the question (including those that need to be constructed) 5.Determine the analysis that will answer the questions (maybe sketch your table shell or graph) 6.Run the analyses

And more steps Figure out what the numbers are telling you Take what you learned and act on the message

…..And meanwhile keep improving the quantity and quality of what is in your data system.

Early Childhood Outcomes Center 46 Information Infrastructure: Data Needed for Program Improvement WHOSERVICES COSTQUALITY OUTCOMES