Data, Now What? Skills for Analyzing and Interpreting Data

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

1
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
Author: Julia Richards and R. Scott Hawley
Properties Use, share, or modify this drill on mathematic properties. There is too much material for a single class, so you’ll have to select for your.
Objectives: Generate and describe sequences. Vocabulary:
UNITED NATIONS Shipment Details Report – January 2006.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Create an Application Title 1A - Adult Chapter 3.
Custom Statutory Programs Chapter 3. Customary Statutory Programs and Titles 3-2 Objectives Add Local Statutory Programs Create Customer Application For.
Board of Early Education and Care Retreat June 30,
1 Early Childhood Outcomes: Early ACCESS and Early Childhood Special Education Presented by: Dee Gethmann Iowa Department of Education October 2006
1 Early Childhood Outcomes: Early ACCESS and Early Childhood Special Education Presented by: Dee Gethmann Iowa Department of Education October 2006
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
PP Test Review Sections 6-1 to 6-6
Exarte Bezoek aan de Mediacampus Bachelor in de grafische en digitale media April 2014.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
CONTROL VISION Set-up. Step 1 Step 2 Step 3 Step 5 Step 4.
Adding Up In Chunks.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Note to the teacher: Was 28. A. to B. you C. said D. on Note to the teacher: Make this slide correct answer be C and sound to be “said”. to said you on.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
1 Using one or more of your senses to gather information.
Analyzing Genes and Genomes
WEB IEP FOLLOW-UP ECO GATHERED FOR BIRTH TO 5 INCLUDING INFANT, TODDLER, PK 1.
Missouri Department of Elementary and Secondary Education
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
Essential Cell Biology
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
Essential Cell Biology
Energy Generation in Mitochondria and Chlorplasts
Educator Evaluation: A Protocol for Developing S.M.A.R.T. Goal Statements.
High Quality Child Outcomes Data in Early Childhood: More Important than Ever Kathleen Hebbeler, SRI International Christina Kasprzak, Frank Porter Graham.
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Using Data for Program Improvement Christina Kasprzak May, 2011.
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Early Childhood Outcomes Center Orientation to Measuring Child and Family Outcomes for New People Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG/UNC.
Why Collect Outcome Data? Early Childhood Outcomes Center.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Data Workshop: Analyzing and Interpreting Data
Integrating Outcomes Learning Community Call February 8, 2012
Using outcomes data for program improvement
Why Collect Outcome Data?
Data Workshop: Analyzing and Interpreting Data
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Measuring EC Outcomes DEC Conference Presentation 2010
Presentation transcript:

Data, Now What? Skills for Analyzing and Interpreting Data Abby Winer Christina Kasprzak Kathleen Hebbeler Division for Early Childhood Annual Conference October 2014

Desired Outcomes Opportunity to practice forming good data analysis questions Opportunity to examine and discuss different ways of analyzing aggregate data for program improvement Opportunity to discuss and interpret data to drive program improvement Program characteristics Child characteristics

Child Outcomes States are required to report on the percent of infants and toddlers with Individualized Family Service Plans (IFSPs) or preschool children with Individualized Education Plans (IEPs) who demonstrate improved: Positive social-emotional skills (including social relationships); Acquisition and use of knowledge and skills (including early language/communication [and early literacy]); and Use of appropriate behaviors to meet their needs.

Progress Categories Percentage of children who: did not improve functioning. improved functioning but not sufficient to move nearer to functioning comparable to same aged peers. improved functioning to a level nearer to same aged peers but did not reach it. improved functioning to reach a level comparable to same aged peers. maintained functioning at a level comparable to same aged peers.

4/10/2017 Developmental science has provided information about the skills children master at different ages. Knowing what is expected for each age allows us to identify children who are developing too slowly. Children who are substantially behind their peers are described as having a developmental delay. There is a solid line on this graph that illustrates typical development. All the other lines represent some kind of delay in the early years. Looking at this graph, we provide intervention services to the children who are acquiring skills at a much lower rate than is shown by the line that illustrates typical development because without intervention, the children are likely to fall further behind. The purpose of intervening is to improve the child’s rate of skill acquisition. With intervention, we hope that their lines on this graph will be closer to the typical development line, if not aligned with it, due to their growth rate being greater with intervention than it is without intervention. Presenter Notes: Developmental science has provided information about the skills children master at different ages. Knowing what is expected for each age allows us to identify children who are developing too slowly. Children who are substantially behind their peers are described as having a developmental delay. The solid line on this graph (line e) illustrates typical development. All the other lines represent some kind of delay in the early years. If Angela is 12 months old with the skills of a 6 month old, without intervention it is likely that she will continue to grow at the same rate, and have the skills of 9 month old at 18 months of age. We provide intervention services because Angela is acquiring skills at about half the rate she should be and will continue to fall farther behind. This pattern of growth is illustrated in the b line in the graph. The purpose of intervening is to improve the child’s rate of skill acquisition. The c and d lines illustrate children whose growth was greater than expected because their growth rate with intervention was greater than their growth rate before intervention. The percentages of children showing greater than expected growth and exiting within age expectations are computed from these five percentages.

Summary Statements For OSEP states are required to report on two summary statements for each of the three child outcomes: Summary Statement 1 : Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program. (c+d)/(a+b+c+d) Summary Statement 2 : The percent of children who were functioning within age expectations in each Outcome by the time they exited the program. (d+e)/(a+b+c+d+e) The progress categories are then used to compute two summary statements that state’s report annually to OSEP to summarize the results for children receiving EI or ECSE services in their state who exited in that fiscal year

Value of Child Outcomes Data 4/10/2017 Value of Child Outcomes Data Federal government is driving force behind child outcomes data collection But there are many reasons to collect and use the child outcomes data: Examine program effectiveness Use data for program improvement Ultimately, to better serve children and families Although the federal government is the driving forces behind the movement to collect child outcomes data, it is not the only reason to collect and use it Data on outcomes are important for state and local purposes as well including: - To examine program effectiveness - to improve programs by identifying strengths and weaknesses and decide where to allocate support resources such as TA and training And ultimately, to better serve children and families and fulfill the vision of early intervention programs:

Evidence Inference Action 8

Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable 9

Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) 10

Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data 11

Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Again, early on the action might have to do with improving the quality of the data 12

Topic 1: Forming Good Questions

Starting with a question (or two..) All analyses are driven by questions Questions come from different sources Different versions of the same question are necessary and appropriate for different audiences. What are your crucial policy and programmatic questions?

Defining Data Analysis Questions What are your crucial policy and programmatic questions? Example: 1. Does our program/district serve some children more effectively than others? Do children with different racial/ethnic backgrounds have similar outcomes?

Question sources Internal – State administrators, staff External – The governor, the legislature Advocates Families of children with disabilities General public OSEP External sources may not have a clear sense of what they want to know

Sample basic questions Who is being served? What services are provided? How much services is provided? Which professionals provide services? What is the quality of the services provided? What outcomes do children achieve?

Sample questions that cut across components How do outcomes relate to services? Who receives which services? Who receives the most services? Which services are high quality? Which children receive high cost services?

Making comparisons How do outcomes for 2013 compare to outcomes for 2014? In which districts are children experiencing the best outcomes? Which children have the best outcomes? How do children who receive speech therapy compare to those who do not?

Making comparisons Disability groups Region/school district Program type Family income Age Length of time in program Comparing Group 1 to Group 2 to Group 3, etc.

Are programs serving young children with disabilities effective? Question precision A research question is completely precise when the data elements and the analyses have been specified. Are programs serving young children with disabilities effective? (question 1)

Question precision Of the children who exited the program between July 1, 2012 and June 30, 2013 and had been in program at least 6 months and were not typically developing in outcome 1, what percentage gained at least one score point between entry and exit score on outcome 1? (question 2)

Finding the right level of precision Who is the audience? What is the purpose? Different levels of precision for different purposes BUT THEY CAN BE VERSIONS OF THE SAME QUESTION

Activity 1, Part I & II Starting with a Question I: Forming Good Data Analysis Questions II: Generating Questions

Topic 2: Looking at Data

Different “Levels” of Looking at Data Individual data Individual child Individual classroom Individual program/district Aggregate data Combining data across individual children, classrooms or districts Summary statistics/values

Why do we need to look at aggregate data? Volume of data and information available is not easy to make conclusions Aggregating helps to make comparisons What kind of characteristics for children or programs are actually linked with better outcomes? How do we group information about children or programs in order to make comparisons?

Individual Child Outcomes Data for District 2

Activity – 2 Looking at Data

Comparing District Characteristics

Which Districts Have Better Outcomes? What do we mean by “which”? Subgroups! What subgroups to consider? What factors differ across some districts How are districts different from one another?

Linking Different Pieces of Information What information do you have available about district characteristics? Is it already captured in a data system or report? Is it collected systematically? What about qualitative information?

Comparing District Characteristics Activity – 3 Comparing District Characteristics

Comparing Child/Family Characteristics

Planning for Follow-up Analyses Analysis planning Asking a question – what else do you want to know? Generating hypotheses Identifying data sources, including comparisons (what groups to compare, how to put together groups)

Which Children Have Better Outcomes than Others? What do we mean by “which”? Subgroups! What subgroups to consider? What factors differ for children within or across classrooms/districts/regions? How are children/families different from one another?

Linking Different Pieces of Information What information do you have available about child/family characteristics? Is it already captured in a data system or report? Is it collected systematically? What about qualitative information?

Comparing Child/Family Characteristics Activity – 4 Comparing Child/Family Characteristics

Sharing Your Results Communicate your analysis in a way that is appropriate for your audience Who are you communicating with? What is the key information that they need to know? When do they need the information? What other types of information do they need to help them understand the data? Think about the different ways you want to visualize present the data

Thank You! 45