Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.

Slides:



Advertisements
Similar presentations
Data, Now What? Skills for Analyzing and Interpreting Data
Advertisements

Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Data Analysis for Assuring the Quality of your COSF Data 1.
Presented at: Annual Conference of the American Evaluation Association Anaheim, CA - November 3, 2011 Performance Management in Action: A National System.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Lynne Kahn Christina Kasprzak Kathy Hebbeler The Early Childhood Outcomes.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
National Call on Public Reporting of Local Child Outcomes Data NECTAC/ECO June 11, 2010.
Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
Update on Part C Child Outcomes Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center June 2011 Kathy Hebbeler ECO at SRI International.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Updates on APR Reporting for Early Childhood Outcomes (Indicators C-3 and B-7) Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San.
Child Outcomes Data July 1, 2008 – June 30, 2009.
Considerations for Establishing Baseline and Setting Targets for Indicators C3 and B7 Kathy Hebbeler, Lynne Kahn, Christina Kasprzak ECO/NECTAC June 16,
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG Christina Kasprzak, ECO at FPG Cornelia Taylor, ECO at SRI Lauren Barton, ECO at SRI National Picture.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
UNDERSTANDING THE THREE CHILD OUTCOMES 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Child Outcomes: Understanding the Requirements in order to Set Targets Presentation to the Virginia Interagency Coordination Council Infant &
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
1 Quality Assurance: The COS Ratings and the OSEP Reporting Categories Presented by The Early Childhood Outcomes Center Revised January 2013.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
2012 OSEP Leadership Conference Leading Together to Achieve Success from Cradle to Career Child Outcomes for Early Intervention and Preschool Special Education:
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Kathy Hebbeler Lynne Kahn The Early Childhood Outcomes (ECO) Center.
Considerations Related to Setting Targets for Child Outcomes.
Early Childhood Outcomes Workgroup Christina Kasprzak and Lynne Kahn ECO and NECTAC July 2009.
Building State Systems to Produce Quality Data on Child Outcomes Jim J. Lesko Director, Early Development and Learning Resources Delaware Department of.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Approaches for Converting Assessment Data to the OSEP Outcome Categories Approaches for Converting Assessment Data to the OSEP Outcome Categories NECTAC.
Kathy Hebbeler, ECO at SRI Lynne Kahn, NECTAC and ECO at FPG
OSEP Project Directors Meeting
Kathy Hebbeler, ECO at SRI International AUCD Meeting Washington, DC
Review of Summary Statements for Target Setting on Indicators C3 and B7 Lynne Kahn and Christina Kasprzak ECO/NECTAC June 9,
Early Childhood Outcomes Data (Indicator C3 and B7)
Integrating Outcomes Learning Community Call February 8, 2012
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Webinar for the Massachusetts ICC Retreat October 3, 2012
Using outcomes data for program improvement
Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center
The Basics of Quality Data and Target Setting
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Gathering Input for the Summary Statements
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Kathy Hebbeler, Lynne Kahn, Christina Kasprzak ECO/NECTAC
Review of Summary Statements for Target Setting on Indicators C3 and B7 Lynne Kahn and Christina Kasprzak ECO/NECTAC June 9,
Measuring EC Outcomes DEC Conference Presentation 2010
Measuring Part C and Early Childhood Special Education Child Outcomes
Child Outcomes Data July 1, 2008 – June 30, 2009
Christina Kasprzak Frank Porter Graham Child Development Institute
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Presentation transcript:

Summary Statements

The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many interrelated targets to make sense of OSEP asked for a recommendation 2

Thinking through the summary statements ECO presented options to states and ECO work groups via conference calls Two sessions at December, 2008 EC Conference Posted on the ECO web site for comments ECO made recommendation to OSEP 3

Final Deliberation OSEP put the summary statements out for public comment Comments came in that were thoughtful, but not necessarily consistent with one another Advantages and disadvantages to all options 4

Paper documenting the process on the ECO website Setting Targets for Child Outcomes 5

The Summary Statements Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program. The percent of children who were functioning within age expectations in each Outcome by the time they exited the program. 6

Example of State Progress Data for Positive social-emotional skills (including social relationships): Number of children % of children a. Percent of infants and toddlers who did not improve functioning 404 b. Percent of infants and toddlers who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers c. Percent of infants and toddlers who improved functioning to a level nearer to same-aged peers but did not reach d. Percent of infants and toddlers who improved functioning to reach a level comparable to same- aged peers e. Percent of infants and toddlers who maintained functioning at a level comparable to same-aged peers TotalN= %

Summary Statement Data Required Summary Statement 1: Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program= 75% Required Summary Statement 2: The percent of children who were functioning within age expectations in each Outcome by the time they exited the program= 54% 8

Where do the #s come from? Measurement for Summary Statement 1: Percent = # of infants and toddlers reported in progress category (c) plus # of infants and toddlers reported in category (d) divided by [# of infants and toddlers reported in progress category (a) plus # of infants and toddlers reported in progress category (b) plus # of infants and toddlers reported in progress category (c) plus # of infants and toddlers reported in progress category (d)] times

Where do the #s come from? 10 Prog cat #% a 404 b c d e (a, b, c, and d) or 76% of the children entered the program functioning below age expectations 240 (e) or 24% of the children entered and exited functioning at age expectations

Where do the #s come from? 11 Prog cat #% a 404 b c d e (c and d) of the 760 (a, b, c, and d) changed their growth trajectories (made greater than expected progress) = = 75%

Where do the #s come from? 12 Summary Statements Calculator Summary Statements Calculator -April 14, 2009

Where do the #s come from? Measurement for Summary Statement 2: Percent = # of infants and toddlers reported in progress category (d) plus [# of infants and toddlers reported in progress category (e) divided by the total # of infants and toddlers reported in progress categories (a) + (b) + (c) + (d) + (e)] times

Where do the #s come from? 14 Prog cat #% a 404 b c d e = = 54% 30% of the children reached age expectations by exit and 24% of the children entered and exited at age expectations

So remind me again what this means What can we say about the children’s progress? 15

What can we say? Part C Outcome 1: successful social relationships with peers and adults, following rules for social interactions 96% of children participating in Part C made progress in their social relationships while they were enrolled. The 4% of children who did not make progress included children with the most severe disabilities and/or degenerative conditions. Can you describe them? 16

24% of the children participating in Part C were functioning at age expectations at entry and at exit in this outcome area. Can you describe them? 54% of the children were functioning at age expectations in this outcome area when they exited the program. (summary statement 2) –30% started out behind and caught up –24% entered and exited at age expectations 17

75% of the children who entered the program below age expectations made greater than expected gains, made substantial increases in their rates of growth. i.e. changed their growth trajectories (summary statement 1) 18

What other data might you want to share? The public likes to see scores going up! Increase in mean scores from entry to exit –e.g. 4.3 to 5.6 on the COSF –Increase in raw scores –Increase in scale scores What else? 19

Setting Targets

What we’ll cover today Two strategies for examining data –Data quality –Potential for program improvement Parameters, guidance for target settings from OSEP 21

Can you trust the data? Begin by identifying outliers Examples: look at the percentages reported for certain categories across local programs 22

Percentages reported in category “a” across 30 local programs 23

Remove the outliers State percentage for “a” with all data= 3.9% Revised percentage for “a” with outliers removed= 2.4% 24

Percentages reported in category “e” across 30 local programs 25

Remove the outliers State percentage for “e” with all data= 32.1% Revised percentage for “e” with outliers removed= 27.7% 26

Example of data with outliers removed Progress Category Original %Clean % a 42 b 1517 c 2730 d 31 e 2420 Sum St Sum St Clean data (without the outliers) may be a more accurate picture of where you are starting 27

Suggested strategy Analyze your data with your local LEA/program outliers included and excluded so you can gauge the impact they are having on your state level data.

Note Note Note Consider clean data when deciding about reasonable targets, BUT Turn in the original data to OSEP in the SPP report! You can discuss the clean data in the rationale for your targets. 29

Which local programs can be targeted for program improvement? Compare the summary statement data by local program to identify which programs have the most potential for improvement. 30

Summary Statement Percentages by Local Program 31

Considerations What do you know about the programs/LEAs with the least and the most progress in the summary statements? i.e. the programs w/ –the lowest and highest percentages of children at age expectation at exit –the lowest and highest percentage of children making greater than expected gains 32

Examples of Key Questions Are the children similar at entry? Are the higher performing programs/LEAs participating in special projects? e.g. a state initiative, TACSEI or CELL? Are there systems issues in lower performing programs/LEAs that would explain differences in outcomes? e.g. personnel shortages 33

Bottom-line Question Could either system or practice focused improvement activities targeted toward the lowest performing programs/LEAs improve the child outcomes? 34

The Math of Target Setting How much would the data change if the lowest local programs moved toward the mean? Improvements in the lowest programs will result in improvement in your statewide data Experiment with your data to determine what targets are reasonable in your state 35