Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.

Slides:



Advertisements
Similar presentations
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Advertisements

Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
Refresher: Background on Federal and State Requirements.
Presented at: Annual Conference of the American Evaluation Association Anaheim, CA - November 3, 2011 Performance Management in Action: A National System.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Lynne Kahn Christina Kasprzak Kathy Hebbeler The Early Childhood Outcomes.
Orientation for New Staff Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center Early Childhood Outcomes Center September 2011.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
State Activities in Measuring Child Outcomes Lynne Kahn, Donna Spiker, Melissa Raspa, & Kathleen Hebbeler ECO Center Presented at: International Society.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
Update on Part C Child Outcomes Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center June 2011 Kathy Hebbeler ECO at SRI International.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Updates on APR Reporting for Early Childhood Outcomes (Indicators C-3 and B-7) Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San.
Child Outcomes Data July 1, 2008 – June 30, 2009.
Considerations for Establishing Baseline and Setting Targets for Indicators C3 and B7 Kathy Hebbeler, Lynne Kahn, Christina Kasprzak ECO/NECTAC June 16,
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Center for IDEA Early Childhood Data Systems What Practitioners Need to Know about Measuring EI and ECSE Outcomes Kathleen Hebbeler, SRI International.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG Christina Kasprzak, ECO at FPG Cornelia Taylor, ECO at SRI Lauren Barton, ECO at SRI National Picture.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
UNDERSTANDING THE THREE CHILD OUTCOMES 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Child Outcomes: Understanding the Requirements in order to Set Targets Presentation to the Virginia Interagency Coordination Council Infant &
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
1 Quality Assurance: The COS Ratings and the OSEP Reporting Categories Presented by The Early Childhood Outcomes Center Revised January 2013.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,
2012 OSEP Leadership Conference Leading Together to Achieve Success from Cradle to Career Child Outcomes for Early Intervention and Preschool Special Education:
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Early Childhood Outcomes Center Orientation to Measuring Child and Family Outcomes for New People Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG/UNC.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Critical Markers of High Quality Child Outcomes Data ECO Advisory Board March, 2012.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
Why Collect Outcome Data? Early Childhood Outcomes Center.
Parent and National TA Perspectives on EC Outcomes Connie Hawkins, Region 2 PTAC Kathy Hebbeler, ECO at SRI Lynne Kahn ECO at FPG and NECTAC.
Early Childhood Outcomes Workgroup Christina Kasprzak and Lynne Kahn ECO and NECTAC July 2009.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement Tools & Process A story of 3 conversions.
Kathy Hebbeler, ECO at SRI Lynne Kahn, NECTAC and ECO at FPG
OSEP Project Directors Meeting
OSEP Project Directors Meeting
Kathy Hebbeler, ECO at SRI International AUCD Meeting Washington, DC
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Early Childhood Outcomes Data (Indicator C3 and B7)
IDEA Part C and Part B Section 619 National Child Outcomes Results for
OSEP Initiatives on Early Childhood Outcomes
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Webinar for the Massachusetts ICC Retreat October 3, 2012
Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center
Early Childhood and Family Outcomes
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Gathering Input for the Summary Statements
ECO Suggestions on Indicators C3 and B7 Kathy Hebbeler, ECO
Measuring Part C and Early Childhood Special Education Child Outcomes
Child Outcomes Data July 1, 2008 – June 30, 2009
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Presentation transcript:

Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International and ECO at UNC

What we will cover Review of the timeline for national reporting Share the national data Describe how the national data were computed Discuss the quality of the national data Discuss the meaning of the numbers 2 Early Childhood Outcomes Center

Timeline 3 Early Childhood Outcomes Center WhenCritical Event January 2004 – January 2005Stakeholder input gathered on 3 child outcomes July 2005 (revised September 2006)OSEP releases reporting requirements for state programs February 2008States submit first data on 5 progress categories: Children who exited between July 1, 2006 to June 30, 2007 February 2010States establish baseline and set targets on the Summary Statements for first time. February 2011States submit data on 5 categories for the 4 th time.

Request from the U.S. Department of Education Analyze the data for possible inclusion as a GPRA indicator. Also, to use in President’s budget justification for Part C and Preschool 619 funding. Initial request received in 2010, repeated in Early Childhood Outcomes Center

The Dilemma Variations in quality of state data –Some states started earlier –Some states had devoted more attention to improving quality What would be the impact of state variation in data quality on the national number? 5 Early Childhood Outcomes Center

Our Response Compute the analyses several ways 1.Identify the states with the highest quality data and use only their data. Stratify by number of children served and weight data to produce national estimate. 2.Use data from all states. Weight data to represent the nation. Weighting necessary because a few states are sampling. Also, many states not reporting data on all children. 6 Early Childhood Outcomes Center

7 Note: Based on 29 States with highest quality data

OSEP Reporting Categories Percentage of children who: a.Did not improve functioning b.Improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers c.Improved functioning to a level nearer to same-aged peers but did not reach it d.Improved functioning to reach a level comparable to same-aged peers e.Maintained functioning at a level comparable to same- aged peers 8 Early Childhood Outcomes Center 3 outcomes x 5 “measures” = 15 numbers

Early Childhood Outcomes Center10 Note: Based on 29 States with highest quality data

The Summary Statements 1.Of those children who entered the program below age expectations in each outcome, the percent who substantially increased their rate of growth by the time they turned 3 [6] years of age or exited the program. 2.The percent of children who were functioning within age expectations in each outcome by the time they turned 3 [6] years of age or exited the program. 11Early Childhood Outcomes Center

12 Note: Based on 33 States with highest quality data

Early Childhood Outcomes Center13 Note : Based on 33 States with highest quality data

Criteria for States with Quality Data 1.Low percentage of missing data 2.No odd patterns in “a” or “e” categories 3.Did not use questionable data collection methods 14 Early Childhood Outcomes Center

Calculating Missing Data for Part C Proxy for missing data = Number with data for C3/ Exiting Data (618) 15 Early Childhood Outcomes Center Do not expect this number to be 100%..but we don’t expect it to be 10% either

Part C: Percent of Exiters included in Outcomes Data <10% = 10* % = % = % = % = % = % = % = 2 >80% = <10% = 5* % = % = % = % = % = % = % = 1 >80% = 0 *3 states are sampling for Part C. Cut off was > 27%.

Calculating Missing Data for 619 Proxy for missing data = Number with data for B7/ Child count 17 Early Childhood Outcomes Center Do not expect this number to be 100%..but we don’t expect it to be 10% either

Percent of Child Count included in Outcomes Data for ECSE <10= 11* %= %= %= % =1 >50% = <10= 6* %= %= %= % =4 >50%= 0 *4 States are sampling for 619 Cutoff was > 11%.

Odd Patterns in a or e a = % of children who show no new skills –Except this to be very small. e = % of children who maintained functioning comparable to age expectations. –Don’t expect this to be large. Quality defined as <10% in a and <65% in e. 19Early Childhood Outcomes Center

20

Can we trust these data? 21 Early Childhood Outcomes Center

Pattern checking for validity Checking across years –How do the data compare to the data for ? Checking across methods –How do the data for all states compare to states with highest quality data? 22 Early Childhood Outcomes Center

Part C, Outcome A: Social Relationships

Part C, Outcome B: Knowledge and Skills

Part C, Outcome C: Meets Needs

Part B Preschool: Social Relationships

Part B Preschool: Knowledge and Skills

Part B Preschool: Meets Needs

Possible interpretation of the data Nationally, a high proportion of children who receive Part C and ECSE services are showing greater than expected progress Nationally, many (over half) are exiting the program functioning like same age peers in at least one of the outcomes. 29Early Childhood Outcomes Center

Would you agree? 30Early Childhood Outcomes Center

Should each state’s data look like the national data? Probably not More important that each state continue to focus on the quality of its own data –Getting outcomes data on all children who exit –Working with programs whose data look unusual to address possible data quality issues 31Early Childhood Outcomes Center

Additional information For information on state activities related to improving data quality and using data for program improvement 32 Early Childhood Outcomes Center