Child Outcomes Data Collection: Results of the ITCA National Survey and Discussion of Policies and Considerations Christina Kasprzak, ECTA/DaSy Cornelia.

Slides:



Advertisements
Similar presentations
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
Advertisements

The Center for IDEA Early Childhood Data Systems 2014 Improving Data, Improving Outcomes Conference September 9, 2014 Developing or Enhancing Business.
The Center for IDEA Early Childhood Data Systems Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
The Center for IDEA Early Childhood Data Systems How Alaska Connected Child Welfare Data to Automate Referrals of Maltreated Children Lisa Balivet: AK.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
Subtitle 2016 Virtual Leadership Conference All In: Achieving Results Together States’ Experiences with Cross-State Learning Opportunities Teri Chapman,
The Center for IDEA Early Childhood Data Systems The Importance of Personnel Data Donna Spiker Co-Director, DaSy Center OSEP 2016 Virtual leadership Conference.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
The Center for IDEA Early Childhood Data Systems Using Integrated Data to Support the Transitions in Special Education Wednesday, July 13, :00 –
Child Outcomes Summary (COS) Process Professional Development Tools
Status of Part B 619 State Data Systems
Child Outcomes Summary Process April 26, 2017
What’s Unique about the Child Outcome Summary Process in Minnesota:
Part C Data Managers — Review, Resources, and Relationship Building
Pacific and Caribbean States/Entities Early Intervention and
New Significant Disproportionality Regulations
Using Formative Assessment
Improving Data, Improving Outcomes Conference Aug , 2016
What’s New in the IDC Part C Exiting Data Toolkit
National, State and Local Educational Environments Data:
Improving Data, Improving Outcomes Conference August, 2016
2014 Improving Data, Improving Outcomes Conference September 9, 2014
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
National Webinar Presented by: Amy Nicholas Cathy Smyth
Using Data to Reduce Suspension and Expulsion of Young Children
ECTA/DaSy System Framework Self-Assessment
Who Wants to be a 7-Point Rating Scale Millionaire?
Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development Subcomponent Taletha Derrington and Kathleen.
Eligibility Determination IFSP Meetings IFSP Service Implementation
Improving Data, Improving Outcomes Conference
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
Pay For Success: An Invitation to Learn More
Periodic, Annual Reviews and Transition Procedures
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Integrating Outcomes Learning Community Call February 8, 2012
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Early Childhood Conference: Improving Data, Improving Outcomes
2018 Improving Data, Improving Outcomes Conference
ECTA/DaSy System Framework Self-Assessment
Examining Data to Identify Meaningful Difference
The Center for IDEA Early Childhood Data Systems (DaSy)
Cornelia Taylor, Sharon Walsh, and Batya Elbaum
Data Culture: What does it look like in your program?
Supporting Improvement of Local Child Outcomes Measurement Systems
Using Data to Answer Critical Questions about EI/ECSE Personnel
What Else Would You Like Your Data Systems to Do?
7-Point Rating Scale Jeopardy
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
The Center for IDEA Early Childhood Data Systems (DaSy)
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Pacific and Caribbean States/Entities Early Intervention and
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4)
Measuring EC Outcomes DEC Conference Presentation 2010
Ready to create a culture of data use?
Christina Kasprzak Frank Porter Graham Child Development Institute
Data Culture: What does it look like in your program?
Section 618 Public Reporting Requirements Thursday, September 11, 2014
New Special Education Teacher Webinar Series
COS Training to Promote Data Quality: What’s Working, What’s Needed
Data Culture: What does it look like in your program?
Date: Tuesday, August 27 Time: 3-4:30 est
Presentation transcript:

Child Outcomes Data Collection: Results of the ITCA National Survey and Discussion of Policies and Considerations Christina Kasprzak, ECTA/DaSy Cornelia Taylor, DaSy/ECTA Maureen Greer, ITCA/DaSy Robin Nelson, DaSy 2016 Improving Data, Improving Outcomes Conference New Orleans, LA August 2016

Intended Outcomes Increased participant knowledge of the variability in policies and procedures for the collection of child outcomes data across states. Increased participant understanding of the importance of reviewing policies and procedures and their impact on the completeness and accuracy of the data.

Agenda Introduction and Background Analysis and Results Discussion and Recommendations

How Did We Get Here? Determinations, ITCA comment on denominator “Completeness” of the data should be determined based on a denominator supplied by the state with accompanying rationale and documentation. Question on the listserv from a TA provider on behalf of a state : A considerable percentage of kids are getting removed from the ECO data because the Entrance ECO to the Exiting ECO is less than 6 months. But in actuality the child is in the program more than 6 months based on the initial IFSP date and the Exiting date. What are other states doing in this situation? Other states responded and were interested in a compilation of the responses. Question posted January 2016 by Haidee

Survey on Child Outcomes Data Collection ITCA Data Committee, in collaboration with DaSy and ECTA, developed a survey. Distributed to Part C programs in April 2016 Consisted of 17 questions: Criteria for start and end dates for “6 months of service” Other criteria used, including gaps in service Approach and frequency of ratings Policies on family participation Procedures for exit ratings for unanticipated exits Minimum age for entry ratings 4 with state contact information Primary approach, i.e., COS, BDI-2, AEPS And whether they have or intend to change approach 2 additional questions: integration of outcome ratings in IFSP; ability of state to determine number of children who should have ratings

Analysis and Results * 48 surveys received; Of those, 2 surveys had no responses other than state and contact name. Therefore, results based on 46 respondents. * Unless otherwise noted, frequency refers to an unduplicated count of the number of State respondents.

Primary Approach for Outcomes   FREQUENCY (# States) PERCENT COS 36 78% BDI-2 5 11% AEPS 3 7% Other 2 4% Based on ITCA survey results

2014-2015 data 43 use COS, 77% 13 use other approach, 23%

Date Used for Start Date, Unduplicated Duplicate count Initial IFSP – 30 Rating completed – 8 Elig deter – 6 First delivered services – 4 Eval date – 2

Date Used for End Date, Unduplicated Much more variability Duplicate Exit date – 30 Date of exit rating – 15 Third birthday – 9 IFSP end date – 8 Date exit assessment if lost to contact - 6

Pairings of Start and End Date   FREQUENCY PERCENT Initial IFSP date/ Exit date, 3rd bday 22 48% Date of entry rating / Date of exit rating 8 17% Elig deter date / Exit date 6 13% One state indicated Initial IFSP date and date of entry rating, exit date and date of exit rating

Frequency of Ratings/Scores   FREQUENCY PERCENT Entry and exit 29 63% Entry, annual and exit 11 24% Entry and exit, optional at annual 2 4% Entry, then annual or exit 1 2% Semi-annually (at 6-mo IFSP review) Quarterly

Require Family Participation in Ratings   FREQUENCY PERCENT Entry only 4 9% Entry and exit 27 59% No 14 30% Piloting 1 2%

P&Ps for Exit Ratings/Scores for Children who Exit Unexpectedly   FREQUENCY No policies or procedures 2 Yes, we use the last periodic child outcomes rating/score that was completed before the child exited as long as it meets our state-established timeline for being current. 16 Yes, the team meets and determines the rating/score after the child exits based on ongoing assessment information. 24 Other 5 Include other; reconcile 47

Accounting for Gaps in Services to Determine 6 Months in Service, Duplicated Duplicated count. Of those states that accounted for gaps in services, the highest number was for children who exited and re-entered the program, sometimes multiple times. If they reported one of the other reasons, they usually reported more than one.

Minimum Age for Entry Rating/Score   FREQUENCY PERCENT No Minimum Age 38 83% Min. Age 2 Months 1 2% Min. Age 4 Months 0% Min. Age 6 Months 6 13% Other Min. Age

Change in Approach (including overall and other changes) In Process of Changing Approach   YES NO Changed Approach Last 3 Years 7 8 15 24 31 14 32 Change in approach doesn’t usually represent major change in approach; Rather, changes to the process, e.g., more training, change in policies/procedures, increased family participation

Other Findings 45 Part C early intervention programs reported being able to determine the number of children exiting who were in the program at least six months, and therefore should have child outcomes ratings/scores. 20 Part C early intervention programs reported that they integrated the child outcomes ratings/scores into the IFSP.

History of Indicator 3 Requirements and Expectations 8/19/2005 instructions for Part C State Performance Plan, due December 2, 2005 First year (2/1/07) will be status upon entry. Following years (starting with 2/1/08) will be progress from entry to exit or other naturally occurring point near exit (such as IFSP/IEP review) for children who have received Part C services for 6 months or more. “6 months” NOT mentioned in 10/19/06 measurement table or any measurement table thereafter. However, OSEP is working on adding language back into the measurement table that used to be there around including children who had at least 6 mo. of service. 

Related FAQs How soon after entry and how close to exit do children need to be assessed? This is a State decision. In order to capture the most progress, States should consider assessing children as close to entry and exit as possible. OSEP recommends that exit data be collected within 90 days of a child’s exit from Part C or 619. For children to be included in this data collection they must have received services for 6 months or more, do children need to be in the same program for 6 months? No, 6 months refers to time in service. September 2006, Frequently Asked Questions regarding the SPP/APR: Early Childhood Outcomes (Part C Indicator #3 and Part B Indicator #7)

Related FAQs continued Does a child’s time in service need to be consecutive? 6 months of service is generally 6 months of consecutive service. However, if a child is in a program for 2 months, leaves and takes a month to move with his family and shows up in another program across the State where he receives services for another 4 months, this would be considered equivalent to 6 months of consecutive service. How do States report on children who exit unexpectedly (i.e. they move or parents withdraw from program) and there are no exit child outcome data? States are encouraged to assess the extent to which this is a problem in their State. If a State reports high numbers/percentages of missing child outcome data due to children exiting unexpectedly, OSEP may contact the State for additional information.

Discussion and Recommendations What are the impacts of the dates selected to define 6 month in service? Is your first measurement truly a baseline of performance for the child? Are you able to include all of the children who should have made progress in your program?

Discussion and Recommendations What defines a gap in service? Would you expect that these gaps impact performance? How many children are not included in the outcomes data because of gaps in data?

Discussion and Recommendations When using a minimum age at first measurement consider Impact of gaps between the entry date (e.g. Initial IFSP Date) and the first assessment date.

Discussion and Recommendations How do you engage families in the outcomes measurement process?

Stay in Touch with DaSy Visit the DaSy website at: http://dasycenter.org/ Like us on Facebook: https://www.facebook.com/dasycenter Follow us on Twitter: @DaSyCenter

Stay in Touch with ECTA Visit the ECTA website at: http://ectacenter.org/ Like us on Facebook: https://www.facebook.com/pages/ECTA-Center/304774389667984 Follow us on Twitter: https://twitter.com/ECTACenter  

The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P120002 and #H373Z120002. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile. Instructions to presenters: This slide is to be included as the last slide in your deck but you are not expected to show it to the audience. Please be sure to delete these instructions from this slide’s notes page in your presentation.