Strategies for Increasing Data Quality

Slides:



Advertisements
Similar presentations
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference September 15-17, 2013 Washington, DC Why CEDS for Part C.
Advertisements

Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
The Center for IDEA Early Childhood Data Systems ECTA/DaSy System Framework Self- Assessment June 18, 2015.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
The Center for IDEA Early Childhood Data Systems What Practitioners Need to Know about Measuring EI and ECSE Outcomes Kathleen Hebbeler, SRI International.
The Center for IDEA Early Childhood Data Systems Using Needs Assessments to Identify and Evaluate Technical Assistance: Results of a National Survey about.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes,
PREVIEW: STATE CHILD OUTCOMES DATA QUALITY PROFILES National Webinar February 2014.
The Center for IDEA Early Childhood Data Systems CEDS: Common Data Elements and Definitions for Part C and 619 Programs Meredith Miceli, OSEP Missy Cochenour,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
The Center for IDEA Early Childhood Data Systems Why Should EI/ECSE Participate in Early Childhood Integrated Data Systems (ECIDs)? Missy Cochenour (DaSy.
LEA Self-Assessment LEASA: Presentations:
The Center for IDEA Early Childhood Data Systems The Importance of Personnel Data Donna Spiker Co-Director, DaSy Center OSEP 2016 Virtual leadership Conference.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Evaluating activities intended to improve the quality of Child Outcomes Data August 2016.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Child Outcomes Summary (COS) Process Professional Development Tools
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Getting Results through Systemic Approaches
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Child Outcomes Summary Process April 26, 2017
Phase I Strategies to Improve Social-Emotional Outcomes
What’s Unique about the Child Outcome Summary Process in Minnesota:
Pacific and Caribbean States/Entities Early Intervention and
FPG Child Development Institute
Using Formative Assessment
National, State and Local Educational Environments Data:
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes Infusing Partnership Principles and Practices into Family Engagement.
Child Outcomes Summary (COS) Process Training Module
Kristin Reedy, Co-Director June 24, 2016
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
National Webinar Presented by: Amy Nicholas Cathy Smyth
Strategies for Increasing Data Quality
ECO Family Experiences and Outcomes measurement system
Regional Meetings for Teachers of the Deaf Spring 2014
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development Subcomponent Taletha Derrington and Kathleen.
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
Pay For Success: An Invitation to Learn More
DB Summit 2016 Early Identification/Referral Session
2018 OSEP Project Directors’ Conference
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
Supporting Improvement of Local Child Outcomes Measurement Systems
Improving Data, Improving Outcomes Conference Arlington, VA
Let’s Talk Data: Making Data Conversations Engaging and Productive
Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
2018 Improving Data, Improving Outcomes Conference
Early Childhood and Family Outcomes
Strategies for Increasing Data Quality
Using Data to Monitor Title I, Part D
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Integrating Results into Accountability Procedures and Activities
Integrating Outcomes Learning Community June 12, 2013
Child Outcomes Summary (COS) Process Training Module
Christina Kasprzak Frank Porter Graham Child Development Institute
Using the Child and Family Outcomes Analysis Tools
Data Culture: What does it look like in your program?
Measuring Child and Family Outcomes Conference August 2008
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Data Culture: What does it look like in your program?
Presentation transcript:

Strategies for Increasing Data Quality Kate Rogers, Early Learning Team Manager Katie McCarthy, VT 619 Coordinator

Vermont Early Childhood Systems State Policy and Grant Initiatives Universal PreK ECSE Services ECO Vermont Early Childhood Systems State level interagency work Universal PreK 77% of children with disabilities served in inclusive EC settings RTT-ELC Early Childhood Outcomes (COS) process embedded into the IEP 2013

Where We are Coming From: State Context Universal PreK Full Implementation Fall 2016 For all 3, 4 and 5 year olds not enrolled in K Prequalified Public and Private EC Programs (including Head Start) 77% of children with disabilities served in inclusive EC settings QRIS (STARS) State PreK Inclusion Coordinator RTT-ELC Early MTSS, PreK Assessment, QRIS, Vermont Early Learning Standards (birth to 8), PreK Monitoring System, SLDS, PreK-Grade 3 Early Childhood Outcomes (COS) process embedded into the IEP 2013

Why did we prioritize child outcomes? 2013 Embedded ECO into IEP process 2013-14 Compiled anecdotal comments and questions about ECO from field 2013-14 Clear that teachers and staff did not see purpose, use and value 2014-15 Consequently we received data of poor quality For example, red flags popped for impossible combinations in COS calculator 2014-15 Universal PreK preparation for full implementation for each and every child 2015 Joined ECTA/DASY cohort for TA Heard from providers that they were not invested in collecting the data. They did not see the value in the process of the data. Frequently saw impossible combinations when the COS data were entered into the COS calculator. Were expanding the teams serving CWD so needed to be able to communicate responsibilities to Pre K providers

How VT Identified Root Causes of Data Quality Issues Disseminated statewide VT 619 Implementation Survey (available on the conference website) “The Survey Said…” Teachers and staff were not engaged teaming practices The level of family engagement was minimal Need for technical assistance and professional learning

How are we improving teaming practices? Personnel/Workforce Cross agency (AOE and AHS) professional learning coordination Training modules and packets ready to launch Fall 2017 (list available on conference site) Governance Developed a ECO Practice and Procedure Manual Universal PreK regulations include rules to ensure inclusion Universal PreK regulations include access to service delivery model PreK Monitoring System incorporates ECSE Guiding Principals for Full Participation Accountability Exploring Local Child Outcomes Measurement Framework (LCOMS; available on the conference website) as part of Pre K monitoring system

Early Childhood Outcomes Practices and Procedures Training and TA WHO are the learners? ECSE Educators, Related Service Personnel PreK Educators (private and public) Administrators WHAT is the content? P and P Manual Topics such as Inclusion and ADA 504 HOW is the content delivered? State and Local Strategies Training and TA Modules State Early Childhood website Face to Face 76.37 % of children in inclusive settings 1769 children total 1351 are in inclusive settings

Evaluating the impact of the system improvements on teaming Pre- Post- implementation survey COS-TC Quality Practices Checklist and Descriptions (Younggren, Barton, Jackson, Swett, & Smyth, 2017). VT –LCOMS (installation phase) Proof is in the ECO data Pudding!

Next steps Quality Standards Data System Revising the QRIS system to reflect high quality practices for children with disabilities including teaming and assessment Data System AOE is exploring ‘real time’ ECO reports for local and state. Planning ECO Data Days (DAZE!) for LEAs

WA Part C Early Intervention Services Debi Donelan, Assistant Administrator of Training and Technical Assistance Susan Franck, Part C Data Manager

Where we are coming from: State Context Housed within a department of early learning Currently have 25 local lead agencies (LLAs) System design work to address infrastructure needs, including priority areas: Regionalization, Resources, Rules, and Robust data system SSIP focused on increasing social emotional outcomes Have an online IFSP with real time reporting available to multiple user types. COS embedded into IFSP

Why did we prioritize child outcomes? SSIP data analysis identified the following: Unexpected patterns related to entry ratings Inconsistent understanding of the Child Outcome Summary (COS) process Inconsistent involvement of families in the COS process Needed better measurement to understand the impact of SSIP improvement activities Had data reports available through our web based system that locals were not using Did not have a structure in place to support local data use SSIP data analysis identified high entry ratings in social-emotional outcome, in-depth data analysis (local team interviews) identified inconsistent COS process, especially related to family involvement in the process

Identifying root causes for data quality issues Completed extensive analysis of existing child outcomes data patterns including: Entry Exit Age at entry Length of time in service Race/ethnicity Gender Disability category Service area

How are we improving local data use in WA Data System Planning a revision to the data system to improve the quality of the data collected. Personnel/Workforce Developed a series of COS modules and required all providers to complete an assessment of their knowledge after completing the modules. Completed “Fun with Data” activities at local lead agency meetings. Focusing quarterly calls with local lead agency administrators on the use of child outcomes data. Governance System design plan aligns authority to ensure all early intervention providers complete required trainings Accountability and Quality Improvement Requiring local lead agencies to complete the Local Child Outcomes Measurement Framework as part of their self-assessment.

Developed an evaluation to see where each LLA was related to: SLA LLA Providers Finding Reports Understanding Data Analyzing Data Monitoring Data Using Data for Program Improvement Families Developed an evaluation to see where each LLA was related to: Finding reports, Understanding reports, Using reports to analyze data, Using reports to monitor data, and Using reports to assess progress and make program adjustments.

Fun with Data- activities at LLA meetings November 2015- Fun with Data Support LLAs to start looking at data and identifying patterns and hypothesize contributing factors Small groups compared de-identified local data in comparison to statewide data November 2016- Fun with Data 2.0 Support LLAs to review their own data in comparison to statewide data Reviewed entry by exit data using pattern checking tool FUN WITH DATA: At our outcomes cohort face-to-face meeting in November, 2015, we examined local data patterns in child outcomes: Outcome 1 (social-emotional) Observed high percentage of ratings of 6 or 7 at entry We reviewed data that was disaggregated by Local Lead Agency and identified programs where this was more of a concern Social-emotional is the focus of our SSIP work We developed an activity to complete with Local Lead Agencies later that month Purpose to support LLAs to start looking at data and identifying patterns We shared sample entry scores from two de-identified programs (labeled program A and B) in comparison to statewide data. We asked participants to identify a pattern in the local data that differed from the statewide data. (Program A had a very high percentage of a COS entry score of 3, Program B had a very high percentage of a COS entry score of 6.) Small groups hypothesized which contributing factors might have influenced the data, using the local contributing factors tool, which includes: Policies and Procedures, Infrastructure, Data Quality, Training/Technical Assistance, Supervision, Child Outcome Summary Rating Process This activity was our first step in building data analysis into our LLA meetings. This November we completed the second phase of the activity (Fun with Data 2.0) using their local data in comparison to statewide data. The goal was to continue building capacity at the local level. We provided each LLA with their own data and reviewed the Outcome 1 entry by exit report, which showed number and percentages of exit scores in relation to entry scores. We shared the pattern checking tool and described the predicted pattern: functioning at entry in one outcome area will be related to functioning at exit in the same outcome area

Fun with Data- activities at LLA meetings

Quarterly call topics October 2016 January 2017 April 2017 Reviewed quarterly call process Provided orientation to all COS reports Reviewed COS reports Progress codes and summary statements Entry by Exit report Progress codes by Ethnicity January 2017 Developed resource: https://del.wa.gov/sites/default/files/public/ESIT/COS_Review_Sheet.pdf LLAs demonstrated understanding of the COS process and reports Data activity- LLAs charted their progress category percentages for each outcome for the year comparable to these data for the state April 2017 Reviewed COS resource: guiding questions and activity template Data activity: walked through the guiding questions with COS reports

Evaluating the impact of the system improvements 93% of providers completed the COS modules and passed the quiz. Quarterly assessment of the participants’ comfort with data. The average ability to access reports score increased from 3.5 during call one in October to 4.2 during call two in January. Challenges/lessons learned: Identifying a learning progression for data skills. There can be multiple people on a call who have different levels of comfort with data. Difficult to figure out how to estimate for the program. Progress has been made toward this outcome.

Next steps Support locals in replicating data activities with their providers. Revise the data management system to streamline reporting and better meet the needs of the local administrators.

Discussion Questions What steps are you taking to improve child outcomes data quality? How are stakeholders included in child outcomes data quality efforts? Is this work part of your SSIP improvement strategies? Training and adult learning principles – are states interested in doing this and what would their action steps be to work with their locals? Do you have resources available to build capacity at local level? What can be the one step you would take? What are your action steps to improve child outcome data quality leaving this session?

Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P120002 and #H373Z120002. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile.