ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

1 Champlain Valley Head Start Child Outcomes Assessment in Champlain Valley Head Start.
Module 1 Learning More about the Summary of Functional Performance Every day, we are honored to take action that inspires the world to discover, love and.
Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Teachers’ views of the challenges and solutions of their work: including children identified as at-risk and disabled Sallee Beneke University of Illinois.
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
ACCOUNTING FOR PROGRESS…… ONE CHILD AT A TIME
The Child Outcomes Summary Competency Check (COS-CC) Amy Nicholas, Naomi Younggren, Siobhan Colgan & Kathi Gillaspy 2013 Improving Data, Improving Outcomes.
Minnesota’s Outcome Measurement System For Infants, Toddlers and Preschool Children with Disabilities and their Families, including young children with.
National Professional Development Center on ASD Lisa Sullivan MIND Institute, UC Davis.
Opening Doors to Success: The Status of State Transition Policies and Practices Beth Rous University of Kentucky Gloria Harbin University of North Carolina.
Provider Perceptions of the Child Outcomes Summary Process Lauren Barton and Cornelia Taylor October 27, 2012 Measuring and Improving Child and Family.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
Patterns in Child Outcomes Summary Data: Cornelia Taylor, Lauren Barton, Donna Spiker September 19-21, 2011 Measuring and Improving Child and Family Outcomes.
ND Early Childhood Outcomes Process Nancy Skorheim – ND Department of Public Instruction, Office of Special Education.
Early Childhood Outcomes Center Do My Data Count? Questions and Methods for Monitoring and Improving our Accountability Systems Dale Walker, Sara Gould,
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes,
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
Evaluating a Research Report
Child Outcomes: Understanding the Requirements in order to Set Targets Presentation to the Virginia Interagency Coordination Council Infant &
Looking for Patterns in Child Outcome Data – Examples from NYS New York State Department of Health Bureau of Early Intervention.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Presented at State Kindergarten Entry Assessment (KEA) Conference San Antonio, Texas February, 2012 Comprehensive Assessment in Early Childhood: How Assessments.
Early Childhood Special Education Part B, Section 619 Measurement of Preschool Outcomes-SPP Indicator #7 Training Sessions-2010.
Early Childhood Outcomes Center Orientation to Measuring Child and Family Outcomes for New People Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG/UNC.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
1 Indicator 7 Child Outcomes: Changes & Updates June 2011 Indicator 7 Child Outcomes: Changes & Updates June 2011.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
1 Collaboration Across Part C and 619 on Child Outcomes Measuring Child and Family Outcomes.
Update on the Online Conversion Process for AEPSi: Implications for OSEP Reporting.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
Section 6 The Three Global Outcomes. Key Principles for Early Intervention Service Provision 1.Infants and toddlers learn best through every day experiences.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Quality Assurance: Looking for Quality Data
Child Outcomes Summary Process April 26, 2017
Validity of the Child Outcomes Summary Process:
Validity of the Child Outcomes Summary Process:
Integrating Outcomes Learning Community Call February 8, 2012
Assuring the Quality of your COSF Data
Update on the Online Conversion Process for CC.net and GOLD:
Update from ECO: Possible Approaches to Measuring Outcomes
History of work between ODE and ECO
Integrating Outcomes Learning Community June 12, 2013
Involving Families Early Childhood Outcomes Center.
Using the Child and Family Outcomes Analysis Tools
Assuring the Quality of your COSF Data
Presentation transcript:

ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington, VA ENHANCE is funded by grant R324A from the the U.S. Dept. of Education, Institute for Educational Sciences

Topics Project design - review Data collection progress What we are learning –Implementation –Child assessment study –State data study Next steps

Project Design

1.Conduct a program of research to examine the validity of ratings generated by COS and identify conditions that lessen validity. 2.Revise the COS and supporting materials based on study findings. 3.Identify a series of validity analyses that can feasibly be conducted in states to allow each state to examine the validity of its own COS data on an ongoing basis. ENHANCE Project Objectives

Validity – What Are We Trying to Demonstrate? Validity is NOT a characteristic of an assessment or measurement device. Validity is a characteristic of the data produced by the tool and how these data are used. Are data valid for the purpose of….. Implications: –State A’s COS data could be valid; –State B’s COS data could not be. Standards for Educational and Psychological Testing(1999) by American Educational Research Association, American Psychological Association, National Council on Measurement in Education

Validation Process Validation process: –Develop propositions (validity argument) - If data were valid for this use, then we would see…. –Collect evidence to examine each of those propositions

Examples of propositions in the COS Validity Argument 3. Children differ from one another with regard to level of functioning in the 3 outcome areas as reflected in COS ratings. 7. Functioning (COS ratings) in an outcome area at time 1 is related to functioning in that area at a later point in time. 9. COS ratings will be related to the nature and severity of the child’s disability.

Design Not controlled Conditions

Design: 37 Project Data Collection Sites 19 Programs Part C Illinois Maine Minnesota New Mexico North Carolina Texas Virginia 18 Districts Part B Preschool Illinois Maine Minnesota New Mexico South Carolina Texas 9

Studies and Data Collection Progress

ENHANCE Studies Provider Survey Team Decision-Making Comparison with Child Assessments State Data Study

Provider Survey Goals Learn about COS implementation – processes in use Identify providers’ knowledge and training experiences Describe perceptions about if COS produces an accurate rating and influences on that Understand impact of COS on practice Process & Sample Online survey responses All providers in program who participate in COS Status Survey underway, continues through April

Team Decision-Making Study Goals Examine understanding and application of outcomes and rating criteria Describe team process Identify if ratings are consistent with evidence discussed Process & Sample Video teams discussing COS ratings 210 children’s teams Status Data collection underway Code videos this summer & fall

What Are We Learning?

Considerable variability across states and even across programs, within a state  Training  Ongoing staff support and quality assurance  Teaming (not just for COS)  Parent involvement  Timing and Process Implication: Results will tell us about COS validity under real-world conditions Implementation

Number of Providers in COS Ratings - Preliminary Percentage of COS forms Number of providers

Goals Compare entry and exit COS ratings to BDI-2 and Vineland-II scores Compare conclusions from COS and assessments Process & Sample Longitudinal, external assessor at program entry & program exit 216 children Study Status Local, trained assessors in place Recruiting families since Aug Sample shows expected variability, including initial COS ratings, tool scores Comparison with Child Assessments Study

Comparison with Child Assessments Study – Preliminary…

3. There is variability in children’s functioning in the three outcome areas and that variability is reflected in the COS ratings. Validity argument claims

Distributions of Preliminary COS Ratings (1-7) EI (n=71) ECSE (n=49) Ratings

3. There is variability in children’s functioning in the three outcome areas and that variability is reflected in the COS ratings. 10. COS ratings in the corresponding outcome areas are moderately correlated with: o the social-emotional (Outcome 1), o cognitive (Outcome 2), o communication (Outcome 2), and o adaptive (Outcome 3) domain scores of assessment tools. Validity argument claims

Methods table Methods

Preliminary correlations between COS Ratings and assessment tools What expect to see? Methods

Preliminary Correlations: COS Ratings & Assessment Scores ECSE larger COS-Assessment Correlations than EI

Correlations: BDI-2 and Vineland-II Domains ECSE larger BDI-Vineland Correlations than EI

Methods COS Group 1 – Ratings of 1, 2, 3 COS Group 2 – Ratings of 4, 5 COS Group 3 – Ratings of 6, 7

Outcome 1: Positive Social Relationships EI (n=71) ECSE (n=49)

EI (n=71) ECSE (n=49) Outcome 2: Acquiring and Using Knowledge and Skills

EI (n=71) ECSE (n=49) Outcome 3: Taking Appropriate Action to Meet Needs

Summary of preliminary findings Patterns Means for groups generally follow expected directions on assessment tools Group comparisons showed expected differences Effect sizes were nearly all larger for ECSE than EI  For COS – assessment tool comparisons  For comparisons between assessment tools by the same external assessor More data are needed for final conclusions

State Data Study Goals Examine characteristics of COS data and relationships to other variables Look for consistency in patterns across states to test claims Sample All valid COS data within the state for a reporting year states conducting all analyses Additional states sharing select analyses Status Piloted procedures with 3 Part C, 3 Part B Preschool states Now working with 4 Part C, 6 Part B Preschool states Recruiting more states, requesting data

7. Functioning, as reflected in the COS rating, in an outcome area at time 1 is related to functioning in that area at a later point in time. Validity argument claims

Year Taking Appropriate Action to Meet Needs State A State A State A State B State B State C Correlations: Entry and Exit Ratings Part B 619 Preliminary state data

3. There is variability in children’s functioning in the three outcome areas and that variability is reflected in the COS ratings. 14. COS rating distributions at entry will be related to the disability-related characteristics of the population served by states. Validity argument claims

Part C entry ratings across states Taking appropriate action to meet needs COS Ratings

Next Steps and Reactions

Next steps Gather more state data Complete data collection involving local programs/districts Analyze provider survey results Code videos

Questions? Reactions? Implications for the national data? Implications for ECO? Questions? Reactions?

Find out more ENHANCE Website – ECO Center Website – Contact ENHANCE staff –