Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska.

Slides:



Advertisements
Similar presentations
Update on the Online Conversion Process for CC.net and GOLD: Implications for OSEP Reporting.
Advertisements

FRANKLIN PUBLIC SCHOOLS SCHOOL COMMITTEE MAY 27, 2014 Massachusetts Kindergarten Entry Assessment (MKEA)
Data, Now What? Skills for Analyzing and Interpreting Data
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Welcome to the YPQA Scores Reporter On Line Tutorial! This technology, available to network and program site leaders using the YPQA as part of quality.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Child Outcomes Data July 1, 2008 – June 30, 2009.
PRT Determinations & Public Reporting Adria Bace & Amy Bunnell, NDE Barb Jackson, UNMC-MMI.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Getting Ready to Get the Most out of Your Child Assessments.
1 Birth to 3 Child Outcomes Maryland’s Approach to Converting Assessment Data to OSEP Outcome Categories August 28, 2007 Deborah Metzger
Analyzing COSF Data in Support of Validity Charles R. Greenwood July 25, 2007.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
Patterns in Child Outcomes Summary Data: Cornelia Taylor, Lauren Barton, Donna Spiker September 19-21, 2011 Measuring and Improving Child and Family Outcomes.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
Building Outcome Data into a State On-Line Data System A birth through five project in Kansas Margy Hornback, Ed.D., Kansas Part B 619 Coordinator Kansas.
A Report on the Texas Parent Survey for Students Receiving Special Education Services DataSource: Statewide Survey of Parents of Students Receiving Special.
ANNUAL PLANNING REGION MEETING May 28, 2009, 11-1.
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
Looking for Patterns in Child Outcome Data – Examples from NYS New York State Department of Health Bureau of Early Intervention.
Council for Exceptional Children/Division of Early Childhood Conference October 2010 Kim Carlson, Asst. Director/619 Coordinator Ohio Department of Education.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
Experiences and Lessons Learned Using Publishers’ Online Assessment Systems for OSEP Reporting Presenters: Barbara Jackson – Munroe-Meyer Institute, UNMC.
EMetric Demo Site User Name: d9999 Password: emetric.
1 Charting the Course: Smoother Data Sharing for Effective Early Childhood Transition Wisconsin’s Journey Lori Wittemann, Wisconsin Department of Health.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,
Presented at State Kindergarten Entry Assessment (KEA) Conference San Antonio, Texas February, 2012 Comprehensive Assessment in Early Childhood: How Assessments.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Maryland’s Approach to Converting Preschool Outcomes Data to OSEP Reporting Categories Nancy M. Vorobey, M.Ed. Maryland State Department of Education
ISES Presentation Slides. Context & Background Approximately 46,000 children are assessed two times per year This data is used to support Indicator 7.
Linking the DRDP to Instruction: Using the DRDP (2015) Reports
Created by: Krystal Barker, Teresa Campbell, Kim Grubb, and Tristan Parsons.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Update on the Online Conversion Process for AEPSi: Implications for OSEP Reporting.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
PREKINDERGARTEN DISABILITIES CONTACT MEETING MAY 9-10, 2012 Guidance Update for the BDI-2.
Building State Systems to Produce Quality Data on Child Outcomes Jim J. Lesko Director, Early Development and Learning Resources Delaware Department of.
Planning, Assessment & Research Analysis and Use of Student Data Part I Administrative Academy Los Angeles Unified School District.
© KEDS 2015 FALL 2015 KDE KEDS Updates & Reminders Join by Phone Number: +1 (855) Conference ID:
Early Childhood Assessments Data Entry Guidance for the Last 45 Days Aaron Brown Office of Assessment SC Department of Education.
ELL – ACCESS for ELLs PIMS Data Collection School Year.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
The Creative Curriculum ® and OSEP Outcomes April 25, 2006.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
TOPSpro Report Setup CASAS Summer Institute June 25, 2009.
Child Outcomes Measurement Tools & Process A story of 3 conversions.
Results Matter Fall Webinar Series Part I: UPDATES
Child Outcomes Summary Process April 26, 2017
Aaron Brown Office of Assessment SC Department of Education
Early Childhood Outcomes Data (Indicator C3 and B7)
Integrating Outcomes Learning Community Call February 8, 2012
Update on the Online Conversion Process for CC.net and GOLD:
Building State Systems to Produce Quality Data on Child Outcomes
Using outcomes data for program improvement
Refresher: Background on Federal and State Requirements
Child Outcomes Data July 1, 2008 – June 30, 2009
Early Childhood Outcomes Data (Indicator C3 and B7)
Presentation transcript:

Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska Medical Center-Munroe Meyer Institute Jan Thelen Nebraska Department of Education

Agenda 1.Review of online assessment systems in Colorado, Nebraska 2.Using automatically generated reports 3.Pattern checking with raw data 4.Helping local districts analyze data quality

How do we use online assessment systems? Federal reporting State reporting CCDC and/or GOLDCOR ColoradoXXXXX NebraskaXXX Hierarchy:  State  Program  Teacher  Child  No COSF rating ® ® Data output options: 1.Automated user reports 2.Raw data

General Reports: CreativeCurriculum.net

Raw Data & Mgmt Reports: CreativeCurriculum.net

General Reports: Online COR

Strategies Using Automated Reports Random checks by state Targeted checks by state State provides guidance to local school districts on running checks (administrator trainings, etc.) State prompts local districts to run own checks

1)Have observations been entered? Observation Notes report, Classroom Status Report 2)Are observations of good quality? 3)Is the corresponding score accurate? Check observation notes Pattern Checking with Automated Reports A H

Make sure all children are included in basic reports Snapshot or Gains report Verify Totals match expected numbers of children in your system Pattern Checking with Automated Reports BCD

Verify assessments are complete Snapshot, Gains, Assessment Status reports Use of “not observed”/”missing data” is discouraged Frequent missing data could mean: –Lack of training –Insufficient time for observation –Invalid data Pattern Checking with Automated Reports BCE

OSEP Entry Status report Child List report Verify children with IEPs/IFSPs have correct entry/exit dates, etc. Verify children who will be included in OSEP reporting Pattern Checking with Automated Reports F

Snapshot report Disaggregate by program, class, child, and/or domain Look for specific trouble areas that could skew data Analyze completion patterns at different levels Pattern Checking with Automated Reports

Strategies Using Raw Data 13 2F1 F3 2F F F2 F1

Correlation between Age and Raw Score by Outcome Established finding that developmental abilities increase with age Relationship should lead to high correlations between: –chronological age and raw score –chronological age and developmental age Pattern Checking with Raw Data

AssessmentAEPS Outcome (n=1704) Outcome (n=1678) Outcome (n=1699) Nebraska Summary: Correlations between chronological age and raw score for each Outcome across assessments Pattern Checking with Raw Data

Examine the distribution across OSEP Summary Statements What patterns do you see? Summary Statement 1: Percent that substantially increased their rate of growth by exit Summary Statement 2: Percent of children who were functioning within age expectations in each outcome exit Pattern Checking with Raw Data

Degree to which outcomes are related to demographic characteristics Gender Primary language in the home Ethnicity Disability Run online OSEP Mandated Federal Reports for Part C or B Pattern Checking with Raw Data

Are there differences that you were not expecting by gender, primary language in the home, ethnicity, and disability? What are the programmatic implications? Pattern Checking with Raw Data

Relationship with Disability Entry and exit scores and OSEP categories should be related to the nature of the child’s disability For many, but not all, children with disabilities, progress in functioning in the three outcomes proceeds together Pattern Checking with Raw Data

Comparison by Disability Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C Developmental Delay (n=99) 47%67%72% 63%69% Speech/Language Impairment (n= 81) 51%72%80%75%64%76% Pattern Checking with Raw Data

Relationship with Gender, Ethnicity, and Primary Language Hypothesis: There will not be a relationship between Entry and Exit scores and OSEP categories by the child’s gender, ethnicity, and primary language. Pattern Checking with Raw Data

Comparison by Ethnicity Ethnicity Summary Statement 1 : Outcome A Summary Statement 2: Outcome A Summary Statement1 : Outcome B Summary Statement 2: Outcome B Summary Statement 1: Outcome C Summary Statement 2: Outcome C White (n=183) 46%68%74%72%50%68% All other ethnicities (n=39) 50%52%61%65%50%52% Pattern Checking with Raw Data

Comparison by Primary Language Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C ELL (n=39) 56%49%68%50%66%58% English (n= 252) 64%75%76%78%67%77% Pattern Checking with Raw Data

Work Sampling System ● High frequency in OSEP Progress Category A non-IEP IEP CategoryN%N% Outcome 1 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total Outcome 2 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total Outcome 3 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total

Outcome Area Total number of children that made no progress Children making no progress that had identical scoring patterns in the fall and spring Percent of all children making no progress that had identical scoring patterns across checkpoints Outcome % Outcome % Outcome % Looking closely at children who made no progress… Pattern Checking with Raw Data

Children identified as “no progress” may be due to mis-administered assessments Solution  professional development Pattern Checking – Children Who Make No Progress

Further Analyses Anomalies in funding source combinations Duplicate child records Erroneous birth dates Score profiles across program year Pattern Checking with Raw Data

Further Analyses Export the raw data and complete statistical analyses to significance of patterns Export the child level data and complete cross tabs in Excel: –Disability with ELL Pattern Checking with Raw Data

Specific Strategies for Enabling Local Programs

Colorado Data “Chats” Screen capture video tutorials Memo to local Head Start about misinterpreting patterns Strategies for Enabling Local Programs G

NE: Navigating Reports in Online Systems: Support for Local Administrators Use of reports to monitor quality –Webinars –Lab workshops – alerts with instructions: individualized by assessment Facilitating communication with partners –Clarifying roles and responsibilities across partnering agencies

NE: Using Data as part of a Continuous Improvement Process Annual Results Matter Institute –National speakers –Local districts share how they use reports Individual Consultation SPP/APR –Meeting Targets –Supporting Improvement Activities

Questions?

Thank You!