Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska Medical Center-Munroe Meyer Institute Jan Thelen Nebraska Department of Education
Agenda 1.Review of online assessment systems in Colorado, Nebraska 2.Using automatically generated reports 3.Pattern checking with raw data 4.Helping local districts analyze data quality
How do we use online assessment systems? Federal reporting State reporting CCDC and/or GOLDCOR ColoradoXXXXX NebraskaXXX Hierarchy: State Program Teacher Child No COSF rating ® ® Data output options: 1.Automated user reports 2.Raw data
General Reports: CreativeCurriculum.net
Raw Data & Mgmt Reports: CreativeCurriculum.net
General Reports: Online COR
Strategies Using Automated Reports Random checks by state Targeted checks by state State provides guidance to local school districts on running checks (administrator trainings, etc.) State prompts local districts to run own checks
1)Have observations been entered? Observation Notes report, Classroom Status Report 2)Are observations of good quality? 3)Is the corresponding score accurate? Check observation notes Pattern Checking with Automated Reports A H
Make sure all children are included in basic reports Snapshot or Gains report Verify Totals match expected numbers of children in your system Pattern Checking with Automated Reports BCD
Verify assessments are complete Snapshot, Gains, Assessment Status reports Use of “not observed”/”missing data” is discouraged Frequent missing data could mean: –Lack of training –Insufficient time for observation –Invalid data Pattern Checking with Automated Reports BCE
OSEP Entry Status report Child List report Verify children with IEPs/IFSPs have correct entry/exit dates, etc. Verify children who will be included in OSEP reporting Pattern Checking with Automated Reports F
Snapshot report Disaggregate by program, class, child, and/or domain Look for specific trouble areas that could skew data Analyze completion patterns at different levels Pattern Checking with Automated Reports
Strategies Using Raw Data 13 2F1 F3 2F F F2 F1
Correlation between Age and Raw Score by Outcome Established finding that developmental abilities increase with age Relationship should lead to high correlations between: –chronological age and raw score –chronological age and developmental age Pattern Checking with Raw Data
AssessmentAEPS Outcome (n=1704) Outcome (n=1678) Outcome (n=1699) Nebraska Summary: Correlations between chronological age and raw score for each Outcome across assessments Pattern Checking with Raw Data
Examine the distribution across OSEP Summary Statements What patterns do you see? Summary Statement 1: Percent that substantially increased their rate of growth by exit Summary Statement 2: Percent of children who were functioning within age expectations in each outcome exit Pattern Checking with Raw Data
Degree to which outcomes are related to demographic characteristics Gender Primary language in the home Ethnicity Disability Run online OSEP Mandated Federal Reports for Part C or B Pattern Checking with Raw Data
Are there differences that you were not expecting by gender, primary language in the home, ethnicity, and disability? What are the programmatic implications? Pattern Checking with Raw Data
Relationship with Disability Entry and exit scores and OSEP categories should be related to the nature of the child’s disability For many, but not all, children with disabilities, progress in functioning in the three outcomes proceeds together Pattern Checking with Raw Data
Comparison by Disability Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C Developmental Delay (n=99) 47%67%72% 63%69% Speech/Language Impairment (n= 81) 51%72%80%75%64%76% Pattern Checking with Raw Data
Relationship with Gender, Ethnicity, and Primary Language Hypothesis: There will not be a relationship between Entry and Exit scores and OSEP categories by the child’s gender, ethnicity, and primary language. Pattern Checking with Raw Data
Comparison by Ethnicity Ethnicity Summary Statement 1 : Outcome A Summary Statement 2: Outcome A Summary Statement1 : Outcome B Summary Statement 2: Outcome B Summary Statement 1: Outcome C Summary Statement 2: Outcome C White (n=183) 46%68%74%72%50%68% All other ethnicities (n=39) 50%52%61%65%50%52% Pattern Checking with Raw Data
Comparison by Primary Language Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C ELL (n=39) 56%49%68%50%66%58% English (n= 252) 64%75%76%78%67%77% Pattern Checking with Raw Data
Work Sampling System ● High frequency in OSEP Progress Category A non-IEP IEP CategoryN%N% Outcome 1 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total Outcome 2 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total Outcome 3 A Exit is 5 or lower, and is less than entry; no progress % % B Exit is 5 or lower, and is less than or equal to entry; progress % % C Exit is 5 or lower, and is higher than entry % % D Exit is 6 or higher, and entry is 5 or lower % % E Score of 6 or 7 at both entry and exit % % Total
Outcome Area Total number of children that made no progress Children making no progress that had identical scoring patterns in the fall and spring Percent of all children making no progress that had identical scoring patterns across checkpoints Outcome % Outcome % Outcome % Looking closely at children who made no progress… Pattern Checking with Raw Data
Children identified as “no progress” may be due to mis-administered assessments Solution professional development Pattern Checking – Children Who Make No Progress
Further Analyses Anomalies in funding source combinations Duplicate child records Erroneous birth dates Score profiles across program year Pattern Checking with Raw Data
Further Analyses Export the raw data and complete statistical analyses to significance of patterns Export the child level data and complete cross tabs in Excel: –Disability with ELL Pattern Checking with Raw Data
Specific Strategies for Enabling Local Programs
Colorado Data “Chats” Screen capture video tutorials Memo to local Head Start about misinterpreting patterns Strategies for Enabling Local Programs G
NE: Navigating Reports in Online Systems: Support for Local Administrators Use of reports to monitor quality –Webinars –Lab workshops – alerts with instructions: individualized by assessment Facilitating communication with partners –Clarifying roles and responsibilities across partnering agencies
NE: Using Data as part of a Continuous Improvement Process Annual Results Matter Institute –National speakers –Local districts share how they use reports Individual Consultation SPP/APR –Meeting Targets –Supporting Improvement Activities
Questions?
Thank You!