Quality Assurance: Looking for Quality Data

Slides:



Advertisements
Similar presentations
Early Childhood Outcomes Center1 Involving Families.
Advertisements

Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
Ideas from the Outcomes Think Tank. Gather family’s concerns and general information about child following program procedures Use 3 global outcomes as.
Early Childhood Outcomes Center1 Refresher: Child Outcome Summary Form Child Outcome Summary Form.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Early On® Michigan Child Outcomes
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Using data for program improvement Early Childhood Outcomes Center1.
Challenge Question: How would I like my performance on the job to be evaluated ? Self-Test Questions: 1.Why are some types of skills better assessed through.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
Parent Introduction to School-wide Positive Behavior Supports (SW-PBS)
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Intro to Positive Behavior Interventions & Supports (PBiS)
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
ND Early Childhood Outcomes Process Nancy Skorheim – ND Department of Public Instruction, Office of Special Education.
Early Childhood Outcomes Center Do My Data Count? Questions and Methods for Monitoring and Improving our Accountability Systems Dale Walker, Sara Gould,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Importance of Building Family and Community Engagement for Implementing a Multi-Tiered System of Support There is great power in harmony and mutual understanding.
Measuring Child Outcomes Christina Kasprzak Robin Rooney (ECO) Early Childhood Outcomes (NECTAC) National Early Childhood TA Center Delaware COSF Training,
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
1 We Changed the COSF The Flip The Skills The Decision-Tree Questions.
1 Assuring the Quality of Data from the Child Outcomes Summary Form.
How to Involve Families in the Child Outcome Summary (COS) Process Debi Donelan, MSSA Early Support for Infants and Toddlers Katrina Martin, Ph.D. SRI.
Including Parents In Alaska Child Outcomes. Alaska Child Outcomes Development Summer 2005 – General Supervision Enhancement Grant (GSEG) Infant & Toddler.
BEHAVIOR BASED SELECTION Reducing the risk. Goals  Improve hiring accuracy  Save time and money  Reduce risk.
Purpose The purpose of Module 1 is to orient new staff to child outcomes measurement and the Child Outcomes Summary Form (COSF).
Looking at Data Presented by The Early Childhood Outcomes Center
EIA: Using data for program improvement
There is great power in harmony and mutual understanding.
An agency of the Office of the Secretary of Education and the Arts
Child Outcomes Summary Process April 26, 2017
Phase I Strategies to Improve Social-Emotional Outcomes
No Place Like Home Pilot Tester Discussion and Consent Training
Child Outcomes Summary (COS) Process Training Module
Child Outcomes Summary (COS) Process Module
Child Outcomes Summary (COS) Process Training Module
Christina Kasprzak Austin, Texas November 2010
the Child Outcomes Data Workshop!
Early Childhood Outcomes Data (Indicator C3 and B7)
Integrating Outcomes Learning Community Call February 8, 2012
Assuring the Quality of your COSF Data
Using outcomes data for program improvement
The Basics of Quality Data and Target Setting
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Ensuring the Quality of Child Outcomes Data
Update from ECO: Possible Approaches to Measuring Outcomes
There is great power in harmony and mutual understanding.
Using Data for Program Improvement
Child Outcome Summary Form
Child Outcomes Summary (COS) Process Module
Child Outcomes Summary (COS) Process Training Module
Refresher: Background on Federal and State Requirements
Involving Families Early Childhood Outcomes Center.
Involving Families in the COS Process
Introduction to the Child Outcomes Summary Form (COSF)
Early Childhood Outcomes Data (Indicator C3 and B7)
Assuring the Quality of your COSF Data
Presentation transcript:

Quality Assurance: Looking for Quality Data I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013

Activity What factors work to improve the quality of your data? What factors work to lessen the quality of your data? How to address these factors? Early Childhood Outcomes Center

Take Home Message If you conclude the data are not (yet) valid, they cannot be used for program effectiveness, program improvement or anything else. What do you if the data are not as good as they should be? Answer: Continue to improve data collection through ongoing quality assurance Early Childhood Outcomes Center

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses.

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses.

Promoting quality data Training and support before and during data collection Analysis of the data after data collection Data system and verification after data collection Early Childhood Outcomes Center

Promoting Quality Data Through training and communication related to: Assessment Understanding the COS process Age expectations Data entry Early Childhood Outcomes Center

Promoting Quality Data Through training materials, such as: Video team and child examples Written child examples “Quizzes” for ensuring learning Refresher trainings – Beware of Drift!! Early Childhood Outcomes Center

Promoting Quality Data Through data systems and verification, such as: Data system error checks Good data entry procedures Early Childhood Outcomes Center

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses.

Ongoing supervision Review of the process Methods Is the process high quality? Are teams reaching the correct rating? Methods Observation Videos Early Childhood Outcomes Center

Quality Review of COS Team Discussion Do all team members participate in the discussion? Is parent input considered in the rating? Give examples. Is the team documenting the rating discussion? Give examples. Does the team discuss multiple assessment sources? What are they? Early Childhood Outcomes Center

Quality Review of COS Team Discussion Does the team describe the child’s functioning, rather than just test scores? Give examples. Does the discussion include the child’s full range of functioning, including skills and behaviors that are age appropriate, immediate foundational, and leading to immediate foundational? Give examples. Early Childhood Outcomes Center

Quality review through process checks Provider surveys Self assessment of competence Knowledge checks Process descriptions (who participates?) Identification of barriers Kansas’ survey Alaska’s survey Early Childhood Outcomes Center

Questions from Alaska’s Survey 3. How would you rate your own level of proficiency with the COSF process? (please select only one) I am confident I know how to do it, and I do it well I know how to do it, but I need some more practice and assistance I understand it to a point, but I need more training I do not know how to do this yet 4. Some cases are different from others, but of the choices below, which process seems to be the most typical in your experience? (please select only one) I gather information and determine COSF ratings on my own I gather information and consult with another provider to determine COSF ratings I gather information and consult with the family to determine COSF ratings I gather information, discuss it with a team and the team determines the COSF ratings Early Childhood Outcomes Center

Ongoing Supervision Feedback to teams is critical Refresher training Beware of: Auto pilot Drift Early Childhood Outcomes Center

Quality Review of COS Team Discussion: Activity Observe team video Evaluate quality Early Childhood Outcomes Center

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses.

Quality Review of Completed COS Forms Is the COSF complete? Is there adequate evidence for the basis for the rating? Does the evidence match the appropriate outcome area? Is the evidence based on functional behaviors? Early Childhood Outcomes Center

Quality Review of Completed COS Forms Is there evidence that the child’s functioning across settings and situations considered? Are the ratings consistent with the evidence? Early Childhood Outcomes Center

Quality Review of COS Forms: Activity Review completed COS Form with errors Early Childhood Outcomes Center

Promoting quality data through data analysis Examine the data for inconsistencies If/when you find something strange, look for other data that might help explain it. Is the variation caused by something other than bad data? Early Childhood Outcomes Center

The validity of your data is questionable if… The overall pattern in the data looks ‘strange’ Compared to what you expect Compared to other data Compared to similar states/regions/school districts Early Childhood Outcomes Center

COS Ratings - Fall Rating Class 1 Class 2 Class 3 Class 4 1 3 2 4 5 6 7

Outcome 3: Appropriate Action Spring Fall 1 2 3 4 5 6 7 total   9 26 15 14 27 19 83 21 39 28 12 108 71 86 48 232 63 136 18 23 56 99 Review Total 13 38 60 185 207 186 691

OSEP Categories OSEP Categories (%) 23 16 24 15 13 32 34 37 28 21 25 2 Class 1 (%) Class 2 (%) Class 3 (%) e. Maintained Age Appro Trajectory 23 16 24 d. Changed Traj – Age Appro 15 13 c. Changed Traj – Closer to Age Appropriate 32 34 37 b. Same Trajectory -Progress 28 21 25 a. Flat Trajectory – No Prog. 2 6 1

Questions to ask Do the data make sense? Am I surprised? Do I believe the data? Believe some of the data? All of the data? If the data are reasonable (or when they become reasonable), what might they tell us? Can’t use data for program improvement until you believe them. Early Childhood Outcomes Center

Validity Or Validity refers to the use of the information Does evidence and theory support the interpretation of the data for the proposed use? Or Are you justified in reaching the inference you are reaching based on the data? Standards for Educational and Psychological Testing (1999) by American Educational Research Association, American Psychological Association, National Council on Measurement in Education Early Childhood Outcomes Center

The validity of your data is questionable if: ? Early Childhood Outcomes Center

The validity of your data is questionable if: ….not all providers are not knowledgeable about in the COS process …not all providers are careful with the COS process …the data look “strange” …etc. Early Childhood Outcomes Center

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses.