Looking at Data Presented by The Early Childhood Outcomes Center

Slides:



Advertisements
Similar presentations
Early Childhood Outcomes Center1 Involving Families.
Advertisements

Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Data, Now What? Skills for Analyzing and Interpreting Data
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Using data for program improvement Early Childhood Outcomes Center1.
Using Data for Program Improvement Christina Kasprzak May, 2011.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Early Childhood Outcomes Center Do My Data Count? Questions and Methods for Monitoring and Improving our Accountability Systems Dale Walker, Sara Gould,
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes,
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Measuring Child Outcomes Christina Kasprzak Robin Rooney (ECO) Early Childhood Outcomes (NECTAC) National Early Childhood TA Center Delaware COSF Training,
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Purpose The purpose of Module 1 is to orient new staff to child outcomes measurement and the Child Outcomes Summary Form (COSF).
EIA: Using data for program improvement
Quality Assurance: Looking for Quality Data
Child Outcomes Summary Process April 26, 2017
Child Outcomes Summary (COS) Process Training Module
Child Outcomes Summary (COS) Process Training Module
OSEP Project Directors Meeting
Welcome to the National ECO TA Call Improving the Quality of
Child Outcomes Summary (COS) Process Training Module
the Child Outcomes Data Workshop!
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Data Workshop: Analyzing and Interpreting Data
Supporting States in Building a Child Outcomes Measurement System
Integrating Outcomes Learning Community Call February 8, 2012
Structures for Implementation
OSEP Initiatives on Early Childhood Outcomes
Assuring the Quality of your COSF Data
Update on the Online Conversion Process for CC.net and GOLD:
Using outcomes data for program improvement
Why Collect Outcome Data?
The Basics of Quality Data and Target Setting
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
Data Workshop: Analyzing and Interpreting Data
Communicating to the Public about Child Outcomes Data
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Annual Title I Meeting and Benefits of Parent and Family Engagement
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Integrating Outcomes Learning Community June 12, 2013
Child Outcome Summary Form
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Measuring EC Outcomes DEC Conference Presentation 2010
Child Outcomes Summary (COS) Process Training Module
Measuring Part C and Early Childhood Special Education Child Outcomes
Refresher: Background on Federal and State Requirements
Involving Families Early Childhood Outcomes Center.
Welcome to the Workshop!
Measuring Child and Family Outcomes Conference August 2008
Introduction to the Child Outcomes Summary Form (COSF)
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Assuring the Quality of your COSF Data
Presentation transcript:

Looking at Data Presented by The Early Childhood Outcomes Center Revised January 2013

Using data for program improvement = EIA Evidence Inference Action Early Childhood Outcomes Center

Early Childhood Outcomes Center Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable Early Childhood Outcomes Center

Early Childhood Outcomes Center Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

Early Childhood Outcomes Center Inference Inference is debatable -- even reasonable people can reach different conclusions from the same set of numbers Stakeholder involvement can be helpful in making sense of the evidence Early Childhood Outcomes Center

Early Childhood Outcomes Center Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Early Childhood Outcomes Center

Early Childhood Outcomes Center What can we infer? Poll results A: Candidate I.M. Good 51%, Candidate R.U. Kidding 49% (+ or – 3%) Poll results B: Candidate I.M. Good 56%, Candidate R.U. Kidding 44% (+ or – 3%) Early Childhood Outcomes Center

Program improvement: Where and how At the state level – TA, policy At the regional or local level – supervision, guidance Classroom level -- spend more time on certain aspects of the curriculum Child level -- modify intervention Different program improvement levers at different levels. Going to be focusing primarily on the state level use of information. Some state applications translate directly to smaller units. How interventionists or teacher use outcome data for program improvement is a completely different topic – very important but we are not going to cover it here. Early Childhood Outcomes Center

Early Childhood Outcomes Center Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning You cannot prevent the misuse of data but you can set up conditions to make it less likely. Early Childhood Outcomes Center

E – I – A Jeopardy $100 $100 $100 $200 $200 $200 $300 $300 $300 COS users unaware of the need to answer the yes/no progress question 90% of exit COSs in Program B missing a response to the yes/no progress question Revise COS procedures to emphasize completion of yes/no progress question Conduct staff development on using the 7-point rating scale 75% of children in Program A received entry ratings of 2 COS users misunderstand the definition of points on the 7-point scale Currently used tools are not accurately assessing children’s social emotional skills Invest resources in materials for assessing social-emotional skills 45% of children reported in category ‘e’ for statewide progress data, Outcome 1 $100 $100 $100 $200 $200 $200 Click over each cell to reveal an example of either evidence, inference, or action. Have participants identify which examples are evidence, inference, or action. $300 $300 $300 Early Childhood Outcomes Center

Evidence-Inference-Action Use of Data: Activity Evidence-Inference-Action Early Childhood Outcomes Center

Continuous Program Improvement Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Early Childhood Outcomes Center

Tweaking the System Reflect Check Plan (vision) Implement Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

Early Childhood Outcomes Center Continuous means… ….the cycle never ends. Early Childhood Outcomes Center

Outcome questions for program improvement, e.g. Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

Examples of process questions Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? Early Childhood Outcomes Center

Early Childhood Outcomes Center Working Assumptions There are some high quality services and programs being provided across the state. There are some children who are not getting the highest quality services. If we can find ways to improve those services/programs, these children will experience better outcomes. Early Childhood Outcomes Center

Early Childhood Outcomes Center Numbers as a tool Heard on the street “Why are we reducing children to a number?” So why do we need numbers? Early Childhood Outcomes Center

Early Childhood Outcomes Center The need to aggregate data on children in a given classroom or caseload Early Childhood Outcomes Center

Early Childhood Outcomes Center The need to aggregate across children within the school/ program Early Childhood Outcomes Center

Early Childhood Outcomes Center The need to aggregate across districts/ programs Early Childhood Outcomes Center

Early Childhood Outcomes Center The need to aggregate across the country Early Childhood Outcomes Center

Examining COS data at one time point One group - Frequency Distribution Tables Graphs Comparing Groups Averages Early Childhood Outcomes Center

Distribution of COS Ratings in Fall Outcome 1 Rating N % 7 350 70 6 110 22 5 20 4 8 1.6 3 1.2 2 .8 1 .4 We are using fake data for illustration Early Childhood Outcomes Center

Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

Comparison of two classes - Fall Early Childhood Outcomes Center

Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

Frequency on Outcome 1 – Class 1 Early Childhood Outcomes Center

Average Scores on Outcomes by Class – Fall, 2008 Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

Average Scores on Outcomes by Class – Fall, 2008 Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

Average Scores on Outcomes by Class – Fall, 2008 Social-Emotional Knowledge and Skills Action to Meet Needs 1 4.5 4.6 4.7 2 5.3 5.2 3 4.9 4 6.4 5.9 6.6 5 4.3 6 3.8 2.9 3.9 All Classes 5.03 4.63 4.95 Early Childhood Outcomes Center

Looking at change over time Extent of change on rating scale The OSEP categories Developmental trajectories Maintaining Changing Early Childhood Outcomes Center

Extent of change on rating scale: Time 1 to Time 2 Outcome 1 Progress N % Maintained age-expected functioning 350 70 Maintained same level function, but not age-expected 60 12 Gained 3 steps 10 2 Gained 2 steps 25 5 Gained 1 step 50 Dropped 1 step 4 .8 Dropped 2 steps 1 .2 Early Childhood Outcomes Center

OSEP progress categories Looking at information across time Reducing the information to fewer categories to allow easier comparisons Early Childhood Outcomes Center

OSEP Categories 2009 (%) 2010 (%) 2011 (%) 23 22 24 15 17 13 32 34 37 Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 Changed Traj – Closer to Age App 32 34 37 Same Trajectory -Progress 28 25 Flat Trajectory – No Prog. 2 1

OSEP Categories 2009 (%) 2010 (%) 2011 (%) 23 22 24 15 17 13 38 39 37 Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 TOTAL - Age Appropriate at Exit 38 39 37

OSEP Categories (%) 23 22 24 15 17 13 38 39 37 Class 1 (%) Class 2 Maintained Age Appro Trajec 23 22 24 Changed Traj – Age Appro 15 17 13 TOTAL - Age Appropriate at Exit 38 39 37

OSEP Categories 2009 (%) 2010 (%) 2011 (%) 15 17 18 32 34 37 47 51 55 Changed Traj – Age Appro 15 17 18 Changed Traj – Closer to Age App 32 34 37 TOTAL – Greater than Expected Progress 47 51 55

Early Childhood Outcomes Center Working with data Different levels of analysis are required for different levels of questions Aggregation will work for you – but loses detail about individual children. 50 assessment items on 20 children in 5 classes in Fall and Spring 50 x 20 x 5 x 2 = 10,000 pieces of information Early Childhood Outcomes Center

Using assessment data at the classroom level Looking at the data by child At a single point in time Over time Looking at data for areas that cut across children Early Childhood Outcomes Center

Items Related to Outcome 1 Example: Item Results for 5 Imaginary Children Name Items Related to Outcome 1 1 Plays well with others 2 Cooperates with peers in simple games 3 Stops for transition cues 4 Takes directions well from adults 5 Has at least one close friend Carlos A E Geeta NY Eileen Ming Shaniqua A=Accomplished; E= Emerging; NY= Not yet Early Childhood Outcomes Center

Example: COS Outcome Ratings for Class 3c by Child Name Outcome 1 Outcome 2 Outcome 3 Time 1 Time 2 Carlos 5 6 3 4 Geeta 1 2 Eileen 7 Ming Shaniqua Early Childhood Outcomes Center

What do you see in these data? Example of an Aggregated Report for Program: Percentage of Children Scoring 5 or Higher on COS by Class Class Outcome 1 Outcome 2 Outcome 3 Time 1 Time 2 1a 65 70 50 51 49 52 1b 55 53 62 61 87 88 2a 47 43 67 66 2b 76 84 78 85 83 3a 97 98 95 100 What do you see in these data? Early Childhood Outcomes Center

Outcome questions for program improvement, e.g. Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

Looking at Data by Region Percentage of Children Who Changed Developmental Trajectories After One Year of Service Class 1 Class 2 Class 3 45 47 23 Note: This doesn’t say percentage of children who moved ECO categories --- Have zeroed in on what you want to know… What do you think? Possible inference? Early Childhood Outcomes Center

Looking at Data by Age at Entry Percentage of Children Who Changed Developmental Trajectories After One Year of Service 36 to 40 months 41 to 44 months 45 to 49 months 34 42 46 This is where the meaning gets important – possible interpretations.. Possible inference? Early Childhood Outcomes Center

Early Childhood Outcomes Center Take Home Message You will want to look at your data in lots of different ways You will want to think about the possible inferences You may need other information to decide among possible inferences Act on what you have learned Early Childhood Outcomes Center

Tweaking the System Reflect Check Plan (vision) Implement Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

How will/might these data be used? Federal level Overall funding decisions (accountability) Resource allocation (e.g., what kind of TA to fund?) Decisions about effectiveness of program in individual states State level Program effectiveness?? Program improvement?? Local level Early Childhood Outcomes Center

Early Childhood Outcomes Center Need for good data Encompasses all three levels: federal, state, local Depends on how well local programs are implementing procedures Early Childhood Outcomes Center

Many steps to ensuring quality data Before Good data collection/Training Good data system and data entry procedures During Ongoing supervision of implementation Feedback to implementers Refresher training After Review of COSF records Data analyses for validity checks Representativeness of the responses. Early Childhood Outcomes Center

Early Childhood Outcomes Center Take Home Message If you conclude the data are not (yet) valid, they cannot be used for program effectiveness, program improvement or anything else. Inference = Data not yet valid Action = Continue to improve data collection and quality assurance Early Childhood Outcomes Center

Early Childhood Outcomes Center Data Exploration Examine the data to look for inconsistencies If and when you find something strange, look for some other data you have that might help explain it. Is the variation caused by something other than bad data? Early Childhood Outcomes Center

Early Childhood Outcomes Center Obtaining good data Focus on addressing the threats to good data Local providers do not understand the procedures Local providers do not follow the procedures And others….. Identify and address the threats Early Childhood Outcomes Center

How far along is our state? Early Childhood Outcomes Center

Keeping our eye on the prize: High quality services for children and families that will lead to good outcomes. Early Childhood Outcomes Center

The Early Childhood Outcomes Center For more information…. www.the-eco-center.org The Early Childhood Outcomes Center