Data Analysis for Assuring the Quality of your COSF Data 1.

Slides:



Advertisements
Similar presentations
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Advertisements

Data, Now What? Skills for Analyzing and Interpreting Data
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
Refresher: Background on Federal and State Requirements.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Lynne Kahn Christina Kasprzak Kathy Hebbeler The Early Childhood Outcomes.
Orientation for New Staff Lynne Kahn Kathy Hebbeler The Early Childhood Outcomes (ECO) Center Early Childhood Outcomes Center September 2011.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
State Activities in Measuring Child Outcomes Lynne Kahn, Donna Spiker, Melissa Raspa, & Kathleen Hebbeler ECO Center Presented at: International Society.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
Update on Part C Child Outcomes Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center June 2011 Kathy Hebbeler ECO at SRI International.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Using data for program improvement Early Childhood Outcomes Center1.
Using Data for Program Improvement Christina Kasprzak May, 2011.
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG Christina Kasprzak, ECO at FPG Cornelia Taylor, ECO at SRI Lauren Barton, ECO at SRI National Picture.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
Preparing the Next Generation of Professionals to Use Child Outcomes Data to Improve Early Intervention and Preschool Special Education Lynne Kahn Kathy.
UNDERSTANDING THE THREE CHILD OUTCOMES 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Child Outcomes: Understanding the Requirements in order to Set Targets Presentation to the Virginia Interagency Coordination Council Infant &
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
Looking for Patterns in Child Outcome Data – Examples from NYS New York State Department of Health Bureau of Early Intervention.
1 Quality Assurance: The COS Ratings and the OSEP Reporting Categories Presented by The Early Childhood Outcomes Center Revised January 2013.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
1 Call in number or Improving the Quality of Child Outcome Data Materials at
2012 OSEP Leadership Conference Leading Together to Achieve Success from Cradle to Career Child Outcomes for Early Intervention and Preschool Special Education:
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Measuring Child Outcomes Christina Kasprzak Robin Rooney (ECO) Early Childhood Outcomes (NECTAC) National Early Childhood TA Center Delaware COSF Training,
Early Childhood Outcomes Center1 Connecting the Three OSEP Family Outcomes with IFSP Outcomes and Local Practices Christina Kasprzak, NECTAC/ECO Connie.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Early Childhood Outcomes Center Orientation to Measuring Child and Family Outcomes for New People Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG/UNC.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Kathy Hebbeler Lynne Kahn The Early Childhood Outcomes (ECO) Center.
Data Workshop: Analyzing and Interpreting Data Kathy Hebbeler, ECO at SRI Lynne Kahn, NECTAC/ECO at FPG Cornelia Taylor, ECO at SRI Ann Bailey, NCRRC Presented.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
Why Collect Outcome Data? Early Childhood Outcomes Center.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Parent and National TA Perspectives on EC Outcomes Connie Hawkins, Region 2 PTAC Kathy Hebbeler, ECO at SRI Lynne Kahn ECO at FPG and NECTAC.
Early Childhood Outcomes Workgroup Christina Kasprzak and Lynne Kahn ECO and NECTAC July 2009.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement Tools & Process A story of 3 conversions.
Approaches for Converting Assessment Data to the OSEP Outcome Categories Approaches for Converting Assessment Data to the OSEP Outcome Categories NECTAC.
Looking at Data Presented by The Early Childhood Outcomes Center
EIA: Using data for program improvement
Quality Assurance: Looking for Quality Data
Data Workshop: Analyzing and Interpreting Data
Integrating Outcomes Learning Community Call February 8, 2012
Using outcomes data for program improvement
The Basics of Quality Data and Target Setting
Using Data for Program Improvement
Data Workshop: Analyzing and Interpreting Data
Using Data for Program Improvement
Measuring Part C and Early Childhood Special Education Child Outcomes
Involving Families Early Childhood Outcomes Center.
Presentation transcript:

Data Analysis for Assuring the Quality of your COSF Data 1

What are these numbers?? 2

OSEP reporting requirements: the outcomes Percentage of children who demonstrated improved: 1.Positive social emotional skills (including positive social relationships) 2.Acquisition and use of knowledge and skills (including early language/ communication [and early literacy]) 3.Use of appropriate behaviors to meet their needs 3

OSEP reporting categories Percentage of children who: a.Did not improve functioning b.Improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers c.Improved functioning to a level nearer to same-aged peers but did not reach it d.Improved functioning to reach a level comparable to same-aged peers e. Maintained functioning at a level comparable to same-aged peers 3 outcomes x 5 “measures” = 15 numbers 4

Getting to progress categories from the COSF ratings 5

FunctioningFunctioning 6

Entry 7

Exit 8

EntryExit 9

Key Point The OSEP categories describe types of progress children can make between entry and exit Two COSF ratings (entry and exit) are needed to calculate what OSEP category describes a child progress 10

How changes in ratings on the COSF correspond to reporting categories a - e e. % of children who maintain functioning at a level comparable to same-aged peers Rated 6 or 7 at entry; ANDRated 6 or 7 at entry; AND Rated 6 or 7 at exitRated 6 or 7 at exit 11

EntryExit 12

EntryExit 13

EntryExit 14

How changes in ratings on the COSF correspond to reporting categories a - e d. % of children who improve functioning to reach a level comparable to same-aged peers Rated 5 or lower at entry; ANDRated 5 or lower at entry; AND Rated 6 or 7 at exitRated 6 or 7 at exit 15

EntryExit 16

How changes in ratings on the COSF correspond to reporting categories a - e c. % of children who improved functioning to a level nearer to same aged peers, but did not reach it Rated higher at exit than entry; ANDRated higher at exit than entry; AND Rated 5 or below at exitRated 5 or below at exit 17

EntryExit 18

EntryExit 19

How changes in ratings on the COSF correspond to reporting categories a - e b. % of children who improved functioning, but not sufficient to move nearer to same aged peers Rated 5 or lower at entry; ANDRated 5 or lower at entry; AND Rated the same or lower at exit; ANDRated the same or lower at exit; AND “Yes” on the progress question (b)“Yes” on the progress question (b) 20

EntryExit 21

EntryExit 22

EntryExit 23

EntryExit 24

How changes in ratings on the COSF correspond to reporting categories a - e a. % of children who did not improve functioning Rated lower at exit than entry; ORRated lower at exit than entry; OR Rated 1 at both entry and exit; ANDRated 1 at both entry and exit; AND Scored “No” on the progress question (b)Scored “No” on the progress question (b) 25

EntryExit 26

EntryExit 27

The ECO Calculator can be used to translate COSF entry and exit ratings to the 5 progress categories for federal reporting 28

Promoting quality data through data analysis 29

Promoting quality data through data analysis Examine the data for inconsistencies If/when you find something strange, what might help explain it? Is the variation because of a program data? Or because of bad data? (at this point in the implementation process, data quality issues are likely!) 30

The validity of your data is questionable if… The overall pattern in the data looks ‘strange’ –Compared to what you expect –Compared to other data –Compared to similar states/regions/agencies 31

COSF Ratings – Outcome 1 Entry data (fake data) RatingStatewide

COSF Ratings – Outcome 1 Entry data (fake data) RatingStatewide 130 (15%) 242 (20%) 351 (25%) 460 (30%) 510 (5%) 6 70 (0%) 33

Frequency on Outcome 1 – Statewide (fake data) 34

COSF Ratings – Outcome 1 Entry data (fake data) RatingAgency 1Agency 2Agency 3Agency

COSF Ratings – Outcome 1 Entry data (fake data) RatingGroup 1Group 2Group 3Group 4 115%5% 10% 220%5%10% 325%10%15% 430%15%10%20% 55%20%25%20% 65%25% 20% 70%20%10%5% 36

Questions to ask when looking at data Do the data make sense? –Am I surprised? –Do I believe the data? Some of it? All of it? If the data are reasonable (or when they become reasonable), what might they tell us? When we believe the data, how can we use it for program improvement? 37

Using data for program improvement 38

39 Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Continuous Program Improvement

Using data for program improvement = EIA E vidence I nference A ction 40

41 Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable

42 Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence)

43 Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data

Explaining variation Who has good outcomes = Do outcomes vary by Region of the state? Amount of services received? Type of services received? Age at entry to service? Level of functioning at entry? Family outcomes? Education level of parent? 44

45 Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Again, early on the action might have to do with improving the quality of the data

Working Assumptions There are some high quality services and programs being provided across the state There are some children who are not getting the highest quality services If we can find ways to improve those services/programs, these children will experience better outcomes 46

Questions to ask of your data Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? 47

Program improvement: Where and how –At the state level – TA, policy –At the agency level – supervision, guidance –Child level -- modify intervention 48

Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning 49

Plan (vision) Program characteristics Child and family outcomes Implement Check (Collect and analyze data) Reflect Are we where we want to be? Is there a problem? Why is it happening? What should be done? Is it being done? Is it working? Tweaking the System 50

Continuous means… ….the cycle never ends.….the cycle never ends. 51