Christina Kasprzak ECTA/ECO/DaSy Lauren Barton ECO/DaSy Ruth Chvojicek WI Statewide Part B Indicator 7 Child Outcomes Coordinator September 16, 2013 Improving.

Slides:



Advertisements
Similar presentations
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Advertisements

Module 1 Learning More about the Summary of Functional Performance Every day, we are honored to take action that inspires the world to discover, love and.
Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
Data, Now What? Skills for Analyzing and Interpreting Data
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Data Analysis for Assuring the Quality of your COSF Data 1.
Refresher: Background on Federal and State Requirements.
Presented at: Annual Conference of the American Evaluation Association Anaheim, CA - November 3, 2011 Performance Management in Action: A National System.
Update on Child Outcomes for Early Childhood Special Education Lynne Kahn ECO at UNC The Early Childhood Outcomes (ECO) Center The National Association.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 The Results are In: Using Early Childhood Outcome Data.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
Highs and Lows on the Road to High Quality Data American Evaluation Association Anaheim, CA November, 2011 Kathy Hebbeler and Lynne Kahn ECO at SRI International.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Using data for program improvement Early Childhood Outcomes Center1.
Child Outcomes Data July 1, 2008 – June 30, 2009.
Considerations for Establishing Baseline and Setting Targets for Indicators C3 and B7 Kathy Hebbeler, Lynne Kahn, Christina Kasprzak ECO/NECTAC June 16,
1 Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4) Analysis and Summary Report of All States’ Annual Performance Reports Christina.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
Early Childhood Special Education RESOURCES.  Early Childhood Special Education Early Childhood Special Education Wisconsin Early Childhood Indicators.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
Kathy Hebbeler, ECO at SRI Lynne Kahn, ECO at FPG Christina Kasprzak, ECO at FPG Cornelia Taylor, ECO at SRI Lauren Barton, ECO at SRI National Picture.
Child Outcomes Data Analysis Workshop Abby Winer, ECTA, DaSy Kathy Hebbeler, ECTA, DaSy Kathi Gillaspy, ECTA, DaSy September 8, 2014 Improving Data, Improving.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
Patterns in Child Outcomes Summary Data: Cornelia Taylor, Lauren Barton, Donna Spiker September 19-21, 2011 Measuring and Improving Child and Family Outcomes.
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
Module 5 Understanding the Age-Expected Child Development, Developmental Trajectories and Progress Every day, we are honored to take action that inspires.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
Christina Kasprzak ECTA/ECO/DaSy Lauren Barton ECO/DaSy Ruth Chvojicek WI Statewide Part B Indicator 7 Child Outcomes Coordinator March 17, 2014 Encore.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
Measuring and Improving Child and Family Outcomes Conference New Orleans, Sept 19-21, 2011 Using the Child Outcomes Measurement System (COMS) Self-Assessment.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Evaluating activities intended to improve the quality of Child Outcomes Data August 2016.
EIA: Using data for program improvement
Quality Assurance: Looking for Quality Data
Child Outcomes Summary Process April 26, 2017
OSEP Project Directors Meeting
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Early Childhood Outcomes Data (Indicator C3 and B7)
Integrating Outcomes Learning Community Call February 8, 2012
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Using outcomes data for program improvement
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Integrating Outcomes Learning Community June 12, 2013
Measuring EC Outcomes DEC Conference Presentation 2010
Refresher: Background on Federal and State Requirements
Involving Families Early Childhood Outcomes Center.
Christina Kasprzak Frank Porter Graham Child Development Institute
Using the Child and Family Outcomes Analysis Tools
Early Childhood Outcomes Data (Indicator C3 and B7)
2019 Title I Annual Parent Meeting
Presentation transcript:

Christina Kasprzak ECTA/ECO/DaSy Lauren Barton ECO/DaSy Ruth Chvojicek WI Statewide Part B Indicator 7 Child Outcomes Coordinator September 16, 2013 Improving Data, Improving Outcomes Conference - Washington, DC Where the Rubber Hits the Road: Tools and Strategies for Using Child Outcomes Data for Program Improvement

Purposes To describe national resources for promoting data quality and supporting program improvement To share Wisconsin 619 experience and strategies to promote data quality and program improvement To discuss potential approaches for examining data quality and using data in your state 2

Quality Assurance: Looking for Quality Data I know it is in here somewhere

4 Do Ratings Accurately Reflect Child Status? Pattern Checking We have expectations about how child outcomes data should look –Compared to what we expect –Compared to other data in the state –Compared to similar states/regions/school districts When the data are different than expected ask follow up questions

Questions to Ask Do the data make sense? –Am I surprised? Do I believe the data? Believe some of the data? All of the data? If the data are reasonable (or when they become reasonable), what might they tell us? 5

Pattern Checking for Data Quality Strategies for using data analysis to improve the quality of state data by looking for patterns that indicate potential issues for further investigation. king_table.pdf

7 Predicted Pattern 3b. Large changes in status relative to same age peers between entry and exit from the program are possible, but rare.

Most children served in EI and ECSE will maintain or improve their rate of growth in the three child outcomes areas over time given participation in intervention activities that promote skill development. 8 Rationale

Analysis 1. Crosstabs between entry and exit ratings for each outcome, best for COS ratings. 2. Exit minus Entry numbers. For COS ratings we would expect most cases to increase by no more than 3 points. Question: Is the distribution sensible? 9

Entry Exit total Review Total Outcome 3: Crosstabs Between Entry and Exit Ratings

Outcome 1: Children that increased by 4 or more points from entry to exit

Analyzing Child Outcomes Data for Program Improvement Quick reference tool Consider key issues, questions, and approaches for analyzing and interpreting child outcomes data. hildOutcomesData-GuidanceTable.pdf

Steps in Using Data for Program Improvement Defining Analysis Questions Step 1. What are your crucial policy and programmatic questions? Step 2. What is already known about the question? Clarifying Expectations Step 3. Describe expected relationships with child outcomes. Step 4. What analysis will provide information about the relationships? Do you have the necessary data for that? Step 5. Provide more detail about what you expect to see. With that analysis, how would data showing the expected relationships look? 13

Steps in Using Data for Program Improvement Analyzing Data Step 6. Run the analysis and format the data for review. Testing Inferences Step 7. Describe the results. Begin to interpret the results. Stakeholders offer inferences based on the data. Step 8. Conduct follow-up analysis. Format the data for review. Step 9. Describe and interpret the new results as in step 7. Repeat cycle as needed. Data-Based Program Improvement Planning Step 10. Discuss/plan appropriate actions based on the inference(s). Step 11. Implement and evaluate impact of the action plan. Revisit crucial questions in Step 1.

Defining Analysis Questions What are your crucial policy and programmatic questions? Example: 1. Does our program serve some children more effectively than others? a.Do children with different racial/ethnic backgrounds have similar outcomes? 15

What do you expect to see? Do you expect children with racial/ethnic backgrounds will have similar outcomes? Why? Why not? 16 Clarifying Expectations

1.Compare outcomes for children in different subgroups: a. Different child ethnicities/races (e.g. for each outcome examine if there are higher summary statements, progress categories, entry and/or exit ratings for children of different racial/ethnic groups). 17 Analyzing Data

Outcome 1: Summary Statements by Child’s Race/Ethnicity 18

Outcome 1: Progress Categories by Child’s Race/Ethnicity 19

Is the evidence what you expected? What is the inference or interpretation? What might be the action? 20 Describing and Interpreting Results

Guidance Table 21

22

U SING D ATA FOR S TATE & L OCAL I MPROVEMENT W ISCONSIN ’ S P ART B Ruth Chvojicek – WI Statewide Part B Indicator 7 Child Outcomes Coordinator

K EY P OINTS A BOUT W ISCONSIN ’ S S YSTEM Sampling strategy until July 1, 2011 Part B Child Outcomes Coordinator position funded through preschool discretionary funds – focus on training and data Statewide T/TA system with district support through 12 Cooperative Educational Service Agency’s – Program Support Teachers

G ERMANTOWN S CHOOL D ISTRICT – L ESSON ’ S L EARNED Jenni Last – Speech Language Pathologist Lisa Bartolone School Pyschologist

R ESULT OF G ERMANTOWN ’ S WORK IN JUST 2 YEARS

G ERMANTOWN – O UTCOME T WO

G ERMANTOWN – O UTCOME T HREE

S TATE P ROGRESS IN T WO Y EARS – O UTCOME O NE

S TATE P ROGRESS – O UTCOME T WO

S TATE P ROGRESS – O UTCOME T HREE

BUT … O UTCOME O NE E XIT R ATING

O UTCOME T HREE

W ISCONSIN P ART B D ATA R EVIEWS – Piloted process individually with 20 districts Discovered differences in how districts were determining eligibility S/L and SDD Two districts who used criterion referenced tool consistently AND provided PD on using tool showed more appropriate pattern than other 18 districts Next steps identified by districts: Mentoring and pd for new staff More attention to formative assessment process Work on internal data tracking system

W ISCONSIN P ART B D ATA R EVIEW Looked at 8 data patterns including: Entry Rating Distribution Entry Rating Distribution by Disability* Comparison Entry Ratings by Outcome Exit Rating Distribution Entry / Exit Comparison* Race/Ethnicity Comparison* State Progress Categories* Summary Statements*

L OOKING AT R ACE /E THNICITY

T RYING OUT THE N EW T OOL - D O C HILDREN WITH S PECIFIC T YPES OF D ISABILITIES S HOW D IFFERENT P ATTERNS OF G ROWTH ?

D O C HILDREN WITH S PECIFIC T YPES OF D ISABILITIES S HOW D IFFERENT P ATTERNS OF G ROWTH ?

W ISCONSIN N EXT S TEPS Looking at the data – does the type of setting impact the progress children make? (District level analysis) As a state T&TA system, we’re operating as a PLC to guide the work and support the District What will the Districts want to focus on? E.g. settings, race/ethnicity, curriculum use

Local Contributing Factors Tool Provides ideas for the types of questions a local team would consider in identifying factors impacting performance. 2/Uploads/ECO-C3-B7-LCFT_DRAFT docx

Relationship of Quality Practices to Child and Family Outcome Measurement Results Designed to assist states in identifying ways to improve results for children and families through implementation of quality practices tcomes_ Final.doc

Next Steps? Try out using these resources Send feedback to ECO Center about the new Analysis tool What are your ‘take aways’ and next steps related to analyzing your data for data quality and/or program improvement? (notes for State Team time) 52

53 Find more resources at: the-eco-center-org