Using Data to Drive Improved Results for Children with Disabilities Facilitated by Mary Corey Coordinator, Special Education Data Missouri Department of Elementary and Secondary Education
2019 OSEP Leadership Conference OSEP Disclaimer 2019 OSEP Leadership Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2019 OSEP Leadership Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)
Agenda Oklahoma Part B: Trained LEAs to analyze data and conduct root cause analysis for local improvement Department of Defense, Army (Educational and Developmental Intervention Services) Part C: Conducted system-level analyses to improve quality and quantity of early childhood child and family outcome data across all units Kansas Part B: Facilitated the analysis of a specific LEA’s discipline data to reduce disproportionate suspensions and expulsions
Guiding Perspective Evidence Inference Action What data describe our units, programs and children? What do we know to be true from the data? Evidence What do the data mean? What patterns exist within unit/program/ child characteristics and across them? Inference What will we do about it? Action
Oklahoma: Empowering LEAs to Use Data to Improve Outcomes Ginger Elliott-Teague, PhD Director of Data Analysis, Special Education Oklahoma State Department of Education
Goal: Build LEA capacity to use data to drive local improvements Information Knowledge POWER to CHANGE
Training & Tools to See the Evidence User manuals Webinars and in-person training Biannual child count/end of year seminars Data retreat Statistical tools How to identify patterns within and across categories and variables Center, shape and spread Comparing means Cross-tabs Scatterplots
Identifying Outliers Grades Count SLP PK – K 8 1 – 2 7 3 – 4 4 5 – 6 3 7 – 8 9 – 10 11 – 12 2 Using center and spread together to understand shape and identify outliers
Three Approaches to Comparing Data Depends on the types of variables/factors you have Categorical vs numerical How many attributes exist in each factor? Both categorical and/or few attributes: use cross-tabulation Both numerical and/or many attributes: use scatterplots Numerical and categorical: compare means across groups Or use regrouping to build cross-tabs
Inference: What does my evidence mean? Scatterplot: Daily Comparison of Hours of Sleep to Steps Counted Patterns of various kinds Outliers Positive and negative lines Bimodal or gapped data Uniform versus random Measuring pattern strength Statistical techniques
Training to Link Data to Improvement Data Retreat Root-cause analysis Self-assessments Improvement plans What is the relationship between the attendance rates of students who do not graduate on time and/or have dropped out of school? Consider whether students with disabilities differ from those without, whether attendance is an issue, whether students were absent when younger, etc. What have you learned? What will you do about it? How does your district analyze suspension data for students with and without disabilities? How do you identify students at risk for not graduating on time and/or dropping out? What have you learned? What will you do about it? WHY? Why? WHY???
Action: Impact on Student Outcomes No concrete evidence yet linking participation in the data retreat or high quality completion of self-assessments to student outcomes Anticipated outcomes: LEAs will learn through practice how to devise program improvements that are rooted in data. Program improvements will lead to better student outcomes.
Naomi Younggren, PhD Part C/CSPD Coordinator DoD Army EDIS DoD Army EDIS: Child & Family Outcomes Data: Inspiring changes that might have otherwise not occurred Naomi Younggren, PhD Part C/CSPD Coordinator DoD Army EDIS
Child & Family Outcomes Three child outcomes Children have positive social emotional skills Children acquire and use knowledge and skills Children use appropriate behaviors to meet their needs FAMILY OUTCOMES - Percentage of families reporting that EI helped their family: Know their rights Effectively communicate their child's needs Help their child develop and learn
Child & Family Outcomes Inspired Changes Question Evidence Inference Action Embedding the child outcomes measurement process into the IFSP Advancing local level data awareness, analysis, and application Partnering with families to measure child outcomes
Slide 16 QUESTION Do we have complete data? EVIDENCE 70% return on Child Outcomes & 39% return on Family Outcomes INFERENCE Child outcome data collection not integrated into current IFSP processes Family outcome data collection not sufficiently emphasized ACTION Embed child outcomes in IFSP & increase emphasis on family outcomes Slide 16
Celebrate Success AND Continue tracking and efforts to further increase return rates!
Be mindful of slippage Examine program data in light of national data IDEA Part C and Part B Section 619 National Child Outcomes Results for 2016-17 (Webinar October 9, 2018 Presenters: Christina Kasprzak and Cornelia Taylor)
QUESTION Do we have continuous complete data? EVIDENCE Some slippage INFERENCE Opportunities to advance local child outcome data monitoring exist ACTION Build reports to capture data for local level analysis and action Slide 19 Data Patterns for COS Ratings: What to Expect and What to Question Taylor, C., & Tunzi, D. (2018). Data patterns for COS ratings: What to expect and what to question. Menlo Park, CA: SRI International. http://ectacenter.org/~pdfs/eco/pattern_checking_for_cos_ratings.pdf
Slide 20 Are there missing data? Are there obvious questions about data entry? Do the ratings match the children – you know the children?
Slide 21 4. Are ratings across outcomes related as expected? 5. Do the changes in entry to exit show an expected distribution? 6. Is the distribution of entry/exit ratings as expected? 7. Do the entry to exit means increase? Slide 21
Slide 22 QUESTION Are results and practices similar across programs? EVIDENCE Variation in results and practices is evident INFERENCE Program practices not uniform ACTION Establish consistent practices, including engaging families in the process Slide 22
Slide 23 Younggren, N., Barton, L., Jackson, B., Swett, J. & Smyth, C. (2017). Child Outcomes Summary-Team Collaboration (COS-TC) Quality Practices: Checklist and Descriptions. Retrieved from http://ectacenter.org/eco/pages/costeam.asp
Child and Family Outcome Inspired Positive Change https://www.youtube.com/watch?v=j5pdmyTs4co
QUESTION Will family participation in child outcome ratings influence family outcomes? EVIDENCE Family outcomes are high COS-TC not fully implemented, need to follow the data INFERENCE Pending ACTION Slide 25
Data do not have to be difficult to be useful, even simple analysis can help influence positive change.
Kansas: Equity, Inclusion, and Opportunity: Addressing Success Gaps in Our Districts Laura Jurgensen, Assistant Director Special Education and Title Services Kansas State Department of Education
The Success Gaps Process Use data to identify groups of students who experience educational “success gaps” in areas such as attendance, graduation, test scores, discipline, and class placement; Build a team of educators, parents, students as appropriate, and community members focused on the groups experiencing the gaps; school or district leaders capable of implementing change; and data experts; Use local data to identify factors that promote or–if absent–detract from equity, inclusion, and opportunity for all students; Create action plans to address identified negative factors; and Implement the action plans over time, with structures in place to maintain a focus on data and the groups affected by success gaps.
Two tools from the IDEA Data Center (IDC) help you identify and address practices by finding the factors contributing to the success gap.
Success Gaps Toolkit Includes— Guidelines (instructions) for using the Success Gaps materials Meeting agendas for a series of meetings and presentation shells for each meeting Some materials for pre-reading Two videos, one to invite participants to be part of the success gaps work and one to introduce success gaps during the first meeting Sample action plan formats and meeting evaluation formats Written stories or examples of work in other states or districts
Phase One IDENTIFY SUCCESS GAPS
What is a success gap? A gap in educational outcomes between different groups of students Achievement Identification for special education Suspension rates Course-taking Graduation rates Attendance
Kansas Identifies a Success Gap Over 6% of black KCK students with disabilities were suspended/expelled > 10 days. State Performance Plan/Annual Performance Report Indicator 4B KCK students with disabilities are about 10x as likely to be suspended/expelled > 10 days as other Kansas students with disabilities. State Performance Plan/Annual Performance Report Indicator 4A KCK black students with disabilities were about twice as likely as non-black students with disabilities to be suspended/expelled in 4 categories. Current state bar is 4x. Significant Disproportionality Data from 2016–17 school year
Phase Two BUILD A TEAM
Kansas City Builds a Team Considered the demographics of the student population as well as staff demographics. Kansas City, KS Students: 49.76% Hispanic 27.37 % African American 12.53% White 10.34% other Kansas City, KS Staff: 28.5% Black (certified and classified) 30% Administrators Black 25% of the Success Gaps team members from KCKPS were African American
IDENTIFY FACTORS CONTRIBUTING TO THE SUCCESS GAP Phase Three IDENTIFY FACTORS CONTRIBUTING TO THE SUCCESS GAP
Equity, Inclusion, and Opportunity Can Lessen Success Gaps Between Groups of Students
Identifying the Factors Contributing to Kansas City’s Success Gap Six meeting dates from February to April, two hours in length First meeting, set purpose and agreed upon the focus of the work. Three meetings, completed self-assessment (rubric) Prioritized elements and formed groups Each of the two groups met to develop plans Last meeting, shared plans with whole group
Identifying the Factors Contributing to Kansas City’s Success Gap Meeting One: Establishing Purpose
Meetings Two, Three, and Four: Self-Assessment
Meeting Five: Prioritizing Need Each member given 100% to split among the indicators however they chose The top two were Indicator 1: Data-Based Decision-Making Indicator 2a: Culturally responsive instructional interventions and teaching strategies are used throughout the district
Phase Four CREATE ACTION PLAN
Data-based Decision-Making Emerged as most pressing and focused on as a corrective action plan. Some aspects of the Cultural Responsiveness plan are being implemented.
Phase Five IMPLEMENT ACTION PLAN
Implementation in Kansas City, Kansas Action plan implemented districtwide KSDE monitored implementation in 15 schools based on discipline data from the 2016-17 school year for black students with disabilities 3 of 4 comprehensive high schools 6 of 8 middle schools 6 of 30 elementary schools
Implementation in Kansas City, Kansas KSDE staff met with a building-level leadership team in each building monitored to observe discipline data analysis KSDE also observed professional development District leadership received guidance and feedback from KSDE multiple times per month in the support of building-level implementation KSDE completed its oversite in December 2018
Reflecting on the Process Preferred to see the district implement its action plans independently. However, KSDE used one of the action plans as a corrective plan. Success Gaps is not intended to be a monitoring tool. KCK not identified under Indicator 4B for the following school year. Significant improvements made in the number of suspensions and expulsions of black students with disabilities.
For More Information Visit the IDC website http://ideadata.org/ Contact Laura at ljurgensen@ksde.org
QUESTIONS?
2019 OSEP Leadership Conference OSEP Disclaimer 2019 OSEP Leadership Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2019 OSEP Leadership Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)