Download presentation
Presentation is loading. Please wait.
Published byΠαραμονιμος Αθανασιάδης Modified over 6 years ago
1
A3 – Improving State Level Supports and Stakeholder Engagement through Effective Evaluation Kim Gulbrandson, Justyn Poulos – Wisconsin RtI Center Key Words: Applied Evaluation, Assessment
2
Objectives Understand connection between implementation and student outcomes. Identify considerations for deciding which implementation and outcome data to evaluate and report. Leave with examples of how states can make implementation and outcome connections and how to report to stakeholders. Gather examples of how to use the same evaluation data for state level continuous improvement. Audience: Those leading state level work to support PBIS, RtI, MLSS/MTSS implementation.
3
What Are You Trying to Accomplish?
Wisconsin RtI Center: Our mission is to build the capacity of Wisconsin schools to develop and sustain an equitable multi-level system of supports to ensure the success for all students. Turn and talk – What are you trying to accomplish? How does your evaluation relate to what you are trying to accomplish? Our mission is to build the capacity (skills and abilities) of Wisconsin schools to develop (consistent training) and sustain (consistent technical assistance) an equitable multi-level system of supports (framework document- consistent messaging) to ensure the success for all students (use of Center data).
4
The Fidelity - Outcome Relationship
”Higher Fidelity is correlated with better outcomes across a wide range of programs and practices.” (Fixsen/Blase) We look at outcome data in relation to fidelity data when measuring impact...
5
Logic Model – What Data We Collect and Use and Why
Original of slide can be found here: When schools reflect on their implementation by measuring it, they can action plan based on needs and make improvements. Improvements lead to fidelity and Fidelity of implementation over time connects to improved student outcomes (add reference or two). Short-term – objectives explicit throughout trainings, training evaluations – formative and summative; working on trainers gathering formative data Medium term – follow-up surveys (follow through on action items) Long term – Annual report, cabinet project
6
Align Data Collection and Evaluation Across the Project
Same data and logic used for both internal continuous improvement and external evaluation. Align data collection and evaluation to logic model and goals in a way that ties it all together Do this early on – don’t wait Helps with political support, alignment of practices such as continuous improvement External facing evaluation (annual report, cabinet) and internal continuous improvement involve the same logic and data. Political support (SLT, Cabinet) Annual report Continuous Improvement Goals, sub-goals, etc.
7
Evaluation Plan - Calendar
Internal Continuous Improvement Monthly Evaluation Brief April Share Outcomes with DPI May-June Share Outcomes with SLT August Annual Report October Share Annual Report with SLT November Evaluation Brief Dec-Jan SLT = State Leadership Team DPI = Department of Public Instruction
8
Evaluation Briefs Align to Logic
Audience: Schools and Districts
9
Annual Report Aligns to Logic
Professional Learning–Assessing-Fidelity–Sustaining- Student Outcomes Audience:
10
Since 2009… Statewide Overview Professional Learning
2214 public schools Since 2009… Of the 2214 public schools in Wisconsin, 1803 (81%) have participated in professional learning offered by the Center and 1547 (70%) have completed a full training in behavior, reading, and/or mathematics Of trained schools, 1427 (92%) have self-assessed to measure their implementation. Of assessing schools, (75%) reached fidelity or full implementation at any one level.
11
PBIS – Trained to Fidelity
12
Math – Trained to Fidelity
13
Reading – Trained to Fidelity
15
Overall Student Outcomes from DPI Report - Follows Same Logic
16
Overall Student Outcomes
17
Student Outcomes – Students with IEPs
18
Student Outcomes – Students with IEPs
19
Student Outcomes – English Learners
20
Student Outcomes – Students of Color
21
Student Outcomes – Students of Color
22
Lesson Learned When Looking at Outcomes – Factor in Length of Implementation
Outcomes - consider when schools got to fidelity as it relates to outcomes… Schools just reaching fidelity vs sustaining for multiple years.
23
Make the Data Relatable for Stakeholders
ConnectEd
24
How We Engage Stakeholders in our Logic and Evaluation
Review together Discuss/Clarify Share with their group(s) Bring feedback to RtI Center Share how Share activity example for how we share with SLT – highlight sections, have tables discuss and ask questions and share what stands out for them We ask stakeholders to share annual report and follow-up with them to ask how they’ve used it a few months after sharing it. Intentional, annual time spent on connecting sharing our Annual Report with the State Leadership Team (SLT), including the Department of Public Instruction (DPI) and district leaders. Include examples of sharing with CESAs (principal, administrator meetings), school stories – shared with districts, department, etc.
25
The Data Has a Dual Purpose
1. Informs our support to schools/districts (our continuous improvement) - INTERNAL 2. Provides our data for evaluation, which impacts funding and political support for the center - EXTERNAL
26
Use Same Logic for Internal Continuous Improvement
Briefly share demo of the dashboard Dashboard includes the same data (in aggregate) and is a snapshot of how we are doing toward our goals It is being used to drive our internal goals Can do this in Excel and have one person update monthly – does not need to be a fancy, programmed database
27
Lessons Learned - Goal Establish buy-in during goal development and start with realistic goals. Share current state (data) often with internal staff Establish buy-in by involving staff in goal development up front Start small and grow Start with realistic place to meet people where they are at – don’t assume staff ability to problem solve Being explicit with staff about connections between goal and long term outcomes We wanted assessing-fidelity-sustaining, but are starting with assessing
28
Lessons Learned in Continuous Improvement - Assessing
Smaller, more rapid indicators of progress for assessing versus an annual review.
29
Lesson Learned - Operationalize the differences between how schools and state differ in problem solving from this data. Staff had strong understanding of how to guide schools in using their data to problem solve. Generalizing that process to state/regional TA was not a given. We have had to and are continuing to define things such as indicators of progress toward goals, methods by which to examine the data (in the absence of student outcome data being easily accessible and recent).
30
Shared Practice PDSA Example – to Increase Assessing
31
Key Takeaways Link Internal and External Evaluation through your Logic
Have an Evaluation Plan and Map it Out (Calendar) Start with Doable and Reasonable Goals Link back to objectives with this slide Keep it simple
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.