A3 – Improving State Level Supports and Stakeholder Engagement through Effective Evaluation Kim Gulbrandson, Justyn Poulos – Wisconsin RtI Center Key Words: Applied Evaluation, Assessment
Objectives Understand connection between implementation and student outcomes. Identify considerations for deciding which implementation and outcome data to evaluate and report. Leave with examples of how states can make implementation and outcome connections and how to report to stakeholders. Gather examples of how to use the same evaluation data for state level continuous improvement. Audience: Those leading state level work to support PBIS, RtI, MLSS/MTSS implementation.
What Are You Trying to Accomplish? Wisconsin RtI Center: Our mission is to build the capacity of Wisconsin schools to develop and sustain an equitable multi-level system of supports to ensure the success for all students. Turn and talk – What are you trying to accomplish? How does your evaluation relate to what you are trying to accomplish? Our mission is to build the capacity (skills and abilities) of Wisconsin schools to develop (consistent training) and sustain (consistent technical assistance) an equitable multi-level system of supports (framework document- consistent messaging) to ensure the success for all students (use of Center data).
The Fidelity - Outcome Relationship ”Higher Fidelity is correlated with better outcomes across a wide range of programs and practices.” (Fixsen/Blase) https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/NIRN-NASHMPD15thConference-02-2005.pdf We look at outcome data in relation to fidelity data when measuring impact...
Logic Model – What Data We Collect and Use and Why Original of slide can be found here: http://connect.wisconsinrticenter.org/CentralServices/Admin/State%20Leadership%20Team/11.30.17SLT%20PowerPoint.pptx When schools reflect on their implementation by measuring it, they can action plan based on needs and make improvements. Improvements lead to fidelity and Fidelity of implementation over time connects to improved student outcomes (add reference or two). Short-term – objectives explicit throughout trainings, training evaluations – formative and summative; working on trainers gathering formative data Medium term – follow-up surveys (follow through on action items) Long term – Annual report, cabinet project
Align Data Collection and Evaluation Across the Project Same data and logic used for both internal continuous improvement and external evaluation. Align data collection and evaluation to logic model and goals in a way that ties it all together Do this early on – don’t wait Helps with political support, alignment of practices such as continuous improvement External facing evaluation (annual report, cabinet) and internal continuous improvement involve the same logic and data. Political support (SLT, Cabinet) Annual report Continuous Improvement Goals, sub-goals, etc.
Evaluation Plan - Calendar Internal Continuous Improvement Monthly Evaluation Brief April Share Outcomes with DPI May-June Share Outcomes with SLT August Annual Report October Share Annual Report with SLT November Evaluation Brief Dec-Jan SLT = State Leadership Team DPI = Department of Public Instruction
Evaluation Briefs Align to Logic Audience: Schools and Districts
Annual Report Aligns to Logic Professional Learning–Assessing-Fidelity–Sustaining- Student Outcomes Audience: https://www.wisconsinrticenter.org/assets/files/resources/1513016537_2016-17%20Annual%20Report.pdf
Since 2009… Statewide Overview Professional Learning 2214 public schools Since 2009… Of the 2214 public schools in Wisconsin, 1803 (81%) have participated in professional learning offered by the Center and 1547 (70%) have completed a full training in behavior, reading, and/or mathematics Of trained schools, 1427 (92%) have self-assessed to measure their implementation. Of assessing schools, 1072 (75%) reached fidelity or full implementation at any one level.
PBIS – Trained to Fidelity
Math – Trained to Fidelity
Reading – Trained to Fidelity
Overall Student Outcomes from DPI Report - Follows Same Logic
Overall Student Outcomes
Student Outcomes – Students with IEPs
Student Outcomes – Students with IEPs
Student Outcomes – English Learners
Student Outcomes – Students of Color
Student Outcomes – Students of Color
Lesson Learned When Looking at Outcomes – Factor in Length of Implementation Outcomes - consider when schools got to fidelity as it relates to outcomes… Schools just reaching fidelity vs sustaining for multiple years.
Make the Data Relatable for Stakeholders ConnectEd https://youtu.be/UCYLGfqQ3jY
How We Engage Stakeholders in our Logic and Evaluation Review together Discuss/Clarify Share with their group(s) Bring feedback to RtI Center Share how Share activity example for how we share with SLT – highlight sections, have tables discuss and ask questions and share what stands out for them We ask stakeholders to share annual report and follow-up with them to ask how they’ve used it a few months after sharing it. Intentional, annual time spent on connecting sharing our Annual Report with the State Leadership Team (SLT), including the Department of Public Instruction (DPI) and district leaders. Include examples of sharing with CESAs (principal, administrator meetings), school stories – shared with districts, department, etc.
The Data Has a Dual Purpose 1. Informs our support to schools/districts (our continuous improvement) - INTERNAL 2. Provides our data for evaluation, which impacts funding and political support for the center - EXTERNAL
Use Same Logic for Internal Continuous Improvement Briefly share demo of the dashboard Dashboard includes the same data (in aggregate) and is a snapshot of how we are doing toward our goals It is being used to drive our internal goals Can do this in Excel and have one person update monthly – does not need to be a fancy, programmed database
Lessons Learned - Goal Establish buy-in during goal development and start with realistic goals. Share current state (data) often with internal staff Establish buy-in by involving staff in goal development up front Start small and grow Start with realistic place to meet people where they are at – don’t assume staff ability to problem solve Being explicit with staff about connections between goal and long term outcomes We wanted assessing-fidelity-sustaining, but are starting with assessing
Lessons Learned in Continuous Improvement - Assessing Smaller, more rapid indicators of progress for assessing versus an annual review.
Lesson Learned - Operationalize the differences between how schools and state differ in problem solving from this data. Staff had strong understanding of how to guide schools in using their data to problem solve. Generalizing that process to state/regional TA was not a given. We have had to and are continuing to define things such as indicators of progress toward goals, methods by which to examine the data (in the absence of student outcome data being easily accessible and recent).
Shared Practice PDSA Example – to Increase Assessing
Key Takeaways Link Internal and External Evaluation through your Logic Have an Evaluation Plan and Map it Out (Calendar) Start with Doable and Reasonable Goals Link back to objectives with this slide Keep it simple