Download presentation
Presentation is loading. Please wait.
Published byYanti Tanudjaja Modified over 6 years ago
1
RtI Innovations: Evaluation Anna Harms & Jose Castillo
Module 2: Exploration RtI Innovations: Evaluation Anna Harms & Jose Castillo
2
Agenda & Objectives Agenda Objectives
Exploration of and development of consensus for evaluating MTSS Participants will: Understand a framework that can be used to explore MTSS evaluation Gain exposure to critical questions and tools that can facilitate exploration of MTSS evaluation.
3
Stages of Change/Implementation
Should we do it? Exploration/Adoption Consensus Work to do it right! Installation Infrastructure Initial Implementation Implementation Work to do it better! Elaboration Continuous Regeneration “It” is the district evaluation of MTSS.
4
Exploration and Consensus for Adoption
After fully exploring the benefits and risks, stakeholders decide how to move forward at this time. Examine need, evidence, fit, necessary resources, and capacity Determine who will play lead roles in developing consensus Build awareness and garner support among stakeholders Describe risk and help partners manage risk (What will be different for us?) Build consensus and commitment
5
Reflection: What are the risks of moving forward with evaluation of MTSS if there is not consensus for it?
6
Consensus and Readiness
In order to have consensus, we must gather information and strategically communicate that information. “Willingness” and “readiness” are not one and the same Excellent Resource: Scaling Up Brief on Readiness for Change (State Implementation and Scaling-up of Evidence-based Practices, 2013)
7
A Structure for Exploring Readiness and Gaining Consensus
8
Need What is the need for evaluating MTSS in our district?
What data/information suggest there is a need for evaluation of MTSS and that it is a priority? What is the scope of the need? Evaluation just for MTSS? Implementation of evaluation practices across the board? From which stakeholder groups are our indicators of need coming?
9
Beliefs Survey Assesses educator beliefs related to RtI/MTSS
27 items, Likert Scale format Strongly Agree to Strongly Disagree 3 Factors: SWD Academic Abilities and Performance Data-Based Decision Making Functions of Core & Supplemental Instruction ---The Beliefs Survey was developed by Project staff in order to assess the beliefs of educators regarding Problem-Solving/Response to Intervention (PS/RtI). ---Factor One, which includes items 9A, 9B, 10A, 10B, 11A, and 11B, relates to the ability of students with disabilities to achieve academic benchmarks. Factor Two, which includes items 12, 13, 14, 15, 16, 17, 20, 21, 22, 23, 24, 25, and 27, relates to data-based decision-making. Factor Three, which includes items 7A, 7B, 8A, and 8B, relates to the functions of core and supplemental instruction. Additionally, items 6, 18, 19, and 26 were not accounted for by any of the three factors.
10
Perception of Practices Survey
Assesses educator perceptions of data-based problem-solving practices across tiers 18 items, Likert Scale format Never Occurred to Always Occurred 2 Factors: Academic Practices Behavior Practices
11
A simple Perceptions of Practices Graph to use as an example for the Guiding Questions. Again, showing a different way to display data, specifically with baseline levels of perceptions as compared to beliefs.
12
Consensus Development: Guiding Questions cont.
Perception of Practices: What practices occurring in our school do we think are most consistent with MTSS? Least consistent? Which ones do we think may be a threat to the successful implementation of the model? Perception of Practices: What practices occurring in our school do we think are most consistent with the PS/RtI model? Least consistent? Which ones do we think may be a threat to the successful implementation of the model?
13
Here is a simply Beliefs comparison graph to use for an example with the guiding questions. This will also show audience members a different way to display data (using means) to see changes over time.
14
Consensus Development: Guiding Questions cont.
Beliefs & Practices: How consistent are the overall beliefs of our school with our overall perceptions of the practices occurring? What does this level of consistency/inconsistency mean in terms of implementing an MTSS in our school?
15
Fit How does evaluation of MTSS fit within our local context?
Alignment with district priorities? Alignment with our MTSS framework? Other ways of evaluating programs and practices? Alignment between school, district, regional, and state systems and requirements?
16
How do the following policy considerations impact districts in your state?
Student Performance & Growth State-determined school-performance measures ESSA allows states more flexibility Measures and outcome criteria vary from state to state Other state-mandated performance indicators State Performance Plans (SPED) School and District Improvement Plans Local performance indicators Academic measures and performance criteria Behavioral measures and performance criteria
17
What capacity and/or requirements exist to support evaluation efforts?
Statewide and Local: Data sources and measures Data management and reporting systems Professional development and evaluation plans Instruction and intervention plans & protocols Technical assistance personnel and supports
18
Evidence & Readiness for Replication
What evidence suggests that district evaluation of MTSS will help you to meet your desired goals? What have other districts achieved through the evaluation of MTSS? What is the status of empirical research? Number of studies? Quality of the research? How well is the practice of evaluating MTSS defined? Has the process been replicated? Are model sites and experts available to provide TA? What are the characteristics of other districts that have evaluated MTSS? How diverse? Similar to my district?
19
Evidence for Critical Components of MTSS
What We Know What We Don’t Know Assessments and data sources available that predict student outcomes Research supported instruction and intervention methods and strategies exist Problem-solving process relates to student outcomes Which assessments and data sources “should” be used How much drift in implementation fidelity can occur without reducing effectiveness Which steps of problem-solving relate most to student outcomes
20
Evaluation of MTSS What We Know What We Don’t Know
Implementation related to improved academic, behavior, and SPED outcomes Implementation has looked different (e.g., # of tiers, assessments used, approaches to intervention) Evaluations have occurred at school, district, intermediate agency, and state (pilot) levels Methods and tools for evaluating implementation available How to determine “how much” MTSS is contributing to student outcomes Which models and data sources for evaluating MTSS implementation work best What levels of implementation fidelity are needed for student outcomes to improve Which systemic issues will need to be addressed in a given district
21
Necessary Resources If we want to evaluate MTSS well within our district, what resources will be necessary? Information about. . . Tools and materials for. . . Time for. . . People to do . . . Money for. . .
22
Capacity Given the resources that will be necessary and your currently available resources, do you have the capacity to evaluate MTSS? What could you do to increase your capacity? Information Tools and Materials Time People Money
23
Reflection: Consider your district’s readiness to implement and evaluate MTSS. What aspects of the hexagon need to be addressed to move forward?
24
Contact Us Jose Castillo Anna Harms
Michigan’s Integrated Behavior and Learning Support Initiative University of South Florida, Florida Problem Solving/Response to Intervention Project
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.