Presentation is loading. Please wait.

Presentation is loading. Please wait.

Enter Conference Line ID: #

Similar presentations


Presentation on theme: "Enter Conference Line ID: #"— Presentation transcript:

1 Enter Conference Line ID: 165-5404#
Welcome! To join the conference call, please dial local number for city listed below IF NO LOCAL NUMBER IS AVAILABLE BELOW: (800) LOCAL DIAL IN NUMBERS Boca Raton (561) Fort Lauderdale  (954) Fort Myers (239) Miami (786) Tallahassee (850) Tampa                   (813) West Palm Beach (561) Enter Conference Line ID: #

2 Large groups and “multi-taskers,” please be sure to mute your line to prevent background noise.  Thanks!

3 Southern Region MTSS PLC
Welcome! Face-to-face Session – February 2017 Providing MTSS Fidelity Data Kelly Justice Florida PS/RtI Project Robyn Vanover, Florida’s PBIS Project Wrap up

4 2017 Face to Face PLC Session
February date to be determined Watch for Date of greatest common availability RSVP Determine location based on RSVPs

5 Sharing implementation fidelity data

6 PS/RtI Needs Assessment – May 2016
Southern Region Respondents: PS/RtI District Contacts Rating Scale: Level of Need = 1 (low) – 5 (high) Secondary MTSS implementation – 4.18 Providing MTSS fidelity data to staff – 3.36 Increasing student engagement with instruction – 3.36 Understanding how SWDs are served and fully included within an MTSS – 3.27 Tier 3 behavioral intervention practices – 3.27 Average item score

7 Why bother? Why is it important for districts and schools to gather MTSS implementation fidelity data?

8 Fidelity The degree to which a set of educational procedures and/or strategies are implemented in a manner consistent with the research that supports their validation and use.

9 Two Types of Fidelity Instructional
Implementation or procedural fidelity

10 Importance of Fidelity
Rapid and widespread deployment of MTSS has made urgent the need to attend to fidelity Fidelity data are essential to making valid conclusions about outcomes Issues related to integrity are central to the success of MTSS (Sanetti and Kratochwill, 2009)

11 Attending to Fidelity Control “drift”
Target fidelity as a potential reason for unintended outcomes Detect and prevent poor implementation fidelity

12 Questions Drive Data Collection
Data collection should be driven by the questions you want to answer Are students meeting expectations? Academically? Behaviorally? Social-emotionally? Do we have the capacity to implement successfully? Do staff buy into implementing MTSS? Are we implementing MTSS with fidelity?

13 Poll Question To what degree are schools in your district gathering data to determine levels of MTSS across academic and/or behavioral content areas?

14 Potential Data Sources
Self-Assessment of MTSS (SAM) Tiers I/II and Tier III Critical Components Checklist Tier I/II Observation Checklist, Problem-Solving Team Meeting Checklist Available on Florida PS/RtI website Resources > Program Evaluation

15 Venues for Dissemination
District Leadership Meetings Professional Learning Events School-Based Leadership Team meetings Faculty Meetings PLCs

16 Strategies for Disseminating Data
Identify key stakeholders (e.g., District Leadership Teams, School Leadership Teams, instructional staff) Graph data to easily identify trends Compare different types of data (self-report, permanent product reviews, observations) Share data quickly and frequently Use Guiding Questions to guide discussions

17 Using Guiding Questions
What are the patterns? What patterns are evident among each of the individual items on the checklist and across all data sources? What elements are occurring more frequently? Less frequently? Are there any current indicators that show a zero or low level of implementation? Why? Have these been targeted in the past? Do barriers exist with consensus or infrastructure? Other priorities? Meetings not happening or not focusing on implementation? The use of guiding questions facilitates discussion about data and strategies to increase PS/RtI practices. These guiding questions were designed to facilitate discussions about each school’s data, including current level of problem-solving implementation and consistency between SAPSI data and other implementation integrity measures (e.g., other data sources are discussed elsewhere in this manual). However, stakeholders can generate additional guiding questions to better meet the needs of their school.

18 Using Guiding Questions (cont.)
How have you progressed in implementing the Problem-Solving Model with fidelity? Looking across all fidelity measures (SAM, CCC, and Observations) What are the general levels of implementation? What are the general trends? Do the data from the Critical Component Checklist and Observations support what is evident in the SAM? Are there discrepancies among the different sources of data with using the Problem-Solving model? How might these discrepancies be interpreted?

19 Making Feedback Effective
Goal Referenced Actionable User-Friendly Timely Ongoing/Formative Consistent Depending on the research literature that is used to identity what makes feedback most effective, you will find slight variations in the above list. But in general, feedback is most effective when… It is provided in relation to an agreed-upon goal. It is actionable, User-friendly Provided in a timely manner It has a formative rather than a summative purpose And, is stable and reliable over time. Each of these will be highlighted next in more detail.

20 Other tips… Review the instrument in advance
Enroll the facilitator or team in planning A conversation… not a gotcha!! Use lower ratings or scores as an opportunity

21 What strategies have proven effective for you when providing others feedback?

22 PBIS Evaluation Goal Become familiar with different data sources for Tier 1 behavior data-based problem solving.

23 Mid Year 1 – November 1 Mid Year 2 – March 1 End of Year – June 15

24

25

26 Evaluating Tier 1 Implementation
Diagnostic Progress Monitoring Outcome Baseline BoQ PIC Surveys: Staff Student Parent PBIS Walkthrough Participatio n in Reward Activities Artifacts of Lessons Focus Groups Observatio ns ODRs Minors OSS/ISS Attendance SESIR % Students w/ Referrals Climate Bullying Sub. Abuse Academics As teams plan their implementation, they will engage in ongoing data analysis to make the best decisions about their use of resources. The PS/RtI Project talks about how all data are either screening, diagnostic, progress monitoring, or outcome data. This summarizes the breakdown for Tier 1 behavior data (the most common sources). Data sources shouldn't’t be used independently of each other – the nature of Tier 1:Behavior is that teams must triangulate several kinds of data to get a complete picture of what’s going on. There are many data sources that can be used to evaluate & plan Tier 1 implementation. This slide lists the most common. Note that the different data sources, although they are used primarily for one purpose or another (diagnostic, monitoring, outcome), can be used across multiple categories. For example, ODRs & Minors can be used to monitor the progress of implementation (how are our Tier 1 interventions working?), as well as serve as an overall outcome measure (did ODRs decrease compared to last year?). Trainers: you might want to clarify all the acronyms. How do you monitor Tier 1 implementation in the classroom?

27 End-Year Data Review and Discussion
District-Level End-Year Reports/Graphs 1. Core Effectiveness 2. Average Referrals (ODRs) 3. Equity profile 4. Days Lost OSS, ISS, Attendance 5. BoQ District summary for all schools or by grade level (i.e., elementary, middle, high, alternative). Pull district level data and review it with the coaches. TFI – Tiered Fidelity Inventory for Tiers 2 and 3 – Replaced the BAT (Benchmarks of Quality)

28 End-Year Data Review and Discussion
School-Level Reports End-Year Reports/Graphs Core Effectiveness Average Referrals (ODRs) Equity profile Days Lost OSS, ISS BoQ Surveys (staff, student, families) Walkthrough Graphs from each school or sample of schools to share/discuss

29 Mid Year I & II (November 1, March 1) School Level
Mid-Year I & II PBIS Implementation Checklist (PIC) Critical Element (Tier 1) Implementation Level (Tier 1, 2, 3)

30 Mid Year 1 Progress Monitoring
Trainers: Explain what the PIC is, how to use it. This is the fall PIC from the school whose BoQ was displayed a few slides ago, so we’re looking at actual results. Time goes by, October rolls around and Coaches will complete a PIC for their team. Overall, progress on implementing the critical elements is really good! Point out that in this example, the team focused their efforts on getting faculty buy-in & establishing formal school-wide expectations. (DBDM came up as a result of the team using their data to guide their implementation). As a result, Buy-in & Expectations have come way up, but there’s still work to be done. Point out how if the team used this info to plan their next steps with implementation, by the end of the year they could expect their Benchmarks to reflect high scores on many of these elements.

31

32 Mid Year 2 Progress Monitoring
Spring Note to trainers: this is made-up data If the team continued to use their PIC results to plan their implementation steps, they would see progress in the spring PIC, and on the Benchmarks at the end of the year.

33 BoQ Progress Monitoring
Note to trainers: this is made-up data The Benchmarks of Quality is the year-end implementation measure. Teams can complete the Benchmarks as often as they’d like, but they’re only entered into PBISES for graphical output at the end of the year. Their first year, most teams see lots of growth over their Baseline BoQ. Over time, Benchmarks scores can vary due to changes within the school and district (new principal, new superintendent, change of data system, etc…). So teams should monitor their Benchmarks scores & use them to plan implementation activities every year.

34 Multiple Data Sources to Evaluate Implementation
Even when the Benchmarks are high, teams need to evaluate whether they’re actually impacting their students with their efforts, and over time they’ll need to evaluate whether they’re impacting all groups of students equitably. Most schools will see their rate of ODRs decrease over their baseline year, but sometimes they may see an increase as they begin to track existing behaviors or disciplinary actions that they previously handled “off-the-books”. Annual discipline outcomes should be used in conjunction with implementation data, surveys, & academic info. **Other data to consider would be ISS/OSS data, which is also found in PBISES.

35 September – October Data Review PBIS Planning Implementation
Family/Community September Previous year’s ODRs, ISS, OSS, attendance, ethnicity, academics, classroom BoQ & TFI Survey results of kick-off events Monthly team meeting (schedule developed in August) Establish schedule for reporting data to staff Plan how to provide data to T2/T3 intervention teams Teach expectations & rules, rewards, discipline to staff & students Schedule reward events for year Update Action Plan Faculty/Staff Review Expectations/Rules Reward system Discipline Process Behavior forms Lesson Plans PBIS Student Review Discipline procedures Newsletter article to introduce PBIS to families and community Flyers and/or presentations provided at PTS meetings to introduce PBIS October Previous month’s ODRs, ISS, OSS, attendance, ethnicity, academic, classroom Students with 1 or more referrals Plan refresher based on data & reward event Coordinate with T2 team Review Mid-Year I PIC Hold student reward event and provide staff rewards Complete Mid-Year I PIC Share PBIS data with staff & T2 team for 1+ referrals Implement refresher Newsletter article sharing school-wide data ‘Good News’ stories to media Add/revise based on district policies/processes

36 Using Year-End Data Discussion
How does year-end data provide a ‘picture’ of what happened last year? What patterns and/or trends can be determined from the data? How will your team use the data for problem-solving and action planning for this year? How will your team use the data to plan activities and events for this year? What worked well? Celebrate!

37 Sharing with Stakeholders
Coaches (internal/external) PBIS Team Members Administrators Faculty District Administrators School Board Members Community/Parents/Students

38 Resources

39 Contact Information and Resources FLPBS MTSS Project Phone: (813) Fax: (813) Website: Facebook: FLPBS on Twitter: @flpbs OSEP TA Center on PBIS Website: Association on PBS Website:

40 Connect with Us! Florida’s Problem-Solving/Response to Intervention Project Facebook: flpsrti Florida Positive Behavioral Interventions & Support Project Facebook: flpbis

41 Our next online PLC Session:
Jan. 10, 2017 from 8:30 – 9:30 a.m. EST Access session materials on our wiki:

42 MTSS Technical Assistance Contacts Southern Region
Kelly Justice Regional Coordinator Florida PS/RtI Project Stephanie Martinez Technical Assistance Specialist Florida’s PBIS Project Devon Minch Robyn Vanover Brian Gaunt Inter-project Coordinator


Download ppt "Enter Conference Line ID: #"

Similar presentations


Ads by Google