Download presentation
Presentation is loading. Please wait.
Published byGodfrey McCarthy Modified over 9 years ago
1
PBIS Data Review: Presented by Susan Mack & Steven Vitto
2
One Common Voice – One Plan Do Implement Plan Monitor Plan Evaluate Plan Plan Develop Action Plan Gather Getting Ready Collect School Data Build School Profile Student Achievement Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice RtI
3
RtI Core Principles (Batsche, et al., 2005) We can effectively teach all children. Intervene early. Use a multi-tier model of service delivery. Use a problem-solving method to make decisions within a multi-tier model. Use evidence-based, scientifically validated interventions/instruction to the extent available. Monitor student progress to inform instruction. Use data to make decisions. Use assessment for the purpose of screening, diagnostics, and progress monitoring. (p.19-20) 6
4
We want to gather information that tells us: –How well we are implementing/doing something: Systems/Process/Fidelity Data AND –Whether what we’re doing is working: Student Outcome Data Gather
5
Purpose of Systems Review Leadership teams will: Review implementation fidelity data. Reflect on their role in leading the PBIS process. Develop a plan for continuing to implement and refine the critical features of a PBIS model.
6
Don’t forget, it’s all about the students!
7
Systems/Process Measures Includes: Team Implementation Checklist, Self-Assessment Survey, The BoQ, Answer the questions: –Are we doing what we have learned? –How well are we doing it, when, where, who? Having this information helps us to accurately interpret our student outcomes.
8
Completed quarterly as a team Takes approximately 10 minutes to complete Is used as a planning guide for successful School-Wide PBS Implementation Team Implementation Checklist (TIC)
9
Evaluating PBIS Team Implementation Checklist Have we met criteria? Are we making progress? What areas can we celebrate? What areas need more work? What is the plan to achieve criteria on the TIC?
10
Percent Fully Implemented Percent Partially Implemented Percent Not Implemented Feature
11
Team Implementation Checklist: Implementation by Feature Chart Example A Have we met criteria? Not Yet Are we making progress? Yes in these areas What can we celebrate? These areas What areas need more work? These areas
12
Team Implementation Checklist: Implementation by Feature Chart Example B Have we met criteria? Are we making progress? What can we celebrate? What areas need more work?
13
Complete the TIC Worksheet What are your priorities for meeting criteria on the Team Implementation Checklist? 20 minutes Your Turn
14
Why conduct Self-Assessment Survey in addition to Checklist? Checklists are conducted by team, all/most staff complete survey Look for areas of convergence/agreement across tools –Increases confidence of data Look for areas of divergence across tools –Decrease confidence of data? –Possible reasons for disparity… Lack of understanding of questions Staff not fully aware of work of Building Leadership Team Support component not fully “In Place”
15
Differences between the Team Implementation Checklist and the Self-Assessment Survey Team Implementation Checklist Self-Assessment Survey Purpose? Evaluate on-going progress towards schoolwide PBS Evaluate extent that all systems (schoolwide, nonclassroom, classroom, individual) are in place When administered? QuarterlyAnnually Who completes? PBIS school team, completed as a team All school staff (or representative sample) completed individually Time involved? 10-15 minutes30-45 minutes
16
Using Data to Support Implementation of Positive Behavior Interventions and Supports (PBIS) The PBIS Self Assessment Survey (SAS) will help principals and coaches to: –Re-establish buy-in from staff. –Develop clarity around the core features of PBIS.
17
Analyzing Self-Assessment Survey Results Established? > 66% respondents identifying system as “In Place” Priority? > 50% respondents identifying system as “High Priority” Which one of the 4 systems should be focus? Why? –If schoolwide not established, then it should be the focus –Otherwise consider systems that are close to being established and staff priority
18
Have we met criteria? Not yet
19
What can we celebrate? This Area What areas need more work?Everything Else
20
WIllowTree Elementary What areas need more work?
21
Have we met criteria?
22
What can we celebrate? What areas need more work? 4/30/2008 5/11/2009 5/2/2010
23
Have we met criteria?
24
What can we celebrate? What areas need more work?
25
Using your SAS results, complete the S.A.S. Worksheet What are your priorities for meeting criteria on the Self-Assessment Survey? Your Turn
26
Big Ideas for Gathering Data in an Efficient, Effective Way Stay organized (schedule, forms readily available for use) Provide training to ensure consistent, accurate data collection. Get the data back in front of the people who are doing the work. Actively use the data for decision-making. Keep the results in a central location.
27
Systems/Process/Fidelity Measures Tell us about our implementation fidelity: Are we doing what we said we would do, when and how we said we would do it? When engaged in systems change, we will likely be able to see changes in adult behavior before we are able to see changes in student behavior. Systems/process/fidelity data help us know if we are on the right track early on and can reinforce our initial work. Having this information helps us to accurately interpret our student outcomes.
28
Indicators that your systems/process/fidelity data might not be accurate First time completing the measure, scores of fully implemented across the board. There is a mismatch between your process outcome measures and your student outcome measures. There is a mismatch between the leadership team’s perceptions of implementation and the full staff perception.
29
Sharing Data with Staff Help bring staff along with the work the leadership team is spearheading. Share graphs, implementation improvement, especially information that is relevant (scores related to the classroom, results from SAS), etc. SAS, TIC data should presented in conjunction with student outcome data (SWIS) Leadership team perceptions should be complemented by staff perceptions of implementation.
30
Progress Monitoring SWIS Data Entry A. problems and concern B. fidelity issues SWIS Data Reports A. who is seeing the data B. how are the being utilized Check-in Check-Out Data
31
Benchmarks of Quality BoQ are used by teams to identify areas of success and areas for improvement, and by the Positive Behavior Interventions and Supports (PBIS) project to identify model PBIS schools.
32
Benchmarks of Quality Scoring Form and Scoring Guide Read through each of the items on the Scoring Form. Answer as a team Team discussion should be generated by referring to the BoQ Scoring Guide 20 minutes Team Time
33
Progress Monitoring Decision Flowchart
34
1. Using the big five SWIS reports to complete the SWIS worksheet 2. Develop a precision statement based on your analysis of the data 3.Develop an action plan based on your analysis of the data 4.Use the SWIS flow chart to determine any problems 20 minutes Team Time
36
Critical Features of Team Conversations During the Study Phase: Have conversation that revolves around the data. Have data available and organized in a central location (data binder, assessment booklet). Ensure all team members are actively engaged in the data review process and conversation. Discuss strengths and weaknesses of the data. Note items for celebration to share with staff. Document areas of need, action items, and plans all in writing. 22
37
Critical pieces to include in the School Improvement Plan based on your work with PBIS References to Data Strategies and Activities related to a Schoolwide Positive Behavior Interventions and Supports
38
Imagine the rest of the group is your school staff. Provide us with a brief update on our school’s progress. 1.What celebrations do you have to share with us? 2.What will our next steps and priorities be? 3.What feedback do you want from us? Example Team Action Planning
39
Action Planning Where do we go from here? Next Steps? Evaluation
40
The BEP (CICO) Program Agenda Overview the Program The BEP Program Video Review Process for Accessing the BEP Program Review BEP tools Review Progress Monitoring for student involved in the BEP Program Conducting a BEP team meeting
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.