Download presentation
Presentation is loading. Please wait.
Published byPhoebe Harvey Modified over 8 years ago
1
Using Data to Evaluate PBIS Implementation and Student Outcomes Anna Harms
2
Agenda Quick Review of PBIS Types of Data Using Data Across Multiple Tiers of Support within PBIS & Review of Tools How PBIS can fit within School Improvement Review of Resources
3
Be Responsible –Attend to the “Come back together” signal –Active participation and engagement Be Respectful –Please allow others to listen Please turn off cell phones and pagers Please limit sidebar conversations Be Safe –Take care of your own needs Setting Group Expectations
4
Core Principles of PBIS 1.We can effectively teach appropriate behavior to all children. 2.Intervene early. 3.Use of a multi-tier model of service delivery. 4.Use research-based, scientifically validated interventions to the extent available. 5.Monitor student progress to inform interventions. 6.Use data to make decisions. 7.Use assessment for three different purposes.
5
Define expectations Teach expectations in settings Monitor students to see if expected behaviors are followed Acknowledge students for demonstrating expectations with specific feedback Correct misbehavior Use Data to determine strengths and areas of need
6
Stages of ImplementationFocusStageDescription Exploration/Ado ption Develop Commitment at ISD and LEA level InstallationEstablish ISD leadership team and implementation team, set up data systems. Audit of current resources and capacity, plan and prepare for work Initial Implementation Try out the practices, work out details, learn from and provide significant support to implementers ElaborationExpand the program/practices to other locations, individuals, times- adjust from learning in initial implementation. Continuous Improvement/R egeneration Make it easier, more efficient. Embed within current practices. Work to do it right! Work to do it better! Should we do it!
7
We want to gather information that tells us: –How well we are implementing/doing something: Systems/Process/Fidelity Data AND –Whether what we’re doing is working: Student Outcome Data Types of Data
8
Your Turn Complete the Audit of Measures and Data Systems used to Support PBISComplete the Audit of Measures and Data Systems used to Support PBIS
9
A Few Examples of Tools Systems/ Process/ FidelityStudent Outcomes Tier 1PBIS Self Assessment Survey (PBIS-SAS) PBIS Team Implementation Checklist (PBIS-TIC) Benchmarks of Quality (BOQ) Schoolwide Evaluation Tool (SET) Schoolwide Information System (SWIS) Tier 2Benchmarks for Advanced Tiers (BAT) Schoolwide Information System- Check-in/Check-out (SWIS-CICO) Tier 3Benchmarks for Advanced Tiers (BAT) Individual Student Information System-Schoolwide Information System (ISIS- SWIS)
10
Systems/Process/Fidelity Measures Tell us about our implementation fidelity: Are we doing what we said we would do, when and how we said we would do it? Relate to the big ideas of a practice. When engaged in systems change, we will likely be able to see changes in adult behavior before we are able to see changes in student behavior. Systems/process/fidelity data help us know if we are on the right track early on and can reinforce our initial work. Having this information helps us to accurately interpret our student outcomes.
11
Strengths and Weaknesses of Self Assessment Strengths Weaknesses Self Evaluation can prompt improved implementation on its own. Self perceptions might not always be accurate: -Lack / changing background knowledge Typically faster to complete, and less complex. Intentional or unintentional over or under-inflation of scores
12
www.pbssurveys.org A free website where you can download copies of the measures, enter your school’s scores online, and view graphs of your results
13
A systems / process measure that examines implementation fidelity of PBIS. 4 sections: –Schoolwide Systems –Classroom Systems –Nonclassroom Systems –Individual Student Systems All staff complete individually online at www.pbssurveys.org For each item, staff rate the degree to which it is in place (scale of 0-1). Then, they rate whether each item is a priority for improvement (scale of 0-1). Takes approximately 20-30 minutes to complete. PBIS Self Assessment Survey (SAS)
14
Individual Summaries
15
Team Checklist Version 3
16
Effective Behavior Support (EBS) Team Implementation Checklist (TIC) Completed quarterly as a team Takes approximately 10 minutes to complete Entered on-line from group completed paper copy Is used as a planning guide for successful School-Wide PBIS Implementation
17
Components of Team Implementation Checklist 1.Establish commitment 2.Establish and maintain team 3.Self-assessment (fidelity and outcomes) 4.School-wide expectations 5.Classroom Behavior Support Systems 6.Establish information system 7.Build capacity for function-based support This checklist provides a scope and sequence for implementing schoolwide behavior supports. Schools shouldn’t expect to see change in student behavior until at least 80% of items are achieved on the checklist.
19
# of items fully implemented (achieved) / 22 total items # of items partially implemented (In progress)/ 22 total items Total Points= Total Points Earned / Total Possible Points
20
Benchmarks of Quality (BoQ) Completed by the building leadership team, including coach Completed annually, typically in the Spring Requires ≈ 45 minutes to complete
21
Benchmarks of Quality (BoQ) Critical Elements: –PBS Team –Faculty Commitment –Effective Procedures for Dealing with Discipline –Data Entry & Analysis Plan –Expectations & Rules Developed –Rewards/Recognition Program Established –Lesson Plans for Teaching Expectations/Rules –Implementation Plan –Classroom Systems –Evaluation
22
Note: Criterion Score = 70%
24
Why the SET? Provides an evaluation of the critical features of PBIS (universal systems) Results are based on collection of evidence/data from interviews with principal, staff & students and reviews of permanent products (listed on SET form) External evaluation – eliminates bias often inherent to perception data.
25
What is the SET? The School-wide Evaluation Tool (SET) is designed to assess and evaluate the critical features of school- wide effective behavior support across each academic school year. The SET results are used to: assess features that are in place, determine annual goals for school-wide effective behavior support, evaluate on-going efforts toward school-wide behavior support, design and revise procedures as needed, and compare efforts toward school-wide effective behavior support from year to year.
26
SET 28 items total Critical Feature QuestionData Source Score
27
SET Results
28
PBIS-SAS, TIC, BOQ and SET MeasureWhat it measures: Who completes it: Depth: PBIS-SASSchoolwide Classroom Non-Classroom Individual All Staff Complete Individually Online Moderate PBIS-TICUniversal PBISLeadership TeamBrief- 22 items, designed for frequent use BOQUniversal PBISLeadership TeamExtensive – 53 items, designed for annual use, scoring criteria SETUniversal PBISExternal Evaluator Extensive –review of permanent products, interviews, scoring rubric
29
Benchmarks for Advanced Tiers (BAT) The BAT is designed to answer three questions: 1.Are the foundational elements in place for implementing secondary and tertiary behavior support practices? 2.Is a Tier 2 support system in place? 3.Is a Tier 3 system in place?
30
BAT Sections ItemsFocus You will complete 1-3A. Implementation of Schoolwide PBISToday 4-6B. CommitmentToday 7-10C. Student IdentificationToday 11-12D. Monitoring and EvaluationToday 13-17E. Tier 2 Support SystemsToday 18-27F. Main Tier 2 Strategy: ImplementationToday 28-31G. Main Tier 2 Strategy: Monitoring and Evaluation Today 32-43H. Tier 3: Intensive Support SystemsFebruary 44-53I. Tier 3: Assessment and Plan DevelopmentFebruary 54-56J. Tier 3: Monitoring and EvaluationFebruary
31
BAT Results
32
Critical Features of Student Outcome Data for Behavior Schoolwide Questions: –Is there a problem? –What is the problem behavior? –When is the problem happening? –Where is the problem happening? –Who is engaging in problem behavior?
33
www.swis.org
34
Features of SWIS Only reports discipline data Major office referrals Minor discipline offenses Suspensions and expulsions Highly efficient (30 sec per referral) Local control (Building Level) Formatted for decision-making (pictures) Information is available continuously Confidential, secure Can be combined with district data base
36
New SWIS Report Average Referrals Per Day Per Month (National data lines)
37
SWIS summary 2009-10 (Majors Only) 4,019 schools; 2,063,408 students; 1,622,229 ODRs Grade RangeNumber of Schools Mean Enrollment per school Median ODRs per 100 per school day K-62565452.22 6-9713648.50 9-12266897.68 K-(8-12)474423.42
38
SWIS Reports: Referrals by Problem Behavior
39
SWIS Reports: Referrals by Location
40
SWIS Reports: Referrals by Student
41
SWIS Reports: Referrals by Time
42
Critical Features of Student Outcome Data for Behavior Tier 2 Questions: –How many students have additional needs for behavior support? –How many students who need support are accessing interventions in a timely way? –Are the interventions working for all students? –Are the interventions working for individual students
43
Schoolwide Information System Check-in/Check-out (SWIS-CICO)
47
Maximum # of school days in reporting period Number of days student has data for each period of report period Period #
50
Tier 2/3 Tracking Tool Access to ANY Tier 2 Interventions Specific to CICO 6 8 75% 8 12 66%
51
Critical Features of Student Outcome Data for Behavior Tier 3 Questions: –How many students have intensive needs for behavior support? –How many students who need support are accessing interventions in a timely way? –Are the interventions working for all students? –Are the interventions working for individual students?
52
Screen shots for what is coming this time next year. Individual Student Information System (ISIS-SWIS)
60
Tier 2/3 Tracking Tool Access to Tier 3 Support Impact of Function- Based Support
61
Embedding Behavioral Work within the School Improvement Process
62
Critical to Know How Behavior Relates to Academic Outcomes Research around the relation between behavior and academics/reading What that means in our school Ensuring that we are running integrated behavior and academic systems, not parallel systems.
63
Breaking Down Your Plan Goal Objective Strategy Activity Strategy Activity Objective Strategy Activity Gap Statement and Cause for Gap School Data Profile Fidelity Data Fits Student Outcome Data Fits
66
Resources
67
www.pbis.org
68
Resources from Other States Maryland Florida Colorado Missouri Illinois
69
www.cenmi.org/miblsi
70
Available Resources Measurement Pages Measurement Schedules Facilitator Guides for Studying Data Videos Links to Other Resources
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.