Using Data to Evaluate PBIS Implementation and Student Outcomes Anna Harms.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs Kim Gulbrandson, Ph.D. Wisconsin RtI Center.
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
PBIS Applications NWPBIS Washington Conference November 5, 2012.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Evaluation Tools, On-Line Systems, and Data-Based Decision Making Version 3.0, Rev  This is a presentation of the Illinois PBIS Network. All.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
3.0 Behavior Data Review and Action Planning SPRING 2012.
SW-PBS District Administration Team Orientation
The District Role in Implementing and Sustaining PBIS
Cohort 4 Middle/Jr. High School Data Review and Action Planning: Schoolwide Behavior Spring 2009.
Positive Behavioral Interventions and Supports (PBIS) Leadership Summit Breakout Sessions March 30, 2009.
Did You Know?. Data and PBIS: A Basic Overview With slides borrowed from Bolger & Miller Illinois PBIS Network Presentation on August 8 th, 2013 Rob Purple.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
Cohort 4 - Elementary School Data Review and Action Planning: Schoolwide Behavior Spring
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
PBIS Team Training Baltimore County Public Schools Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES July 16, 2008 Secondary.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
 This is a presentation of the IL PBIS Network. All rights reserved. Recognition Process Materials available on
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
D. Data Entry & Analysis Plan Established. Critical Element PBIS Implementation Goal D. Data Entry & Analysis Plan Established 13. Data system is used.
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
Targeted Behavior Interventions Data Driven Decision Making for Targeted Behavior Support Cohort 5 December Page 30.
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Introduction to School-wide Positive Behavior Support.
Positive Behaviour for Success (PBS) Universals Pre-Training Session 2012 Positive Behaviour for Success Illawarra South East Region.
Notes for Trainers (Day Training)
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Systems Review: Schoolwide Behavior Support Cohort 5: Elementary Schools Winter, 2009.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Evaluation Day 2.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Data Entry and Analysis Plan Established Day 2.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
How’s Your PBIS Program Measuring Up? Jane Ballesteros Tucson PBIS Initiative June 11, 2008.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Coaching Within a Statewide Multi-Tiered System of Supports (MTSS) Steve Goodman miblsi.cenmi.org December 6, 2010.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
Implementing School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
Leadership Launch Module 11: Introduction to School Wide Information System (SWIS) and the Student Risk Screening Scale District Cohort 1 1.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
Positive Behavior Interventions and Supports (PBIS) Rachel Saladis Wisconsin PBIS Network
Annual Evaluation (TFI 1.15 )
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Presentation transcript:

Using Data to Evaluate PBIS Implementation and Student Outcomes Anna Harms

Agenda Quick Review of PBIS Types of Data Using Data Across Multiple Tiers of Support within PBIS & Review of Tools How PBIS can fit within School Improvement Review of Resources

Be Responsible –Attend to the “Come back together” signal –Active participation and engagement Be Respectful –Please allow others to listen Please turn off cell phones and pagers Please limit sidebar conversations Be Safe –Take care of your own needs Setting Group Expectations

Core Principles of PBIS 1.We can effectively teach appropriate behavior to all children. 2.Intervene early. 3.Use of a multi-tier model of service delivery. 4.Use research-based, scientifically validated interventions to the extent available. 5.Monitor student progress to inform interventions. 6.Use data to make decisions. 7.Use assessment for three different purposes.

Define expectations Teach expectations in settings Monitor students to see if expected behaviors are followed Acknowledge students for demonstrating expectations with specific feedback Correct misbehavior Use Data to determine strengths and areas of need

Stages of ImplementationFocusStageDescription Exploration/Ado ption Develop Commitment at ISD and LEA level InstallationEstablish ISD leadership team and implementation team, set up data systems. Audit of current resources and capacity, plan and prepare for work Initial Implementation Try out the practices, work out details, learn from and provide significant support to implementers ElaborationExpand the program/practices to other locations, individuals, times- adjust from learning in initial implementation. Continuous Improvement/R egeneration Make it easier, more efficient. Embed within current practices. Work to do it right! Work to do it better! Should we do it!

We want to gather information that tells us: –How well we are implementing/doing something: Systems/Process/Fidelity Data AND –Whether what we’re doing is working: Student Outcome Data Types of Data

Your Turn Complete the Audit of Measures and Data Systems used to Support PBISComplete the Audit of Measures and Data Systems used to Support PBIS

A Few Examples of Tools Systems/ Process/ FidelityStudent Outcomes Tier 1PBIS Self Assessment Survey (PBIS-SAS) PBIS Team Implementation Checklist (PBIS-TIC) Benchmarks of Quality (BOQ) Schoolwide Evaluation Tool (SET) Schoolwide Information System (SWIS) Tier 2Benchmarks for Advanced Tiers (BAT) Schoolwide Information System- Check-in/Check-out (SWIS-CICO) Tier 3Benchmarks for Advanced Tiers (BAT) Individual Student Information System-Schoolwide Information System (ISIS- SWIS)

Systems/Process/Fidelity Measures Tell us about our implementation fidelity: Are we doing what we said we would do, when and how we said we would do it? Relate to the big ideas of a practice. When engaged in systems change, we will likely be able to see changes in adult behavior before we are able to see changes in student behavior. Systems/process/fidelity data help us know if we are on the right track early on and can reinforce our initial work. Having this information helps us to accurately interpret our student outcomes.

Strengths and Weaknesses of Self Assessment Strengths Weaknesses Self Evaluation can prompt improved implementation on its own. Self perceptions might not always be accurate: -Lack / changing background knowledge Typically faster to complete, and less complex. Intentional or unintentional over or under-inflation of scores

A free website where you can download copies of the measures, enter your school’s scores online, and view graphs of your results

A systems / process measure that examines implementation fidelity of PBIS. 4 sections: –Schoolwide Systems –Classroom Systems –Nonclassroom Systems –Individual Student Systems All staff complete individually online at For each item, staff rate the degree to which it is in place (scale of 0-1). Then, they rate whether each item is a priority for improvement (scale of 0-1). Takes approximately minutes to complete. PBIS Self Assessment Survey (SAS)

Individual Summaries

Team Checklist Version 3

Effective Behavior Support (EBS) Team Implementation Checklist (TIC) Completed quarterly as a team Takes approximately 10 minutes to complete Entered on-line from group completed paper copy Is used as a planning guide for successful School-Wide PBIS Implementation

Components of Team Implementation Checklist 1.Establish commitment 2.Establish and maintain team 3.Self-assessment (fidelity and outcomes) 4.School-wide expectations 5.Classroom Behavior Support Systems 6.Establish information system 7.Build capacity for function-based support This checklist provides a scope and sequence for implementing schoolwide behavior supports. Schools shouldn’t expect to see change in student behavior until at least 80% of items are achieved on the checklist.

# of items fully implemented (achieved) / 22 total items # of items partially implemented (In progress)/ 22 total items Total Points= Total Points Earned / Total Possible Points

Benchmarks of Quality (BoQ) Completed by the building leadership team, including coach Completed annually, typically in the Spring Requires ≈ 45 minutes to complete

Benchmarks of Quality (BoQ) Critical Elements: –PBS Team –Faculty Commitment –Effective Procedures for Dealing with Discipline –Data Entry & Analysis Plan –Expectations & Rules Developed –Rewards/Recognition Program Established –Lesson Plans for Teaching Expectations/Rules –Implementation Plan –Classroom Systems –Evaluation

Note: Criterion Score = 70%

Why the SET? Provides an evaluation of the critical features of PBIS (universal systems) Results are based on collection of evidence/data from interviews with principal, staff & students and reviews of permanent products (listed on SET form) External evaluation – eliminates bias often inherent to perception data.

What is the SET? The School-wide Evaluation Tool (SET) is designed to assess and evaluate the critical features of school- wide effective behavior support across each academic school year. The SET results are used to: assess features that are in place, determine annual goals for school-wide effective behavior support, evaluate on-going efforts toward school-wide behavior support, design and revise procedures as needed, and compare efforts toward school-wide effective behavior support from year to year.

SET 28 items total Critical Feature QuestionData Source Score

SET Results

PBIS-SAS, TIC, BOQ and SET MeasureWhat it measures: Who completes it: Depth: PBIS-SASSchoolwide Classroom Non-Classroom Individual All Staff Complete Individually Online Moderate PBIS-TICUniversal PBISLeadership TeamBrief- 22 items, designed for frequent use BOQUniversal PBISLeadership TeamExtensive – 53 items, designed for annual use, scoring criteria SETUniversal PBISExternal Evaluator Extensive –review of permanent products, interviews, scoring rubric

Benchmarks for Advanced Tiers (BAT) The BAT is designed to answer three questions: 1.Are the foundational elements in place for implementing secondary and tertiary behavior support practices? 2.Is a Tier 2 support system in place? 3.Is a Tier 3 system in place?

BAT Sections ItemsFocus You will complete 1-3A. Implementation of Schoolwide PBISToday 4-6B. CommitmentToday 7-10C. Student IdentificationToday 11-12D. Monitoring and EvaluationToday 13-17E. Tier 2 Support SystemsToday 18-27F. Main Tier 2 Strategy: ImplementationToday 28-31G. Main Tier 2 Strategy: Monitoring and Evaluation Today 32-43H. Tier 3: Intensive Support SystemsFebruary 44-53I. Tier 3: Assessment and Plan DevelopmentFebruary 54-56J. Tier 3: Monitoring and EvaluationFebruary

BAT Results

Critical Features of Student Outcome Data for Behavior Schoolwide Questions: –Is there a problem? –What is the problem behavior? –When is the problem happening? –Where is the problem happening? –Who is engaging in problem behavior?

Features of SWIS Only reports discipline data Major office referrals Minor discipline offenses Suspensions and expulsions Highly efficient (30 sec per referral) Local control (Building Level) Formatted for decision-making (pictures) Information is available continuously Confidential, secure Can be combined with district data base

New SWIS Report Average Referrals Per Day Per Month (National data lines)

SWIS summary (Majors Only) 4,019 schools; 2,063,408 students; 1,622,229 ODRs Grade RangeNumber of Schools Mean Enrollment per school Median ODRs per 100 per school day K K-(8-12)

SWIS Reports: Referrals by Problem Behavior

SWIS Reports: Referrals by Location

SWIS Reports: Referrals by Student

SWIS Reports: Referrals by Time

Critical Features of Student Outcome Data for Behavior Tier 2 Questions: –How many students have additional needs for behavior support? –How many students who need support are accessing interventions in a timely way? –Are the interventions working for all students? –Are the interventions working for individual students

Schoolwide Information System Check-in/Check-out (SWIS-CICO)

Maximum # of school days in reporting period Number of days student has data for each period of report period Period #

Tier 2/3 Tracking Tool Access to ANY Tier 2 Interventions Specific to CICO % %

Critical Features of Student Outcome Data for Behavior Tier 3 Questions: –How many students have intensive needs for behavior support? –How many students who need support are accessing interventions in a timely way? –Are the interventions working for all students? –Are the interventions working for individual students?

Screen shots for what is coming this time next year. Individual Student Information System (ISIS-SWIS)

Tier 2/3 Tracking Tool Access to Tier 3 Support Impact of Function- Based Support

Embedding Behavioral Work within the School Improvement Process

Critical to Know How Behavior Relates to Academic Outcomes Research around the relation between behavior and academics/reading What that means in our school Ensuring that we are running integrated behavior and academic systems, not parallel systems.

Breaking Down Your Plan Goal Objective Strategy Activity Strategy Activity Objective Strategy Activity Gap Statement and Cause for Gap School Data Profile Fidelity Data Fits Student Outcome Data Fits

Resources

Resources from Other States Maryland Florida Colorado Missouri Illinois

Available Resources Measurement Pages Measurement Schedules Facilitator Guides for Studying Data Videos Links to Other Resources