Module 1 Part 1 : Using Data to Problem-Solve Implementation Issues Session 2 Phase I Team Training Presented by the MBI Consultants.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Implementing School-wide Positive Behavior Support Rob Horner, George Sugai and Anne Todd University of Oregon Center on Positive Behavior Interventions.
Welcome and Questions?. Agenda: Component 6: Procedures for Record Keeping and Decision Making Plan for roll-out Team Presentations Completion of Workbook.
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
From the work of: Rob Horner, Steve Newton, & Anne Todd, University of Oregon Bob Algozzine & Kate Algozzine, University of North Carolina at Charlotte.
VTPBiS REGIONAL COORDINATOR MEETING: MARCH
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
Welcome and Questions? Day 4. Component 6: Procedures for Record Keeping & Decision Making.
PBIS Coaches Institute Placer County Office of Education
PBIS Applications NWPBIS Washington Conference November 5, 2012.
PBIS Coaches Training Day 3. Coaches Training Day 4 Follow-up from Coaches Training Day 3 The Why? Preparing your teams for Tier 1 implementation Coaching.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Northern California PBIS Symposium November 18, 2013.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
3.0 Behavior Data Review and Action Planning SPRING 2012.
Using Data to Problem Solve Susan Barrett
The CMSD Pyramid of Success – Implementing the Integrated Systems Model Leadership Team Training – August 2006 The Pyramid of Success: Creating a climate.
SW-PBS District Administration Team Orientation
Cohort 4 Middle/Jr. High School Data Review and Action Planning: Schoolwide Behavior Spring 2009.
Data-based Decision Making and Problem Solving in PBIS Schools VTPBiS Leadership Forum October 7, 2014.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
PBIS Tier 1 & 2 Coaches’ Meeting WCTC – ISU Room March 19 or 20, 2014.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
PBIS Tier 1 Coaches Training
Positive Behavioral Interventions & Supports (PBIS) Core Behavioral Component The Response to Intervention Best Practices Institute Wrightsville Beach,
Cohort 4 - Elementary School Data Review and Action Planning: Schoolwide Behavior Spring
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
January Data Day: 1.0 Introduction WINTER Opening Activity:  Talk to your neighbor about successes and challenges you experienced when using PBIS.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
Revisit TIPS team minute form titled Problem Solving Action Plan? Teams need a review on: getting to a precision problem statement deciding what their.
TIPS Meeting Foundations Structure of meetings lays foundation for efficiency & effectiveness 11/22/20151.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
Module F Reviewing the Problem Solving Process Coaches’ Monthly Meeting Add DC Name Here.
Welcome and Questions Day 4. Today’s Agenda Component 6: Team Initiated Problem Solving (TIPS) Roll-out for STAFF, STUDENTS & FAMILIES BREAK/CHECK OUT:
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Welcome and Questions Day 4. Today’s Agenda Component 6: Team Initiated Problem Solving (TIPS) Roll-out for STAFF, STUDENTS & FAMILIES BREAK/CHECK OUT:
Positive Behaviour for Success (PBS) Universals Pre-Training Session 2012 Positive Behaviour for Success Illawarra South East Region.
Notes for Trainers (Day Training)
Problem-Solving Meeting Foundations
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Systems Review: Schoolwide Behavior Support Cohort 5: Elementary Schools Winter, 2009.
Positive Behavior Interventions & Supports Family & Community Team Member Network Meeting Thank you for coming! Please make yourself comfortable.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
INITIAL TEAM TRAINING Presented by the MBI Consultants MODULE 7: Establishing Procedures for Data Collection.
Data-based Decision Making and Problem Solving in PBIS Schools VTPBiS Leadership Forum October 9, 2015.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
VTPBiS Coordinators as Coaches Learning and Networking Meeting 1 May, 2016.
Spring Data Review and Action Planning: Module 1.0 Introduction Cohort 4 Middle/Jr. High Schools Spring
Effective Meeting Practices November, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project of.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
School-Wide Positive Behavior Coaches Meeting Year 1 Day Donna.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
TIER I SESSION II: INITIAL TEAM TRAINING Presented by the MBI Consultants Workbook pp Module 6: Establishing Procedures for Office Managed Behaviors.
TFI Wordle This presentation is intended to introduce the PBISApps site and the types of data teams will be working with. The teams will take their first.
Looking for a Data Protocol? Consider Using TIPS:
Presentation transcript:

Module 1 Part 1 : Using Data to Problem-Solve Implementation Issues Session 2 Phase I Team Training Presented by the MBI Consultants

Acknowledgements  Much of the content and ideas of this workshop stems from the work of others.  Special thanks to the work of Tim Lewis, George Sugai, Rob Horner, Lori Newcomer, the professors at the University of Oregon, National Center on Positive Behavioral Interventions and Support, and the Quaglia Institute for Student Aspirations.

 BE RESPONSIBLE - Make yourself comfortable & take care of your needs - Address question/activity in group time before discussing “other” topics - Use your team time wisely - Return promptly from breaks  BE RESPECTFUL - Turn cell phones to “off” or “vibrate” - Listen attentively to others  BE PREPARED - Ask questions when something is unclear - Be an active participant OUR EXPECTATIONS FOR TRAINING

 Trainer will raise his/her hand  Participants will raise their hand and wait quietly ATTENTION SIGNAL

Session 1 (Fall)Session 2 (Winter)Session 3 (Fall ) Systems MBI Philosophy/Aspirations Team Process and Responsibilities Faculty Commitment Review Components Problem-Solving Implementation Issues Team Initiated Problem Solving—TIPS Resources for Teams Review Components Problem-Solving Implementation Issues: SET, SAS, TIC Getting Everybody on Board Practices 3-5 Expectations Universal Teaching Matrix Reinforcement Systems Promoting Universals Teaching the Universals and Developing Lesson Plans Active Supervision Consequence Systems Focus groups Interpreting “My Voice” Student Aspirations Team (MBI Youth Day) Data Implementation Data TIC SAS MBI Blueprint My Voice Survey Using Implementation Data System Level Outcome Data ODRs, SWIS Big 5 Generator SSARB, MV Reports, Focus Groups Other data tools Problem-Solving Outcome Data Reviewing your ODRs Big 5 Generator SWIS SSARB PHASE I TRAINING MATRIX

FORMAT OF PRESENTATIONS  Lecture with slides  Workbook/work time &  Action plan development  Workbook guides Activity = Please read =

Critical Components of MBI  Commit to a common purpose and approach to discipline—creating a safe and welcoming culture that includes student voice and family/community involvement  Establish and maintain team… with administrator support, participation and leadership  Establish a clear set of positive expectations and behaviors  Establish procedures for teaching expected behavior  Establish a continuum of procedures for encouraging expected behaviors  Establish a continuum of procedures for discouraging inappropriate behaviors  Establish a system for using data to make decisions, progress monitor, and problem-solve

MBI DATA  Use data to assess current status  Self-assessment Survey (SAS)  Use data to assess implementation fidelity  Team Implementation Checklist (TIC)  School-wide Evaluation Tool (SET)  Benchmarks of Quality (BoQ)  Use data to assess impact (outcome) on students  Office Discipline referrals PBISApps.org

Summary of Currently Available Online Tools  External Evaluation/Research: SET for Tier 1 ISSET for Tier 2 and Tier 3  Internal Evaluation/Action Planning: BAT for Tier 2 and Tier 3 TIC mostly for Tier 1, does have some on Tiers 2 & 3 BoQ for Tier 1 Early Childhood Benchmarks of Quality (ECBoQ) for Tier 1 SAS mostly for Tier 1, does have some on Tiers 2 & 3 Monitoring Advanced Tiers Tool (MATT)

Main Ideas  Decisions are more likely to be effective and efficient when they are based on data.  The quality of decision-making depends most on the first step (defining the problem to be solved)  Define problems with precision and clarity

Main Ideas  Data help us ask the right questions…they do not provide the answers: Use data to  Identify problems  Refine problems  Define the questions that lead to solutions  Data help place the “problem” in the context rather than in the students.

Main Ideas  The process a team uses to “problem solve” is important:  Roles :  Facilitator; Recorder; Data analyst; Active member  Organization  Agenda: Old business (did we do what we said we would do); New business; Action plan for decisions.  What happens BEFORE a meeting  What happens DURING a meeting  What happen AFTER a meeting Agenda, data summary Updates, identify problem, problem solve Minutes posted, tasks completed

TEAM PROBLEM-SOLVING METHOD

TIPS II Meeting Minutes and Problem-Solving Action Plan Form Today’s Meeting: Winter MBI Training Date, time, location: Facilitator: Minute Taker:Data Analyst: Next Meeting: Date, time, location: Facilitator: Minute Taker:Data Analyst: Team Members (bold are present today________________________________________________________________ Information for Team, or Issue for Team to Address Discussion/Decision/Task (if applicable)Who?By When? Administrative/General Information and Issues Implementation and Evaluation Precise Problem Statement, based on review of data (What, When, Where, Who, Why) Solution Actions (e.g., Prevent, Teach, Prompt, Reward, Correction, Extinction, Safety) Who?By When? Goal, Timeline, Decision Rule, & Updates Problem-Solving Action Plan Agenda for NEXT Meeting Implementation and Evaluation Precise Problem Statement, based on review of data (What, When, Where, Who, Why) Solution Actions (Prevent, Teach, Prompt, Reward, Correction, Extinction, Adaptations, Safety) Who?By When?Goal with Timeline Fidelity of Imp measure Effective ness of Solution/ Plan Not started Partially Imp Imp Fidelity Done Goal Met Better Same Worse Agenda for Today: 1. Analyze TIC and SAS Results to determine current level of implementation. Previously Defined Problems/Solutions (Update) 1 2

Decision-making at many levels  Whole school  Small groups or school areas  Individual student   Same basic process Carol SWIS

Implementation Measures Interpreting the TIC and SAS Early Childhood Benchmarks of Quality (ECBoQ)*

Team Implementation Checklist - Version 3.1  Sugai, G, Horner, R.H., Lewis ‐ Palmer, T.,& Rossetto Dickey, C. (2011). Team Implementation Checklist, Version 3.1, University of Oregon. Retrieved from numbered items that focus on the universal level of prevention (Tier 1) although it has 3 items to assess progress toward implementing Tiers 2 and 3. ● For example, item 20 asks if “Personnel are able to provide behavior expertise for students needing Tier II and Tier III support.”

Does your team need to complete a current TIC? If so, complete now! Pick one item that is “in Progress” or “Not Yet Started” and develop an action statement for that item (Who will do What by When) *Remember this is on your TIPS Meeting Form*

Self-Assessment Survey (SAS)  Todd, A.W., Sugai, G., and Horner, R.H. (2003). Effective Behavior Support (EBS) Survey. University of Oregon, Retrieved from Surveys.aspx#sas  The SAS is an annual assessment used by schools to identify the staff perception of the implementation status and improvement priority for school-wide, classroom, non-classroom and individual student systems. Results of the SAS are effective in identifying the staff priorities for Action Planning.

COMMON FEATURES TIC AND SAS  Establish Commitment  Establish and Maintain Team  Self ‐ Assessment  Prevention Systems (defining and teaching expectations, rewarding appropriate behavior, and responding to violations)  Classroom System (a new category)  Information System  Build Capacity for Function ‐ based Support

Graphics from PBISApps: Team Implementation Checklist Total Score (Percentage of Points)

TIC Subscale Report Subscales Regular Meetings Classroom Discipline data Intensive Individual

Team Implementation Checklist Items Establish & Maintain Team 3. Team Established (Representative) Team has regular meeting schedule, effective operating procedures 2 5. Audit is completed for efficient integration of team with other teams/initiatives addressing behavior support. 1

SAS-Did you remember to bring yours?  Measures implementation level and priority for improvement using 46 items across four systems: School ‐ wide (18 items) Specific Setting (9 items) Classroom (11 items) Individual Student (8 items)  Often is used as part of staff development with initial PBIS training.  Measures “Priority for Improvement,” as well as staff perceptions of level of implementation (“in place” or “partially in place” or “not in place”)  Many schools continue to use this as a measure of progress over time.

Graphics from PBISApps: Self-Assessment Survey: Overall Status In Place 59% Partial35% Not6% High19% Medium31% Low50%

Subscale Report Expectations Defined Expectations Taught Reward System Violations System MonitoringManagementDistrict Support Implementatio n Average 11/11/201197%100%91%76%80%67%64%77%

Item Analysis Current StatusFeatureImprovement Priority In Place Partial Not System: schoolwideHigh Medium Low 96 %4 %0 %1. A small number (e.g. 3-5) of positively and clearly stated student expectations or rules are defined. 21 %17 %63 % 65 %25 %10 %2. Expected student behaviors are taught directly. 44 %28 % 50 %46 %4 %5. Consequences for problem behaviors are defined clearly. 28 %40 %32 % 38 % 25 %6. Distinctions between office v. classroom managed problem behaviors are clear. 33 %29 %38 %

SAS Report Color Code Key for Current Status White = 80% or more of staff marked item as “In Place”. Yellow = 51-79% of staff marked item as “In Place”. Red = 0 -50% of staff marked item as “In Place”. Red Review: Review the features in red and determine 2 features which staff identified as “High” priority. List here: Yellow Review: Review the yellow features and identify 1 or 2 features the team could address to easily improve implementation. List here: White Review: Review the white features and 1 or 2 features that your school can celebrate. Self-Assessment Survey Report Analysis Schoolwide System Only School: Date: Use the worksheet below to discuss the results of your school’s Self-Assessment Survey. 1. Self-Assessment Survey Report Analysis 2. Review and Update Team Action Plan 3. How will you share the SAS Results with Staff? (Add to your team action plan)

Work Time Complete the “Problem-Solving Implementation Issues – Part 1” section of your workbook