SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs Kim Gulbrandson, Ph.D. Wisconsin RtI Center.
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Introduction to the Benchmarks of Quality (BoQ) Revised version of the presentation given at the Summer 2012 Minnesota PBIS Institute.
Tier 1 Positive Behavior Support Booster/Refresher Training: Orientation
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
PBIS Tier 1 New Team Training Positive Behavioral Interventions & Supports Module #2 Faculty Commitment Midwest PBIS Network Mid-Atlantic PBIS Network.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
3.0 Behavior Data Review and Action Planning SPRING 2012.
Fall Data Review Continuous Improvement Work Day School Leadership Teams Fall 2014.
Understanding and Administering the School- Wide Evaluation Tool (SET)
Did You Know?. Data and PBIS: A Basic Overview With slides borrowed from Bolger & Miller Illinois PBIS Network Presentation on August 8 th, 2013 Rob Purple.
Establishing Training Capacity for Classroom Management Heather Peshak George, Ph.D. Kim Herrmann, S.S.P. University of South Florida Marla Dewhirst Illinois.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
PBIS Tier 1 & 2 Coaches’ Meeting WCTC – ISU Room March 19 or 20, 2014.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
New Universal Team Member Training School-wide Positive Behavioral Interventions and Support KENTUCKY CENTER FOR INSTRUCTIONAL DISCIPLINE 10/21/08.
Cohort 9 Coach Meeting JANUARY 2014 ANN OSTERHUS, FACILITATOR AND MEGAN GRUIS, PRESENTER WITH SPECIAL GUEST, MARY HUNT.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
VTPBiS Regional Coordinators Meeting March Agenda Using the SAS as a tool to assess and address staff buy-in BoQ Procedure Update New Information.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
PBIS Team Training Baltimore County Public Schools Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES July 16, 2008 Secondary.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
PBIS Assessment Benchmarks for Advanced Tiers 2.5 Information & Instructions.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
BENCHMARKS OF QUALITY (BOQ) January 30 th, 2007 Joey Ledvina Parr, Ph.D. Elsa Velez, Ph. D.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Orientation and Summer Institutes Implementer’s Forum October 2005 Susan Barrett PBIS Maryland.
Notes for Trainers (Day Training)
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Evaluation Day 2.
INITIAL TEAM TRAINING Presented by the MBI Consultants MODULE 7: Establishing Procedures for Data Collection.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Data Entry and Analysis Plan Established Day 2.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Session 1 PBIS Coaching Basics Kentucky Center for Instructional Discipline 33 Fountain Place Frankfort, KY Telephone/Fax:
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
VTPBiS Coordinators as Coaches Learning and Networking Meeting 1 May, 2016.
Using Data to Evaluate PBIS Implementation and Student Outcomes Anna Harms.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Tier 2/3 Advanced Behavior Supports: Session 2 KENTUCKY.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Judy Boggs Carrie Wade Karen Bush Kelly Staten KYCID Pendleton County.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Tier 2/3 Advanced Behavior Supports: Session 3 KENTUCKY.
PBIS IMPLEMENTATION & FIDELITY October 21, 2015 VICTORY IN OUR SCHOOLS WITH PBIS.
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
Coaching Session 2: Facilitating the PBIS Team
Annual Evaluation (TFI 1.15 )
Benchmarks of Quality (BOQ) Training
Thank you for agreeing to complete the Benchmarks of Quality
BENCHMARKS OF QUALITY (BOQ)
Introducing...The Benchmark of Quality (BoQ)
Presentation transcript:

SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL DISCIPLINE

Participant Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone etiquette  Listen attentively to others  Limit sidebars and stay on topic Be Kind  Enter discussions with an open mind  Respond appropriately to others’ ideas

Training Objectives To become familiar with the Benchmarks of Quality (BoQ) and understand why it is important to use it To understand how to complete, submit, and use the BoQ

Why is Evaluation of PBIS Implementation Important? To examine the fidelity of implementation − “Are we really doing what we think we are doing?” To document effectiveness of implementation − “Is what we’re doing working?” To identify and examine strengths and weaknesses of implementation − Celebrate successes − Identify areas to improve

PBIS Fidelity Measures Team Implementation Checklist  Completed during October team meeting PBIS Self-Assessment Survey  Completed between March 1 and April 30 Benchmarks of Quality  Completed between February 1 and March 31

Team Implementation Checklist The Team Implementation Checklist (TIC) is a self-assessment completed by the school PBIS Leadership Team. It serves as a guide in appraising the status of PBIS start-up, team functioning, development of key components, and evaluation.

PBIS Self-Assessment Survey The PBIS Self-Assessment Survey (SAS), is used by school staff for initial and annual assessment of Positive Behavior Support systems in schools. The survey examines the status and need for improvement of four behavior support systems: (a) school-wide discipline systems, (b) non-classroom management systems (c) classroom management systems, and (d) systems for individual students. Each question in the survey relates to one of the four systems. Survey results are summarized and used for a variety of purposes including: annual action planning; internal decision-making; assessment of change over time; increasing awareness of staff; and team validation.

Benchmarks of Quality The Benchmarks of Quality (BoQ) is a research- validated measure that assesses the development and implementation of PBIS across 10 critical elements. The BoQ is completed annually by school PBIS Leadership Teams to assess strengths and identify areas of need. Results are used for action planning. Minimal implementation of universal systems is occurring when a school obtains a score of 70%.

Benchmarks of Quality Developed by Florida PBIS at the University of South Florida Assesses development and implementation of school- wide PBIS Completed by school teams annually to identify areas of strength and weakness Used by state and districts to guide TA and training and evaluate outcomes related to level of implementation

53 Items Aligned with PBIS Process Addressing 10 Critical Elements 1. PBIS team 2. Faculty commitment 3. Effective procedures for dealing with discipline 4. Data entry and analysis plan 5. Expectations and rules/procedures developed

53 Items Aligned with PBIS Process Addressing 10 Critical Elements 6. School-wide recognition system established 7. Lesson plans for teaching expectations and rules/procedures 8. Implementation plan 9. Classroom systems 10. Evaluation

Initial BoQ Validation Process Expert review Pilot studies in Florida and Maryland Reliability: Test-retest, inter-rater both >.01 Concurrent Validity – SET/ODR’s For more details see JPBI, Fall 2007

Completing the Benchmarks of Quality

Timeline January – training for coach/designee on the BoQ February to March – complete the BoQ assessment By March 31 – enter BoQ scores into school PBIS Assessment account

Two Methods for Completion Method 1: Coach uses scoring rubric and team members use separate rating form (original way) Method 2: Coach uses scoring rubric and team members also use scoring rubric (modified version)

Method 1 Steps 1. Coach completes rating using the Scoring Form and Scoring Rubric 2. Team members complete ratings using the Team Member Rating Form 3. Coach tallies most common responses from team and marks on Scoring Form 4. Coach determines areas of discrepancy and marks on Scoring Form 5. Team meets and resolves any discrepancies; determines final score 6. Findings are discussed with team; action plan items are generated

Pros Cons Less time for team members to complete ratings Allows for deeper discussion among team members Different scoring scale for coach vs. team members Determining discrepancies can be confusing Reviewing discrepancies can be time consuming Method 1

Method 2 Steps 1. Coach completes rating using the Scoring Form and Scoring Rubric 2. Team members complete rating using the Scoring Form and Scoring Rubric 3. Coach tallies most common responses from team and marks on Scoring Form 4. Findings are discussed with team; action plan items are generated

Pros Cons More streamlined, less confusing scoring method Everyone uses the same scoring form Eliminates the issue of discrepancies More time for team members to complete rating May result in less intense discussion by team about scores and meaning Method 2

Method 1: Specifics Step 1: Coach completes rating The Coach uses the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form

Method 1: Specifics Step 2: Team Member Ratings The Coach gives the Benchmarks of Quality Team Member Rating Form to each PBIS Leadership Team member to be filled out and returned to the Coach upon completion  All team members should complete their form independently

Method 1: Specifics Step 2: Team Member Ratings (continued) Team members should be instructed to rate each of the 53 items according to whether the component is In Place, Needs Improvement, or Not in Place. Team members place a check in the box to indicate their rating. Note: Some of the items relate to product and process development, others to action items; in order to be rated as In Place, the item must be developed and implemented (where applicable).

Method 1: Specifics Step 3: Record the Team Member Ratings Coach collects the Team Member Rating Forms and tallies the teams’ most frequent response using the Tally Sheet. Record the most frequent response for each item on the Benchmarks of Quality Scoring Form in the Step 2 Column, using ++ for In Place, + for Needs Improvement, and – for Not In Place.

Method 1: Specifics Step 4: Determine discrepancies In the Step 3 Column, place a check in the box where any discrepancies in scoring occur between the team and coach ratings

Discrepancy Guidelines For items with a range of 0 to 1:  1 = In Place  0 = Needs Improvement or Not in Place For items with a range of 0 to 2:  2 = In Place  1 = Needs Improvement  0 = Not in Place For items with a range of 0 to 3:  3 = In Place  2 or 1 = Needs Improvement  0 = Not in Place

Discrepancy Examples Coach scores Item 1 a 2 and the team rating is “In Place”  Discrepancy or No Discrepancy? Coach scores Item 8 a 0 and the team rating is “In Place”  Discrepancy or No Discrepancy? Coach scores Item 19 a 1 and the team rating is “Needs Improvement”  Discrepancy or No Discrepancy?

Method 1: Specifics Step 5: Team Report The Coach completes the Team Summary on p. 3 of the Benchmarks of Quality Scoring Form, recording areas of discrepancy, strength, and weakness.

Method 1: Specifics Step 6: Reporting back to the team The coach reports back to the team using the Team Report page of the Benchmarks of Quality: Scoring Form. If needed, address items of discrepancy and adjust the score.

Discrepancies If there were any items for which the team’s most frequent rating varied from the Coach’s rating based upon the Scoring Guide, the descriptions and exemplars from the guide should be shared with the team. If upon sharing areas of discrepancy, the Coach realizes that there is new information that according to the Scoring Guide would result in a different score, the item and the adjusted final score should be recorded on the Scoring Form.

Method 1: Specifics Step 6: Reporting back to the team (continued) The coach then leads the team through a discussion of the identified areas of strength (high ratings) and weakness (low ratings). The team identifies celebrations to share with staff. The team identifies areas of focus and documents on the PBIS Tier 1/Universal Action Plan.

Method 2: Specifics Step 1: Coach completes rating The Coach uses the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form

Method 2: Specifics Step 2: Team members complete ratings Team members use the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form When completed, team members turn in their Scoring Form to the Coach

Method 2: Specifics Step 3: Record the ratings on the Scoring Form Coach collects all the Scoring Forms and tallies the teams’ most frequent response using the Tally Sheet. The Coach transfers the most frequent response for each item onto a Scoring Form.

Method 2: Specifics Step 4: Team Report The Coach completes the Team Summary on p. 3 of the Benchmarks of Quality Scoring Form, recording areas of strength and weakness. The coach leads the team through a discussion of the identified areas of strength (high ratings) and weakness (low ratings). The team identifies celebrations to share with staff. The team identifies areas of focus and documents on the PBIS Tier 1/Universal Action Plan.

Submitting Your BoQ Method 1 and Method 2 The Coach logs into the school’s PBIS Assessment account and records the final score for each item of the BoQ. This is a new feature. You no longer have to turn in a copy of your BoQ to your KyCID Area Coordinator

Final Notes District level PBIS team members or KYCID staff may validate BoQ findings using the PBIS Walkthrough or the SET In addition, your team can use the PBIS Walkthrough anytime throughout the year for a quick self-assessment

Credits