Implementing a Three-Tiered State Evaluation Structure Florida’s PBS Project Model Karen Elfner Childs, M.A., University of South Florida.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Measuring Performance within School Climate Transformation Grants
Scaling Up SW PBS Implementation: Different Journeys to the Same Destination Heather Peshak George, Ph.D. APBS Conference, St. Louis, MO: March 26, 2010.
Building Trainer Competency for Secondary/Tertiary Systems Don Kincaid, University of South Florida Kimberli Breen, Illinois PBIS Network.
Student Services Personnel and RtI: Bridging the Skill Gap FASSA Institute George M. Batsche Professor and Co-Director Institute for School Reform Florida.
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Schoolwide Positive Behavior Interventions and Support -SWPBIS- Mitchell L. Yell, Ph.D. University of South Carolina
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
Tier 1 Positive Behavior Support Booster/Refresher Training: Orientation
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
Are You Really Ready? Review the School Readiness Checklist Examples of Readiness that are Sustainable and Unsustainable Vermont PBS “Bringing out the.
PBIS Applications NWPBIS Washington Conference November 5, 2012.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Northern California PBIS Symposium November 18, 2013.
SW-PBS District Administration Team Orientation
MU Center for SW-PBS College of Education University of Missouri Missouri SW-PBS Annual Reporting pbismissouri.org.
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
Establishing Training Capacity for Classroom Management Heather Peshak George, Ph.D. Kim Herrmann, S.S.P. University of South Florida Marla Dewhirst Illinois.
Creating an ISD & District Level Infrastructure to Promote Sustainability Mary Bechtel Kim St. Martin.
How to Teach Students Rules and Expectations for Behavior Under Schoolwide Positive Behavior Support (SWPBS) Answers to 12 most common questions about.
Booster/Refresher Training Selecting the Modules Half Day Practice Problem Solving at Tier 1 Action Planning Full day Developing Schools’
Introduction to Coaching School-Wide PBS:RtIB. 2 Agenda PBS:RtIB Brief Overview Coaching Tier 1 Coaching Skills and Activities Resources and Barriers.
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
1 Coaching Essentials Coaches’ Monthly Meeting Module BB DC Name and Date Here.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
Mid Year Evaluation Monthly Coaching Meeting Module D Add DC Name Here.
PBIS Team Training Baltimore County Public Schools Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES July 16, 2008 Secondary.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
1 Module L R ole of Coaches Coaches’ Monthly Meeting Add DC Name Here.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
Orientation and Summer Institutes Implementer’s Forum October 2005 Susan Barrett PBIS Maryland.
Notes for Trainers (Day Training)
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Evaluation Day 2.
1 Secondary Implementation. 2 Agenda Why High Schools? What We Did? What We Learned? –Strategies by Critical Element –Tools for Implementation.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Data Entry and Analysis Plan Established Day 2.
Positive Behavior Support for Families and Community Members School Name / Date (Red font denotes information to be completed/inserted by the district.
Model School Applications Online Chat January 23, 2012.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
School-wide Evaluation Tool (SET) Assessing the Implementation of School-wide Discipline Training Overview George Sugai, University of Connecticut Teri.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Texas Behavior Support (TBS): School-Wide Positive Behavioral Interventions and Support (PBIS) “Overview”
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
12 Top Issues Educators Face When Implementing SWPBS Donald Kincaid, Ed.D March 4, 2010 Educational Research Newsletter Webinars
School-Wide Positive Behavioral Interventions and Supports (SWPBIS)
Presentation transcript:

Implementing a Three-Tiered State Evaluation Structure Florida’s PBS Project Model Karen Elfner Childs, M.A., University of South Florida

Objectives  Briefly describe Florida’s evaluation system for state, district, and school levels  Describe Florida’s comprehensive model for evaluating all three tiers of PBS  Review critical questions to assist in building a scalable and sustainable system  Review methods of data collection procedures, tools, analysis and training

Purpose of Evaluation To examine the extent to which teams are accurately selecting and implementing PBS systems and practices Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved To determine if teams are accurately and consistently implementing activities and practices as specified in their individualized action plan (PBIS Blueprint, 2005)

PBIS Evaluation Blueprint: A Work in Progress… Documenting Context –What was provided, who provided, who received Documenting Input –Professional development, value, perspective Documenting Fidelity –Implemented as designed, w/fidelity, process evaluation Documenting Impact –Behavior change, other schooling changes Replication/Sustainability Indicators –Capacity, practice, policy Implications for Improving Practice –Expanding implementation, allocating resources (PBIS Blueprint, 2009)

Factors to Consider in Developing Comprehensive Evaluation Systems 1)Systems Preparation –Readiness activities 2)Service Provision –Training and technical assistance 3)Identification and Assessment of Behavior Problems –Possible data sources 4)Evaluation Process –Timelines, data systems 5)Evaluation Data (Across all three Tiers) –Implementation Fidelity, Impact on Students, Attrition, Client Satisfaction 6)Products and Dissemination –Reports, materials, presentations, etc. (modified from Childs, Kincaid & George, in press)

Florida’s Model Training On-going technical assistance FLPBS ↓ Districts ↓ Coaches ↓ Schools End- Year Impact Outcome data (ODR, ISS, OSS) FL Comprehensive Assessment Test Benchmarks of Quality School Demographic Data PBS Walkthrough Daily Progress Reports Behavior Rating Scales Climate Surveys Implementation Fidelity PBS Implementation Checklist (PIC) Benchmarks of Quality (BoQ) Benchmarks for Advanced Tiers (BAT) School Demographic Data School-wide Implementation Factors Tier 3 plan fidelity checklist BEP Fidelity checklist Project Impact Attrition Survey/Attrition Rates District Action Plans Client Satisfaction School-Wide Implementation Factors District Coordinator’s Survey Training Evaluations  Annual Reports  Revisions to training and technical assistance process  National, State, district, school dissemination activities  Website  On-line training modules Systems Preparation Service Provision Evaluation Process Evaluation Data Products and Dissemination MidY ear I MidY ear II Identification/ Assessment  Discipline Records  ESE Referrals  Surveys  Walkthroughs  PIC  Classroom Assessment Tool  Student rank/rating  Teacher requests  Lack of response  BAT  Behavior Rating Scale  Daily Progress Report Charts District Action Plan District Readiness Checklist School Readiness Checklist New School Profile (includes ODR, ISS, OSS)

(1) Systems Preparation Tier 1 District Readiness Checklist District Action Plan –District baseline and goals School Readiness Checklist Baseline data Tier 2 School Readiness Implementation of Tier 1 School Infrastructure Tier 3 –District Action Plan Systems change Evaluation of products and processes Establish vision and goals

Organizational Chart for Communication, Reporting & Training Amelia Van Name Larson Scott Larson Karen Childs (USF) Tara Davis Donna Vance David Law Technical Assistance (Star, Forms, Documents ) Tier 3 Tier 1 and Tier 2 RtI: Behavior Subcommittees (RtI: Behavior Task Force) Amelia Van Name Larson Scott Larson Karen Childs (USF) Kathy Christiansen (USF) Rose Iovannone (USF) Tara Davis Donna Vance David Law Cat Raulerson Kelli Hensen Lizette Alexander Scott Larson Karen Childs (USF) Brain Gaunt (USF) Tara Davis Craig Dumais Michael Campoamor Sherri Dunham

(2) Service Provision Training and ongoing technical assistance Training Tier 1- District and multi-district on-site Tier 2 –District, multi-district, web- based Tier 3- Post assessment, goal setting, systems/process established

(3) Identification & Assessment Tier 1 –Discipline records, attendance, ESE referrals, baseline BoQ, action plans, climate surveys, coaches surveys, walkthrough (mini SET), PBS Implementation Checklist (PIC) Classroom –Discipline records, teacher requests, student rankings/ratings, ESE referrals, observations, Classroom Assessment Tool Tier 2 –Discipline records, teacher requests, student rankings/ratings (SSBD, TRF, etc…), lack of response to Tier 1, Daily Progress Reports, PBS Implementation Checklist (PIC), Benchmarks for Advanced Tiers (BAT) Tier 3 –Above items, lack of response to Tier 2, Behavior Rating Scale, observation data, intervention fidelity checklist, PBS Implementation Checklist (PIC), Benchmarks for Advanced Tiers (BAT)

14 Office Discipline Referrals

Student Initials Grade/Period I or E (Step 2) _________ ________________ Rank top 3 externalizing and top 3 internalizing students Check “YES” if personally taught expectations to the student Check “YES” if personally given a SW-PBS reward to student Teacher Nomination

Tier 2 Progress Monitoring (DPRs)

Behavior Rating Scale BehaviorDate Hitting 8 or more 6-7 times 4-5 times 2-3 times 0-1 times Profanity 16 or more times times 8-11 times 4-7 times 0-3 times Requesting Attention/ Assistance 55% or more 40-55% 25-40% 10-25% 0-10%

(4) Evaluation Process Timelines for State Evaluation –Baseline (due date varies) –Mid Year I – due 10/31 School Profile PBS Implementation Checklist (PIC) (Tiers 1-3) –Mid Year II – due 2/28 PBS Implementation Checklist (PIC) (Tiers 1-3) –End Year – due 6/15 Benchmarks of Quality (BoQ) (Tier 1) Benchmarks for Advanced Tiers (BAT) (Tiers 2-3) Outcome Data Summary School-wide Implementation Factors (SWIF) Web-based Data Entry and Reporting PBSES Statewide Student Database – Academic/Behavior

(5) Evaluation Data a)Implementation Fidelity –PIC (All Tiers) –BoQ (Tier 1) –BAT (Tiers 2-3) –SWIF (All Tiers) –Walkthrough (Tier 1) –Tier 2 & 3 intervention specific fidelity measures c)Attrition –Attrition Survey (All) b)Impact on Students –Outcome data (ODR, ISS, OSS) –Academic achievement –School Demographic Data (e.g. ethnicity) –Attendance –DPR charting –Behavior Rating Scale d)Client Satisfaction –SWIF –Climate surveys –Social validation

(a) Implementation Fidelity 1.Are schools trained in SWPBS implementing with fidelity? Across years? Across school types? –BoQ, BAT, School Demographic Data 2.What factors are related to implementing with fidelity? –SWIF survey, BoQ, BAT

PBIS Evaluation Blueprint: A Work in Progress… Implementation Fidelity ResearchSelf-AssessmentProgress Monitoring SET (Tier 1) School-wide Evaluation Tool Sugai, Lewis-Palmer, Todd & Horner (2001) ISSET (Tiers 2/3) Individual Student Systems Evaluation Tool Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008) BoQ (Tier 1) Benchmarks of Quality Kincaid, Childs & George (2005) BAT (Tiers 2/3) Benchmarks for Advanced Tiers Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009) TIC (Tier 1) Team Implementation Checklist Sugai, Horner & Lewis-Palmer (2001) PIC (Tiers 1,2,3) PBS Implementation Checklist for Schools Childs, Kincaid & George (2009)

Tier 1 Critical Element Implementation Level chart

PBS Implementation Level chart

Brief Walk-through

Benchmarks for Advanced Tiers Tier 1 imple ment ation Commitment Student Identification Monitoring and Evaluation3 Support Systems Main Strategy Implementation Main Strategy Monitoring and Evaluation 2nd Strategy Implementation 2nd Strategy Monitoring and Evaluation 3rd Strategy Implementation 3rd Strategy Monitoring and Evaluation 4th Strategy Implementation 4th Strategy Monitoring and Evaluation Support Systems Assessment and Planning Monitoring and Evaluation Percent Possible Points BAT Sections Tiers 2 and 3 Tier 2: Targeted Interventions Tier 3

27 Fidelity Measure: Specific to Interventions

School-Wide Implementation Factors (SWIF) Higher Implementing Lower Implementing (70+ on BoQ) (-70 on BoQ) 90% + respondents from high implementing schools identified these factors as Helpful: 80%+ respondents from low implementing schools identified these factors as Helpful: Factors MOST Helpful to Implementation of SWPBS  Expectations and rules clearly defined  Administrator committed to PBS, willing to teach and model PBS, willing to reward students  Representative and committed PBS Team  Reward system works  PBS Coach’s guidance with process  Students responses to rewards and activities  Expectations and rules clearly defined  Administrator willing to reward students  Representative PBS Team 25%+ respondents from high implementing schools identified these factors as Problematic: 50%+ respondents from low implementing schools identified these factors as Problematic: Factors MOST Problematic to Implementation of SWPBS  Adequate funding  Team recognizes faculty participation  Staff stability from year to year  Student stability from year to year  Staff time for PBS  Staff belief about effectiveness of PBS  Staff philosophy  Staff consistency in teaching  Staff consistency in discipline procedures

Descriptive Data: Teams Team functioning did not effectively differentiate school teams implementing with high or low fidelity with better or worse outcomes Teams implementing Tier 1 PBS with fidelity saw substantially different effects on all four outcome measures

(b) Impact on Student Behavior 1.Do schools implementing SWPBS decrease ODRs, days of ISS, and days of OSS? –ODRs, ISS, OSS 2.Do schools implementing SWPBS realize an increase in academic achievement? –FCAT scores 3.Is there a difference in outcomes across school types? –ODRs, ISS, OSS, FCAT scores, school demographic data 4.Do schools implementing with high fidelity have greater outcomes implementers with low fidelity? –BoQ, ODRs, ISS, OSS 5.Do teams that work well together have greater outcomes than those that don’t work as well together? –Team Process Evaluation, ODRs, ISS, OSS

Percent change in ODR, ISS and OSS rates per 100 students before and after PBS implementation

Academic Outcomes by Implementation Level

Percent decrease in ODR, ISS, OSS rates per 100 students after 1 year of implementation (by school type)

ODRs by implementation level across three years of implementation

(c) Attrition 1.Why do schools discontinue implementation of SWPBS? –Attrition Survey

(d) Consumer Satisfaction 1.Are our consumers satisfied with the training, technical assistance, products and support received? –SWIF survey –District Coordinators survey –Training evaluation –Climate surveys

(6) Products and Dissemination Annual Reports Revisions to Training Revisions to Technical Assistance process Dissemination activities: –National, state, district, school levels Revisions to Website On-line training modules

Improvements Made 1.Increased emphasis on BoQ results for school and district-level action planning 2.Increased training to District Coordinators and Coaches and T.A. targeted areas of deficiency based upon data 3.Team Process Evaluation no longer used 4.Academic data used to increase visibility and political support 5.Specialized training for high schools 6.Identifying critical team variables impacted via training and T.A. activities 7.Revised Tier 1 PBS Training to include classroom strategies, problem- solving process within RtI framework 8.Enhanced monthly T.A. activities

In Summary… 1.Know what you want to know 2.Compare fidelity of implementation with outcomes – presents a strong case for implementing Tier 1 PBS with fidelity 3.Additional sources of data can assist a state in determining if Tier 1 PBS process is working, but also why or why not it is working 4.Address state, district, school systems issues that may impact implementation success

Contact Karen Elfner Childs, Research & Evaluation Coordinator Florida’s PBS Project Phone: (813) Website: Bob Putnam, May Institute Phone: Website: