Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Schoolwide Positive Behavior Interventions and Support -SWPBIS- Mitchell L. Yell, Ph.D. University of South Carolina
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
CT PBS Coaches’ Meeting Coaching SWPBS Basics December 9, 2008 Brandi Simonsen, Kari Sassu, & George Sugai.
School-wide Positive Behavior Support: Outcomes, Data, Practices, & Systems George Sugai Center on Positive Behavioral Interventions & Supports University.
PBIS Applications NWPBIS Washington Conference November 5, 2012.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Northern California PBIS Symposium November 18, 2013.
MU Center for SW-PBS College of Education University of Missouri Missouri SW-PBS Annual Reporting pbismissouri.org.
The District Role in Implementing and Sustaining PBIS
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
Counselor’s Meeting August 11, 2014 Michelle Coconate, RtI Facilitator Academic & Behavioral Response to Intervention (RtI)
Building A Tier Two System In An Elementary School: Lessons Learned Tina Windett & Julie Arment Columbia Public Schools, Missouri Tim Lewis & Linda Bradley.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
RTI: Reasons, Practices, Systems, & Considerations George Sugai OSEP Center on PBIS University of Connecticut December 6,
The Instructional Decision-Making Process 1 hour presentation.
Introduction to Coaching School-Wide PBS:RtIB. 2 Agenda PBS:RtIB Brief Overview Coaching Tier 1 Coaching Skills and Activities Resources and Barriers.
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
2015 WI PBIS Conference A5. Benchmark of Quality – Implementation Plan Dave Kunelius - WI RtI Center Regional Coordinator-PBIS.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
School-Wide PBIS: Action Planning George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut August 11, 2008.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
Introduction to School-wide Positive Behavior Support.
School-Wide Positive Behavioral Interventions and Supports: District Coaches’ Meeting Donna Morelli Cynthia Zingler Education Specialists Positive Behavioral.
Sustaining Change: RtI & SWPBS George Sugai OSEP Center on PBIS Center for Behavioral Education and Research University of Connecticut May 9,
Positive Behaviour for Success (PBS) Universals Pre-Training Session 2012 Positive Behaviour for Success Illawarra South East Region.
Notes for Trainers (Day Training)
Introduction to PBIS Forum George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut October
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
VTPBiS Coordinators as Coaches Learning and Networking Workshop Presented by VTPBiS State Team.
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
Scaling & Sustaining Evidence-Based Practices Glen Dunlap, Steve Goodman, Tim Lewis, Rob Horner & George Sugai OSEP Center on PBIS OSEP Project Directors’
Lessons Learned in SWPBS Implementation: Sustainability & Scaling Up George Sugai OSEP Center on PBIS Connecticut January 15,
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Coordinator Orientation for PBIS at the Universal Level Presented by VTPBiS State Team April 2016.
SCTG COACHES WEBINAR November 3, 2015 December 1, 2015 January 5, 2016 February 2, 2016 March 1, 2016 April 5, 2016 May 3, 2016.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS (PBIS)
Positive Behavior Interventions and Supports (PBIS) Rachel Saladis Wisconsin PBIS Network
Coaching PLC April 5, 2011 Pat Mueller
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
Presentation transcript:

Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA 2010

Session Overview Positive Behavioral Interventions & Support Model (PBIS)…What is PBIS? What is RTI? Positive Behavioral Interventions & Support Model (PBIS)…What is PBIS? What is RTI? Review of 4 key PBIS fidelity instruments Review of 4 key PBIS fidelity instruments Overview of PBIS survey from 2 states Overview of PBIS survey from 2 states PBIS survey results PBIS survey results Implications of the survey findings Implications of the survey findings 2

PBIS is… A framework for enhancing adoption & implementation of A framework for enhancing adoption & implementation of A continuum of evidence-based interventions to achieve A continuum of evidence-based interventions to achieve Academically & behaviorally important outcomes for Academically & behaviorally important outcomes for All students. All students. 3

PBIS emphasizes 4 integrated elements: data for decision making, data for decision making, measurable outcomes supported and evaluated by data, measurable outcomes supported and evaluated by data, practices with evidence that these outcomes are achievable, and practices with evidence that these outcomes are achievable, and systems that efficiently and effectively support implementation of these practices. systems that efficiently and effectively support implementation of these practices. 4

SYSTEMS PRACTICES DATA Supporting Staff Behavior Supporting Student Behavior OUTCOMES Supporting Social Competence & Academic Achievement Supporting Decision Making Integrated Elements 5

1-5% 5-10% 80-90% Intensive, Individual Interventions Individual Students Assessment-based High Intensity Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Universal Interventions All students Preventive, proactive Universal Interventions All settings, all students Preventive, proactive 6 Responsiveness to Intervention Academic SystemsBehavioral Systems Circa 1996

Evaluation Blueprint Context: goals & objectives; who provided & received support Context: goals & objectives; who provided & received support Input: PD provided; who participated; perceived value of the PD Input: PD provided; who participated; perceived value of the PD Fidelity: implemented as designed & with fidelity Fidelity: implemented as designed & with fidelity Impact: changes in student outcomes Impact: changes in student outcomes Replication, Sustainability & Improvement: improved state/local capacity; changes in educational/behavioral policy; systemic educational practice Replication, Sustainability & Improvement: improved state/local capacity; changes in educational/behavioral policy; systemic educational practice 7

Fidelity Instruments Team Implementation Checklist Team Implementation Checklist Self-Assessment Survey Self-Assessment Survey School-wide Evaluation Tool School-wide Evaluation Tool Benchmarks of Quality Benchmarks of Quality 8

Team Implementation Checklist (TIC) Progress monitoring measure for assessing Universal practices Progress monitoring measure for assessing Universal practices 22-item self-assessment completed by school team & coach 22-item self-assessment completed by school team & coach Typically administered 2-3 times per year Typically administered 2-3 times per year Criterion = > 80%. Criterion = > 80%. Information is used to build an action plan for improving implementation fidelity Information is used to build an action plan for improving implementation fidelity 9

Self-Assessment Survey (SAS) Formerly titled the Effective Behavior Support (EBS) Survey Formerly titled the Effective Behavior Support (EBS) Survey Administered to entire school staff to assist with action planning & assessing progress over time Administered to entire school staff to assist with action planning & assessing progress over time Conducted annually, preferably in spring Conducted annually, preferably in spring Purpose is to assess 4 behavior systems: Purpose is to assess 4 behavior systems: school-wide discipline school-wide discipline non-classroom management (e.g., cafeteria, hallway, playground) non-classroom management (e.g., cafeteria, hallway, playground) classroom management classroom management systems for individual students systems for individual students Of the 4 instruments, this is the only one completed by all school faculty and staff. Of the 4 instruments, this is the only one completed by all school faculty and staff. 10

School-wide Evaluation Tool (SET) Designed to assess & evaluate critical features across each academic year Designed to assess & evaluate critical features across each academic year Conducted annually Conducted annually Takes a 2-3 hour review of PBIS systems by an external evaluator. Often there is a cost for the evaluator Takes a 2-3 hour review of PBIS systems by an external evaluator. Often there is a cost for the evaluator One state in this study only uses the SET as tool for determining model schools. The other state used SET extensively up until the last two years and has been transitioning to the BOQ. One state in this study only uses the SET as tool for determining model schools. The other state used SET extensively up until the last two years and has been transitioning to the BOQ. 11

Benchmarks of Quality (BOQ) Developed by personnel at the University of South Florida. Developed by personnel at the University of South Florida. 53-item self-assessment measure of Universal Tier 53-item self-assessment measure of Universal Tier Is completed by a school team & PBIS coach at the end of the academic year Is completed by a school team & PBIS coach at the end of the academic year Takes minutes to complete Takes minutes to complete Leads to summary scores & action planning steps Leads to summary scores & action planning steps Score > 70% is considered to be implementing at criterion Score > 70% is considered to be implementing at criterion 12

Practical Concerns Differences between “research” methods and “evaluation” methods. Differences between “research” methods and “evaluation” methods. 3-4 PBIS instruments are being recommended with multiple administration times for at least one of those instruments. 3-4 PBIS instruments are being recommended with multiple administration times for at least one of those instruments. It is not uncommon for schools to have multiple initiatives, each with their data collection procedures. It is not uncommon for schools to have multiple initiatives, each with their data collection procedures. PBIS has been plagued in many states by lack of comparable data across years. PBIS has been plagued in many states by lack of comparable data across years. Are we placing a burden on schools that impacts their ability to fully implement the model? Are we placing a burden on schools that impacts their ability to fully implement the model? Is this current system of data collection sustainable? Is this current system of data collection sustainable? 13

PBIS Instrument Use

Survey Methods Method of Survey Method of Survey SurveyMonkey invite sent to school-based coaches, with one follow-up SurveyMonkey invite sent to school-based coaches, with one follow-up Response Rates Response Rates State 1 – 99/288 (34%) State 1 – 99/288 (34%) State 2 – 15/30 (50%) State 2 – 15/30 (50%) Quantitative Findings Quantitative Findings Qualitative Findings Qualitative Findings 15

Qualitative Data The TIC really helps us stay on target and helps us make sure we are implementing all of the components, thereby getting the most from our PBIS. The TIC really helps us stay on target and helps us make sure we are implementing all of the components, thereby getting the most from our PBIS. The SAS allows us to know where we are and how we are going to get where we are going. The SAS allows us to know where we are and how we are going to get where we are going. The BOQ showed our strengths and weaknesses. We saw areas that needed improving. We could see our "glows" and "grows." It gave us a vision of what needed to happen. The BOQ showed our strengths and weaknesses. We saw areas that needed improving. We could see our "glows" and "grows." It gave us a vision of what needed to happen. The SET Tool allowed us to have a framework to work from during each year. It was a great guide and helped keep you focused on the goal. The SET Tool allowed us to have a framework to work from during each year. It was a great guide and helped keep you focused on the goal. 22

Why Respondents Don’t Like Particular Instruments SAS SAS Could be more useful if staff clearly understood some of the descriptors-- data is often inaccurate due to lack of understanding Could be more useful if staff clearly understood some of the descriptors-- data is often inaccurate due to lack of understanding EBS is challenging to get every staff member to participate. EBS is challenging to get every staff member to participate. Hard for staff to interpret with the types of graphs used. Hard for staff to interpret with the types of graphs used. BOQ BOQ Time consuming and provides similar information as the other documents. Time consuming and provides similar information as the other documents. Does not really show me anything other than what we already know. Does not really show me anything other than what we already know. Challenge is to get an understanding of the questions and have it filled out correctly. Challenge is to get an understanding of the questions and have it filled out correctly. Process is confusing and pits the Coach against the team. Process is confusing and pits the Coach against the team. SET SET The SET was too time intensive. The SET was too time intensive. 23

Challenges to Using PBIS Instruments Time Time Often we find the various forms loathsome and time consuming when completing. In turn we spend less time working on refining our PBIS strategies. Often we find the various forms loathsome and time consuming when completing. In turn we spend less time working on refining our PBIS strategies. We have a small staff and it is difficult to find the time to collect the information for these instruments. We have a small staff and it is difficult to find the time to collect the information for these instruments. Although it does not take a lot of time, we have so many other things to manage that sometimes it is hard to find a few minutes. Although it does not take a lot of time, we have so many other things to manage that sometimes it is hard to find a few minutes. Our schools biggest challenge is finding a time to meet each month with the entire team. Our schools biggest challenge is finding a time to meet each month with the entire team. 24

Challenges to Using PBIS Instruments Buy-In Buy-In Lack of support and understanding of PBIS principles. Lack of support and understanding of PBIS principles. It has been a challenge for teachers and administrators to buy-in to PBIS. I believe the PBIS process can work if you have a good foundation as well as administrators who want a better school. It has been a challenge for teachers and administrators to buy-in to PBIS. I believe the PBIS process can work if you have a good foundation as well as administrators who want a better school. Lack of administrative support and time to work as a team. Lack of administrative support and time to work as a team. The instruments are all great! Our only challenge involves the turnover in leadership and working to gain their support. The instruments are all great! Our only challenge involves the turnover in leadership and working to gain their support. 25

Discussion Points Evaluation versus research Evaluation versus research Other initiatives at school Other initiatives at school School-based teams School-based teams Sustainability Sustainability 26

References Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2010). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2010). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from Presentation by George Sugai, VT Statewide PBIS Conference. 9/30/10 Presentation by George Sugai, VT Statewide PBIS Conference. 9/30/

Contact Information Pat Mueller & David Merves Pat Mueller & David Merves Evergreen Evaluation & Consulting, LLC Brent Garrett Brent Garrett Pacific Institute for Research & Evaluation