Download presentation
Presentation is loading. Please wait.
Published byQuentin Hardy Modified over 9 years ago
1
Scaling and Sustaining PBIS: State, District, School Roles
Rob Horner University of Oregon
2
Goals Current status and lessons learned from states scaling and sustaining PBIS. Specific suggestions for state, district and school personnel implementing PBIS (or any MTSS) School/ District Self-Assessment State/District (District Capacity Assessment)
3
Assumptions Knowledgeable about PBIS
Demonstrations of effective implementation Read to focus on sustainability, scaling and building PBIS at all three tiers.
4
Why SWPBIS? The fundamental purpose of SWPBIS is to make schools more effective learning environments. Predictable Positive Consistent Safe
5
Main Messages Effective (academic, behavior)
PBIS is a foundation for the next generation of education. Effective (academic, behavior) Equitable (all students succeed) Efficient (time, cost)
6
School-wide Positive Behavioral Interventions and Supports (SWPBIS)
The social culture of a school matters. A continuum of supports that begins with the whole school and extends to intensive, wraparound support for individual students and their families. Effective practices with the systems needed for high fidelity and sustainability Multiple tiers of intensity
7
Experimental Research on SWPBIS
SWPBIS Experimentally Related to: Reduction in problem behavior Increased academic performance Increased attendance Improved perception of safety Reduction in bullying behaviors Improved organizational efficiency Reduction in staff turnover Increased perception of teacher efficacy Improved Social Emotional competence Bradshaw, C.P., Koth, C.W., Thornton, L.A., & Leaf, P.J. (2009). Altering school climate through school-wide Positive Behavioral Interventions and Supports: Findings from a group-randomized effectiveness trial. Prevention Science, 10(2), Bradshaw, C.P., Koth, C.W., Bevans, K.B., Ialongo, N., & Leaf, P.J. (2008). The impact of school-wide Positive Behavioral Interventions and Supports (PBIS) on the organizational health of elementary schools. School Psychology Quarterly, 23(4), Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of School-Wide Positive Behavioral Interventions and Supports on student outcomes: Results from a randomized controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 12, Bradshaw, C.P., Reinke, W. M., Brown, L. D., Bevans, K.B., & Leaf, P.J. (2008). Implementation of school-wide Positive Behavioral Interventions and Supports (PBIS) in elementary schools: Observations from a randomized trial. Education & Treatment of Children, 31, Bradshaw, C., Waasdorp, T., Leaf. P., (in press). Effects of School-wide positive behavioral interventions and supports on child behavior problems and adjustment. Pediatrics. Horner, R., Sugai, G., Smolkowski, K., Eber, L., Nakasato, J., Todd, A., & Esperanza, J., (2009). A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. Journal of Positive Behavior Interventions, 11, Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptionality, 42(8), Ross, S. W., Endrulat, N. R., & Horner, R. H. (2012). Adult outcomes of school-wide positive behavior support. Journal of Positive Behavioral Interventions. 14(2) Waasdorp, T., Bradshaw, C., & Leaf , P., (2012) The Impact of Schoolwide Positive Behavioral Interventions and Supports on Bullying and Peer Rejection: A Randomized Controlled Effectiveness Trial. Archive of Pediatric Adolescent Medicine. 2012;166(2): Bradshaw, Pas, Goldweber, Rosenberg, & Leaf, 2012 Freeman, J., Simonsen, B., McCoach D.B., Sugai, G., Lombardi, A., & Horner, ( submitted) Implementation Effects of School-wide Positive Behavior Interventions and Supports on Academic, Attendance, and Behavior Outcomes in High Schools.
8
Current Status Multiple Models for Scaling
9
Local School Demonstrtions
Visibility Political Support Funding Policy Leadership Team Active Coordination Training Coaching Technical Expertise Evaluation Local School Demonstrtions
10
Implementation Science Frameworks
WHO Teams WHEN Stages WHAT Interventions HOW Cycles HOW Drivers Full set of icons with headers
11
Successful Student Outcomes
Program/Initiative/Framework (e.g. RtI) Performance Assessment (Fidelity) Coaching Systems Intervention Training Facilitative Administration Competency Drivers Organization Drivers Implementation Drivers There are two categories of Implementation Drivers: Competency and Organization. When these core components are in place they provide the support to a successful implementation that will be sustained. Competency Drivers are mechanisms that help to develop, improve, and sustain one’s ability to implement an intervention to benefit students. Competency Drivers include: Selection, Training, Coaching, and Performance Assessment Organization Drivers are mechanisms to create and sustain hospitable organizational and systems environments for effective educational services. Organization Drivers include: Decision Support Data System, Facilitative Administration, and Systems Intervention PD is not a panacea to address every problem PD must be housed in a systems to support this effective practices resulting in successful sustainable student outcomes. Selection Decision Support Data System Leadership Adaptive Technical © Fixsen & Blase, 2008
12
Implementation Stages
Implementation occurs in stages: 2 – 3 Years Exploration Installation Initial Implementation Full Implementation Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 (c) Dean Fixsen and Karen Blase, 2004
13
Stages of Implementation
Steve Goodman Focus Stage Description Exploration/ Adoption Decision regarding commitment to adopting the program/practices and supporting successful implementation. Installation Set up infrastructure so that successful implementation can take place and be supported. Establish team and data systems, conduct audit, develop plan. Initial Implementation Try out the practices, work out details, learn and improve before expanding to other contexts. Full Implementation Expand the program/practices to other locations, individuals, times- adjust from learning in initial implementation. Continuous Improvement/ Regeneration Make it easier, more efficient. Embed within current practices. Should we do it! Work to do it right! Implementation is not an event A mission-oriented process involving multiple decisions, actions, and corrections Work to do it better!
14
Scaling up School-wide Positive Behavioral Interventions and Supports: The Experiences of Seven States with Documented Success Rob Horner, Don Kincaid, George Sugai, Tim Lewis, Lucille Eber, Susan Barrett, Celeste Rossetto Dickey, Mary Richter, Erin Sullivan, Cyndi Boezio, Nancy Johnson, (2014 ), JPBI Exploration Installation Initial Imp Full Imp Leadership Team Funding Visibility Political Support Policy Training Coaching Expertise Evaluation Demos Interviews and Data Reviews with the PBIS implementers from Seven States that had at least 500 schools using PBIS.
15
Exploration and Adoption Installation Initial Implementation
Full Implementation Innovation and sustainability Leadership Team (coordination) Do you have a state leadership team? If you do, how was your first leadership team developed? Who were members? Who supported/lead the team through the exploration process? Was any sort of self-assessment completed (e.g. the PBIS Implementation Blueprint Assessment)? What was the role of State agency personnel in the exploration phase? What were critical issues that confronted the team as it began to install systems changes? What were specific activities the team did to ensure success of the initial implementation efforts? Did the team change personnel or functioning as the # of schools/districts increased? What has the Leadership team done to insure sustainability? In what areas is the State “innovating” and contributing to the research and practice of PBIS (e.g. linking PBIS with literacy or math)? Do you have a state leadership team? If you do, how was your first leadership team developed? Who were members? Who supported/lead the team through the exploration process? Was any sort of self-assessment completed (e.g. the PBIS Implementation Blueprint Assessment)? What was the role of State agency personnel in the exploration phase?
16
Descriptive Summary: Oregon
Exploration / Installation / Initial Imp /Full Imp & Innovate
17
Descriptive Summary: Missouri
Exploration / Installation /Initial Imp / Full Imp & Innovate
18
Descriptive Summary: North Carolina
Exploration / Installation / Initial & Full Imp / Innovate
19
Descriptive Summary: Colorado
Exploration / Installation / Initial & Full Imp / Innovate
20
Descriptive Summary: Florida
Exploration/ Installation/ Initial Imp / Full Imp / Innovate
21
Descriptive Summary: Maryland
Exploration / Installation / Initial Imp / Full Imp / Innovate
22
Descriptive Summary: Illinois
Exploration / Installation / Initial Imp /Full Imp & Innovate
23
Lessons Learned Multiple approaches to achieving scaled implementation
Colorado: Started with Leadership Team Illinois: Started with Leadership Advocates and built team only after implementation expanded. Missouri: Strong initial demonstrations led to strong state support All states began with small “demonstrations” that documented the feasibility and impact of SWPBIS. Only when states reached demonstrations did scaling occur. Four core features needed for scaling: Administrative Leadership / Support/ Funding Technical capacity (Local training, coaching, evaluation and behavioral expertise) Local Demonstrations of feasibility and impact ( ) Evaluation data system (to support continuous improvement) Essential role of Data: Fidelity data AND Outcome data
24
Lessons Learned Scaling is NOT linear
Sustained scaling requires continuous regeneration Both Bottom Up and Top Down Threats to Scaling: Competing initiatives The seductive lure of the “new idea” Leadership turnover Legislative mandates Fiscal constraint Regular Dissemination of Fidelity and Impact data is the best “protective factor” for threats to scaling
25
Lessons Learned Scaling requires planned efficiency
The unit cost of implementation must decrease as the number of adoptions increases. Shift from external trainers to within state/district trainers Use local demonstrations as exemplars Increased coaching capacity can decrease investment in training Improved “selection” of personnel decreases turnover and development costs Use existing professional development and evaluation resources differently Basic Message: The implementation practices that are needed to establish initial exemplars may be different from the practices used to establish large scale adoption. Jennifer Coffey, 2008
26
Sustainability Most educational innovations do not endure beyond 9 months The likelihood of sustaining is NOT related to effectiveness Achieving Sustainability is essential for cost effectiveness and scaling up to levels of social importance. The variables that affect sustaining implementation are not necessarily the variables that affect initial adoption.
27
Schools implement and sustain regardless of school demographic characteristics
High schools Slope or Speed of putting things into place predicts sustainability (Latham, 1988)
28
Research on PBIS Sustainability
Coffey & Horner, 2012 Exceptional Children, 78,
29
A PBIS Sustainability Study (Coffey & Horner, 2012)
Sample: 285 schools with SET scores What predicted INITIAL Adopt of PBIS What predicted SUSTAINED use of PBIS Time check to see if time for Coffey
30
Implementers vs. Non-implementers
SET Subscale SET Overall Expectations Defined Responding to Violations Met SET (≥80%) Expectations Taught Monitoring and Decision-Making Reward System Management (team and admin) ET: have they been taught, what are the rules? M: adequate referral form, data shown and used District-Level Support Never met SET (<80%)
31
Sustainers vs. Non-sustainers
SET Subscale SET Overall Expectations Defined Responding to Violations Sustained SET for ≥5 years Expectations Taught Monitoring and Decision-Making Reward System Management (team and admin) M: representative team, with administrator, reported out to school, SIP RS: documented system, students acknow’d, tchrs acknow Canary in the coal mine? District-Level Support Met and lost SET in 5 years
32
Perceived Importance of Contextual Features for Sustainability of PBIS
McIntosh, K., Predy, L., Upreti, G., Hume, A. E. & Mathews, S. (2014). Perceptions of contextual features related to implementation and sustainability of School-wide Positive Behavior Support. Journal of Positive behavior Interventions, 16,
33
Most Important Features for Sustainability
School administrators actively support PBIS School administrators describes PBIS as a top priority for the school A school administrator regularly attends and participates in PBIS team meetings The PBIS school team is well organized and operates efficiently The school administrators ensure that the PBIS team has regularly scheduled time to meet Pick the 4 in order
34
Research on Sustainability
Recent studies on sustainability of PBS Perceptions of critical features for sustainability (McIntosh, Predy, Hume, Turri, & Mathews, in press) Factors predicting sustainability (McIntosh et al., in press) Critical features of PBS systems (Mathews, McIntosh, Frank, & May, under review)
35
Traditional Behavior/ Classroom Management
Building Capacity SISEP and Scaling UP State District School Classroom Current PBIS Initial PBIS Traditional Behavior/ Classroom Management
36
Team Based Implementation
Districts Coherent District Policy Social behavior is a priority in district improvement plan District commitment to selecting practices that are evidence-based District process for aligning multiple initiatives. Evaluation Capacity Data systems that inform decision-making and provide policy feedback ** Fidelity and Impact Recruitment, Hiring, Evaluation “Preference will be given to individuals with knowledge and experience in implementation of multi-tiered academic and behavior supports.” WHAT Interventions HOW Cycles WHO Teams Team Based Implementation
37
Districts Annual Faculty/Staff Orientation
HOW Drivers Annual Faculty/Staff Orientation Defines PBIS as a priority Defines what to expect in a school using PBIS. 30-60 min of annual orientation Professional Development (Training) PD is always tied to core improvement goals PD typically involves distributed training (multiple events) PD is always linked to on-site coaching. PD is always linked to fidelity measure Coaching
38
Districts Annual Faculty/Staff Evaluations
Evaluations include assessment of “implementation of multi-tiered systems of academic and behavior support.” Build Technical Capacity to Implement Tier II and Tier III supports Assessing District Capacity The District Capacity Assessment (DCA)
39
Team Feature Data Sources Scoring Criteria 0 = Not implemented 1 = Partially implemented 2 = Fully implemented 1.1 Team Authority: Tier I planning team exists, and meets at least monthly to use implementation fidelity and student outcome data to manage universal, school-wide support systems. School organizational chart Tier I systems planning team meeting minutes Fidelity tool(s) 0 = No team identified to manage and monitor Tier I supports 1 = PBIS Team in place but does review data regularly 2 = PBIS Team exists with the authority to implement at least Tier I PBIS, and has at least monthly meetings with access to fidelity and student outcome data to guide action planning
40
Implementation Feature Data Sources Scoring Criteria 0 = Not implemented 1 = Partially implemented 2 = Fully implemented 1.4 Behavioral Expectations: School has five or fewer positively stated behavioral expectations for student and staff behaviors (i.e., teaching matrix) defined and in place. Staff handbook Student handbook Walk through reports 0 = No clearly stated behavioral expectations have been identified and posted 1 = Behavioral expectations identified but no matrix 2 = Five or fewer behavioral expectations exist that are positive, posted, and identified for specific settings
41
Subscale and Items Action Who When Next Update Commitment and Leadership Systems Alignment Action Planning Performance Feedback Selection Training Coaching Decision Support System Facilitative Administration Systems Intervention
45
Next Steps State District Use SCA to define state capacity
Invest in state leadership team Build scaling plan with both “bottom up” and “top down” strategies Fuchs et al., 2014 District Use DCA to define district capacity Use District Self-Assessment for targeted content areas. Invest in Leadership Team Build capacity to Train, Coach, Evaluate, Sustain Collect school “fidelity” data regularly (TFI) Collect student outcome data regularly Tier I ODR Tier II CICO points Tier III % students “Progressing” ISIS
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.