E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Measuring Performance within School Climate Transformation Grants
Current Status and Emerging Directions for PBIS
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
From Blueprint to Finished Product: Selecting the Right Tools and Employing the Right People Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive.
Establishing an Effective Network of PB4L: School wide Coaches
Moving School-wide PBIS Forward with Quality, Equity and Efficiency 2011 Tennessee School-wide PBIS State Conf Rob Horner, University of Oregon
Extending RTI to School-wide Behavior Support Rob Horner University of Oregon
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
Building Evaluation Capacity for States and Districts
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
John Carter Project Coordinator PBIS Idaho: Menu button: Idaho PBIS Presentations and Webinars.
Rob Horner and Steve Goodman. Goals Logic for investing in Trainer development For state leadership teams developing action plan For individuals identified.
SWPB Action Planning for District Leadership George Sugai & Susan Barrettt OSEP Center on PBIS University of Connecticut February 14,
San Jose Unified School District School-wide PBS Initiative Leadership Team Rob Horner Celeste Rossetto Dickey University of Oregon Pbis.org.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
PBIS Applications NWPBIS Washington Conference November 5, 2012.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Northern California PBIS Symposium November 18, 2013.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Keys to Sustaining School-wide PBIS Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
MU Center for SW-PBS College of Education University of Missouri Missouri SW-PBS Annual Reporting pbismissouri.org.
Chris Borgmeier, Dave McKay, Anne Todd, Celeste Dickey, Rob Horner October 2008
The District Role in Implementing and Sustaining PBIS
Rob Horner University of Oregon Current assumptions/research about coaching Define the experience with coaching in SWPBS implementation.
Designing and Implementing Evaluation of School-wide Positive Behavior Support Rob HornerHolly Lewandowski University of Oregon Illinois State Board of.
Blending Academics and Behavior Dawn Miller Shawnee Mission School District Steve Goodman Michigan’s Integrated Behavior and Learning.
Sustaining School-wide Positive Behavior Support Rob Horner University of Oregon OSEP TA Center on Positive Behavior Support
Implementing School-wide PBIS Pennsylvania PBIS Implementer’s Forum Rob Horner University of Oregon.
Rob Horner University of Oregonwww.pbis.org. Celebrate: PBS now being used in many parts of society. Focus: On school-wide positive behavior support.
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Moving PBS Forward with Quality, Equity and Efficiency 2011 APBS Conference Rob Horner, University of Oregon
Brockton PBIS: Tier 2 Coaches Meeting March 2014 Adam Feinberg
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
Sustaining School-wide Positive Behavior Support Rob Horner University of Oregon OSEP TA Center on Positive Behavior Support
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett, Cyndi Boezio,
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
 This is a presentation of the IL PBIS Network. All rights reserved. Recognition Process Materials available on
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Providing Technical Assistance for Evidence-based Practices Rob Horner OSEP TA-Center on PBIS
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency.
Evidence-based Practices: A Technical Assistance Perspective Lou Danielson, Chair Brian Cobb, Susan Sanchez, Kathleen Lane, Rob Horner
OSEP Project Director’s Meeting: Establishing, Sustaining and Scaling Effective Practices Rob Horner University of Oregon OSEP TA Center on PBIS
Notes for Trainers (Day Training)
Scaling and Sustaining PBIS: State, District, School Roles Rob Horner University of Oregon
Sustaining and Improving Implementation of SWPBS Rob Horner and George Sugai OSEP TA-Center on Positive Behavior Support
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
School-wide Positive Behavior Support: Linking Social and Academic Gains Washington Association of School Administrators Rob Horner University of Oregon.
Aligning PBIS to Achieve Educational Excellence Rob Horner University of Oregon Acknowledge: George Sugai, Lucille Eber, Susan Barrett, Justyn Poulos,
Scaling and Sustaining PBIS: State, District, School Roles Rob Horner University of Oregon
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
Lessons Learned in SWPBS Implementation: Sustainability & Scaling Up George Sugai OSEP Center on PBIS Connecticut January 15,
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
Current Issues Related to MTSS for Academic and Behavior Difficulties: Building Capacity to Implement PBIS District Wide at All Three Tiers OSEP conference.
State and District Role in
Northern California PBIS Symposium November 18, 2013
Presentation transcript:

E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011 Acknowledgements: George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille Eber Donald Kincaid, Timothy Lewis, Tary Tobin

Goals/ Assumptions for Webinar Goals: Participants will be able to… – Define the core elements of an evaluation plan – Define a schedule for evaluation data collection – Define the core elements of an evaluation report Assumptions: Participants already can… – Define core features of SWPBIS – Are implementing SWPBIS – Have reviewed the SWPBIS Evaluation Blueprint

Assumptions Every evaluation is unique. Any specific evaluation report will be affected by: – Funding and political focus – Number of years implementing PBIS – Focus on Tier II and Tier III – Additional emphasis areas (Wraparound, Bully Prevention, Discipline Disproportionality)

Supporting Materials Tobin et al., 2011 Fidelity Matrix Childs, George & Kincaid, 2011 (eval brief) Outline of Evaluation Report Evaluation Blueprint Implementation Blueprint Exemplar state evaluation reports – (Illinois, Colorado, Missouri, Maryland, Florida, North Carolina)

Tobin et al 2011 (Abstract) Fidelity of implementation, also known as treatment integrity, has long been considered important for research purposes. Assessment of the impact of an intervention will be relevant only if the extent to which the intervention was implemented with fidelity is first determined. An emerging emphasis is being placed on the use of fidelity assessments to promote initial and sustained implementation of program interventions in schools. This paper examines how fidelity measurement is becoming part of the independent variable with respect to program implementation in schools, using School Wide Positive Behavior Interventions and Supports (SWPBIS) as an example. The current status of fidelity measures within SWPBIS implementation is documented with respect to Messick’s (1988) criteria for types of validity. Lessons learned from this example may pose directions and considerations for future implementation efforts in terms of instrument development, use in the implementation of SWPBIS, and other educational innovations. Key words: Fidelity of implementation, school-wide, positive behavior interventions and supports, PBIS

BoQ use (pbis.org/evaluation) Stability in Variant Administration Methods of the School-Wide PBS Benchmarks of Quality (BoQ) Karen Elfner Childs, Heather Peshak George, Don Kincaid The School-Wide PBS Benchmarks of Quality is now being used by thousands of schools across the country. One of the most frequently asked questions regarding the BoQ has been related to the procedures used for scoring. As a result, we decided to investigate the following question, “Is the BoQ stable and will similar scores result if administered in the method used in the validation study versus other methods?”

Foundations Evaluation is the process of gathering data to – Answer questions – Make decisions Define the core questions/decisions, before selecting measures. Provide an adequate context for interpreting the data. Effective evaluation is cyclical (repeated) – Use the evaluation process for improvement

Evaluation Questions/ Cycle Measure Perform Plan Compare Context Input Fidelity Impact Replication Sustainability

Evaluation Cycle Core Questions Evaluation Plan Data Collection Evaluation Report

Core Evaluation Questions Context What are/were the goals and objectives for SWPBIS implementation? (state/district capacity; school adoption; student outcomes) Who delivered the training and technical assistance for SWPBIS implementation? Input What professional development events were part of SWPBIS implementation support? – Was projected level of TA capacity provided (training/coaching)? Who received the professional development (schools/ cohorts)?

Core Evaluation Questions Fidelity To what extent was Tier I SWPBIS implemented with fidelity ? To what extent were Tier II and Tier III SWPBIS implemented with fidelity? To what extent is the Leadership Team establishing core functions?

Leadership Team Active Coordination Funding Visibility Political Support TrainingCoachingEvaluation Local School/District Teams/Demonstrations Behavioral Expertise Policy Sugai et al.,

Core Evaluation Questions Impact To what extent is SWPBIS associated with changes in student behavioral outcomes? To what extent is SWPBIS associated with changes in academic performance, and dropout rates? To what extent is district/state capacity established? ( local training, coaching, evaluation, behavioral expertise) To what extent is leadership and policy structure established?

Core Evaluation Questions Replication, Sustainability, and Improvement To what extent do Fidelity and Impact outcomes sustain across time? To what extent does initial SWPBIS implementation affect Implementation with later cohorts? To what extent did SWPBIS implementation change educational/behavioral capacity/policy? To what extent did SWPBIS implementation affect systemic educational practice?

Evaluation Plan Context – Evaluation questions, stakeholders, purpose(s) – Schedule of reports for stakeholder decision-making Input – Who will provide what TA/Training (and how much) to whom Fidelity – What data, when collected, by whom to assess: Leadership Team, School teams Tier I, Tier II, Tier III Impact – What data, when collected, by whom to assess: District/state capacity (training, coaching, expertise, evaluation) Student behavior Student academic Replication, Sustainability, Policy

Evaluation Plan Context – Define the roles of the evaluation – Stakeholders, Implementers, Adopters, Evaluators – Define the purpose of the evaluation – What decisions are to be affected by the evaluation? – What questions need to be answered to make informed decisions? – Define the basic goals, timeline and constraints associated with the implementation effort

Evaluation Plan Input – Who will provide what training and technical assistance… on what schedule… to improve the capacity of the leadership team to establish capacity to sustain and scale up SWPBIS? – Who will provide what training and technical assistance to school teams to result in Tier I implementation of SWPBIS? – Who will provide what training and technical assistance to school teams to result in Tier II an Tier III implementation of SWPBIS.

Evaluation Plan Fidelity ContentMeasure(s)Schedule Tier I SWPBIS FidelityTIC (Progress Monitor) BoQ (once TIC is at criterion SET (for research measures TIC (every 3-4 meetings) BoQ or SET (Annually in Spring) Tier II and Tier III SWPBIS Fidelity MATT (Progress Monitor) BAT (Annual) ISSET (for formal research) MATT (every 3-4 team meetings) BAT or ISSET (Annually in Spring)

Evaluation Plan Impact ContentMeasureSchedule Leadership Team CapacityImplementation Self- Assessment Annually (Spring) for use in planning for Fall. BehaviorODRs (SWIS) Attendance Graduation Rate Continuous Annual Academic(CBM)Oral Reading Fluency Standardized Assessments Graduation Start of school, Winter, Early Spring Annually (Spring) Spring

Data Collection 8 core PBIS measures – (Tobin et al., 2011; Childs et al., 2011) Basic logic – Research quality measures – Annual self-assessment measures – Progress monitoring measures – (to be used every 3-4 meetings/cycles) Fidelity measure matrix (Tobin & Vincent)

Data Collection Level of SupportResearch Measures Annual Self- Assessment Measures Progress Monitoring Measures UniversalSchool-wide Evaluation Tool (SET) Self-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) Secondary and Tertiary Individual Student School-wide Evaluation Tool (I-SSET) Benchmarks of Advanced Tiers (BAT) Measure of Advanced Tiers Tool (MATT) Leadership Team SWPBIS Implementation Self-Assessment

Data Collection Schedule LevelMeasurePreYear 1Year 2Year 3 SFWSSFWSSFWS Universal SWPBS Progress Monitoring: TIC XXX XXX Annual Self- Assessment: BoQ, X X X Research Measure: SET X X X Self-Assessment Survey: SAS X X X X Secondary/ Tertiary Progress Monitoring: MATT XXX XXX Annual Self- Assessment: BAT X X Research Measure: I-SSET X X Leadership TeamImplementation Self-Assessment XXXX

Data Collection Schedule LevelMeasurePreYear 4Year 5Year 6 SFWSSFWSSFWS Universal SWPBS Progress Monitoring: TIC Annual Self- Assessment: BoQ, X X X Research Measure: SET Self-Assessment Survey: SAS Secondary/ Tertiary Progress Monitoring: MATT Annual Self- Assessment: BAT X X X Research Measure: I-SSET Leadership TeamImplementation Self-Assessment XXX

Evaluation Report Basic Outline – Context, Input, Fidelity, Student Outcomes, Future Directions Local Adaptation – Cohorts – Locally important questions Disproportionality Bully Prevention Cost – Change in evaluation focus over time Increased emphasis on Tier II and Tier III Examples – At

Evaluation Report Context Define SWBIS – SWPBIS is a framework for establishing a school-wide social culture with the necessary individualized supports needed for all students to achieve academic and social success. Define Goals of the specific project – Number of schools per year implementing SWPBIS How schools were selected Expectation of 2-3 years for Tier I implementation to criterion Expectation of 2 years of criterion implementation to affect academic outcomes. – Development of district/state capacity Capacity needed for sustained and scaled implementation – Behavioral and academic outcomes for students Student outcomes linked to fidelity of implementation Define Stakeholders/ Evaluation Questions – Evaluation report is written at the request of: – Evaluation report is focused on the following key questions:

Evaluation Report Input Who received what support, and from whom? – Leadership team – Local Capacity Building Training, Coaching, Behavioral Expertise, Evaluation – School teams

Leadership Team – Planning Dates – Development of Implementation Plan – Dates of SWPBIS Implementation Self- Assessment Capacity Building – Training, Coaching, Behavioral Expertise, Evaluation Data collection systems

Schools; Coaches; Trainers

Number of Schools, Use of Fidelity Data, and Access to ODR Data

Evaluation Report Impact on SWPBIS Fidelity Leadership Team SWPBIS Implementation Self-Assessment Capacity Development Number of trainers/coaches available to support teams/districts Behavioral expertise available to support Tier II and Tier III implementation Evaluation capacity (data collection, data use, information distribution) School Teams Tier I Implementation (TIC, BoQ, SET, SAS) Collectively, and/or by training cohort Tier II / Tier III Implementation (MATT, BAT, ISSET) Collectively, and/or by training cohort Additional measures of fidelity Phases of Implementation CICO checklist

Evaluation Report

Fidelity of Leadership Team development – SWPBIS Implementation Self-Assessment sub-scales

Evaluation Report Fidelity (TIC, BoQ, SET, SAS) Total Score – Schools

Evaluation Report Fidelity (cohort) Total Score N = 17

Evaluation Report Fidelity (subscales) TIC

Evaluation Report BoQ Subscale Report

Evaluation Report Tier II and Tier III Fidelity Measure Cohort N = 17 ISSET/BAT/MATT Total Scores Time 1 and Time 2 Time 1 Time 2

Evaluation Report Fidelity (Tier II & III)

Tier I, Tier II, Tier III Criterion: Count

Tier I, Tier II, Tier III Criterion: Percentage

Evaluation Report Impact: Student outcomes

Steve Goodman

Participating Schools 2004 Schools (21) 2005 Schools (31) 2006 Schools (50) 2000 Model Demonstration Schools (5) 2008 Schools (95) 2009 Schools (150*) Total of 512 schools in collaboration with 45 of 57 ISDs (79%) The strategies and organization for initial implementation need to change to meet the needs of larger scale implementation.

Average Major Discipline Referral per 100 Students by Cohort

Increase 8% Decrease 14.6% Focus on Implementing with Fidelity using Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08

Percent of Students meeting DIBELS Spring Benchmark for Cohorts (Combined Grades) 5,943 students assessed assessed 8,330 students assessed assessed 16,078 students assessed assessed 32,257 students assessed assessed Spring ’09: 62,608 students assessed in cohorts 1 - 4

Percent of Students at DIBELS Intensive Level across year by Cohort

North Carolina Positive Behavior Interventions & Support Initiative Heather R. Reynolds NC Department of Public Instruction Bob Algozzine Behavior and Reading Improvement Center

Suspensions per 100 students

Cedar Creek Middle School Franklin County, North Carolina

North Carolina Positive Behavior Support Initiative Dr. Bob Algozzine Schools with Low ODRs and High Academic Outcomes Office Discipline Referrals per 100 Students Proportion of Students Meeting State Academic Standard

Evaluation Report Implications – Policy – Practice – Technical Assistance Future Directions

Summary Building an Evaluation Plan for SWPBIS implementation Collecting and organizing Data Reporting Data for Active Decision-making