Presentation is loading. Please wait.

Presentation is loading. Please wait.

E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011.

Similar presentations


Presentation on theme: "E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011."— Presentation transcript:

1 E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011 Acknowledgements: George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille Eber Donald Kincaid, Timothy Lewis, Tary Tobin

2 Goals/ Assumptions for Webinar Goals: Participants will be able to… – Define the core elements of an evaluation plan – Define a schedule for evaluation data collection – Define the core elements of an evaluation report Assumptions: Participants already can… – Define core features of SWPBIS – Are implementing SWPBIS – Have reviewed the SWPBIS Evaluation Blueprint

3 Assumptions Every evaluation is unique. Any specific evaluation report will be affected by: – Funding and political focus – Number of years implementing PBIS – Focus on Tier II and Tier III – Additional emphasis areas (Wraparound, Bully Prevention, Discipline Disproportionality)

4 Supporting Materials Tobin et al., 2011 Fidelity Matrix Childs, George & Kincaid, 2011 (eval brief) Outline of Evaluation Report Evaluation Blueprint Implementation Blueprint Exemplar state evaluation reports – (Illinois, Colorado, Missouri, Maryland, Florida, North Carolina)

5 Tobin et al 2011 (Abstract) Fidelity of implementation, also known as treatment integrity, has long been considered important for research purposes. Assessment of the impact of an intervention will be relevant only if the extent to which the intervention was implemented with fidelity is first determined. An emerging emphasis is being placed on the use of fidelity assessments to promote initial and sustained implementation of program interventions in schools. This paper examines how fidelity measurement is becoming part of the independent variable with respect to program implementation in schools, using School Wide Positive Behavior Interventions and Supports (SWPBIS) as an example. The current status of fidelity measures within SWPBIS implementation is documented with respect to Messick’s (1988) criteria for types of validity. Lessons learned from this example may pose directions and considerations for future implementation efforts in terms of instrument development, use in the implementation of SWPBIS, and other educational innovations. Key words: Fidelity of implementation, school-wide, positive behavior interventions and supports, PBIS

6 BoQ use (pbis.org/evaluation) Stability in Variant Administration Methods of the School-Wide PBS Benchmarks of Quality (BoQ) Karen Elfner Childs, Heather Peshak George, Don Kincaid The School-Wide PBS Benchmarks of Quality is now being used by thousands of schools across the country. One of the most frequently asked questions regarding the BoQ has been related to the procedures used for scoring. As a result, we decided to investigate the following question, “Is the BoQ stable and will similar scores result if administered in the method used in the validation study versus other methods?”

7 Foundations Evaluation is the process of gathering data to – Answer questions – Make decisions Define the core questions/decisions, before selecting measures. Provide an adequate context for interpreting the data. Effective evaluation is cyclical (repeated) – Use the evaluation process for improvement

8 Evaluation Questions/ Cycle Measure Perform Plan Compare Context Input Fidelity Impact Replication Sustainability

9 Evaluation Cycle Core Questions Evaluation Plan Data Collection Evaluation Report

10 Core Evaluation Questions Context What are/were the goals and objectives for SWPBIS implementation? (state/district capacity; school adoption; student outcomes) Who delivered the training and technical assistance for SWPBIS implementation? Input What professional development events were part of SWPBIS implementation support? – Was projected level of TA capacity provided (training/coaching)? Who received the professional development (schools/ cohorts)?

11 Core Evaluation Questions Fidelity To what extent was Tier I SWPBIS implemented with fidelity ? To what extent were Tier II and Tier III SWPBIS implemented with fidelity? To what extent is the Leadership Team establishing core functions?

12 Leadership Team Active Coordination Funding Visibility Political Support TrainingCoachingEvaluation Local School/District Teams/Demonstrations Behavioral Expertise Policy Sugai et al., www.pbis.org

13 Core Evaluation Questions Impact To what extent is SWPBIS associated with changes in student behavioral outcomes? To what extent is SWPBIS associated with changes in academic performance, and dropout rates? To what extent is district/state capacity established? ( local training, coaching, evaluation, behavioral expertise) To what extent is leadership and policy structure established?

14 Core Evaluation Questions Replication, Sustainability, and Improvement To what extent do Fidelity and Impact outcomes sustain across time? To what extent does initial SWPBIS implementation affect Implementation with later cohorts? To what extent did SWPBIS implementation change educational/behavioral capacity/policy? To what extent did SWPBIS implementation affect systemic educational practice?

15 Evaluation Plan Context – Evaluation questions, stakeholders, purpose(s) – Schedule of reports for stakeholder decision-making Input – Who will provide what TA/Training (and how much) to whom Fidelity – What data, when collected, by whom to assess: Leadership Team, School teams Tier I, Tier II, Tier III Impact – What data, when collected, by whom to assess: District/state capacity (training, coaching, expertise, evaluation) Student behavior Student academic Replication, Sustainability, Policy

16 Evaluation Plan Context – Define the roles of the evaluation – Stakeholders, Implementers, Adopters, Evaluators – Define the purpose of the evaluation – What decisions are to be affected by the evaluation? – What questions need to be answered to make informed decisions? – Define the basic goals, timeline and constraints associated with the implementation effort

17 Evaluation Plan Input – Who will provide what training and technical assistance… on what schedule… to improve the capacity of the leadership team to establish capacity to sustain and scale up SWPBIS? – Who will provide what training and technical assistance to school teams to result in Tier I implementation of SWPBIS? – Who will provide what training and technical assistance to school teams to result in Tier II an Tier III implementation of SWPBIS.

18 Evaluation Plan Fidelity ContentMeasure(s)Schedule Tier I SWPBIS FidelityTIC (Progress Monitor) BoQ (once TIC is at criterion SET (for research measures TIC (every 3-4 meetings) BoQ or SET (Annually in Spring) Tier II and Tier III SWPBIS Fidelity MATT (Progress Monitor) BAT (Annual) ISSET (for formal research) MATT (every 3-4 team meetings) BAT or ISSET (Annually in Spring)

19 Evaluation Plan Impact ContentMeasureSchedule Leadership Team CapacityImplementation Self- Assessment Annually (Spring) for use in planning for Fall. BehaviorODRs (SWIS) Attendance Graduation Rate Continuous Annual Academic(CBM)Oral Reading Fluency Standardized Assessments Graduation Start of school, Winter, Early Spring Annually (Spring) Spring

20 Data Collection 8 core PBIS measures – (Tobin et al., 2011; Childs et al., 2011) Basic logic – Research quality measures – Annual self-assessment measures – Progress monitoring measures – (to be used every 3-4 meetings/cycles) Fidelity measure matrix (Tobin & Vincent)

21 Data Collection Level of SupportResearch Measures Annual Self- Assessment Measures Progress Monitoring Measures UniversalSchool-wide Evaluation Tool (SET) Self-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) Secondary and Tertiary Individual Student School-wide Evaluation Tool (I-SSET) Benchmarks of Advanced Tiers (BAT) Measure of Advanced Tiers Tool (MATT) Leadership Team SWPBIS Implementation Self-Assessment

22 Data Collection Schedule LevelMeasurePreYear 1Year 2Year 3 SFWSSFWSSFWS Universal SWPBS Progress Monitoring: TIC XXX XXX Annual Self- Assessment: BoQ, X X X Research Measure: SET X X X Self-Assessment Survey: SAS X X X X Secondary/ Tertiary Progress Monitoring: MATT XXX XXX Annual Self- Assessment: BAT X X Research Measure: I-SSET X X Leadership TeamImplementation Self-Assessment XXXX

23 Data Collection Schedule LevelMeasurePreYear 4Year 5Year 6 SFWSSFWSSFWS Universal SWPBS Progress Monitoring: TIC Annual Self- Assessment: BoQ, X X X Research Measure: SET Self-Assessment Survey: SAS Secondary/ Tertiary Progress Monitoring: MATT Annual Self- Assessment: BAT X X X Research Measure: I-SSET Leadership TeamImplementation Self-Assessment XXX

24 Evaluation Report Basic Outline – Context, Input, Fidelity, Student Outcomes, Future Directions Local Adaptation – Cohorts – Locally important questions Disproportionality Bully Prevention Cost – Change in evaluation focus over time Increased emphasis on Tier II and Tier III Examples – At www.pbis.org/evaluation.www.pbis.org/

25 Evaluation Report Context Define SWBIS – SWPBIS is a framework for establishing a school-wide social culture with the necessary individualized supports needed for all students to achieve academic and social success. Define Goals of the specific project – Number of schools per year implementing SWPBIS How schools were selected Expectation of 2-3 years for Tier I implementation to criterion Expectation of 2 years of criterion implementation to affect academic outcomes. – Development of district/state capacity Capacity needed for sustained and scaled implementation – Behavioral and academic outcomes for students Student outcomes linked to fidelity of implementation Define Stakeholders/ Evaluation Questions – Evaluation report is written at the request of: – Evaluation report is focused on the following key questions:

26 Evaluation Report Input Who received what support, and from whom? – Leadership team – Local Capacity Building Training, Coaching, Behavioral Expertise, Evaluation – School teams

27 Leadership Team – Planning Dates – Development of Implementation Plan – Dates of SWPBIS Implementation Self- Assessment Capacity Building – Training, Coaching, Behavioral Expertise, Evaluation Data collection systems

28

29 Schools; Coaches; Trainers

30 Number of Schools, Use of Fidelity Data, and Access to ODR Data

31 Evaluation Report Impact on SWPBIS Fidelity Leadership Team SWPBIS Implementation Self-Assessment Capacity Development Number of trainers/coaches available to support teams/districts Behavioral expertise available to support Tier II and Tier III implementation Evaluation capacity (data collection, data use, information distribution) School Teams Tier I Implementation (TIC, BoQ, SET, SAS) Collectively, and/or by training cohort Tier II / Tier III Implementation (MATT, BAT, ISSET) Collectively, and/or by training cohort Additional measures of fidelity Phases of Implementation CICO checklist

32 Evaluation Report

33 Fidelity of Leadership Team development – SWPBIS Implementation Self-Assessment sub-scales

34 Evaluation Report Fidelity (TIC, BoQ, SET, SAS) Total Score – Schools

35 Evaluation Report Fidelity (cohort) Total Score N = 17

36 Evaluation Report Fidelity (subscales) TIC

37 Evaluation Report BoQ Subscale Report

38 Evaluation Report Tier II and Tier III Fidelity Measure Cohort N = 17 ISSET/BAT/MATT Total Scores Time 1 and Time 2 Time 1 Time 2

39 Evaluation Report Fidelity (Tier II & III)

40 Tier I, Tier II, Tier III Criterion: Count

41 Tier I, Tier II, Tier III Criterion: Percentage

42 Evaluation Report Impact: Student outcomes

43 Steve Goodman sgoodman@oaisd.org www.cenmi.org/miblsi

44 Participating Schools 2004 Schools (21) 2005 Schools (31) 2006 Schools (50) 2000 Model Demonstration Schools (5) 2008 Schools (95) 2009 Schools (150*) Total of 512 schools in collaboration with 45 of 57 ISDs (79%) The strategies and organization for initial implementation need to change to meet the needs of larger scale implementation.

45 Average Major Discipline Referral per 100 Students by Cohort

46 Increase 8% Decrease 14.6% Focus on Implementing with Fidelity using Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08

47 Percent of Students meeting DIBELS Spring Benchmark for Cohorts 1 - 4 (Combined Grades) 5,943 students assessed assessed 8,330 students assessed assessed 16,078 students assessed assessed 32,257 students assessed assessed Spring ’09: 62,608 students assessed in cohorts 1 - 4

48 Percent of Students at DIBELS Intensive Level across year by Cohort

49 North Carolina Positive Behavior Interventions & Support Initiative Heather R. Reynolds NC Department of Public Instruction Bob Algozzine Behavior and Reading Improvement Center http://www.dpi.state.nc.us/positivebehavior/

50 Suspensions per 100 students

51 Cedar Creek Middle School Franklin County, North Carolina

52 North Carolina Positive Behavior Support Initiative Dr. Bob Algozzine Schools with Low ODRs and High Academic Outcomes Office Discipline Referrals per 100 Students Proportion of Students Meeting State Academic Standard

53 Evaluation Report Implications – Policy – Practice – Technical Assistance Future Directions

54 Summary Building an Evaluation Plan for SWPBIS implementation Collecting and organizing Data Reporting Data for Active Decision-making


Download ppt "E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011."

Similar presentations


Ads by Google