Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.

Similar presentations


Presentation on theme: "Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D."— Presentation transcript:

1

2 Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.

3 2 Objectives  Describe Florida’s evaluation system for state, district, and school levels  Identify the critical questions that Texas needs to answer  Describe a comprehensive model for evaluating Tier 1 PBS  Build a scalable and sustainable system  Review methods of data collection procedures, tools, analysis and training

4 3 Purpose of Evaluation To examine the extent to which teams are accurately selecting and implementing PBS systems and practices Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved To determine if teams are accurately and consistently implementing activities and practices as specified in their individualized action plan (PBIS Blueprint, 2005)

5 4 Factors to Consider in Developing Comprehensive Evaluation Systems 1)Systems Preparation –Readiness activities 2)Service Provision –Training and technical assistance 3)Evaluation Process –Timelines 4)Evaluation Data –Implementation Fidelity, Impact on Students, Attrition, Client Satisfaction 5)Products and Dissemination –Reports, materials, presentations, etc. (Childs, Kincaid & George, in press)

6 5 What Questions Does Texas Need to Answer?

7 6 (1) Systems Preparation Readiness activities District Readiness Checklist –District Action Plan –School Readiness Checklist New School Profile –Baseline data: ODR, ISS, OSS, academic

8 7 (2) Service Provision Training and ongoing technical assistance FLPBS Districts Coaches Schools

9 8 (3) Evaluation Process Timelines for Evaluation Reports Mid Year I – due 10/31 –School Profile –PBS Implementation Checklist (PIC) Mid Year II – due 2/28 –PBS Implementation Checklist (PIC) End Year – due 6/15 –Benchmarks of Quality (BoQ), Bencahmark for Advanced Tiers (BAT) –Outcome Data Summary –School-wide Implementation Factors (SWIF)

10 9 (4) Evaluation Data a)Implementation Fidelity –PIC –BoQ, BAT –School Demographic Data –SWIF –Team Process Survey c)Attrition –Attrition Survey b)Impact on Students –Outcome data (ODR, ISS, OSS) –FCAT (state test) –School climate surveys –Referrals to ESE –Screening ID –Response to intervention d)Client Satisfaction –SWIF

11 10 (a) Implementation Fidelity 1.Are schools trained in Universal PBS implementing with fidelity? Tiers 2 and 3? Across years? Across school types? –BoQ, BAT, School Demographic Data 2.What factors are related to implementing with fidelity? –SWIF survey, BoQ, BAT 3.Do teams that work well together implement with greater fidelity? –Team Process Evaluation, BoQ

12

13 BoQ Totals by School Type Across Years

14 School-Wide Implementation Factors (SWIF) Higher Implementing Lower Implementing (70+ on BoQ) (-70 on BoQ) 90% + respondents from high implementing schools identified these factors as Helpful: 80%+ respondents from low implementing schools identified these factors as Helpful: Factors MOST Helpful to Implementation of SWPBS  Expectations and rules clearly defined  Administrator committed to PBS, willing to teach and model PBS, willing to reward students  Representative and committed PBS Team  Reward system works  PBS Coach’s guidance with process  Students responses to rewards and activities  Expectations and rules clearly defined  Administrator willing to reward students  Representative PBS Team 25%+ respondents from high implementing schools identified these factors as Problematic: 50%+ respondents from low implementing schools identified these factors as Problematic: Factors MOST Problematic to Implementation of SWPBS  Adequate funding  Team recognizes faculty participation  Staff stability from year to year  Student stability from year to year  Staff time for PBS  Staff belief about effectiveness of PBS  Staff philosophy  Staff consistency in teaching  Staff consistency in discipline procedures

15 14 Descriptive Data: Teams Team functioning did not effectively differentiate school teams implementing with high or low fidelity with better or worse outcomes Teams implementing Tier 1 PBS with fidelity saw substantially different effects on all four outcome measures

16 15 (b) Impact on Student Behavior 1.Do schools implementing SWPBS decrease ODRs, days of ISS, and days of OSS? –ODRs, ISS, OSS 2.Do schools implementing SWPBS realize an increase in academic achievement? –FCAT scores 3.Is there a difference in outcomes across school types? –ODRs, ISS, OSS, FCAT scores, school demographic data 4.Do schools implementing with high fidelity have greater outcomes implementers with low fidelity? –BoQ, ODRs, ISS, OSS 5.Do teams that work well together have greater outcomes than those that don’t work as well together? –Team Process Evaluation, ODRs, ISS, OSS

17 16 Percent change in ODR, ISS and OSS rates per 100 students before and after PBS implementation

18 17 Academic Outcomes by Implementation Level

19 18 Percent decrease in ODR, ISS, OSS rates per 100 students after 1 year of implementation (by school type)

20 19 ODRs by implementation level across three years of implementation

21 20 (c) Attrition 1.Why do schools discontinue implementation of SWPBS? –Attrition Survey

22 21 (d) Consumer Satisfaction 1.Are our consumers satisfied with the training, technical assistance, products and support received? –SWIF survey –District Coordinators survey –Training evaluation

23 Consumer Satisfaction

24 23 (5) Products and Dissemination Annual Reports Revisions to Training Revisions to Technical Assistance process Dissemination activities: –National, state, district, school levels Revisions to Website On-line Training Modules

25 24 Improvements Made 1.Increased emphasis on BoQ results for school and district-level action planning 2.Increased training to District Coordinators and Coaches and T.A. targeted areas of deficiency based upon data 3.Team Process Evaluation no longer used 4.Academic data used to increase visibility and political support 5.Specialized training for high schools 6.Identifying critical team variables impacted via training and T.A. activities 7.Revised Tier 1 PBS Training to include classroom strategies, problem- solving process within RtI framework 8.Enhanced monthly T.A. activities

26 25 Florida’s Service Delivery and Evaluation Model District Action Plan District Readiness Checklist School Readiness Checklist New School Profile (includes ODR, ISS, OSS) Training On-going technical assistance FLPBS ↓ Districts ↓ Coaches ↓ Schools Mid-Year Reports End-of-Year Reports Impact on Students Outcome data (ODR, ISS, OSS) Florida Comprehensive Assessment Test School Demographic Data Team Process Survey Implementation Fidelity Benchmarks of Quality, BAT School Demographic Data School-wide Implementation Factors Team Process Survey Attrition Attrition Survey Client Satisfaction School-Wide Implementation Factors Annual Reports Revisions to training and technical assistance process National, State, district, school dissemination activities Website On-line training modules Systems Preparation Service Provision Evaluation Process Evaluation Data Products and Dissemination (Childs, Kincaid & George, in press)

27 26 In Summary… 1.Know what you want to know 2.Compare fidelity of implementation with outcomes – presents a strong case for implementing PBS with fidelity 3.Additional sources of data can assist a state in determining if PBS process (tiers 1-3) is working, but also why or why not it is working 4.Address state, district, school systems issues that may impact implementation success

28 27 Resources Childs, K., Kincaid, D., & George, H.P. (in press). A Model for Statewide Evaluation of a Universal Positive behavior Support Initiative. Journal of Positive Behavior Interventions. George, H.P. & Kincaid, D. (2008). Building District-wide Capacity for Positive Behavior Support. Journal of Positive Behavioral Interventions, 10(1), 20-32. Cohen, R., Kincaid, D., & Childs, K. (2007). Measuring School-Wide Positive Behavior Support Implementation: Development and Validation of the Benchmarks of Quality (BoQ). Journal of Positive Behavior Interventions.

29 28 Contact Heather Peshak George, Ph.D.  Co-PI, Co-Director & PBIS Research Partner Phone: (813) 974-6440 Fax: (813) 974-6115 Email: flpbs@fmhi.usf.edu Website: http://flpbs.fmhi.usf.edu


Download ppt "Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D."

Similar presentations


Ads by Google