Using Data to Problem Solve Susan Barrett www.pbis.org www.pbismaryland.org.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Implementing School-wide Positive Behavior Support Rob Horner, George Sugai and Anne Todd University of Oregon Center on Positive Behavior Interventions.
From Blueprint to Finished Product: Selecting the Right Tools and Employing the Right People Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
VTPBiS Universal School Coordinator Orientation. Agenda Introductions Review Morning and Answer Questions Define Coordinator responsibilities and competencies.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Evaluation Tools, On-Line Systems, and Data-Based Decision Making Version 3.0, Rev  This is a presentation of the Illinois PBIS Network. All.
3.0 Behavior Data Review and Action Planning SPRING 2012.
Cohort 5 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
School-Wide Positive Behavioral Support School-wide Evaluation Tool January 2007.
SW-PBS District Administration Team Orientation
Keys to Sustaining School-wide PBIS Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
Using Data to Guide Decision Making at the Pre-school Level Tim Lewis, Ph.D. & Susan Brawley University of Missouri.
Creative ways to use data: A toolkit for schools Susan Barrett
Cohort 4 Middle/Jr. High School Data Review and Action Planning: Schoolwide Behavior Spring 2009.
Positive Behavioral Interventions and Supports (PBIS) Leadership Summit Breakout Sessions March 30, 2009.
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
PBIS Tier 1 & 2 Coaches’ Meeting WCTC – ISU Room March 19 or 20, 2014.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
Positive Behavioral Interventions & Supports (PBIS) Core Behavioral Component The Response to Intervention Best Practices Institute Wrightsville Beach,
Cohort 4 - Elementary School Data Review and Action Planning: Schoolwide Behavior Spring
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Cohort 5 Middle/Jr. High School Data Review and Action Planning: Schoolwide Reading Spring,
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
PBIS Team Training Baltimore County Public Schools Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES July 16, 2008 Secondary.
B. Faculty Commitment. Core FeaturePBIS Implementation Goal B. Faculty Commitment 4. Faculty are aware of behavior problems across campus through regular.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
Making Data-Based Decisions Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Interventions and Supports pbis.org.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
School-Wide PBIS: Action Planning George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut August 11, 2008.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Introduction to School-wide Positive Behavior Support.
Positive Behaviour for Success (PBS) Universals Pre-Training Session 2012 Positive Behaviour for Success Illawarra South East Region.
Notes for Trainers (Day Training)
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Systems Review: Schoolwide Behavior Support Cohort 5: Elementary Schools Winter, 2009.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Evaluation Day 2.
Sustaining and Improving Implementation of SWPBS Rob Horner and George Sugai OSEP TA-Center on Positive Behavior Support
INITIAL TEAM TRAINING Presented by the MBI Consultants MODULE 7: Establishing Procedures for Data Collection.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
Implementing School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS (PBIS)
Implementing School-wide Positive Behavior Support
Presentation transcript:

Using Data to Problem Solve Susan Barrett

Thanks to… Center on PBIS Steve Goodman- Michigan Tim Lewis Rob Horner George Sugai Catherine Bradshaw Don Kincaid

Adopt systems perspective at a Building Level Systems Perspective Organization do not “behave” …individuals behave “Organization is group of individuals who behave together to achieve a common goal” “Systems are needed to support collective use of best practices by individuals in an organization” (Horner, 2001) Schools as Systems Goal to create communities that for all its members have common Vision Language, & Experience Biglan, 1995; Horner, 2002

What a Leadership Team does… Communicates common vision for schoolwide supports Works collaboratively to establish building capacity to support all students Commits resources to establish procedures for support Develops methods for evaluating progress towards measureable outcomes Action planning based on data

Should get easier for your school over time Handbook Describes core features Expectations and teaching matrix (rules for settings) Teaching plans and teaching schedule Acknowledgement system Continuum of consequences for problem behavior Building Leadership Team Regular meeting schedule and process Regular schedule for annual planning and training Annual Calendar of Activities On-going support for staff

Use the PBIS Maryland website as reference

Your Leadership Team Does your team understand the leadership function in managing and coordinating implementation? On a scale of 1 (low) to 5 (high), how well is your team doing with this responsibility? Team Time

Standards and Protocols Available on What is a “PBIS trained” team? What happens if a school does not meet criteria? What is required to be a PBIS implementing school? How does a school become inactive? What is required for a school to be eligible for PBIS Maryland Recognition?

Purpose of Systems Measures Benchmarks of Quality Checklist Evaluates status of Tier I Positive Behavior Supports Completed Annually, Required for Recognition Submitted to Jerry electronically on April 10 Submitted online at Implementation Phase Inventory Required for Recognition Due November 10 and April 10 to Jerry Self-Assessment Survey- Not required but… Evaluates status of Schoolwide, Nonclassroom, Classroom and Individual Student Supports Submitted online at

What is the BOQ? Lists components of PBIS programs that address the critical elements of PBIS implementation Completed by school teams on a yearly basis to assess how they score on a 100 point scale with regard to developing and implementing school- wide PBIS Useful in developing action plans for following year One of the measures used by MSDE-SPHS-JHU to determine schools achieving Exemplar Status

Three Components of Benchmarks of Quality Team Member Rating Form Completed by team members independently Returned to coach Scoring Form Completed by coach using Scoring Guide Used for reporting back to team Scoring Guide Describes procedure for completing BOQ Includes a rubric for scoring each item

BOQ Will Provide: Summary of team members’ perceptions of PBIS implementation (scored: ++ in place, + needs improvement, and - not in place) Objective assessment of school’s implementation based on criteria described in a rubric (100 point scale) Comparison between the above factors which will encourage discussion of strengths and weaknesses and provides ideas for action planning

What is the IPI? Implementation Phases Inventory Two times/year –Due November 10, April 10 Coach completes with Team Four Phases –Preparation –Initiation –Implementation –Maintenance

What is the SET? School-wide Evaluation Tool One of several methods to evaluate Tier 1 Required if school is seeking Recognition Status External certified SET Assessor will conduct site visit Should you be a SET Assessor?

Measures the level of implementation of SWPBIS (not intended to measure everything!) The Critical Features Expectations Defined Expectations Taught System for Encouraging Expected Behaviors System for Discouraging Problem Behaviors Monitoring and Decision Making Management District Level Support What does it measure?

Why use it? The results help PBIS teams: Assess the features of PBIS in place Determine annual goals for school-wide effective behavior support evaluate on-going efforts toward school- wide behavior support design and revise procedures as needed compare efforts toward school-wide effective behavior support from year to year

What does it look like? Permanent product review Office Discipline Referral (ODR) form Current Action Plan Discipline Handbook/Plan School Improvement Plan Lesson Plans & Schedule 2 to 3 hour school visit: Observations Classroom and Non-classroom settings Interviews Administrator, Staff, PBIS Team Members, and Students

What is the Self-Assessment Survey? Self-assessment survey to assess the extent to which Positive Behavior Support practices and systems are in place within a school School-wide (15 items) Non-classroom (Specific Setting) (9 items) Classroom (11 items) Individual Student (8 items)

Who Completes the Self-Assessment Survey? Initially, the entire staff in a school completes the Survey. In subsequent years and as an on-going assessment and planning tool, the Survey can be completed in several ways: All staff at a staff meeting. Individuals from a representative group. Team member-led focus group.

Using the Self-Assessment Information for Decision Making Is a system in place? “in place” > 66% Is there a need to focus on a system? Current status of “in place” is < 66% and Priority for improvement is “High” for > 50% Which system should receive focus first? Always establish schoolwide as first priority Which features of the system need attention? Combine survey outcomes with information on office referrals, attendance, suspensions, vandalism, perceptions of staff/faculty

Individual Summary Charts Charts are provided for each system (school-wide, nonclassroom, classroom, and individual) Current status Charts Percentage of respondents who answered "In Place", "Partially In Place", and "Not In Place" Improvement Priority Charts Percentage of respondents who answered "High", "Medium", and "Low”

Example of PBS Self Assessment Survey Individual Summaries Chart

Analysis of Schoolwide System Chart Shows a chart with bars for components of the schoolwide system Expectations defined (question 1) Expectations taught (question 2) Reward system (question 3) Violations system (question 4-8) Monitoring (question 10-12) Management (question 9, 14-16) District support (question 17-18)

Analysis of Schoolwide System Chart

Example of PBS Self Assessment Survey Individual Item Score Schoolwide Component White = In Place Yellow = Partial In Place Red = Not In Place

Why conduct Self-Assessment Survey in addition to Checklists? Checklists are conducted by team, all/most staff complete survey Look for areas of convergence across tools Increases confidence of data Look for areas of divergence across tools Decrease confidence of data? Possible reasons for disparity… Lack of understanding of questions Staff not fully aware of work of Building Leadership Team Support component not fully “In Place”

Differences between the BOQ action plan form and the Self-Assessment Survey Benchmarks of Quality EBS Self-Assessment Survey Purpose? Evaluate on-going progress towards schoolwide PBS Evaluate extent that all systems (schoolwide, nonclassroom, classroom, individual) are in place When administered? Monthly- progress monitor Tier 1 Annually Who completes? School Leadership team, completed as a team All school staff (or representative sample) completed individually Time involved? minutes30-45 minutes

To Do List Review results from… BOQ Checklist Determine if you would like your school staff to complete Self-Assessment Survey Review School’s Action plan- What is the link to overall School Improvement Plan? Based on this information complete action plan for you! What celebrations can you share with your school community before this year is over? What is your plan to strengthen your schools’ behavior support for the next school Year? Please take a moment to complete the appropriate section of the Follow-Up Activity Worksheet to document the work yet to be done Please take a moment to complete the appropriate section of the Follow-Up Activity Worksheet to document the work yet to be done

Student Measures How do we know Implementation of Tier 1 PBIS is making an impact? What data should our team be reviewing? How do we build that into the agenda so we it is standard practice? Do we have a core group on our team that reviews that data prior to the monthly team meeting?

Data-Based Decision Making 1. Determine what questions you want to answer 2. Determine what data will help to answer questions 3. Determine the simplest way to get data 4. Put system in place to collect data 5. Analyze data to answer questions Focus on both Academic and Social Outcomes

1. Determine what questions you want to answer Examples Can we predict problems/success? When/where/who? Possible “function” of problem behavior? Who needs targeted or intensive academic supports? What environmental changes/supports are needed?

2. Determine what data will help to answer questions Existing data set(s) Current data collection Additional / new data Confidence in accuracy? Complete picture?

3. Determine the simplest way to get data Agreement on definitions Standard forms / process Frequency of collection Target “Multi-purpose” data/use Train ALL staff on use & provide on-going TA

4. Put system in place to collect data Build on existing systems Add components over time Central entry point Electronic

5. Analyze data to answer questions Trends Instruction & supports in place/not in-place Pre/post “big outcomes” Comparisons (norm / local) Relative growth Absolute growth

By Location

By Behavior

By Student

By # of Referrals

Final Thoughts Don’t collect data for collection sake – make sure informs the process Don’t “drown” in data – keep focused on the question Data without context are simply numbers

Reviewing Student Measures Answer the “Big Five” questions 1. How often are problem behavior events occurring? 2. Where are they happening? 3. What types of problem behaviors? 4. When are the problems occurring? 5. Who is contributing? Using SWIS “Big Five” reports 1. Major Discipline Referrals per Day per Month 2. Major Discipline Referrals by Location 3. Major Discipline Referrals by Problem Behavior 4. Major Discipline Referrals by Time 5. Major Discipline Referrals by Student

Langley Elementary School: 478 Students, Grades K-5 Problem Identification (look at Major Discipline Referral per Day per Month on next slide) 1. Is there a problem with the absolute standard? ✔ PROBLEM- ODRs per day higher than national avg. 100

Langley Elementary School Referrals per Day per Month

✔ Langley Elementary School: 478 Students, Grades K-5 Problem Identification (look at Major Discipline Referral per Day per Month on next slide) 1. Is there a problem with the absolute standard? 2. Are there trends or patterns? PROBLEM- ODRs per day higher than national avg. TREND- 4 consecutive mos. of increasing trend

Langley Elementary

PROBLEM- ODRs per day higher than national avg. TREND- 4 consecutive mos. of increasing trend Happening mostly on the playground Tardiness a problem Disrespect also a problem Happening during morning and lunch recess periods About 3% of students with 2 or more ODRs, 12 students with 5 or more ODRs, 5 students with >30 ODRs

Using Data to Build Solutions Prevention: How can we avoid the problem context? Who, When, Where Schedule change, curriculum change, etc Teaching: How can we define, teach, and monitor what we want? Teach appropriate behavior Use problem behavior as negative example Recognition: How can we build in systematic reward for desired behavior? Extinction: How can we prevent problem behavior from being rewarded? Consequences: What are efficient, consistent consequences for problem behavior? How will we collect and use data to evaluate (a) implementation fidelity, and (b) impact on student outcomes?

Next Steps Review Standards and Protocols Review Data Requirements to be an implementing school Review Requirements for Recognition Use handouts to build best practice routine with your school team