Using Data for Decision Making School-wide Information System Teri Lewis-Palmer, Anne Todd, Rob Horner, George Sugai, & Shanna Hagan-Burke.

Slides:



Advertisements
Similar presentations
Goals for Session Define the role of data-based decision-making with the School-wide PBS approach. Propose features of Office Discipline Referral data.
Advertisements

Module 4: Establishing a Data-based Decision- making System.
PBS Overview Goal for Today To introduce you to key principles and basic concepts for a continuum of support for students known as Positive Behavior.
Implementing School-wide Positive Behavior Support Rob Horner, George Sugai and Anne Todd University of Oregon Center on Positive Behavior Interventions.
School-Wide Information System An Introduction Presented by: Patrick Johnson Secondary Behavior Specialist
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
Innovative Practices in Juvenile Corrections: Positive Behavior Supports C. Michael Nelson National Center for Education, Disability, and Juvenile Justice.
Advanced Topics in PBS: Secondary/Tertiary Interventions George Sugai University of Connecticut Rob Horner University of Oregon.
Developing an FBA & BIP using SWIS Data Developing an FBA & BIP using SWIS Data Presented by: Barbara Gunn, Director Renee Perron, Program Coordinator.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
PBIS (Positive Behavioral Interventions and Supports) Northwest AEA LDA of Siouxland November 18, 2010.
Using ODR Data for Decision Making Rob Horner, George Sugai, Anne Todd, Teri Lewis-Palmer Marilyn Nersesian, Jim Watson.
Data Driven Decision Making Missouri PBS Summer Institute June 28 & 29, 2006.
Understanding Office Discipline Referral Data. Steps of Data Collection, Analysis, and Use 1.Identify sources of information and data 2.Summarize/Organize.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
School-Wide Positive Behavioral Support School-wide Evaluation Tool January 2007.
The District Role in Implementing and Sustaining PBIS
Creative ways to use data: A toolkit for schools Susan Barrett
The Big Picture: “Getting it Going” and “Keeping it Going” Susan Barrett Implementer Partner, OSEP Center on PBIS.
Designing and Implementing Evaluation of School-wide Positive Behavior Support Rob HornerHolly Lewandowski University of Oregon Illinois State Board of.
Using Data for Decision-making Rob Horner, Anne Todd, Steve Newton, Bob Algozzine, Kate Algozzine.
Tier 2 PBIS: Check-In / Check-out
Returning Team Training July 17, AGENDA  Introductions and Celebrations  Team Check-up  Creative ways to use data: A toolkit for schools  Check-in.
Review of School-wide Positive Behavior Support Maryland PBIS Summer Institute July 13,2004 Teri Lewis-Palmer.
School School Teams Must Have Immediate Access to Data to Make Objective Decisions About School Climate & Safety -,Where to Start?
Connecting PBIS & SST to Address Student Needs
Supporting Students At- risk by Implementing a SW Targeted Intervention Teri Lewis-Palmer July 10, 2008.
Strategies to Support Yellow-zone students Specialized Group-based Approach.
Secondary Interventions Function-based Strategies to Support At-Risk Students.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
Mining Date: Using Data for Decision-making Rob Horner, Anne Todd, Steve Newton, Bob Algozzine, Kate Algozzine.
Using MiBLSi Data to Write a Gap Statement and a Cause for Gap May 17, 2011 Some material taken from MiBLSi Spring Data Review
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
Module F Reviewing the Problem Solving Process Coaches’ Monthly Meeting Add DC Name Here.
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
SWPBS Team is Leadership Team Representative of school Representative of school Works directly with administration Works directly with administration Reviews.
Dr. Dana Morris Jessica Barrett Alcott Middle School.
Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Behavior Management in Specific Settings Applying School-wide Expectations and Interventions.
Introduction to School-wide Positive Behavior Support.
Check-In / Check-Out (CI/CO): Overview for Staff.
Notes for Trainers (Day Training)
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Systems Review: Schoolwide Behavior Support Cohort 5: Elementary Schools Winter, 2009.
Leading a Team from a Functional Behavioral Assessment to a Practical and Effective Behavior Support Plan Rob Horner University of Oregon TA-Center on.
Using Information for Behavior Support Decision-Making Taken from Rob Horner, Anne Todd, & George Sugai
Using Data to Make Decisions Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES Joan Ledvina Parr
Secondary Interventions: Check-in/ Check-out as an Example Rob Horner, Anne Todd, Amy Kauffman-Campbell, Jessica Swain-Bradway University of Oregon
The Behavior Education Program (BEP): An additional intervention to complement school and classroom managment The Behavior Education Program (BEP): An.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Implementing School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive.
Using Information for Decision Making Identifying Interventions that support Targeted and Intensive Students.
School-wide Evaluation Tool (SET) Assessing the Implementation of School-wide Discipline Training Overview George Sugai, University of Connecticut Teri.
Behavior Education Program (BEP): Overview for Staff Leanne S. Hawken, Ph.D. University of Utah.
Leadership Launch Module 11: Introduction to School Wide Information System (SWIS) and the Student Risk Screening Scale District Cohort 1 1.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Creative ways to use data: A toolkit for schools Melissa Leahy, Ph.D. Kimberly Muniz, M.A./CAS, NCSP Carroll County Public Schools Maryland.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
Version 2.1 School-wide Evaluation Tool (SET).  Define  Purpose of the School-wide Evaluation Tool (SET)  Process  Who participates?  Duration and.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS Wauwatosa School Board – January 9, 2012.
Planning, Implementing and Sustaining School-wide Positive Behavior Interventions and Supports Janeah Gullett Area Coordinator, Kycid Amanda Warder, Guidance.
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
Check In/Check Out A Tier 2 targeted system for providing behavioral support to groups of students at risk MCPS “Bone” Evidence Based Practices Training,
Quick Review Agree on Approach to Discipline Identify Expectations
Using Data for On-going Problem Solving
Idaho SWPBIS Training Institute
Using Information for Decision Making
Presentation transcript:

Using Data for Decision Making School-wide Information System Teri Lewis-Palmer, Anne Todd, Rob Horner, George Sugai, & Shanna Hagan-Burke

Assumptions zSchool has team focused on school-wide behavior support. zTeam has an action plan zTeam meets regularly (weekly, every two weeks) zTeam has access to information about student behavior

Why Collect Discipline Information? zDecision making zProfessional Accountability zDecisions made with data (information) are more likely to (a) be implemented, and (b) be effective

Key features of data systems that work zThe data are accurate zThe data are very easy to collect (1% of staff time) zData are used for decision-making yThe data must be available when decisions need to be made (weekly?) yDifference between data needs at a school building versus data needs for a district yThe people who collect the data must see the information used for decision-making.

What data to collect for decision-making? zUSE WHAT YOU HAVE yOffice Discipline Referrals/Detentions xMeasure of overall environment. Referrals are affected by (a) student behavior, (b) staff behavior, (c) administrative context xAn under-estimate of what is really happening xOffice Referrals per Day per Month yAttendance ySuspensions/Expulsions yVandalism

Office Discipline Referral Processes/Form zCoherent system in place to collect office discipline referral data yFaculty and staff agree on categories yFaculty and staff agree on process yOffice Discipline Referral Form include needed information xName, date, time xStaff xProblem Behavior xLocation

When Should Data be Collected? zContinuously zData collection should be an embedded part of the school cycle not something “extra” zData should be summarized prior to meetings of decision-makers (e.g. weekly) zData will be inaccurate and irrelevant unless the people who collect and summarize it see the data used for decision-making.

Using Office Discipline Referrals for Team Planning School-Wide Systems Non Classroom Setting Systems Classroom Systems Individual Student Support Systems

Sugai, Sprague, Horner & Walker, in press z11 elementary schools, 9 middle schools zFor the 9 Middle Schools yNumber of students: Mean = 635 ( ) yOffice Dis Referrals: Mean = 1535 ( ) yReferrals per student: Mean = 2.4 yReferrals per school day : Mean = 8.6 y% students with at least 10 referrals = 5.4% y% of referrals from top 5% of students = 40%

Focus on School-Wide System if: zMore than 35% of students receive 1 or more referral zAverage referrals per student is greater than 2.5

Focus on Non-Classroom Systems if zMore than 35% of referrals come from non- classroom settings zMore than 15% of students who receive a referral are referred from non-classroom settings.

Focus on Classroom Systems if zMore than 50% of referrals are from classroom settings. zMore than 40% of referrals come from less than 10% of the classrooms.

Focus on Individual Student Systems zTargeted Group Interventions yIf 10 or more students have 10+ referrals xExample (check-in, check-out BEP) zTargeted Individual Interventions yFewer than 10 students xIntense, individualized support xWrap Around xPersonal Futures Planning xFunctional Assessment

Using Data for On-Going Problem Solving zStart with the decisions not the data zUse data in “decision layers” yIs there a problem? yWhat “system(s)” are problematic yWhat individuals (individual units) are problematic? zDon’t drown in the data zIt’s “OK” to be doing well zBe efficient

The Decisions/Decision Questions zInitial Self-Assessment yWhere to focus “investment” energy/time zOn-Going Assessment/Planning yIs the action plan working? Should we change? xDecision: Maintain, Modify, Terminate yWhat is the problem? Where should we focus? xDecision: Allocation of time, money, skills yDo we understand the problem? yWhat is the smallest effort that will produce the biggest effect?

Interpreting Office Referral Data: Is there a problem? zAbsolute level (depending on size of school) yMiddle Schools (>6) yElementary Schools (>1.5) zTrends yPeaks before breaks? yGradual increasing trend across year? zCompare levels to last year yImprovement?

Is There a Problem? #1 Maintain - Modify - Terminate

Is There a Problem? #2 Maintain - Modify - Terminate

Is There a Problem? #3 Maintain - Modify - Terminate

Is There a Problem? #4 Maintain - Modify - Terminate

What systems are problematic? zReferrals by problem behavior? yWhat problem behaviors are most common? zReferrals by location? yAre there specific problem locations? zReferrals by student? yAre there many students receiving referrals or only a small number of students with many referrals? zReferrals by time of day? yAre there specific times when problems occur?

Referrals by Problem Behavior

Referrals per Student

Referrals by Time of Day

Combining Information zIs there a problem? yWhat data did you use? zWhat systems are problematic? zWhere do you need to focus? yThe next level of information needed zWhat information is NOT needed?

What Individuals/Specific Units are problematic? Detailed Data Sources zIndividual student data zDirect observation zFaculty/Staff report

Designing Solutions zIf many students are making the same mistake it typically is the system that needs to change not the students. zTeach, monitor and reward before relying on punishment.