Chris Borgmeier, Dave McKay, Anne Todd, Celeste Dickey, Rob Horner October 2008

Slides:



Advertisements
Similar presentations
Measuring Performance within School Climate Transformation Grants
Advertisements

Overview of SW-PBIS Cohort 10 ( ) Metro RIP (Regional Implementation Project) November 6, 2013 Shoreview Community Center T. J. Larson, MAT Barack.
Building Evaluation Capacity for States and Districts
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
VTPBiS Universal School Coordinator Orientation. Agenda Introductions Review Morning and Answer Questions Define Coordinator responsibilities and competencies.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
School-wide PBS Universal System Chris Borgmeier, PSU & Celeste Dickey, UO.
SWPB Action Planning for District Leadership George Sugai & Susan Barrettt OSEP Center on PBIS University of Connecticut February 14,
San Jose Unified School District School-wide PBS Initiative Leadership Team Rob Horner Celeste Rossetto Dickey University of Oregon Pbis.org.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
PBIS Applications NWPBIS Washington Conference November 5, 2012.
PBIS Coaches Training Day 3. Coaches Training Day 4 Follow-up from Coaches Training Day 3 The Why? Preparing your teams for Tier 1 implementation Coaching.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Northern California PBIS Symposium November 18, 2013.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Using Data to Problem Solve Susan Barrett
School-Wide Positive Behavioral Support School-wide Evaluation Tool January 2007.
Introduction to Positive Behaviour Support
The District Role in Implementing and Sustaining PBIS
Designing and Implementing Evaluation of School-wide Positive Behavior Support Rob HornerHolly Lewandowski University of Oregon Illinois State Board of.
Cohort 4 Middle/Jr. High School Data Review and Action Planning: Schoolwide Behavior Spring 2009.
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
Did You Know?. Data and PBIS: A Basic Overview With slides borrowed from Bolger & Miller Illinois PBIS Network Presentation on August 8 th, 2013 Rob Purple.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
PBIS in Urban Settings Presented by Christine McGrath, Ph.D., PBIS Trainer The May Institute Association for Positive Behavior Supports March 27, 2009.
Cohort 4 - Elementary School Data Review and Action Planning: Schoolwide Behavior Spring
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Thank you for joining us After you sit and get comfortable, please work with a partner and complete the Crossword Puzzle “MiBLSi Data Tools”
“Lessons learned” regarding Michigan’s state-wide implementation of schoolwide behavior and reading support Margie McGlinchey Kathryn Schallmo Steve Goodman.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
School-Wide PBIS: Action Planning George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut August 11, 2008.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Notes for Trainers (Day Training)
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
INITIAL TEAM TRAINING Presented by the MBI Consultants MODULE 7: Establishing Procedures for Data Collection.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
School-wide Evaluation Tool (SET) Assessing the Implementation of School-wide Discipline Training Overview George Sugai, University of Connecticut Teri.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
Positive Behavioral Interventions and Supports – Coaching School/AEA Month, 20xx.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
School Climate Transformation Grant. SSAISD Learner Profile ▪Reflects to set personal goals ▪Is an accomplished reader ▪Employs digital skills ▪Is an.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS (PBIS)
XXXXX School Ci3T Implementation Report Social Validity and Treatment Integrity 20XX – 20XX ____________________________________ Fall 20XX Lane and Oakes.
Presentation transcript:

Chris Borgmeier, Dave McKay, Anne Todd, Celeste Dickey, Rob Horner October

 Clarify evaluation questions for effective decision making around discipline and school climate  Define critical features for organizing the data  Define data sources needed ◦ Pbseval for discipline, SIS for achievement & attendance  Define the cycles and timelines for data collection and reporting

 Guide Decision Making ◦ Plan ◦ Progress monitor ◦ Inform Change/Need  Visibility/Support  FUNDING  Celebrate accomplishments

 How many schools are adopting PBIS? ◦ What percentage of schools in state, region/district?  SET, TIC, BoQ  How many schools are implementing SWPBIS at criteria?  SET, TIC, BoQ  Are schools implementing SWPBIS perceived as safe?  School Safety Survey  What are discipline patterns?  SWIS  What are attendance patterns?  SIS  What are achievement patterns?  SIS, State Testing scores DIBELS, AIMS Web

 Written Documents ◦ Visual/Graphic Display ◦ Easy to Read  Oral Reports/presentations ◦ Status reports  Website ◦ Post surveys and other resources ◦ Schedule reminders to school to complete survey  Make fair comparisons ◦ Account for 20% change in attendance ◦ Calculate per 100 students ◦ Percent of total ◦ Account for days of week if used for report

 District/ Regional Leadership Team Meetings ◦ To guide decision making  Quarterly, annually  Monitor Progress  Annual Action Planning  School Board Meetings  State Leadership Team Meetings  State/regional conferences  Newsletters  Establish a rhythm

Measure Year OneYear 2Year Three Q1Q2Q3Q4Q1Q2Q3Q4Q1Q2Q3Q4 EBS Survey X Team Checklist (TIC) XXXXX SET or BoQ XXX Classroom Self- Assessment XXXX CISS XXXX I-SSET or I-BoQ XX An Evaluation Model for School-wide PBS Designed for a school or cohort of schools

 SWIS + PBS Surveys + PBS Eval

 Online survey application for school teams and staff (free of charge) ◦ ◦ Complete Local Coordinator Form and submit to  PBS Applications Manager  1235 UO  Eugene, Oregon  PBS Surveys 5 different measures for schools to use that measure the: ◦ percent of implementation of SW-PBS ◦ risk and safety factors ◦ Rates of office discipline referrals ◦ Each of the 5 measures are used at different times and for different purposes

Team Implementation Checklist EBS Self-Assessment Survey School Safety Survey Benchmarks of Quality SET

 PBSEval™ is ◦ a web-based progress monitoring service that permits specific district and state-level organizations such as the Licensee to monitor and review selected area-wide administrative data and generate automated reports (collectively “Data”) in a manner that does not disclose personally identifiable information  Accessing PBS Eval ◦ License agreement  State $1000/year  District $500/year ◦ Participate in 90 minute training

284 schools 674 schools as of 10/24/08 There are 1239 public schools in Oregon

SET’s x Year Elementary Schools (K-6)

Elementary Schools TIC’s x Year from OR MS

SET’s x Year Middle School

SET’s x Year High School

Nat’l Elem Mean.39 ODR/100/ Day Nat’l MS Mean.97 ODR/100/ Day

Outcome Data  ODR  Suspension/Expulsion  LRE  Attendance  Academic Achievement  School climate surveys  Staff retention Implementation Data  Self Assessment  SET

Outcome Data  Academic Achievement  State Test Scores  CBM/ DIBELS/ EZ CBM/ AimsWeb  LRE  Attendance Implementation Data  PET-R  Healthy Systems Checklist  In Program Assessments  School/ Classroom Observations

 Across Schools/District-wide Data ◦ SWIS won’t do it ◦ Capabilities of new SAMI? ◦ PBS Surveys won’t do it  Can access all data but have to compile it your self ◦ DIBELS Survey ◦ PBS Eval ◦ Other programs -- eSIS? Etc? ◦ Need some Excel skills

Carol Sadler

 16 Schools, 12,000 students ◦ 10 elementary, 1 charter, 3 middle, 2 high  Special Programs participation ◦ 1,200 Special Education (10%) ◦ 1,800 English Language Learners (15%) ◦ 1,500 Talented and Gifted (12%)  Socio-economic status ◦ Title 1 in 5 elementary schools  Free/Reduced ranges from 7% to 58%

T-TSD’s SET Scores, years 9 and 10!

. 52 T-TSD’s rates of discipline referrals,

.45

.16

For 5 th grade students who have attended T-T since kindergarten or first grade (i.e., intact groups) 94% met or exceeded OSA Reading/Literature benchmark in

~80% of Students ~15% ~5% (Walker, et al. 1996) 80% received NO referrals; 90% received 0-1 referrals 8% received 2-5 referrals 1% received 6 or more referrals

 Dave McKay and Scott Perry