Presentation is loading. Please wait.

Presentation is loading. Please wait.

Spring Data Review Workday School Leadership Teams May 20 & 21, 2014.

Similar presentations


Presentation on theme: "Spring Data Review Workday School Leadership Teams May 20 & 21, 2014."— Presentation transcript:

1 Spring Data Review Workday School Leadership Teams May 20 & 21, 2014

2 Acknowledgements The material for this training day was developed by Ingham ISD: Theron Blakeslee, John Endahl, Melanie Kahler, Matt Phillips, Jeanne Tomlinson, Kelly Trout, Nate Stevenson Laura Colligan and Mary Jo Wegenke Content based on the work of… MiBLSi project George Batsch, University of South Florida Robert Balfanz, Everyone Graduates Center and Johns Hopkins University Roland Good and Rob Horner, University of Oregon George Sugai, University of Connecticut Joe Torgesen, Florida Center for Reading Research Dawn Miller, Shawnee Mission School District, Kansas

3 Learning Targets Participants will be able to: Analyze process data and outcome data to identify academic and/or behavior areas in need of improvement Make connections between process and outcome data and its impact on student achievement Identify an academic and/or behavioral priority based upon the data analysis and use the Continuous Improvement Process to address the priority

4 Where to access materials for today: 1. POMPOMS! The documents we are using today are on flash drives attached to ISD pompoms. 2.MTSS Implementers Website http://mtss-implementers.wiki.inghamisd.org  Building Data Review page OR Cute as they are, please don’t take them home!

5 Agenda 9:00-9:05Welcome 9:05-9:30Review modifications to Data Toolkit and Problem Solving Guide Illuminate Data Analysis 9:30-11:45Team Time to work through the School Improvement Continuous Improvement Cycle 11:45-12:00District Check-in: Celebrations and Areas to Grow 12:00-1:00Lunch 1:00-3:15Continue Team Time 3:15-3:30District Check-in: Celebrations and Areas to Grow Session Evaluation

6 Data Review Workday Working Agreements Participants  Work through the Continuous Improvement Process using current data to identify school-wide and grade level needs and create an action plan to address area(s) of concern. Administrator  Assist staff to make instructional decisions based on data  Assist with resource allocation so action items can be implemented Ingham ISD  Provide tools to assist with data analysis  Provide guided facilitation for problem solving using the Continuous Improvement Cycle

7 Materials you will need today Data Review Workbook(MTSS Wiki & hardcopy) Problem Solving Guide (MTSS Wiki & hardcopy) Worked Example Problem Solving Guide (MTSS Wiki & hardcopy) Log-in Information: Illuminate Ed, BAA, pbisapps.org, and SWIS Process Data: PETR/SWEPT, PET-M, BSA, BoQ, SAS

8

9 Role of The School Leadership Team Acts on school-wide data (Process Data and Student Outcomes) on a regular basis Sends content area specific information to the appropriate staff to address during content area meetings Provides all stakeholders with an overview of the data and areas for celebration and areas targeted for growth. This includes teachers, support staff, volunteers and parents. Utilizes work groups to address relevant needs Following through on action plans and updating progress along the way Sends school-wide information to district level staff

10 New and Improved Materials Page 4 in the Data Toolkit Trend Data Columns on the School-wide Literacy and Mathematics Overview Forms Vocabulary in the Problem Solving Guide aligned with School Improvement Process Spring to Fall Transition Action Plan

11 Evaluating Previous Plan New Page 4 in Data Tool Kit

12 Schoolwide Overview- Reading

13 Schoolwide Overview-Mathematics

14 Problem Solving Guide Data-based Problem Solving

15 Connection to School Improvement

16 Scheduling of Action Items First Few Days of School

17 Schoolwide Overview- Academics Where to find the academic data! Record information on the Illuminate note taking form.

18 Data Analysis…Something to think about What … Assumptions do we bring to this discussion? Important points seem to pop out? Patterns, categories, or trends are emerging? Seems to be surprising or unexpected? Additional data sources do we need to explore? Inferences, explanations, or conclusions might we draw? Solutions might we explore as a result of our conclusions? got data? Now What?, Solution Tree Press, 2012

19 5 Reasons Why Problem Exists got data? Now What?, Solution Tree Press, 2012

20 Pursuing Worthy Problems An issue recurs with frequency, year after year. An issue is pervasive across multiple grade levels, student groups or school settings. An issue consumes high levels of energy, time and resources. Even after an improvement bump, performance plateaus and subsequent data flatline. got data? Now What?, Solution Tree Press, 2012

21 Remember… The Building Leadership Team… Does not have to solve every problem Needs to study building data to determine school-wide and grade-level priorities Will identify the appropriate individual(s) who will address these needs (e.g., which grade-level teams need to address the identified needs)

22 Team Time Review/update previous action plan. Page 4 in Data Toolkit Use the School-wide overview sheets for problem(s) Identification Prioritize “Problems” Identify a problem; complete Continuous Improvement Process process and create an action plan. Move on to second (and third) problem, if able Identify building Glows and Grows You do!

23 Interpreting BSA Data

24 Schoolwide Overview- Behavior

25 Process Data - Behavior Benchmarks of Quality (BoQ) Completed annually by school leadership teams Tier 1 SWPBIS implementation fidelity check 53 benchmarks across 10 critical elements of implementation. Identifies areas of strength and need; informs problem analysis and action planning. 70% Implementation Goal Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) school wide, (b) non- classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication between leadership team and staff 70% Implementation Goal pbisapps.org

26 Process Data - Behavior

27 Schoolwide Overview- Behavior

28 Outcome Data - Behavior

29

30 Early Warning Signs

31 Early Warning Signs (EWS) Routinely available data; available early in the school year Better predictor than background characteristics Cut points selected to balance yield and accuracy. Helps target interventions Informs of patterns and trends

32 Early Warning Signs (EWS) ATTENDANCE: Missing more than 10% of instructional time BEHAVIOR: Suspensions (ISS or OSS); Minor or Major ODRs ISS or OSS: 6 hours of academic instruction lost per day ODR: 20 minutes of academic instruction lost for student per referral COURSE PERFORMANCE: Course failures, grade point average; credit accrual Combinations of academic indicators can reduce graduation likelihood to 55%

33 EWS Outcome Data - Building Level ATTENDANCE: > 90% missing more than 10% of instructional time State of Ohio retrospective analysis of top/bottom 10% academic outcomes Balances yield vs. accuracy BEHAVIOR: > 80% with 0 Suspensions (ISS or OSS) “High Quality Instruction” research MTSS Targeted Intervention COURSE PERFORMANCE: ACT-Explore Data Course Failures (MTSS Model of 80% corrected for accuracy to 85-90%) Credit Accrual is building-specific Combinations of academic indicators can reduce graduation likelihood to 55%

34 Schoolwide Overview – Behavior Worked Example

35 Process Data Snapshots ACADEMICS & BEHAVIOR

36 PET-M SNAPSHOTS

37 BSA: BUILDING SELF-ASSESSMENT Scale: Not Started (N) — In Progress (I) — Achieved (A) — Maintaining (M) — What Does BSA Data Tell you?

38 Process Data Snapshots BEHAVIOR Benchmarks of Quality (BoQ) Tier 1 SWPBIS implementation fidelity check 53 benchmarks across 10 critical elements: Identifies areas of strength and need to inform action plans Completed annually by school leadership teams Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) schoolwide, (b) non-classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication

39 Process Data Snapshots BEHAVIOR Benchmarks of Quality (BoQ) Tier 1 SWPBIS implementation fidelity check 53 benchmarks across 10 critical elements: Identifies areas of strength and need to inform action plans Completed annually by school leadership teams Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) schoolwide, (b) non-classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication

40 Process Data Snapshots: PBIS Benchmarks of Quality (BoQ)

41 Process Data Snapshots: PBIS Self-Assessment Survey (SAS) While summary data from the SAS provides a general sense of a building’s PBIS systems, more focused analysis can inform a team of the most vital and influential next steps. Low Implementation Status High Staff Priority PBIS Subsystem Targeted Implementation Supports

42 Process Data Snapshots: PBIS Self-Assessment Survey (SAS)

43 Problem Solving Guide: Step 1 Determine your (first) problem to be addressed today based one what you’ve derived from: Previous SIP Outcome Data Process Data and Process Data Snapshots

44 Problem Solving Guide: Step 2 Complete a Problem Analysis: Hypothesize what may be contributing to the problem Again, your data and the Snapshots can inform this discussion.

45 Problem Solving Guide: Worked Example

46

47

48

49 Problem Solving Guide Data-based Problem Solving

50 Problem Solving Guide

51

52 Please take a moment to complete the Evaluation Survey: https://www.surveymonkey.com/s/elementarydays

53 THANK YOU!


Download ppt "Spring Data Review Workday School Leadership Teams May 20 & 21, 2014."

Similar presentations


Ads by Google