Presentation is loading. Please wait.

Presentation is loading. Please wait.

Compilation of Slides for Data Measures

Similar presentations


Presentation on theme: "Compilation of Slides for Data Measures"— Presentation transcript:

1 Compilation of Slides for Data Measures
miblsi.org

2 Capacity Slides

3 Why Focus on Capacity? Recall the Definition of Sustainability
“The durable, long-term implementation of a practice at a level of fidelity that continues to produce valued outcomes” (McIntosh, Horner, & Sugai, 2009; p.328) Trainer Notes: This slide defines sustainability. Have participants read through the definition. Here are some additional talking points: We are looking for implementation with fidelity to last at least 5 years We want to to be durable over time: Endure over change in staff – administration as well as other staff

4 Why Focus on Capacity? Recall the Definition of Scale Up
“Scaling up is defined as having enough of something so that it is useful. Scaling up is the process of moving from “exemplars” to the “typical.” The process of scaling involves the development of organizational capacity to move from exemplars sustained by extra- ordinary supports, to typical application with typical supports.” Trainer Notes: “Scaling up is defined as having enough of something so that it is useful. Scaling up is the process of moving from “exemplars” to the “typical.” The process of scaling involves the development of organizational capacity to move from exemplars sustained by extra-ordinary supports, to typical application with typical supports. While there is no firm agreement about the level at which “scaling” is achieved, we hypothesize that an organization (district) has reached the “tipping point” for functional scaling when approximately 40% of the units in the organization are implementing a practice with fidelity. At that point, the education system would have changed to provide typical supports for evidence-based practices across the (district).” Fixsen et al 2008 (Fixsen et al, 2008)

5 What Are You Aiming For on the DCA?
A Total Score of 80% or higher on the District Capacity Assessment (DCA) Increase of 10% or more at each assessment

6 Fidelity Slides

7 School-wide PBIS Tiered Fidelity Inventory (SWPBIS-TFI)
Self assessment Monitors school-wide PBIS implementation efforts Identifies strengths for celebration Identifies needs for action planning to improve implementation efforts for the upcoming school year Data compared with Office Discipline Referral Data (ODR) Trainer Notes: The TFI is a self assessment measure that monitors implementation efforts related to school-wide PBIS. It assists teams in identifying strengths for celebration as well as areas in need of improvement for action planning to strengthen or sustain future implementation efforts. The data are compared to the Office Discipline Referrals data.

8 School-wide PBIS Tiered Fidelity Inventory (SWPBIS-TFI): Purpose
Provide an efficient and valid index of the extent to which PBIS core features are in place within a school Tier 1: Universal PBIS Tier 2: Targeted PBIS Tier 3: Intensive PBIS Trainer Notes:

9 Student Outcome Slides

10 SWIS: Why Discipline Referrals?
Research supports the use of Discipline Referral data as measure of outcomes for students Common metric used in schools implementing SWPBIS and research evaluating the effectiveness of the practice Trainer Notes: This slide unpacks information regarding “why” we collect discipline referral data and use it for problem solving. Be sure to emphasize both bullets – it is supported by research and a common metric used related to the outcomes of the implementation of SWPBIS.

11 Student Outcome Measure: Discipline Referrals
Discipline referral data are used to monitor student outcomes related to social behavioral outcomes, illustrating patterns that represent school climate School-wide Information System (SWIS) is the data system used to collect and organize discipline referral data Trainer Notes: As mentioned previously, typically the form used to document problem behaviors is called an ODR, or an Office Discipline Referral form. This name has been confusing as it seems to imply that it is only used for behavior that results in a student being sent to the office. We are deliberating pairing the term “discipline referral form” to minimize this confusion. For schools that have been through PBIS training before and/or are using SWIS, discipline referral form is the same thing as the ODR. Some schools have opted to call their form an incident report or a ticket just to name a few other examples. The ODR is used to collect building level data for the purpose of problem solving.

12 SWIS Summary Reports Trainer Notes:
Core Reports provide information at the school-wide level. These reports are presented on the main SWIS Dashboard and are also available in the reports dashboard. The value of the Core Reports is they help schools quickly, visually identify “red flags” or potential problems that may need further inquiry. The Core Reports include: Average Referrals Per Day Per Month, Referrals By Location, Referrals By Problem Behavior, Referrals By Time, Referrals By Student, Referrals By Day of Week, and Referrals By Grade.

13 School Climate Survey School climate studies conducted with student populations within the United States and internationally have consistently suggested that safe, caring, and responsive school environments have a notable effect on student perceptions of school climate and are positively associated with academic performance, risk prevention, and health promotion for students Trainer Notes: Research is suggesting that safe, caring and responsive school environments have a positive effect on student academic performance, risk prevention and health promotion. This aligns with our work to implement PBIS with fidelity to improve student academic, behavior, and healthy outcomes and to create an environment that is positive and responsive to students needs and preventing more intensive needs and supports.

14 School Climate Surveys
School Climate Surveys are a set of multi- dimensional measures of student, teacher, administrator, faculty, and family perceptions of school climate. Trainer Notes: The purpose of school climate surveys is to provide valid, reliable and brief measures of perceptions of school climate within a variety of dimensions. In the next few slides we will unpack what multi-dimensional means. Keep in mind as your Leadership Teams create annual district assessment schedules, the School Climate Surveys can be administered 1 or 2x a year. The recommendation is in the fall, within first 45 days of school and/or the spring, within the last 45 days of school.

15 Types of Reports in School Climate Survey
Elementary & Middle/Secondary School Personnel & Family Total Score Mean Scores By Grade Subscale Mean Score by Gender Mean Scores By Race/Ethnicity Mean Scores By Gender Mean Scores By Question Items Trainer Notes: After each survey window is completed, we have several reports we have access to for each survey. There is a slight difference with the student surveys and the staff and family surveys. For example, the student surveys only have one subscale – School Climate. However the staff survey has 6 subscales and the family survey has 5 subscales. The other difference is the the student surveys have a report Mean Scores By Question report, which the staff and family surveys do not have. When interpreting the scores, keep in mind the higher scores represent more positive perceptions of school climate. Also, in order for a subgroup mean score to show up on a report, we will need 5 or more participants represented in a subgroup, otherwise their responses will only be included in the overall reports.

16 General Data Slides

17 Introduction: Data Sources for Implementation Plan
Reach Data: Extend the range of contact and influence Capacity Data: The ability or power to do Fidelity Data: Extent that implementation occurs as intended Student Outcome Data: Meeting educational goals and objectives Trainer Notes: This slide is here to help folks get reacquainted with the four types of data before we head further into the data review. For most districts we will not be digging into Reach data because we do not anticipate that the reach has changed significantly from the Data Review training this past fall. In this case, please be sure to explain that we will not be examining reach because we do not anticipate changes from the fall. However, if the context of the district is such that you anticipate changes in reach from the fall or if the district did not have a chance to adequately review reach data this past fall (e.g., the Dashboard in MiData wasn’t yet functioning properly), we’ve included a few hidden slides relative to reach that you can use for that portion of this data review.

18 Two Categories of Assessment
Program Quality/Fidelity Data Are we doing what we said we would do? 2. Student Outcome Data Is what we are doing working? Trainer Notes: Within the SWPBIS framework there are two categories of assessments at the school level: student outcome and program quality/fidelity. Program quality/fidelity measures answer the question “Are we doing what we said we would do?” The TFI fits under category of program quality/fidelity. Program quality/fidelity measures are used to evaluate the changes in staff behavior as a school implements MTSS. Successful student outcomes may reinforce staff behavior while implementing and continually improving MTSS efforts. However, changes in student behavior may take time. Measures that are used to evaluate the items staff have accomplished may function as reinforcement for changes in adult behavior. The items achieved on the measure also serve to evaluate the school’s progress in the implementation process.


Download ppt "Compilation of Slides for Data Measures"

Similar presentations


Ads by Google