Presentation is loading. Please wait.

Presentation is loading. Please wait.

Progress monitoring for social behavior

Similar presentations


Presentation on theme: "Progress monitoring for social behavior"— Presentation transcript:

1 Progress monitoring for social behavior
Cynthia M. Anderson, PhD & Nadia Katul Sampson, MA University of Oregon

2 School-Wide Positive Behavior Support Intensive Interventions
Specialized Individualized Systems for Students with High-Risk Behavior ~5% Targeted Interventions Specialized Group Systems for Students with At-Risk Behavior Universal Interventions School-/Classroom- Wide Systems for All Students, Staff, & Settings ~15% The universal level of SWPBS is designed to be in place for all students and in all situations. This level of support is critical in a school as research shows that implementation of a universal intervention—the first tier of SWPBS—increases the number of students who succeed academically and socially. In other words, this intervention reduces the number of students who will require more intensive interventions for behavior problems AND for academics. Thus, it is very important that, as schools “move up the triangle” they do not lose focus on implementation of their universal intervention. In the continuum of SWPBS, interventions for students requiring more support are grouped into two categories. First are what we call targeted interventions. Targeted interventions are considered the next level of support in the prevention continuum and are those interventions that are able to be implemented soon after a need is detected and that require minimal time to implement. Examples could include social skills groups and CICO. Importantly and as we will talk more about, group interventions are targeted interventions if and only if data are used to (a) guide decision-making around which students are most likely to benefit and (b) progress monitor students receiving the intervention. Finally, the last tier of support is intensive interventions. These are individualized interventions based on the results of a FBA. Intensive interventions typically require more time prior to implementation—to complete the FBA and planning meetings—and also may require significant staff time and/or other resources to implement. As with targeted interventions, data-based decision-making is key here. Thus, our focus today is on systems for progress monitoring. ~80% of Students

3 Supporting Student Behavior
Systems Supporting Staff Behavior Measurable Outcomes Supporting Decision Making OUTCOMES Practices Supporting Student Behavior In addition to emphasizing a continuum of supports to meet student needs within a school, SWPBS has taught us as well that we will be successful if and only if we focus, not just on our interventions—what they consist of—but also on what we need to do to make it more likely that the intervention is feasible and successful and will sustain over time. In other words, we need to focus on our practices (our interventions) but we also need to make sure we have systems to support what staff do and that we have defined outcomes and can measure them. When we talk about interventions or practices, the first consideration is that a school invest in interventions that are “evidence based.” EBP are not just those that have been shown to be effective in randomized clinical trials or SSRD however—they also must fit YOUR school. In addition, the intervention must match the needs of the student(s) for whom it is being considered. We will talk more about this later. Successful outcomes require that we attend to what we want to see changed and that we measure it. Thus, besides just designing an intervention for a student, we want to think about “what will Jonny say or do differently in 8 weeks” and how are we going to assess whether our goal has been met. Last but certainly not least, successfully intervening requires that we spend time considering what we need to do to make it more likely that staff can and will implement interventions. What training do they need, what materials are needed, etc.? Here we are concerned with measuring treatment integrity or fidelity—did we do what we said we would do? 3

4 Important Outcomes to Monitor
System outcomes What key features of student support are in place? Are key features implemented with fidelity? Individual student outcomes Decision rules for starting an intervention “Is this intervention a good fit?” Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?” Is the intervention being implemented as designed? “Are we doing what we said we would do?” The big message is that it pays to invest time up-front in defining your outcomes. If you fail to do this; if you intervene first and then measure later, you could end up collecting a lot of data that you don’t really need or you could end up NOT measuring what is really important. In addition, you may miss a lot of the picture by not beginning measurement soon enough. When considering what to assess for Tiers II and III, think about what you want to know regarding your intervention system AND what you want to know regarding how any given student is doing. First, your questions around your overall system focus on how what you are doing is working, in general. You might ask questions about what you are doing and also about what general outcomes you are seeing. For individual students, you might want to know whether the intervention is being implemented as designed and also whether it is working.

5 Systems Outcomes: Assessing Process
Self Assessment Monitoring progress over time Developing an action plan External Evaluation Useful when outside opinion is warranted For systems outcomes focusing on whether what you are doing is working, ask yourself, how would you KNOW if you were having an effect, what would you WANT to be different? For example, maybe you would have fewer students placed out of the building our out of district due to problem behavior. Maybe you would have fewer referrals to SPED. There are not existing tools for this because the questions you want to ask may differ from school to school or district to district. For the process—what are you doing questions, you could consider self assessment and/or external evaluation. Here we have existing tools. Self assessment tools are ones completed by the team or by someone in the school and external tools are created by someone who doesn’t have a vested interest in the system. External evaluation tools are used mostly for research products—either by a researcher asking questions about systems or by a district that is interested in some sort of program evaluation.

6 Existing Tools for Assessing Process
Universal Component of SWPBS External School-wide Evaluation Tool (SET) Self Assessment Team implementation Checklist (TIC) Benchmarks of Quality (BoQ) Phases of Implementation Targeted & Intensive Components of SWPBS Individual Student Systems Evaluation Tool (ISSET) Benchmarks for Advanced Tiers (BAT) There are external and self-assessment tools for the entire continuum of SWPBS and you should be familiar with some of them. At the universal level the SET is the external tool, etc.

7 ISSET and BAT Key Features For each feature:
Foundations: What needs to be in place? Targeted interventions Intensive interventions For each feature: What practices are implemented? What systems are used? What outcomes are assessed? SYSTEMS Practices Data

8 Important Outcomes to Assess
System outcomes Individual student outcomes Decision rules for starting an intervention “Is this intervention a good fit?” Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?” Is the intervention being implemented as designed? “Are we doing what we said we would do?” Our focus today is primarily on assessing individual student outcomes, so let’s move on to these. There are several points at which data are used to guide decision-making for students who are receiving an intervention. First, we can use data to decide whether any particular intervention should even be used—is it likely to work? Once we have settled on an intervention we want to use data to see if we are obtaining desired outcomes AND if the intervention is being implemented as designed. We are going to walk through all of these areas and I will share some tools we have found useful for data-based decision-making.

9 Important Outcomes to Assess
System outcomes Individual student outcomes Decision rules for starting an intervention “Is this intervention a good fit?” So we start with wanting to find out whether any given intervention is a good fit—do data suggest it might work? It is useful to consider this upfront as it can save you a lot of time in the long run. Further, doing some general planning—around existing and commonly used interventions in the school can be a really useful exercise.

10 Is an Intervention a Good Fit?
Questions about the student’s behavior: What is the problem? What is the hypothesis about why the problem is occurring What is the goal of intervention? Who will be implementing and what are their skills and availability? Intervention selection: Is this intervention effective for: Problems like this (severity, intensity, where it occurs, etc.) Behaviors triggered and maintained by events like this one? Achieving goals like this? What resources are needed to implement? Any time you are considering a particular intervention you want to make sure there is a match between the needs of the student and what the intervention provides. You want to be sure that everything you know suggests this intervention will be likely to work. Has this intervention been found to be effective for behaviors that are similar in intensity, locale, etc? -For example, CICO is a great intervention but I wouldn’t use it for a student with behavior that was dangerous to others. Similarly, because CICO is in place across the day it is not appropriate if a student is struggling only in a small part of the day such as during recess or during one class period. Does the intervention address relevant environmental variables—antecedents and consequences? --For example, social skills training (skill deficit versus contingency management deficit) c) What is the goal? Will this intervention get you there? --for example, imagine in September you have a student who is reading less than 20 words per minute in 2nd grade. You would like her to be at grade level by the end of 2nd grade. If you put her in a Tier II reading group that consists of only 30 minutes of additional instruction per week, the goal of that intervention DOES NOT match your own goal. Finally, what resources are needed to implement this intervention and do they match what is available? --here you need to consider the skills of the person implementing, whether he or she is “a fan” of the intervention, and whether they have the time to implement the intervention. Keep in mind that actual implementation almost always requires much more training then we think it will. Rarely if ever is just going over the intervention verbally or in writing enough. What typically happens is when the person begins to implement, something happens that wasn’t covered in training. Left to our own devices—bad things happen. Social skills is good example here Put the intervention documentation tools here

11 Is this Intervention a Good Fit?
Evaluating outcomes requires planning before the intervention begins What are the targeted outcomes? What is the goal—date and outcome? How will data be collected? How will data be analyzed? How often will progress monitoring occur? Figuring out the answers to those questions for any given student is a lot easier if a school has invested in defining key features of the intervention up front. In addition to the questions we just covered, for each intervention you want to consider these important points. This information should be a part of every intervention plan. Group Template Individual Template

12 Important Outcomes to Assess
System outcomes Individual student outcomes Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?” Once you have made some decisions regarding whether an intervention is likely to be a good fit, you turn to how you will determine whether it is actually working—progress monitoring.

13 Students in IPBS—Is the Intervention Working?
Once the intervention has begun Progress monitoring occurs regularly and frequently Feedback from a teacher(s) Team feedback Data are used to guide decision-making Continue the intervention Modify the intervention Begin a new intervention Fade the existing intervention In education and in the field of behavior analysis, we have done a simply brilliant job of making data collection extremely onerous. The fact is though, data collection MUST be fairly easy to do if it will be used. The goal is to achieve a balance between useful and ease. There is no sense designing a data collection system that would produce fabulous data if only someone had the skills and time to use it. Similarly, there is no sense developing a really easy to use system if the information it gives you aren’t useful for progress monitoring. A general rule of thumb—if you can’t graph it, it isn’t going to be useful! We have found it helpful to have a variety of progress monitoring tools AND to have a system for storing and graphing data. I am going to show you a few tools first and then show you our data analysis system last. Before we look at the tools though, you want to be able to make a decision EACH time you progress monitor, either keep going, modify it, stop and do something new, or fade. Thus, you must have defined decision-rules to help you get there. This is easiest if you have a tool to help with data management. CICO—SWIS does this. Other interventions—here is a spreadsheet that might be useful. We will walk through it in greater depth with interested teams. Behavior Rating Form Behavior Rating & Fidelity Team Feedback Graph System

14 In addition to knowing whether the intervention is working, you also need to know if it is actually being implemented. Think about this. You are implementing a point-card system for a student and it has been in place for three weeks. At the end of every day she takes the card home and her parents let her watch television for an hour if she earned more than 20 points. If she earned less than 20….no television. You are unhappy because the intervention is not working. Here are the data. What would you do—change the intervention? Well…..maybe it isn’t actually being implemented? Maybe the teacher isn’t making points contingent on behavior. Maybe parents are not allowing her to watch TV whenever she earns points. MAYBE she gets to watch TV irrespective of points. The thing is, before we change an intervention, it is important to find out if you actually HAVE an intervention.

15 Important Outcomes to Assess
System outcomes Individual student outcomes Is the intervention being implemented as designed? “Are we doing what we said we would do?” In other words, Do we have an intervention?

16 Fidelity Documentation that intervention is being implemented as designed Measurement Teacher-completed Assessed by another person The easiest type of fidelity to get is when you ask the person implementing it to self report. People always think you won’t get accurate data—I have had NO problem with this. You can also go and watch. You don’t have to do this formally, just hanging out in the room is usually enough. What do you see?

17 Student Outcomes--Fidelity
What are key components of the intervention? How can fidelity be measured? Who will collect and analyze the data? How will data be used? Now, fidelity data are not supposed to be shoved in a box and forgotten. We use these data to either provide feedback to improve fidelity or as information that we need to change the intervention, not because it wouldn’t work but because the implementer is either unable or unwilling to implement it. In fact, self monitoring of fidelity is a great intervention in itself. We often make a checklist that defines key features of the intervention and ask implementers to complete it either daily or weekly—this alone can improve fidelity of implementation. Sample BSP

18 Monitoring Student Progress Over Time
System requirements Efficient Comprehensive Easily accessible Modifiable to meet needs of individual students Even when schools attempt to collect data the often stop or are unable to use the data to make efficient decisions because they lack a system for organizing the data in a way that can be accessed easily. I am going to share with you a very simple system that we use to help schools progress monitor outcomes for individual students.

19 Relevant Information for Individual Students
Referral information Intervention description Modifications to intervention Easily interpretable summery of intervention results/progress This is the information that is stored for each student. The program is an Excel file and you have one file for each student—it thus is an electronic record of progress monitoring.

20

21

22

23

24 Progress-Monitoring in Illinois
Progress monitoring is critical at all levels Student Per student, for individual progress-monitoring In aggregate, to monitor effectiveness of interventions themselves Ex. Is our ‘problem-solving’ group effective? Building/District Per school, to monitor building-level systems Ex. Is our HS effective at keeping youth engaged? In aggregate, to make district-level decisions District as a whole (set goals, allocate resources) Cohort schools vs non-cohort schools (is an initiative working?)

25 Data-Based Decision-Making
1) Student outcome data is used: To identify youth in need of support and to identify appropriate intervention For on-going progress-monitoring of response to intervention To exit or transition youth off of interventions 2) Intervention integrity or process data is used: To monitor the effectiveness of the intervention itself To make decisions regarding the continuum/ menu of interventions/supports

26 71 Elementary Schools

27 71 Elementary Schools Mean CICO points per school
71 Illinois Elementary Schools 08-09 71 Elementary Schools

28

29 Secondary Systems Planning Team Meeting Agenda
Number of youth in CICO (record on TT)? Number of youth responding (record on TT)? * Send Reverse Request for Assistance to teachers of all youth not responding Number of new youth potentially entering intervention (share # of RFAs, Universal Screening info and/or # of youth who met the data-based decision-rule cut offs for Secondary support)? Repeat for S/AIG, Mentoring & Brief FBA/BIP If less than 70% of youth are responding to any of the interventions, the Secondary Systems team should review the integrity of the intervention and make adjustments as needed. Have audience review Reverse RFA briefly

30 3-Tiered System of Support Necessary Conversations (Teams)
Universal Team Secondary Systems Team Problem Solving Team Tertiary Systems Team Uses Process data; determines overall intervention effectiveness Uses Process data; determines overall intervention effectiveness Plans SW & Class-wide supports Standing team; uses FBA/BIP process for one youth at a time CICO Universal Support Brief FBA/BIP SAIG Complex FBA/BIP WRAP Group w. individual feature Brief FBA/BIP Sept. 1, 2009 30 30

31 Comparison: Elementary School A FY 2009 CISS Data and IS-SET Data
31

32 FY 2009 IS-SET Data Comparison: Elementary School A - District
32

33 Mean Percentage of Students by Major ODRs 06-07
Elementary School B (677 students) 164 33 33

34 Mean Percentage of Students by Major ODRs 07-08
Elementary School B (707 students) 71 34 34

35 Mean Percentage of Students by Major ODRs 08-09
Elementary School B (695 students) 61 35 35

36 FY 2009 IS-SET Data Comparison: Elementary School B - District
No CISS score for this school this year. 36

37 Comments/Questions


Download ppt "Progress monitoring for social behavior"

Similar presentations


Ads by Google