Download presentation
Presentation is loading. Please wait.
Published byBudi Lesmono Modified over 6 years ago
1
Quick Review Agree on Approach to Discipline Identify Expectations
Teach Expectations Encourage Appropriate Behavior Discourage Inappropriate Behavior Monitor and Evaluate
2
Using Data to Make Decisions
3
Purpose Provide considerations for using data within a school
Brief review of the purpose and uses of data Provide data decision rules for office discipline referrals
4
Problem Features Efficiency, effectiveness, & sustainability of individual student behavior support systems affected by efficiency & effectiveness of school-wide discipline systems Schools need rules and processes for data-based decision making that are efficient and effective inefficiencies associated with matching intervention to problem context
5
Response Establish rules for guiding schools with intervention decision making
6
Improving Decision-Making
Solution Problem From Problem Solving Solution Problem To Information
7
Key Features The data are accurate The data are very easy to collect
Data are used for decision-making The data must be available when decisions need to be made The people who collect the data must see the information used for decision-making
8
Guiding Considerations
Use what schools have access to Handle data as few times as possible Build data collection into daily routines Establish data collection & use as a conditioned positive reinforcer (e.g., information)
9
Monthly E-mails (December)
Dear Staff, Thought I’d send this along before we go home ‘till 1999. Through 11/30/98 there were 179 referrals involving 62 students (6.7%). 858 students (93.3%) have no referrals. 27 students (2.9%) are responsible for 80% of all referrals through 11/30. The top 13 have earned 59% of the referrals. Thank you for your efforts this fall in helping to carry a positive surge in momentum through the year’s end. Have a refreshing break. Happiest Holiday Wishes!
10
Monthly E-mails (February)
Ever have that feeling like you wondered if someone had gotten the license plate of the truck that hit you? February had a bit of that feel to it. Approximately 1/3 of the year’s referrals to date (143 out of 457) took place in February…In perspective, the month was truly out of character with the rest of the year. Thank you for your perseverance. 85% of our students continue their good work and have no referrals. The 457 referrals (9/98-2/99) are down 22% from the 581 referrals last year. In April we will be seeking staff input through our EBS survey to help build a focus for next school year. Keep up your good work--
11
Types of Questions Initial Assessment Questions
What type, or which program do we need? Where should we focus our efforts? On-going Evaluation Questions Is the program working? If not, can it be changed? If not, should we end the program? Do we need this program any more?
12
What Data Should be Collected?
Data that will answer your question Easy, available, reliable Balance between reliability and accessibility Systems approach Logistics Who, When, Where, How Two levels What is readily accessible What requires extra resources
13
When Should Data Decisions Be Made?
Natural cycles, meeting times Weekly, monthly, quarterly Level of System addressed Individual: daily, weekly School-wide: monthly, quarterly
14
PBS Evaluation Measures
Office Discipline Referrals Staff Survey and Action Plans Checklist(s) Team Coaches School-wide Evaluation Tool (SET)
15
Office Discipline Referral =
Indicator of behavioral event requiring administrative involvement Three behavioral elements Student act/behavior Staff response Office response Under estimation of actual behavioral events
16
Usefulness of Office Discipline Referral Data
Useful as general measure of status of school Research support Colvin, Kameenui, & Sugai (1993) Skiba, Peterson, & Williams (1997) Taylor-Greene et al. (1997) Tobin & Sugai (1999a, 1999b) Tobin, Sugai, & Colvin (1996, in press) Wright & Dusek (1998)
17
Office Discipline Referral Summaries
Graphs # ODR/day/month # ODR/student # ODR by type of problem behavior # ODR by location # ODR by consequence/action # ODR/staff member
18
Some Findings Elementary averages (n=11) 567 students/year (240-1065)
283 ODR per year (79-607) 0.5 ODR per student per year 1.7 ODR per school day 21% of students received 1 or more ODR per year 0.5% students with >10 ODR 59% ODR from 5% of students with most ODR
19
Middle/junior averages (n=9)
635 students/year ( ) 1535 ODR per year ( ) 2.4 ODR per student per year 8.6 ODR per school day 47% of students received 1 or more ODR per year 5.4% students with >10 ODR 40% of ODR from 5% of students with most referrals
20
HS database # of schools 10,622 Total # students
7171 Total # referrals 1844 Total # of OSS 8066 Total # OSS days 471 Total # ISS
23
Cost Benefit Example (Scott, 2000)
ODR 182 to 67 (115 fewer) ISS 169 to 45 (124 fewer OSS 17 to 11 (8 fewer)
24
Cost-Benefit Analysis (Scott, 2000)
25
General Data Decision Rules
26
1. School-wide systems if…
>40% of students received 1+ ODR >2.5 ODR/student Modify universal interventions (proactive school-wide discipline) to improve overall discipline system Teach, precorrect, & positively reinforce expected behavior
27
2. Classroom system if… >60% of referrals come from classroom
>50% of ODR come from <10% of classrooms Enhance universal and/or targeted classroom management practices Examine academic engagement & success Teach, precorrect for, & positively reinforce expected classroom behavior & routines
28
3. Non-classroom systems if…
>35% of referrals come from non-classroom settings >15% of students referred from non-classroom settings Enhance universal behavior management practices teach, precorrect for, & positively reinforce expected behavior & routines increase active supervision (move, scan, interact)
29
4. Targeted group interventions if….
>10-15 students receive >5 ODR Provide functional assessment-based, but group-based targeted interventions Standarize & increase daily monitoring, opportunities & frequency of positive reinforcement
30
5. Individualized action team system if...
<10 students with >10 ODR <10 students continue rate of referrals after receiving targeted group support Provide highly individualized functional-assessment-based behavior support planning
31
Interpreting Office Referral Data: Is there a problem?
Absolute level (depending on size of school) Middle Schools (>5 per day) Elementary Schools (>1.5-2 per day) Trends Peaks before breaks? Gradual increasing trend across year? Compare levels to last year Improvement?
32
Is There a Problem? #1 Maintain - Modify - Terminate
33
Is There a Problem? #2 Maintain - Modify - Terminate
34
Is There a Problem? #3 Maintain - Modify - Terminate
35
Is There a Problem? #4 Maintain - Modify - Terminate
36
SET Measures level of implementation for SW Expectations defined
Expectations taught Appropriate behavior encouraged Inappropriate behavior discouraged Monitoring and decision-making District level support
37
Process Guidelines Two-three hour visit
Product review Obsesrvations Interviews Guidelines For evaluation purposed, not day-to-day decision-making Interpreted with consideration for Action plan Always combined with multiple measures
38
PBS Staff Survey Used for initial and on-going assessment of EBS in school. School-wide Specific setting Classroom Individual student Used to develop action plan.
39
General Process Who completes? When and How competed?
Initially entire staff Subsequent years (all staff, group, team) When and How competed? Annually (beginning or end of year)
40
Using the results Summarizing the results
Analyze and prioritize the results Develop the action plan
41
Summarizing the results
42
Analyzing the results Goal is to narrow focus of activities
Use other data sources (i.e., ODR) Steps: Identify system(s) Identify specific features within the system Prioritize features
43
Developing an action plan
Develop as a team Specify implementation activities Set timelines and responsibilities Set follow-up meetings/progress checks Consider three types of areas Development Maintenance Management & evaluation
44
Using multiple data sources
Confirm each other More confidence Disconfirm Why? Mater of perspective Breadth and Depth Start broad Use other data sources to provide detail E.g., ODR- recess, Survey- features
45
Additional Data Sources
What other data sources do you have available? What do they tel you about? Attendance Achievement Vandalism Spec. Ed Referral rates ISS/OSS/Detention “Gothcha’s
46
Big Ideas Data can provide information for initial and on-going decisions Data should be collected to answer specific questions Data collection procedures must be efficient and effective
47
Data System Work as a team
Use sample forms and worksheets to design/refine data system and evaluation plan Add activities to action plan as needed
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.