Direct Behavior Rating: An Assessment and Intervention Tool for Improving Student Engagement Class-wide Rose Jaffery, Lindsay M. Fallon, Sandra M. Chafouleas,

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

Overview of Withdrawal Designs
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute Cal-ABA, 2011.
Individual Education Plan Overview Presented By: Pamela Cameron Winter 2013.
Rose Jaffery LEND Fellow (School Psychology) April 30, 2010.
Experimental Research Designs
The Effects of ____ on ____ for ____ (students) Candidate Spring 2008 Behavior change.ppt.
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Chapter 5: Improving and Assessing the Quality of Behavioral Measurement Cooper, Heron, and Heward Applied Behavior Analysis, Second Edition.
1 Maximizing Effectiveness Using Positive Behavior Support Methods in the Classroom: Self-Management.
How do you know it worked
Chapter 12: Single-Subject Designs An alternative to experimental designs Purpose: To draw conclusions about the effects of treatment based on the responses.
Partnering with parents
Sustainability: Tier II Interventions. Overview of Session 1.Essential Features of Tier II (10min) 2.Review of Common Tier II Interventions (10min) 3.Tier.
RELIABILITY AND VALIDITY OF DATA COLLECTION. RELIABILITY OF MEASUREMENT Measurement is reliable when it yields the same values across repeated measures.
Module 2: Schoolwide/Classroom Interventions
Quick Sort Matrix 1 Check-In Check-Out Check & ConnectSocial Skills Group Organizational Skills Newcomers Club Adult Attention XXXXX Peer Attention XX.
Chapter 11 Research Methods in Behavior Modification.
Generalizability and Dependability of Direct Behavior Ratings (DBRs) to Assess Social Behavior of Preschoolers Sandra M. Chafouleas 1, Theodore J. Christ.
Recent public laws such as Individuals with Disabilities Education Improvement Act (IDEIA, 2004) and No Child Left Behind Act (NCLB,2002) aim to establish.
The Impact of Training on the Accuracy of Teacher-Completed Direct Behavior Ratings (DBRs) Teresa J. LeBel, Stephen P. Kilgus, Amy M. Briesch, & Sandra.
Establishing Training Capacity for Classroom Management Heather Peshak George, Ph.D. Kim Herrmann, S.S.P. University of South Florida Marla Dewhirst Illinois.
Behavior Management: Applications for Teachers (5 th Ed.) Thomas J. Zirpoli Copyright © 2008 by Pearson Education, Inc. All rights reserved. 1 CHAPTER.
INTRODUCTION Project VIABLERESULTSRESULTS CONTACTS This study represents one of of several investigations initiated under Project VIABLE. Through Project.
Project VIABLE: Overview of Directions Related to Training to Enhance Adequacy of Data Obtained through Direct Behavior Rating (DBR) Sandra M. Chafouleas.
Wisconsin PBIS Leadership Conference E8 ISIS/SWIS Overview Marla Dewhirst ISIS/SWIS Trainer of Trainers
Adolescent Literacy – Professional Development
+ Development and Validation of Progress Monitoring Tools for Social Behavior: Lessons from Project VIABLE Sandra M. Chafouleas, Project Director Presented.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Training Interventionists to Implement a Brief Experimental Analysis of Reading Protocol to Elementary Students: An Evaluation of Three Training Packages.
Functional Assessment Functional assessment involves analyzing contextual, curricular, and outcome factors related to the occurrence of a challenging behavior.
Method Participants and Setting Three second grade students from two different elementary schools in Eau Claire, WI participated in this study. Teachers.
Tier Two and an Evidence-Based Practice: Check-In/Check-Out Janice Morris, Barbara Mitchell and Nicole Reifesel Columbia Public Schools.
Checking in on Check In/Check Out DEBORA LINTNER MO SW-PBS TIER 2/3 CONSULTANT SUSAN LONG ASSISTANT PRINCIPAL SIKESTON 5-6 GRADE CENTER.
District/School Alignment Plan Proposal. Academic Goal 1: Plaucheville Elementary School will meet the Fundamental Practice Level or above in the.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Single-Subject Experimental Research
Training Individuals to Implement a Brief Experimental Analysis of Oral Reading Fluency Amber Zank, M.S.E & Michael Axelrod, Ph.D. Human Development Center.
Investigating the Step Size in a Progressive-Ratio Schedule of Reinforcement for Young Children Diagnosed with Autism Kathryn R. Haugle, Stephany Reetz,
Playground Settings and the Impact of Recess on Classroom Attention Christine Peterson, B.A., M.S.E. Psychology Department Human Development Center University.
Direct Behavior Rating: Using DBR for Intervention.
Experimental Control Definition Is a predictable change in behavior (dependent variable) that can be reliably produced by the systematic manipulation.
Notes for Trainers (Day Training)
Single-Subject and Correlational Research Bring Schraw et al.
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute ABAI, 2011.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Training Strategies to Improve Accuracy Sayward E. Harrison, M.A./C.A.S. T. Chris Riley-Tillman, Ph.D. East Carolina University Sandra M. Chafouleas, Ph.D.
Progress Monitoring Elementary Intervention Coaches November 22, 2011.
Clayton R. Cook Diana Browning Wright. Purposes of Assessment Screening Who needs help? Diagnosis Why is the problem occurring? Progress Monitoring Is.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
ALISON KING VIRGINIA COMMONWEALTH UNIVERSITY SEDP 711: DOCTORAL SEMINAR IN SINGLE SUBJECT DESIGN FALL 2015 Article Analysis: Changing Criteria.
Introduction to Teacher Evaluation
SAHS Advanced Placement Program Evaluation
DAY 2 Visual Analysis of Single-Case Intervention Data Tom Kratochwill
Lab Roles and Lab Report
Chapter 12 Single-Case Evaluation Designs
International And Cross-Cultural Application Of The Good Behavior Game
Promoting Inclusion with Classroom Peers
Introduction to Teacher Evaluation
XXXXX School Ci3T Implementation Report Social Validity and Treatment Integrity 20XX – 20XX ____________________________________ Fall 20XX Lane and Oakes.
11 Single-Case Research Designs.
Southwest Junior High School CICO Handbook
ABAB Design Ethical considerations
Visually Interpreting Your Client’s Progress
Accessibility Supports Training
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Accessibility Supports Training
Understanding How the Ranking is Calculated
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Direct Behavior Rating: An Assessment and Intervention Tool for Improving Student Engagement Class-wide Rose Jaffery, Lindsay M. Fallon, Sandra M. Chafouleas, and Lisa M. Hagermoser Sanetti University of Connecticut Direct Behavior Rating (DBR) is a form of behavioral assessment that combines the efficiency of behavior rating scales and the repeatability of systematic direct observation (SDO). Procedures involve making a brief rating of behavior immediately following a pre-specified observation period. Previous research has demonstrated the technical adequacy of DBR as a method of assessing student behavior and has also demonstrated its effectiveness as a behavioral intervention (Chafouleas, Riley-Tillman, & Christ, 2009). The purpose of this study was to evaluate the effectiveness of a class-wide Direct Behavior Rating–Self-Monitoring (DBR-SM) intervention with an interdependent group contingency reward system for improving academic engagement in 8 th grade students. Researchers were also interested in assessing inter-method reliability of DBR and SDO to draw conclusions about the use of DBR as a measurement tool to assess academic engagement. This study was part of a larger project that assessed the effectiveness of the intervention for improving several student behaviors (Chafouleas, Sanetti, Jaffery, & Fallon, 2010), however the purpose of this presentation is to discuss results specifically pertaining to students’ academic engagement. Introduction Participants Two eighth grade teachers (Ms. S and Ms. B) from a suburban public middle school in the northeastern United States participated in the study. Ms. S implemented the DBR-SM intervention in her first and fifth period science classes, whereas Ms. B implemented the intervention in her third period social studies class (17-24 students in each class). Design: A multiple baseline design across three 8 th grade classrooms was used to evaluate the class-wide intervention. Procedures The behavioral intervention was designed for students to monitor their performance on behavioral goals, including academic engagement. Classes were divided into teams of 3-5 students, and researchers trained students to use the DBR- SM sheet. At the end of class each day, students used single-item scales on the DBR-SM sheet to rate from 0-10 how much they performed each target behavior (0=Not at all, 5=Some, 10=Totally). In order to encourage students to make accurate self-ratings, teachers also provided ratings directly on the students’ DBR-SM sheets for each scale, and awarded bonus points to student ratings within one point of the teacher’s rating. Then, using teacher ratings, a total score was summed for each student. Teachers calculated and recorded the average score for each team on a Team Tally Sheet. If a team met or exceeded the weekly goal, all members of the team earned a reward (e.g., snack, gift card, pizza party). During baseline, students and teachers provided ratings on the DBR-SM sheet daily. During the intervention phase, the interdependent group contingency reward system was introduced. Method Conclusions Results Overall, according to visual analysis of baseline and intervention data, DBR- SM and SDO data demonstrated a similar pattern for each class, lending support for the reliability of data obtained through DBR to measure student engagement. Results also suggest that integrating DBR into a self-monitoring intervention can lead to increased levels of desired academic behaviors, particularly student engagement. These results are important for demonstrating the technical adequacy and practicality of using DBR for intervention and assessment purposes. Chafouleas, S. M., Riley-Tillman, T. C., & Christ, T. J. (2009). Direct Behavior Rating: An emerging method for assessing social behavior within a tiered intervention system. Assessment for Effective Intervention, 34, Chafouleas, S. M., Hagermoser Sanetti, L.M., Jaffery, R., & Fallon, L. M. (2010). Incorporating Direct Behavior Rating in an intervention package involving self-monitoring and group contingency to improve classroom behavior of middle school students. Manuscript in preparation. This poster can be cited as: Jaffery, R., Fallon, L.M., Chafouleas, S.M., & Sanetti, L.M.H. (2010, Oct). Direct Behavior Rating: An assessment and intervention tool for improving student engagement class-wide. Berkshire Association for Behavior Analysis and Therapy Annual Conference, Amherst, MA. Dependent Variables DBR-SM ratings of academic engagement were monitored to evaluate the effectiveness of the intervention on improving student outcomes. Research assistants also conducted systematic direct observations (SDO) once per week for 15 minutes in each class to obtain overall ratings of student engagement. Inter- observer agreement (IOA) was conducted for 33% of classroom observations. Agreement among highly trained raters (graduate students consultants) ranged from 78% - 100%, with an average of 92.4%. Team Results The teams in Ms. S’s Period 5 class met the goal sporadically, however all four teams met the goal during the last three weeks of the intervention. In Ms. B’s Period 3 class, all teams met the goal 100% of the weeks, except for one team that met the goal 78% of the weeks. In Ms. S’s Period 1 class, five teams met the goal 75% of the weeks, however one team only met the goal 25% of the weeks. Visual Analysis Overall, DBR-SM data for academic engagement increased upon intervention implementation, particularly after a phase change was implemented (i.e., the weekly goal was increased). However, Ms. S’s Period 1 class showed a decreasing trend in engagement upon implementation of the phase change (Figure 1). SDO data for engagement showed a pattern similar to DBR-SM data for each class (Figure 2). Figure 2. Percentage of intervals students were observed to be engaged via SDO. Figure 1. Class average of daily student engagement ratings on DBR-SM. Note that data were not collected on the following dates: 2/16 (Winter Break), 3/2 (Snow Day), 3/25 & 3/27 (No class due to state-wide testing), 4/20-4/24 (Spring Break). Based on the results obtained from DBR-SM ratings and SDO, the self- monitoring interdependent group contingency intervention increased class-wide levels of academic engagement. However, results from Ms. S’s Period 1 class show that academic engagement decreased after a phase change (i.e., weekly goal increase) was implemented, indicating that the phase change may have been inappropriate for this class. Ms. S’s Period 1 class may have benefited more from retraining, in which the importance and incentives for meeting the behavioral goals, as well as the instructions for self-monitoring, were restated. The current study was conducted in an applied setting (i.e., public middle school), which made it difficult to control all threats to internal validity. For example, scheduling conflicts due to shortened days for statewide testing, teacher absences, severe weather and school vacations affected implementation, resulting in several breaks in the data path. Additionally, researcher involvement with teacher’s implementation was extensive, particularly in terms of data collection and summarization as well as provision of rewards, thus it is difficult to generalize whether similar results would be found without researcher involvement. In order to limit threats to internal validity, experimental control was demonstrated using a multiple-baseline design. Three demonstrations of effect support the external validity of the study’s findings (i.e., three AB designs with staggered intervention start dates showed behavior did not change until the intervention and/or phase change was implemented). However, future investigations should explore solutions to scheduling conflicts and ease of data summarization for educators in applied settings. Discussion References