Presentation is loading. Please wait.

Presentation is loading. Please wait.

Response to Intervention www.interventioncentral.org RTI: Schoolwide Academic Screening Jim Wright www.interventioncentral.org.

Similar presentations


Presentation on theme: "Response to Intervention www.interventioncentral.org RTI: Schoolwide Academic Screening Jim Wright www.interventioncentral.org."— Presentation transcript:

1 Response to Intervention www.interventioncentral.org RTI: Schoolwide Academic Screening Jim Wright www.interventioncentral.org

2 Response to Intervention www.interventioncentral.org 2 RTI Literacy: Assessment & Progress-Monitoring To measure student ‘response to instruction/intervention’ effectively, the RTI model measures students’ academic performance and progress on schedules matched to each student’s risk profile and intervention Tier membership. Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of academic assessments. Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention. Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 intervention are assessed at least once per week. Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

3 Response to Intervention www.interventioncentral.org 3 Educational Decisions and Corresponding Types of Assessment SCREENING/BENCHMARKING DECISIONS: Tier 1: Brief screenings to quickly indicate whether students in the general-education population are academically proficient or at risk. PROGRESS-MONITORING DECISIONS: At Tiers and 3, ongoing ‘formative’ assessments to judge whether students on intervention are making adequate progress. INSTRUCTIONAL/DIAGNOSTIC DECISIONS: At any Tier, detailed assessment to map out specific academic deficits, discover the root cause(s) of a student’s academic problem. OUTCOME DECISIONS: Summative assessment (e.g., state tests) to evaluate the effectiveness of a program. Source: Hosp, M. K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM: A practical guide to curriculum-based measurement. New York: Guilford Press.

4 Response to Intervention www.interventioncentral.org Curriculum-Based Measurement: An Introduction

5 Response to Intervention www.interventioncentral.org 5 Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases Aligns with curriculum-goals and materials Is reliable and valid (has ‘technical adequacy’) Is criterion-referenced : sets specific performance levels for specific tasks Uses standard procedures to prepare materials, administer, and score Samples student performance to give objective, observable ‘low- inference’ information about student performance Has decision rules to help educators to interpret student data and make appropriate instructional decisions Is efficient to implement in schools (e.g., training can be done quickly; the measures are brief and feasible for classrooms, etc.) Provides data that can be converted into visual displays for ease of communication Source: Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.

6 Response to Intervention www.interventioncentral.org 6 Reading fluency Reading comprehension Math computation Writing Spelling Phonemic awareness skills Early math skills Among other areas, CBM Techniques have been developed to assess:

7 Response to Intervention www.interventioncentral.org 7 Measuring General vs. Specific Academic Outcomes General Outcome Measures… Track the student’s increasing proficiency on general curriculum goals such as reading fluency. Example: CBM-Oral Reading Fluency (Hintz et al., 2006). Are most useful for longer-term measurement (e.g., to set and track IEP goals over the timespan of a school year). Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

8 Response to Intervention www.interventioncentral.org 8 Measuring General vs. Specific Academic Outcomes Specific Sub-Skill Mastery Measures… Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). Example: Letter Identification. Are helpful in assessing whether the student has acquired short-term skills whose acquisition may require weeks rather than months. Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

9 Response to Intervention www.interventioncentral.org CBM: Developing a Process to Screen and Collect Local Norms Jim Wright www.interventioncentral.org

10 Response to Intervention www.interventioncentral.org 10 Building-Wide Screening: Assessing All Students (Stewart & Silberglit, 2008) Screening data in basic academic skills are collected at least 3 times per year (fall, winter, spring). Schools should consider using ‘curriculum-linked’ measures such as Curriculum-Based Measurement that will show generalized student growth in response to learning. If possible, schools should consider avoiding ‘curriculum-locked’ measures that are tied to a single commercial instructional program. Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

11 Response to Intervention www.interventioncentral.org 11 Building-Wide Screening: Using a Wide Variety of Data (Stewart & Silberglit, 2008) Screenings can be compiled using: Fluency measures such as Curriculum-Based Measurement. Existing data, such as office disciplinary referrals. Computer-delivered assessments, e.g., Measures of Academic Progress (MAP) from www.nwea.org Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

12 Response to Intervention www.interventioncentral.org 12 Measures of Academic Progress (MAP) www.nwea.org

13 Response to Intervention www.interventioncentral.org 13 Applications of Screening Data (Stewart & Silberglit, 2008) Screening data can be used to: Evaluate and improve the current core instructional program. Allocate resources to classrooms, grades, and buildings where student academic needs are greatest. Guide the creation of targeted Tier 2 (supplemental intervention) groups Set academic goals for improvement for students on Tier 2 and Tier 3 interventions. Move students across levels of intervention, based on performance relative to that of peers (local norms). Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

14 Response to Intervention www.interventioncentral.org 14 Screening Data: Supplement With Additional Academic Testing as Needed (Stewart & Silberglit, 2008) “ At the individual student level, local norm data are just the first step toward determining why a student may be experiencing academic difficulty. Because local norms are collected on brief indicators of core academic skills, other sources of information and additional testing using the local norm measures or other tests are needed to validate the problem and determine why the student is having difficulty. … Percentage correct and rate information provide clues regarding automaticity and accuracy of skills. Error types, error patterns, and qualitative data provide clues about how a student approached the task. Patterns of strengths and weaknesses on subtests of an assessment can provide information about the concepts in which a student or group of students may need greater instructional support, provided these subtests are equated and reliable for these purposes.” p. 237 Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.

15 Response to Intervention www.interventioncentral.org 15 Steps in Creating Process for Local Norming/Screening Using CBM Measures 1.Identify personnel to assist in collecting data. A range of staff and school stakeholders can assist in the school norming, including: Administrators Support staff (e.g., school psychologist, school social worker, specials teachers, paraprofessionals) Parents and adult volunteers Field placement students from graduate programs Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

16 Response to Intervention www.interventioncentral.org 16 Steps in Creating Process for Local Norming/Screening Using CBM Measures 2.Determine method for screening data collection. The school can have teachers collect data in the classroom or designate a team to conduct the screening: In-Class: Teaching staff in the classroom collect the data over a calendar week. Schoolwide/Single Day: A trained team of 6-10 sets up a testing area, cycles students through, and collects all data in one school day. Schoolwide/Multiple Days: Trained team of 4-8 either goes to classrooms or creates a central testing location, completing the assessment over multiple days. Within-Grade: Data collectors at a grade level norm the entire grade, with students kept busy with another activity (e.g., video) when not being screened. Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

17 Response to Intervention www.interventioncentral.org 17 Steps in Creating Process for Local Norming/Screening Using CBM Measures 3.Select dates for screening data collection. Data collection should occur at minimum three times per year in fall, winter, and spring. Consider: Avoiding screening dates within two weeks of a major student break (e.g., summer or winter break). Coordinate the screenings to avoid state testing periods and other major scheduling conflicts. Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

18 Response to Intervention www.interventioncentral.org 18 Steps in Creating Process for Local Norming/Screening Using CBM Measures 4.Create Preparation Checklist. Important preparation steps are carried out, including: Selecting location of screening Recruiting screening personnel Ensure that training occurs for all data collectors Line up data-entry personnel (e.g., for rapid computer data entry). Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf

19 Response to Intervention www.interventioncentral.org Methods of Classroom Data Collection Jim Wright www.interventioncentral.org

20 Response to Intervention www.interventioncentral.org Activity: Classroom Methods of Data Collection In your teams: Review the potential sources of classroom data that can be used to monitor Tier 1 interventions. What questions do you have about any of these data sources? How can your school make full use of these data sources to ensure that every Tier 1 intervention is monitored? Classroom Data Sources: Existing records Global skills checklist Rating scales Behavioral frequency count Behavioral log Student work samples Work performance logs Timed tasks (e.g., CBM)

21 Response to Intervention www.interventioncentral.org 21 RTI ‘Pyramid of Interventions’ Tier 1 Tier 2 Tier 3 Tier 1: Universal interventions. Available to all students in a classroom or school. Can consist of whole-group or individual strategies or supports. Tier 2 Individualized interventions. Subset of students receive interventions targeting specific needs. Tier 3: Intensive interventions. Students who are ‘non- responders’ to Tiers 1 & 2 are referred to the RTI Team for more intensive interventions.

22 Response to Intervention www.interventioncentral.org 22

23 Response to Intervention www.interventioncentral.org Existing Records Description: The teacher uses information already being collected in the classroom that is relevant to the identified student problem. Examples of existing records that can be used to track student problems include: –Grades –Absences and incidents of tardiness –Homework turned in 23

24 Response to Intervention www.interventioncentral.org Global Skills Checklists Description: The teacher selects a global skill. The teacher then breaks that global skill down into specific, observable ‘subskills’. Each subskill can be verified as ‘done’ or ‘not done’. 24

25 Response to Intervention www.interventioncentral.org Global Skills Checklists: Example The teacher selects the global skill ‘organizational skills’. That global skill is defined as having the following components, each of which can be observed:  arriving to class on time;  bringing work materials to class;  following teacher directions in a timely manner;  knowing how to request teacher assistance when needed;  having an uncluttered desk with only essential work materials. 25

26 Response to Intervention www.interventioncentral.org Behavioral Frequency Count Description: The teacher observes a student behavior and keeps a cumulative tally of the number of times that the behavior is observed during a given period. Behaviors that are best measured using frequency counts have clearly observable beginning and end points—and are of relatively short duration. Examples include: –Student call-outs. –Requests for teacher help during independent seatwork. –Raising one’s hand to make a contribution to large- group discussion. 26

27 Response to Intervention www.interventioncentral.org Behavioral Frequency Count: How to Record Teachers can collect data on the frequency of student behaviors in several ways: Keeping a mental tally of the frequency of target behaviors occurring during a class period. Recording behaviors on paper (e.g., simple tally marks) as they occur. Using a golf counter, stitch counter, or other mechanical counter device to keep an accurate tally of behaviors. 27

28 Response to Intervention www.interventioncentral.org Behavioral Frequency Count: How to Compute If student behaviors are being tallied during a class period, frequency-count data can be reported as ‘X number of behaviors per class period’. If frequency-count data is collected in different spans of time on different days, however, schools can use the following method to standardize frequency count data : –Record the total number of behaviors observed. –Record the number of minutes in the observation period. –Divide the total number of behaviors observed by total minutes in the observation period. Example: 5 callouts observed during a 10 minute period = 0.5 callouts per minute. 28

29 Response to Intervention www.interventioncentral.org Behavior Log Description: The teacher makes a log entry each time that a behavior is observed. An advantage of behavior logs is that they can provide information about the context within which a behavior occurs.(Disciplinary office referrals are a specialized example of a behavior log.) Behavior logs are useful for tracking ‘low- incidence’ problem behaviors. 29

30 Response to Intervention www.interventioncentral.org Behavior Log: Sample Form 30

31 Response to Intervention www.interventioncentral.org Rating Scales Description: A scale is developed that a rater can use to complete a global rating of a behavior. Often the rating scale is completed at the conclusion of a fixed observation period (e.g., after each class period). Daily / Direct Behavior Report Cards are one example of rating scales. 31

32 Response to Intervention www.interventioncentral.org Daily Behavior Report Card: Daily Version Jim BlalockMay 5 Mrs. WilliamsRm 108

33 Response to Intervention www.interventioncentral.org Student Work Samples Description: Work samples are collected for information about the student’s basic academic skills, mastery of course content, etc. Recommendation: When collecting work samples: –Record the date that the sample was collected –If the work sample was produced in class, note the amount of time needed to complete the sample (students can calculate and record this information). –If possible, collect 1-2 work samples from typical students as well to provide a standard of peer comparison. 33

34 Response to Intervention www.interventioncentral.org Work Performance Logs Description: Information about student academic performance is collected to provide insight into growth in student skills or use of skills in appropriate situations. Example: A teacher implementing a vocabulary- building intervention keeps a cumulative log noting date and vocabulary words mastered. Example: A student keeps a journal with dated entries logging books read or the amount of ‘seat time’ that she spends on math homework. 34

35 Response to Intervention www.interventioncentral.org Timed Tasks (e.g., Curriculum-Based Measurement) Description: The teacher administers structured, timed tasks to assess student accuracy and fluency. Example: The student completes a 2-minute CBM single-skill math computation probe. Example: The student completes a 3-minute CBM writing probe that is scored for total words written. 35

36 Response to Intervention www.interventioncentral.org Combining Classroom Monitoring Methods Often, methods of classroom data collection and progress-monitoring can be combined to track a single student problem. Example: A teacher can use a rubric (checklist) to rate the quality of student work samples. Example: A teacher may keep a running tally (behavioral frequency count) of student callouts. At the same time, the student may be self- monitoring his rate of callouts on a Daily Behavior Report Card (rating scale). 36

37 Response to Intervention www.interventioncentral.org Activity: Classroom Methods of Data Collection In your teams: Review the potential sources of classroom data that can be used to monitor Tier 1 interventions. What questions do you have about any of these data sources? How can your school make full use of these data sources to ensure that every Tier 1 intervention is monitored? Classroom Data Sources: Existing records Global skills checklist Rating scales Behavioral frequency count Behavioral log Student work samples Work performance logs Timed tasks (e.g., CBM)


Download ppt "Response to Intervention www.interventioncentral.org RTI: Schoolwide Academic Screening Jim Wright www.interventioncentral.org."

Similar presentations


Ads by Google