Download presentation
Presentation is loading. Please wait.
Published bySydney Poole Modified over 9 years ago
2
Supporting Teachers Within RTI: The Role of the School Psychologist Jon Potter Lisa Bates OrRTI Project 1 OSPA/WSASP Conference Fall 2010
3
Objectives Develop awareness about the potential role of the school psychologist in a RTI model within the domains of: – Assessment/Evaluation – Consultation 2
4
Core Principles of RTI All children can be taught effectively Focus on prevention and early intervention Provide services using a tiered model Use a problem-solving method to make decisions Use research-based interventions Monitor student progress to inform instruction Use data to make decisions Use assessment for different purposes: Screening Skill diagnostics Progress monitoring 3 NASDSE, 2006
5
Essential Components of an RTI Model High quality instruction and intervention materials System for collecting data Data-based decision making using a problem-solving method 4 NASDSE, 2006
6
The Role of School Psychologists Assessment/Evaluation Consultation/Coaching 5
7
How much time do you spend on these activities currently? Assessment/Evaluation? Consultation/Coaching? 6
8
Assessment/Evaluation
9
ICEL Instruction: How content is taught Curriculum: What content is taught Environment: Accommodations, modifications, & other environmental considerations Learner: Things specific to the student What to Assess
10
ICEL
11
RIOT Review: existing information Interview: parents, teachers, student Observe: student during instruction Test: student skills LEAST TO MOST INTRUSIVE DIRECT TO INDIRECT How to Assess
12
Goal: Convergent Data from Multiple Sources Curriculum RIOT Instruction RIOT Learner RIOT Environment RIOT Multiple Sources and Domains Why a student is struggling
13
The Role of School Psychologists: Assessment 12
14
The Role of School Psychologists: Assessment 13
15
14 Assisting in the collection and analysis of academic screening and progress data Screening – Given to all students regularly to determine who receives extra support Progress Monitoring – Given to those students who are receiving extra support Diagnostic – Given to a smaller number of students for whom more information is needed to create an intervention matched to the student’s needs
16
The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. Additional Diagnostic Data Diagnostic tests should only be given when there is a clear expectation that they will provide new information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction.
17
Diagnostic Assessment Questions “Why is the student not performing at the expected level?” (Defining the Problem) “What is the student’s instructional need?” (Designing an Intervention)
18
Vocabulary Reading Comprehension Phonemic Awareness Phonics (Alphabetic Principle) Phonics (Alphabetic Principle) Oral Reading Fluency & Accuracy Oral Reading Fluency & Accuracy Assessing enabling skills
19
Diagnostic Assessments Quick Phonics Screener (Jan Hasbrouck) Digging Deeper (Wendy Robinson) CORE Multiple Measures Error Analysis Curriculum-Based Evaluation Procedures (Ken Howell)
22
Digging Deeper Questions http://www.aea11.k12.ia.us/educators/idm/Day5_10/Digging_Questions_k8.pdf
23
Core Multiple Measures
24
Error Analysis
25
Assisting in the collection and analysis of academic progress data Assessment activities before referral is made to remediate a problem (screening, progress monitoring, diagnostic data) Linked to comprehensive evaluation – Student has a disability (screening & progress monitoring data) – Impacts their education (screening & progress monitoring data) – Needs specially designed instruction Linked to IEP development – Develop goals (screening, diagnostic, and progress monitoring data) – Monitor progress on goals (screening, diagnostic, and progress monitoring data) 24
26
Assisting in the collection and analysis of academic progress data Makes your job easier!!!!!!
27
How do these assessment activities (screening, progress monitoring, & diagnostic assessment) compare to what you are currently doing in the area of assessment? 26
28
The Role of School Psychologists: Assessment 27
29
2. Ensuring high quality instruction and intervention programs by assessing instructional contexts Observing the critical components of effective teaching Focus on teacher behaviors shown to improve student outcomes 28 Brophy & Good, 1986; Gunter, Hummell, & Conroy, 1998 Observing Instruction
30
29 ACTIVE ENGAGEMENT MASTERY OPPORTUNITIES TO LEARN
31
Instructional Delivery Features to Examine Instructor provides multiple opportunities for students to practice instructional tasks Students are successful completing activities at a high criterion level of performance Students are engaged in lesson during teacher led-instruction 30
32
Provides more than one opportunity to practice each new skill Provides opportunities for practice after each step in instruction Elicits group responses when feasible Provides extra practice based on accuracy of student responses 31 Multiple Opportunities to Practice
33
Assess opportunities to respond What is an Opportunity to Respond (OTR)? Need to operationally define, for example: “An instructional question, statement or gesture made by the teacher that seeks an academic response (i.e. “What sound,” “Sound it out,”, “point to the /a/ sound”, etc.) OTR’s can include behavior related statements or directives as long as they have an academic component (i.e. “write the answer in your workbook”).” Be clear! 32
34
How many times it takes to learn something new Accelerated Learner Everybody else Truly disabled student Jo Robinson (2008) 4-14 times 14-250 times 250-350 times Opportunities to Respond
35
Instructional Delivery Features to Examine Instructor provides multiple opportunities for students to practice instructional tasks Students are successful completing activities at a high criterion level of performance Students are engaged in lesson during teacher led-instruction 34
36
Students are successful completing activities at a high criterion level of performance The group of students demonstrate a high percentage of accurate responses Individual students demonstrate a high percentage of accurate responses Holds same standard of accuracy for high performers and low performers 35
37
Students Are Successful Levels of Mastery: – 90% First time correct on new material – 95% Subsequent responding (after first time) First Time Correct = How many errors are students making the first time they answer the new tasks?
38
Correct Academic Responding (CAR): 90% 1 st Time Responding; 95% Subsequent Responding Successful Student Engagement Ensures that students are not practicing errors Practice to automaticity/mastery Provides practice at a high level of success to build accuracy and fluency How do you measure CAR???? 37 Brophy & Good, 1986, Lyon, 1998 CAR = # of correct student respond # of opportunities to respond Adapted from Martin & Tobin, 2006
39
Error Correction: Should occur after ALL errors Prevent students from learning misrules Positively correlated with Student Achievement Ratings of Teacher Effectiveness 38 Adapted from Martin & Tobin, 2006
40
Error Correction Does the teacher correct errors? Does the teacher provide opportunities for the students to respond again to that item? 39
41
Students are successful completing activities at a high criterion level of performance Students are successful completing activities at a high criterion level of performance The group of students demonstrate a high percentage of accurate responses Individual students demonstrate a high percentage of accurate responses Holds same standard of accuracy for high performers and low performers 40
42
Holds same standard of accuracy for high performers and low performers Teachers hold the same expectations for low achievers and high achievers – No excuses!!!!
43
Instructional Delivery Features to Examine Instructor provides multiple opportunities for students to practice instructional tasks Students are successful completing activities at a high criterion level of performance Students are engaged in lesson during teacher led-instruction 42
44
Students are engaged in lesson during teacher led-instruction Teacher gains student attention before initiating instruction Paces lesson to maintain attention Maintains close proximity to students Transitions quickly between tasks Intervenes with off-task students to maintain their focus 43
45
Instructional Pacing: 8-12 Opportunities to Respond per Minute Opportunity to learn Provides mass trial practice to build fluency and achieve mastery Provides opportunity to monitor student performance Positively correlated with: Student On-Task Behavior Student Academic Achievement 44 Adapted from Martin & Tobin, 2006 Pacing = # of opportunities to respond # of minutes observed
46
Tools for Measuring Effective Teaching: Data Instructional Variable Observation DataRecommended Criteria 45 Pacing # of opportunities to respond 8-12 OTRs per minute # of minutes observed for most intensive instruction Student Accuracy # of correct responses 90% or above # of opportunities to respond
47
How does this compare to your current practices in the area of instructional observation/assessment? 46
48
Consultation/Coaching
49
The Role of School Psychologists Consultation/Coaching School/District Leadership Team School Data Teams Individual Teachers 48
50
School/District Leadership Team Provide input on district-wide decisions around: 1.Curriculum/Interventions 2.Assessments Screening Progress monitoring Diagnostic 3.Decision rules 49
51
50 School Data Teams Schoolwide Data Team Monthly RTI Team Individual Problem Solving Team IEP Team
52
Schoolwide Data Team Evaluate effectiveness of Tier I (Core) programming for ALL students Determine areas of need and provide support for implementation of Core Fidelity to Core Instruction – Develop and implement fidelity monitoring systems 51
53
68% 17% 15% Evaluate effectiveness of Core programming for ALL students
54
Grade Level Data Team Use screening, progress monitoring, and diagnostic data to place students in interventions Determining progress monitoring tools and appropriate student goals Develop and help implement progress monitoring Evaluate effectiveness of interventions 53
55
Aimline Examining Adequate Progress: 4 Points Below the Goal Line Oral Reading Fluency Add 15 minutes to intervention Reduce group size to 3 students
56
55 Aimline Examining Intervention Cohort Data Amy Chase Mary Isaiah
57
56 Aimline Examining Intervention Cohort Data Amy Chase Mary Isaiah
58
Individual Problem Solving Team Coordinate additional data collection – Diagnostic testing, record reviews, parent/teacher interviews, student observations Create individualized interventions through problem-solving Evaluate effectiveness of individualized interventions 57
59
58 The Problem Solving Model 1.Define the Problem: What is the problem and why is it happening? 2.Design Intervention: What are we going to do about the problem? 3.Implement and Monitor: Are we doing what we intended to do? 4.Evaluate Effectiveness: Did our plan work?
60
IEP Team Evaluate student needs using diagnostic assessments Evaluate student progress data Assist in developing IEP services Attend annual IEP meetings 59
61
Instructional Consultation with Individual Teachers Consultation can occur at any level in the system – Tier 1 – Tier 2 – Tier 3 Focus on observable teaching behaviors – What can WE change? (alterable variables)
62
Alterable Variables How do we know what to change when students are not making adequate progress? Follow the data 61
63
What do we change?: TTSD Example Time Group Size Different program Individual Problem- solving Time/ Engagement
64
Alterable Variables Chart 63 http://oregonreadingfirst.uoregon.edu/downloads/Alt_Var_Chart_2.pdf
65
Time 64 http://oregonreadingfirst.uoregon.edu/downloads/Alt_Var_Chart_2.pdf
66
Time Possible Data Sources/Questions to consider – Initial screening and progress monitoring data Is progress being made but not closing the gap quick enough? – Program placement tests Is the student placed appropriately? – Daily student accuracy data Is the student fairly accurate in daily lessons? (>85-90%) – Lesson checkouts Is the student passing checkouts regularly? – Instructional observation data: Is a lot of time spent in transitions/non-academic activities? Is the student actively engaged and responding? Does the student get sufficient opportunities to respond (8-12 per minute in intensive interventions)?
67
Group Size 66 http://oregonreadingfirst.uoregon.edu/downloads/Alt_Var_Chart_2.pdf
68
Group Size Possible Data Sources/Questions to Consider – Initial screening and progress monitoring data Is progress being made but not closing the gap quick enough? – Daily student accuracy data Is the student fairly accurate in daily lessons? (>85-90%) – Lesson checkouts Is the student passing checkouts regularly? – Instructional observation data: Does the student get enough individual attention (i.e. opportunities to respond, corrective feedback, praise, etc)? Is the student at the same instructional level as other students in the group?
69
Different Program 68 http://oregonreadingfirst.uoregon.edu/downloads/Alt_Var_Chart_2.pdf
70
Different Program Possible Data Sources/Questions to Consider – Initial screening, progress monitoring, diagnostic data Is the intervention matched to student need? Does the student have multiple instructional needs? Is the student not making any progress? Are there pre-requisite skills the student is lacking? – Daily student accuracy data Is the student inaccurate in daily lessons, even when provided regular corrective feedback? (>85-90%) – Lesson checkouts Is the student not passing these consistently? – Instructional observation data Is the student off-task a lot?
71
Fidelity 70 http://oregonreadingfirst.uoregon.edu/downloads/Alt_Var_Chart_2.pdf Communication/ Meetings
72
Big Ideas Use the skills you already have Focus on evaluating instructional environments Use data to guide your practice If something isn’t working, change it Build capacity 71
73
Questions/Comments Jon Potter jpotter@ttsd.k12.or.usjpotter@ttsd.k12.or.us Lisa Bates lbates@ttsd.k12.or.uslbates@ttsd.k12.or.us 72
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.