Download presentation
Presentation is loading. Please wait.
1
Oregon Reading First: Statewide Mentor Coach Meeting February 18, 2005 © 2005 by the Oregon Reading First Center Center on Teaching and Learning
3
Overview of the Data-Based Leadership Model
7
Student Performance and Implementation Questions and Data Sources
11
Coach’s Role in the Data-Based Leadership Model
15
Linking to the Outcomes Driven Model 3 times per year progress monitoring - Low Risk Frequent progress monitoring - At Risk
16
How Are We Doing? By grade and within each class, which students made adequate reading progress from the beginning of the year to the middle of the year? Explanation of New Summary of Effectiveness Reports
17
How Are We Doing? School-Based Normative Context for Evaluating Effectiveness of Instruction Projectwide, National
18
R. Good (2004)18 Model of Big Ideas, Indicators, and Timeline Adapted from Good, R. H., Simmons, D. C., & Kame'enui, E. J. (2001). The importance and decision- making utility of a continuum of fluency-based indicators of foundational reading skills for third- grade high-stakes outcomes. Scientific Studies of Reading, 5, 257-288.
19
R. Good III (2004)19 Summary of Effectiveness By School, District or Project Provides a quick summary of the effectiveness of core, supplemental, and intervention programs for students who require benchmark, strategic, and intensive support. It examines a step in time: –Beginning to Middle of Year –Middle to End of Year –Beginning to End of Year It divides students by Instructional Recommendations –Benchmark –Strategic –Intensive
20
School Effectiveness Reports- Kindergarten
21
School Effectiveness Reports- First Grade
22
22 School Effectiveness Reports- Second Grade
23
23 School Effectiveness Reports- Third Grade
24
DIBELS National Norms Summary tables including percentages. See handouts.
25
Guiding Questions How are we doing compared to the Oregon Reading First schools? How are we doing compared to national standards of DIBELS users? Where do we want to focus our efforts for improvement? (i.e. purchase of intervention program, refining implementation.)
26
How Are We Doing? How Do We Get There? By grade and within each class, how are students performing in the middle of the year on essential components of RF? Are the reading programs being used effective? Examples of Using Data to Drive Instruction
27
How Do We Get There? Are the reading programs and materials being used as intended? Are efforts to improve fidelity working? Issues Around the Coaching Cycle
28
How Do We Get There? By grade and within each classroom, are the reading programs and materials being used to teach the full range of students effective? A Plan to Build Capacity for Program Specific Training
29
How Do We Get There? How should students be grouped? Do we need to reschedule adequate instructional time for the different reading groups? Do we need to revise who will deliver reading instruction? Using LPRs as a Data Source
30
Why Use LPRs? Regional Coordinators, Principals, Coaches: To analyze the overall status of the implementation. To continuously monitor mastery and lesson progress. To determine areas that require change, and to identify solutions. Teachers, Specialists, Assistants: To summarize and report lesson gains, in-program tests, and results. To communicate questions or comments to the coach. (NIFDI LPC Procedures, 2000)
31
Questions to Consider: 1. Is instruction differentiated? 2. Is lesson progress adequate? 3. Are students at a high level of mastery as measured by in-program tests? 4. What information or concerns has the teacher communicated?
32
1. Is Instruction Differentiated? Are the group sizes appropriate? Are programs matched to student performance level? Are all of the groups on the same lesson? (Is teacher treating all groups the same?) Are high, medium, and low groups completing lessons at optimum rates? Does the data indicate the need for acceleration for some students? (NIFDI Coaching Manual: Level I, 1999)
33
2. Is Lesson Progress Adequate? Does the data reveal potential problems with use of time? (Slow progress may indicate that teacher is (a) not following the schedule, (b) not teaching the program as specified, or (c) struggling with presentation skills or behavior management issues.) Are some lessons being repeated too many times? Will projections be met if current rate of lesson progress is continued? If projections will not be met, do justifiable reasons exist for not meeting them? Do the projections need to be changed? (NIFDI Coaching Manual: Level I, 1999)
34
3. Are students at a high level of mastery as measured by in-program tests? Did teacher indicate the number of students who passed the in- program test(s)? Did teacher miss an opportunity to give an in-program test? Did teacher remediate and retest students who failed the test on the first try? Consider group performance: How many students overall passed the in-program test? Consider individual student performance: Who are the students who failed one test, two consecutive tests? Which tests? Are the same students failing from time to time? Does data indicate a possible need for change in placement? Is lesson gain being achieved at the expense of mastery? (NIFDI Coaching Manual: Level I, 1999)
35
4. What additional information or concerns has the teacher communicated? Did the teacher list types of items missed on in-program tests? Did the teacher include information on remediation and retesting? Did the teacher indicate a concern about an individual student? (NIFDI Coaching Manual: Level I, 1999)
36
Lesson Progress Organizer
37
Lesson Progress Report
38
Question and Answer/ Large Group Sharing
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.