Download presentation
Presentation is loading. Please wait.
1
Ruth Poage-Gaines and Terry Schuster
I-RtI Network Tier 2: Progress Monitoring and Evaluation Tools Facilitated/Presented by: Ruth Poage-Gaines and Terry Schuster The Illinois RtI Network is a State Personnel Development Grant (SPDG) project of the Illinois State Board of Education. All funding (100%) is from federal sources. The contents of this presentation were developed under a grant from the U.S. Department of Education, #H325A However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. (OSEP Project Officer: Grace Zamora Durán)
2
What’s happening in your district?
Making connections What Applying coaching what we’ve learned questions do you have? What’s happening in your district? Check-in 10 Minute Check In
3
Progress Monitoring and Evaluation
What I know What I want to know 5 minutes: Find a coach you don’t often talk with. Ask him or her what is known and what is desired to know about Progress monitoring and Evaluation Tools. Take 2 minutes then reverse roles.
4
Review of November Participants will:
Share and problem solve around Fidelity Checklist/coaching activities Use self-assessment (SAPSI-D), fidelity, and student outcome data to identify district strengths, needs, and action plans in order to make necessary changes to the District Improvement Plan (DIP/Rising Star) Create or refine a Tier 2 parent involvement communication feedback loop structure for their district, and share it with their District Establish or refine their own current written procedures for Tier 2 intervention structures and logistics (manual) August/September Meeting’s Outcomes: EC Meeting Content and EC Documents Definition of Critical Components at Tier 2 Components of a Communication Plan Purpose and Format of a District RtI Manual Development of a RtI Manual Glossary Strategies for Consensus Building at Tier 2 One of the best ways to remember something is to test yourself. 6 Minutes: Time to review last month’s outcomes
5
November’s Fidelity Checklist Action Plans
DISCUSS & SHARE Successes Barriers Additional supports &/or resources needed Next step(s) 5 Minutes: Have coaches discuss the progress that was made on their extension activities from last month. For some, this would be developing or refining the instruction and interventions section of their manual including work on scheduling, logistics and alignment.
6
Outcomes Participants will:
Identify critical features of research based progress monitoring tools. Evaluate currently used PM tools for Reading and Math. Identify evaluation tools and processes at Tier 2. Plan for administration of the SAPSI-S using identified administration procedures. 2 minute: Share Outcomes for the day. (Tip: Turn toward screen and allow participants one minute to read and reflect on these outcomes.)
7
Review Pre-Meeting Survey Results
1 Minutes: Share results of the pre-meeting survey for your area.
8
Progress monitoring tools
1 m: Transition to next content area I-RtI Network Progress monitoring tools
9
25 minutes: Modified jigsaw read of Fuchs and Fuchs article
25 minutes: Modified jigsaw read of Fuchs and Fuchs article. There should be at least 3 groups of 2. Each group reads the same 2 page section. Sections identified in handout. Each reader should mark the text with a checkmark if what they read confirms what they know already, with an exclamation point if something is surprising or new information and with a question mark if they still have questions they would like to explore. Take 5 minutes to read and then 3 minutes to compare responses for each category. After the pairs talk, report out to the larger group with an overview of the section and then use these three categories as highlights . Something I knew was… something that surprised me was…something I would like know more about is… OPTION: If you have a large group or many who want to know about HS Pming, you can use the PM section from the Shinn article, the CBM Secondary Research Article and/or the PDF of the Powerpoint of the NHSC on using the EWS to Progress Monitor. You will need to divide the HS readings up according to your own discretion. Read and Respond
10
Big Idea Teachers can use systematic progress monitoring in reading, mathematics, and spelling to identify students in need of additional or different forms of instruction, to design stronger instructional programs, and to effect better achievement outcomes for their students. 5 Minutes: Quote from the end of the Fuchs article. Give 1 minute to read and reflect, then ask Coaches to share how they see this happening in their district at Tier 2 and where is their district strong and or weak? Ask each Coach to share one strength and one challenge. Record these on a whiteboard, chart paper, or the following slide. (Remember to ‘unhide’ first.) Afterwards you may ask if anyone has a ‘coaching suggestion’ to address any of the challenges mentioned.
11
The science of monitoring progress
Progress monitoring has a scientific base in assessment with over 30 years of research. There are technical aspects to these tools that make them technically adequate for monitoring the academic progress of students. 2 minutes: We are going to look at the scientific standards from a body of research presented on the National Center for Progress Monitoring website as well as those utilized in the rubrics developed and used by the Iowa Dept of Education. As we encounter tools in our districts that are being used, we should have a solid basis for evaluating those tools. We will being creating a checklist as an activity in a few minutes that we may be able to take back and use to evaluate what is being used for Pming strengths and weaknesses. This will make for a more accurate use of the data in decision making around intensity of supports and movement through tiers.
12
Core Standards of Technical Adequacy
Foundational Psychometric Standards: Reliability Validity Sufficient number of alternate forms Sensitivity to learning Evidence of instructional utility Specification of adequate growth Description of benchmarks for adequate end-of-year performance or goal-setting process 5-20 minutes Standards for progress monitoring tools. (This information comes from a presentation titled “Choosing a Progress Monitoring Tool That Works for You” by Silvia Wen-Yu Lee and Sarah Short of the National Center on Student Progress Monitoring.) These are the standards by which the National Center for Progress Monitoring evaluated the tools that were submitted. This represents the best science accumulated over 30 years of research. If we use a PM tool, it should be based on these guidelines. If we are looking at alternatives, like MAP or existing data, these factors are still important. ASK: If you had three pennies to spend to indicate which of these is the most important, how would you spend those pennies? Coaches should write down there ‘vote’ on a stick note. Share what you how you spent your coins and why? (Review definitions only if needed.) (Optional if more basic info is needed: Divide coaches into pairs and each pair takes one item to define. If you have a smaller number, then have coaches can number by 1s and 2s with 1s taking the odd numbered items and 2s taking the even items. Develop a definition , an example and a reason why it is important. SAMPLE: Reliability is the ability of an assessment to provide the same score over multiple iterations and scorers. A PM tool has high reliability when a student gets a similar ORF score with several different ORF probes. This is important because when Pming, we need to know that a student’s skills are actually growing as opposed to the student just appearing stronger in skills because an ORF passage was easier.) Reliability: the ability of a tool to produce strongly similar scores across scorers and administrations of the tool. Validity: the ability of a tool to measure the identified skill or construct Sufficient number of alternate forms 4. Sensitivity to learning: Capacity of the measure to reveal intervention effects 5. Evidence of instructional utility: Extent to which the use of the measure helps teachers plan more effectively so that student achievement increases 6. Specification of adequate growth: Slope of improvement or average weekly increase in score by grade level 7. Description of benchmarks for adequate end-of-year performance or goal setting process
13
Progress Monitoring Rubric
Header on cover page Descriptive info on each work group’s section 7 minutes This is an example of a tool that the Iowa Department of Education used to evaluate progress monitoring tools for use in the state. This was a very comprehensive work and more information is available at The items in the work group section are the sources of information for the evaluation of the tool and the standards they chose to use to begin evaluation – this is not the entire tool but is included here as an example for what the coaches will being creating for their own use. (Buros Mental Measurement Yearbook is a compilation of evaluations of assessment tools published by the Buros Center for Testing. This is info that they will be using in a moment to create their own rubric or checklist for evaluation of their districts pm tools.
14
Create a Rubric to Evaluate Progress Monitoring Tools
Break into groups of four by Large/Small or Elementary/MS-HS Look at sample tools Work together to create a checklist or rubric for evaluating PM tools based on the samples and information from today 25 minutes Have copies of the tools: “Characteristics of Effective Measurement for RtI Worksheet”, “Academic Progress Monitoring Rubric”, Evaluating RTI Tool for Monitoring Academic Progress” (Knight) and “Choosing a PM Tool Activity”, and the Progress Monitoring Characteristics Evaluation Grid. (There are additional tools available online. If they are not already familiar, have Coaches look at the current location of the Tools Chart at: ) Group coaches by large and small districts OR by Elementary and High School depending upon the interests in your group. Keep group size to Ask coaches to create a rubric or checklist to assess the tools/data being used to make progress monitoring decisions. Pass out the templates and let coaches create an evaluation rubric or checklist that includes the categories most important to them. Each group should share their finished product (if they can use the electronic version of the template to create and to the presenter, that would be an option or if there is a document camera to share the finished paper version).
15
Use Existing Data What existing data could you use to progress monitor in Reading? In Math? How could you use the rubric you just created to assess the strengths and weaknesses of that data? 7 minutes: The ideal approach is to use scientifically research based PM tools, but sometimes it is more efficient to use existing data. It would be important to know the limitations of existing tools. For example, is there is a high level of validity, are the assessments of the same or different levels. What requirements of a good assessment (type of items/questions asked, etc) are met. In the same groups give the coaches 5 minutes to talk about existing data that they could use to progress monitor. They should evaluate this data using their rubric or checklist. (E.g.: Some MS and HS have developed common assessments. These can vary in terms of reliability and validity, but if there are like items over time on common assessments that we can compare over time, could this be a source for progress monitoring for some of the more difficult to assess skills, like informational text, etc. ) **Ask coaches to share some existing data that are being used within their districts. What are some of the pros and cons of doing this? How can we apply the characteristics for evaluating PM tools to improve the use of existing data?
16
Time to Reflect Questions/Comments
One thing I learned during this section… One thing I would like to have clarified is… One thing I way I could apply this learning is… 5 minutes: To Reflect and clarify understanding. Each coach writes the completion to one of these stems and then shares with an elbow partner. Clarifications may be shared with the group. Questions/Comments
17
Coaching Progress Monitoring
I-RtI Network 1 Coaching Progress Monitoring
18
Coaching Progress Monitoring
Partnership Principle – Dialogue Be humble about what you know Balance advocacy with inquiry Ask open ended questions that prompt thinking 15 minutes According to Jim Knight’s Partnership Principles, Dialogue is an important component of a coaching relationship. One technique for facilitating dialogue is asking open ended questions. Open ended questions cannot be answered with a ‘yes’ or ‘no’. Activity: Have coaches think, then pair, then share and generate 3-5 open ended questions that could be used to facilitate dialogue and discussion around progress monitoring practices. These could be captured on chart paper or on the blank ppt slide following this one. (Remember to ‘unhide’ it first.) 19
19
Evaluation Tools I-RtI Network
1 minute: How will we know if the Tier 2 supports we have in place are effective and efficient? Evaluation Tools
20
RESPONSE TYPES Positive Questionable Poor 2 minutes:
To assess whether or not an intervention is working we look at a student’s response to the intervention and make decisions accordingly. PM data allows this evaluation. We compare this data to the student’s goal or, at Tier 2, the group goal. We also need to look at the fidelity of the intervention’s delivery before we continue, modify, or change the instruction or intervention.
21
Parts of a Goal… Timeline Condition Behavior Criterion
10/4/10 Parts of a Goal… Timeline Condition Behavior Criterion When the expected progress will be accomplished Specific circumstances under which the behavior will occur The specific action that is expected The standard to which the behavior is expected to be performed 2 minutes A review of the structure of a written goal for student performance. Decision rules are needed to determine how as a district, building, grade/department are going to establish goals. Develop a criteria for consistent goal setting either based on a cut off score (all students will reach ___ (33 CD) based on a standard) or ROI, but make goal setting basic and systematic to be time efficient. *Stem for setting a goal -In (#) weeks, when (condition) occurs, (student) will (the behavior) to (criterion). - EX: In 32 weeks, when given a 4th grade Reading Curriculum Based Measure (RCBM), Sarah will read 80 correct words per minute with 3 or fewer errors, 2 out of 3 trials.
22
Setting Group-Level Goals
Determine an ambitious, yet achievable ROI for your grade level ( times the expected ROI) = ambitious ROI Multiply the ambitious ROI by the total number of weeks between benchmark periods (Danville CCSD 118 Fall – Winter = 13 weeks) = expected gain Subtract this expected gain from the next benchmark goal = target score Determine the percentage of students at the target score for the previous benchmark period = expected percentage at benchmark 2 minutes A method for setting group level goals based on rate of improvement. For example: Math Computation at 3rd grade is 0.3 digits per week improvement. In order to close the achievement gap, the average ROI for the group will need to be higher than the expected ROI. The expected gain for a 3rd grade student after 8 weeks is 24 digits , for the math computation intervention group that lasts for 8 weeks, an ambitious goal would be digits. This would be a weekly goal of 0.45 – 0.6 digits per week.
23
Setting Group-Level Goals
2 minutes What type of response is expected for the group receiving the intervention? An intervention is deemed effective if 60-70% of students are making sufficient progress to close the gap between the average of the group and the median of the other students in Tier 1. In addition to looking at individual student goals, we can look at the median of the group and set goals for the group over time. What data are you using to assess this and where does this type of analysis occur? Your decision making team should also be looking at fidelity data when evaluating this data. Remember: The at risk student’s rate of improvement must be greater than the rate of improvement of a typical student in order to “close the gap” and return to grade level functioning.
24
How well is Tier 2 working?
Are we adhering to the curriculum as planned? Curriculum Are we using instructional strategies/ routines as planned? Instruction Are we administering & scoring assessments reliably? Assessment Are we adhering to the process as planned? Process 3 minutes: If we see that intervention groups are not making progress, what else can we look at? Four ‘types’ of fidelity: a review of April 2013’s EC meeting. 26
25
2 minutes An Intervention Documentation Worksheet is one tool for evaluation intervention at Tier 2. This version follows the time, the program used and the focus of instruction (reading in this case.) From this we can assess whether or not the student was present for the intervention, how many minutes of additional instruction was delivered and what the student received. At this level we look at fidelity data alongside PM data to assess how well an intervention is working and to make decisions about whether to continue, modify, or discontinue an intervention.
26
Resource: Intervention Evaluation & Alignment Chart
2 minute Some examples of Fidelity tools: for the following examples, give time for coaches to look at each and process then talk about how these could be used to evaluate the effectiveness of our Tier 2 interventions or processes.
27
Teacher:___________________________ Date:______________________
Implementation Check Teacher:___________________________ Date:______________________ Location:___________________________ Group:_____________________ Comments By:______________________ Time:______________________ Organization Yes No N/A Comments Materials organized and ready Begins lesson promptly Finishes lesson on time Students on task 2 minutes Is the intervention delivered as designed? A checklist of required steps for implementation of the selected intervention or strategy. Ask coaches to share how they assess this for interventions in reading and math.
28
Collaborative Team Progress Planning Tool
Collaborative Team Progress Planning Tool: Tier 2 Is supplemental support sufficient? (Analyze the number and percentages of students needing less supplemental support. (For DIBELS and Aims Web this can be found on the Summary of Effectiveness Report or Summary of Impact Report.) No How many students have met benchmark as a result of supplemental support? How many students have not made adequate progress? Respond to the following questions to reflect on the sufficiency of specific aspects of supplemental support. For each student that has not made adequate progress, were decision rules followed and appropriate instructional adjustments made (refer to decision rules documentation and ICEL tool)? Are students receiving adequate time for supplemental support (e.g. 30 minutes for supplemental reading support in addition to the Core)? Is group size appropriate for supplemental support (e.g. for reading 3-5 students per targeted intervention group)? Is supplemental instruction aligned with core instruction? Are there targeted interventions for all of the "Big Ideas" of Reading, Math or Behavior? Is there a correct match between student need and the instructional focus of supplemental support? Are we implementing supplemental support as intended? use of materials sequencing pacing instructional strategies or routines differentiating the supplemental instruction sufficient time for student practice other practices Should any curricular materials or instructional practices be added, discontinued or replaced due to lack of evidence, need or lack of effectiveness with our students? 5 minutes Another tool that could be used by a DLT or BLT to evaluate the Tier 2 supports. Give coaches 3 minutes to look at this tool and talk with an partner about 1. potential uses and 2. its strengths and weaknesses.
29
Take a look at the process
To consider whether a practice has been implemented with fidelity, the practice must first be clearly defined (Century, Rudnick & Freeman, 2010). By clearly defining a practice, expectations are spelled out creating an understanding of what needs to be accomplished. From that understanding, educators are able to reflect on the integrity of their work and then plan for next steps to improve implementation (Fixsen, Blasé, Horner & Sugai, 2009). 10 Minutes Link to Manual Work: where in your RtI/MTSS manual have you addressed these processes? From Colorado Department of Education RtI Implementation Rubrics, 2010
30
Time to Reflect Questions/Comments
One thing I learned during this section… One thing I would like to have clarified is… One thing I way I could apply this learning is… 5 minutes: To Reflect and clarify understanding. Each coach writes the completion to one of these stems and then shares with an elbow partner. Clarifications may be shared with the group. Questions/Comments
31
SAPSI-S Timelines BLT Administration Measurement of Growth
Evaluation and Action Planning 15 Minutes: Review procedures and ask Coaches who have given the SAPSI-S for their best ‘coaching tips’ for this process. Ask what other administration concerns do you have? Other reminders to all: Timeline = January-February 28, (must submit before March 1, 2014) Administer with a Building Leadership team. To mark a “yes” the item must be in place for 6 or more months. Sources of documentation need to be identified for each “yes” response. Administration length varies depending upon the level of discussion associated with each item. Expect at least 1 hour for administration. Coaching tips: Setting the Stage for Ecs to administer this with LC support. This data will be compared to last year’s baseline data to assess growth in SAPSI-S key components. Afterwards, an action plan will be develop to continue the RtI/MTSS improvement process. Time Saving Tips: - Print out or open up your responses from last year. Use these responses as your starting point to save time. Quickly ascertain whether those items that were marked are still in place, and then concentrate on what additional items can be checked. Especially if you are working with different team members, you may want to provide the SAPSI-S to the team ahead of time and ask that they preview the items and think about the school’s status and mark their school’s status. (can jigsaw this and assign each team member to review a portion of the SAPSI-S prior to the meeting) During administration: Use a projector to project the SAPSI-S as you complete it as a team. Or, at least make sure everyone has a copy. As the facilitator, make sure you ask follow up and clarifying questions for items as you go to ensure that their ratings are as accurate as possible. Use the SAPSI-S Glossary of Terms, Crosswalk, & Language Differences documents to help with this. At the end of the administration, consider asking the team if there were major items that they believe ought to be addressed this year. Explain that at your next meeting, you will be presenting graphs of their data and developing an action plan. However, it is nice to get a list of key items now when the items have been recently discussed. This will may action planning more efficient next time. 11/15/2018 Illinois RTI Network
32
Key Ideas from Today PM data can be used to i.d. at risk students and strengthen Tier 2 supports Effective PM tools are built on research based standards. Open ended questions facilitate dialogue Evaluation tools can be used to continuously improve Tier 2 3 min
33
Closing Activities Online Post-Meeting Survey
External Coach Fidelity Checklist (No new item this month) Network and ISBE Evaluations 10 minutes
34
On-Site Coaching Complete the Participant Agenda and communicate your needs to your LC: The purpose of on-site coaching is the application of the content to your work and your school/district needs. 5 minutes
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.