Thinking about progress monitoring: Decisions and instructional change strategies Dr. Lisa Habedank Stewart Minnesota State Univ Moorhead 218.477.4081.

Slides:



Advertisements
Similar presentations
Scott Linner Aimsweb Trainer Aimsweb support
Advertisements

Progress Monitoring in Reading: Why, What, and How
Progress Monitoring project DATA Assessment Module.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
An Introduction to Response to Intervention
Response to Intervention (RTI) Lindenhurst Schools
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
CA Multi-Tiered System of Supports
Department of Special Education August 3, 2010 iSTEEP Follow-up & Training Presented by: Raecheal Vizier, M.Ed. Special Education Program Effectiveness.
Response to Intervention RTI – SLD Eligibility. What is RTI? Early intervention – General Education Frequent progress measurement Increasingly intensive.
What To Do When A Student Does Not Respond To An Academic Intervention Brian Lloyd Ed. S., NCSP May 2 nd, 2013.
RtI Assessment CED 613. Universal Screening What is it and what does it Evaluate? What is the fundamental question it is asking? What is the ultimate.
DATA-BASED DECISION MAKING USING STUDENT DATA-BECAUSE IT’S BEST PRACTICE & IT’S REQUIRED Dr. David D. Hampton Bowling Green State University.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
Angela is a 6 th grader in a k-8 schools. She hit her reading benchmarks in K, 1 st and 2 nd Grade. She was proficient on state testing in 3 rd grade,
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
DATA BASED DECISION MAKING IN THE RTI PROCESS: WEBINAR #2 SETTING GOALS & INSTRUCTION FOR THE GRADE Edward S. Shapiro, Ph.D. Director, Center for Promoting.
Reevaluation Using PSM/RTI Processes, PLAFP, and Exit Criteria How do I do all this stuff?
Progress Monitoring and Response to Intervention Solution.
DIBELS: Dynamic Indicators of Basic Early Literacy Skills 6 th Edition A guide for Parents.
Response to Intervention (RTI) at Mary Lin Elementary Principal’s Coffee August 30, 2013.
Progress Monitoring for students in Strategic or Intensive intervention levels Based on the work of Roland Good and Ruth Kaminski.
PROGRESS MONITORING FOR DATA-BASED DECISIONS June 27, 2007 FSSM Summer Institute.
RTI Procedures Tigard Tualatin School District EBIS / RTI Project Jennifer Doolittle Oregon Department of Education, January 27, 2006.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
Reevaluation Using PSM/RTI Processes, PLAFP, and Exit Criteria How do I do all this stuff?
Progress Monitoring Strategies for Writing Individual Goals in General Curriculum and More Frequent Formative Evaluation Mark Shinn, Ph.D. Lisa A. Langell,
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Effective Behavior & Instructional Support. Implementing RTI through Effective Behavior & Instructional Support.
Welcome Everyone!. Informal Agenda  Quick Trip With Jo Beck on navigating WebEx  Who you are and what you hope to get out of being a cohort  RTI/Measurement.
1 Average Range Fall. 2 Average Range Winter 3 Average Range Spring.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Response to Intervention (RtI) Aldine ISD District Staff Development August 18, 2009.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
RESPONSE TO INTERVENTION. What is it? A method of academic intervention used in the United States designed to provide early, effective assistance to children.
K-5: Progress Monitoring JANUARY, 2010 WAKE COUNTY PUBLIC SCHOOL SYSTEM INTERVENTION ALIGNMENT.
Progress Monitoring Presented By: Bart Lyman. Aimsweb Progress Monitoring Guide-Pearson 2012 RTI Implementer Series: Module 2: Progress Monitoring Training.
Nuts and Bolts of Progress Monitoring Response to Instruction and Intervention RtII.
Response to Intervention & Positive Behavioral Intervention & Support
1st Grade Curriculum Night
RTI – Response to Intervention
Data-Driven Decision Making
Progress monitoring Is the Help Helping?.
Are students benefitting from core instruction + interventions?
What is AIMSweb? AIMSweb is a benchmark and progress monitoring system based on direct, frequent and continuous student assessment.
Multi-Tiered System of Supports (MTSS) Overview
Module 10 Assessment Logistics
Data Review Team Time Winter 2014.
Data-Based Leadership
Data Collection Challenge:
Curriculum-Based Measurement for Student Assessment
Data Review Team Time Spring 2014.
RTI & SRBI What Are They and How Can We Use Them?
DIBELS Next Overview.
Overview: Understanding and Building a Schoolwide Assessment Plan
Data-Based Instructional Decision Making
EasyCBM: Benchmarking and Progress Monitoring System: RTI Assessment Julie Alonzo, Ph.D. & Gerald Tindal, Ph.D. July 2011.
Improved Outcomes For All Children Progress Universal Monitoring
Special Education teacher progress monitoring refresher training
Response to Instruction/Intervention (RtI) for Parents and Community
Thinking about progress monitoring: Decisions and instructional change strategies Dr. Lisa Habedank Stewart Minnesota State Univ Moorhead
Response to Instruction/Intervention (RtI) for Parents and Community
Sarah’s Progress in Secondary Prevention
DIBELS: An Overview Kelli Anderson Early Intervention Specialist - ECC
Response to Intervention Overview
RTI Procedures Tigard Tualatin School District EBIS / RTI Project
Presentation transcript:

Thinking about progress monitoring: Decisions and instructional change strategies Dr. Lisa Habedank Stewart Minnesota State Univ Moorhead 218.477.4081 stewart@mnstate.edu 1:15-3:30pm

Credits Minnesota Reading Corps U of O folks, Dr. Mark Shinn, Dr. Roland Good U of M folks, Dr. Matt Burns, Dr. Ted Christ Aimsweb www. aimsweb.com Dibels Data System dibels.uoregon.edu Graduate Students, Practicum Sites and School Districts I’ve worked with… Moorhead, Fargo, West Fargo, St Croix River Ed District, Fergus Falls Sped Coop, …

Progress Monitoring & RTI Data-based decision making improves student outcomes The more “at risk” a student is (& the more intensive our interventions) the more important frequent progress monitoring becomes Not all CBM measures and not all DIBELS have adequate reliability and validity!

“Good” Progress Monitoring Progress monitoring uses: reliable, valid measures tied to important educational outcomes Need long term measurement, not just short term/mastery sensitive to student growth given frequently (1 to 4x per month) My examples will be using General Outcome Measures of CBM or DIBELS in Reading Data are collected with fidelity Sensitivity is part of validity but warrants its own mention when discussing progress monitoring…. Also, brief, cheap, etc…. But those are practical issues.

Most Tools that Meet Standards are Members of the Curriculum-Based Measurement (CBM) “Family” Main Point- some of the best tools out there for progress monitoring in reading are in the CBM “family” (see General Outcome Measure CBM module). AIMSweb and DIBELS are both in the CBM “family” as are the MBSP measures and Progress Pro and other tools on the list that do not appear in this screen shot. Note that this graphic shows that Accelerated Reader-Reading does NOT have adequate reliability, and some of the measures are “better” (meet all standards) than others This graphic is an excerpt from the December 2007 version of the studentprogress.org Tools review (this is page 1 of 3 total pages on the website). Some tools reviewed are not in the CBM family (e.g., TOWRE and the TOSWRF and the STAR) but some of these do not meet standards. Also some of the CBM “family” measures (e.g., DIBELS retell fluency, word use fluency) also do not meet standards… www.studentprogress.org click on “Tools” 5

“Good” Progress Monitoring Progress monitoring uses: reliable, valid measures tied to important educational outcomes Need long term measurement, not just short term/mastery sensitive to student growth given frequently (1 to 4x per month) My examples will be using General Outcome Measures of CBM or DIBELS in Reading Data are collected with fidelity Sensitivity is part of validity but warrants its own mention when discussing progress monitoring…. Also, brief, cheap, etc…. But those are practical issues.

Short Term (Mastery) and Long Term Progress Monitoring Mastery Monitoring Test subskill mastery and individual lesson effectiveness Ex: QandA, worksheets Following directions Unit tests, “hot” reads Accuracy, Skills “checks” CBE, cba Long Term General Outcome Measures Test retention, generalization and progress toward overall general outcome (reading) Ex: CBM, DIBELS Main Point- Schools need to use long term measurement for high quality progress monitoring When it comes to progress monitoring… can use lots of things to monitor progress. For RTI what you need is at least ONE method of progress monitoring that is a GOM (General Outcome Measure)…. A Term used to describe the “CBM family”… see the General Outcome Measure CBM module for more information Both types of progress monitoring (short term and long term) are very important for the professional teacher. Mastery monitoring includes getting 10 of 10 math facts correct, being able to pick out the key parts of a story for comprehension, completing an in-class writing assignment with all the key writing components, passing the unit test on the Native Americans of NW Minnesota, etc. Mastery monitoring is also needed and can provide good data about whether a student is able to demonstrate understanding about the concepts and skills presented in that lesson or unit. This is very useful information for lesson planning, scaffolding for students, differentiating instruction, etc. However, this type of progress monitoring does not give us good information about whether our students are retaining and generalizing those skills over time to make progress toward long-term important key educational outcomes like becoming successful overall readers. You can lose the forest for the trees… Long term monitoring of student progress allows the teacher to make decisions about whether the intervention is working over time- is the student becoming a better reader, writer, problem solver etc… See next slide for another example of the difference between the two and why monitoring progress in general outcome measures 7

Both Mastery Monitoring and Long Term Progress Monitoring are Important Sometimes mastering subskills doesn’t generalize to the general outcome or students don’t retain the information over time For example: Melissa is very good at decoding letters and reading individual words, but is not generalizing these skills to reading text with automaticity and comprehension. Adam was really good at using his comprehension strategies and using those when they were working on these skills in class (showed mastery), but when they moved on to another unit he quit using the strategies Main point- mastery monitoring is great, but you need general outcome measures too.

“Good” Progress Monitoring Progress monitoring uses: reliable, valid measures tied to important educational outcomes Need long term measurement, not just short term/mastery sensitive to student growth given frequently (1 to 4x per month) My examples will be using General Outcome Measures of CBM or DIBELS in Reading Data are collected with fidelity Sensitivity is part of validity but warrants its own mention when discussing progress monitoring…. Also, brief, cheap, etc…. But those are practical issues.

How often? Informally we collect progress monitoring data all the time… On standardized general outcome measures Oral reading fluency Weekly Median of 3 passages every 3 weeks Jenkins, Graff & Miglioretti (2009) Estimated Reading Growth using Intermittent CBM Progress Monitoring. Exceptional Children, 75 (2), 151-163 On other measures? NWF? Maze? IGDIs?

“Good” Progress Monitoring Progress monitoring uses: reliable, valid measures tied to important educational outcomes Need long term measurement, not just short term/mastery sensitive to student growth given frequently (1 to 4x per month) My examples will be using General Outcome Measures of CBM or DIBELS in Reading Data are collected with fidelity Sensitivity is part of validity but warrants its own mention when discussing progress monitoring…. Also, brief, cheap, etc…. But those are practical issues.

Why is Progress Monitoring Important? We do NOT KNOW ahead of time whether an intervention will be successful for an individual student Do they assume in the hospital that your heart is working just fine after your bypass surgery? After all… the surgery works well for MOST patients….. Need to be Efficient and Effective with our time…. Some of these students are in academic intensive care!… and some just have a cold, but that could turn into pneumonia if we don’t watch it and do the right thing….

Individual Progress Data are CRITICAL: Small Group Segmenting Progress (02-03) Benchmark = 35-45 Spring K Note: K n=5, Gr1 n=5, 02-03 Credit: Moorhead Am Indian Prereferral Project

Segmenting – Individual Results from the Same Small Groups… Even though the intervention was effective “overall” or on average, Student who started the highest (purple) didn’t make progress…. Neither did the lowest student…. Credit: Moorhead Am Indian Prereferral Project

Survey Level Assessment (SLA) What is it? Start with grade level (if possible) and, if student isn’t at benchmark or other criteria, test down through successively “easier” grade levels of passages or probes Several probes per grade level (over diff days if possible) Why do it? Identify and begin to validate the extent of the skills problem Find “instructional level” and “measurement level” (may be different than each other and different than “grade level”) Look at how behavior/skills change in easier material Note: can do this easiest in reading (and spelling), also can do it somewhat in math- hard in writing.

A comment on materials… In READING When possible, students are monitored using grade level materials If this is not possible due to frustration or lack of sensitivity, “test down” and use the highest grade level of measures possible Periodically “check” how the student is doing on grade-level materials and move into grade level materials as soon as possible Instructional level may be different again!!! In Math… may be different….

Gus’ Reading “Survey Level” Data (Gus is in Grade 4) Note: error rates high (5-15) in Gr4, slightly lower (4-8) in Gr 3 and 2, and much lower in Gr 1 (1-4) Walk through what is on this slide… Doesn’t read much better in Gr 4, 3 or even 2…better in Gr 1 Errors disrupt meaning in Grades 4, 3, some in 2… not in Gr 1

What Material Should we use… To “instruct” Gus? To monitor Gus’ progress?

Systems for “Using” the data Culture of professionalism and using data to inform decisions Easy access to data and reports Time set aside to look at and “use” the data Professional Development in data collection and use Professional development and support in learning new and varied interventions

Use Graphs! Label them as clearly as possible, treat as confidential data,

Looking at the Graphs Is there “go upness”???? Is there ENOUGH “go upness”????

Basic Visual Analysis: “Go Upness”?

Aimline Shows general trajectory needed for student to reach his/her goal Typically set so student gets back “on target” within a set amount of time (e.g., by the end of the year) Tier 2- meet next benchmark or end of year benchmark Tier 3- depends… remember Gus?

Outcomes: DIBELS® Benchmark Goals 80% - 100% Chance of Getting to Next Goal Initial Sound Fluency: Phoneme Segmentation Fluency: Nonsense Word Fluency: DIBELS® Oral Reading Fluency: 25 sounds per minute by winter Kindergarten 35 sounds per minute by spring Kindergarten 50 sounds per minute with at least 15 words recoded by winter First Grade 40 words correct per minute by spring First Grade 90 words correct per minute by spring Second Grade 110 words correct per minute by spring Third Grade 118 words correct per minute by spring Fourth Grade 124 words correct per minute by spring Fifth Grade 125 words correct per minute by spring Sixth Grade Credit: Based on Kaminski, R., Good, R.H., & Knutson, N. (2006). Mentoring Workshop Manual. DMG

Can correlate your data with state tests…. AIMSweb R-CBM Cut Scores correlated with passing the MN Reading MCAs 20 40 60 80 100 120 140 160 180 1 Winter 1 Spring 2 Fall 2 Winter 2 Spring 3 Fall 3 Winter 3 Spring 4 Fall 4 Winter 4 Spring 5 Fall 5 Winter 5 Spring 6 Fall 6 Winter 6 Spring Benchmark Grade and Date words correct per min. (grade-level passages) Cut Score Correlated with passing Grade 3 MN Reading Correlated with passing Grade 5 MN Reading MCA

Minnesota Reading Corps Target Scores based on SCRED Targets Tied to 80% likelihood of Passing MN MCA-II Grade Measure Fall target Winter target Spring target K Letter sounds 8 16 36 1 Nonsense Word Fluency (NWF) 28 52 Oral Reading Fluency (ORF) Don’t do ORF in Fall 22 49 2 Oral Read Fluency 43 72 90 3 70 91 107 Lisa starts here…. Note: AIMSweb materials are used

Using an Aimline

Data Decision Guidelines If the student has some data points above and some below the aimline (doing the “aimline hug”), keep doing what you are doing! If the student has 4 consecutive data points above the aimline, consider moving the student to less intervention (e.g., decreasing minutes, or moving from Tier 2 to Tier 1 or Tier 3 to Tier 2) Also use other pieces of information Continue to progress monitor

Data Decision Guidelines Cont’d If the student has 4 consecutive data points below the aimline, ASK THE FOLLOWING QUESTIONS (and continue to progress monitor): What does the “other” evidence available suggest about the student’s progress? Error rates? Behavior during the intervention? What is the general “trend” of the data? Is the student likely to get where we want if this continues? Use visual analysis and other evidence Use “trendlines” and “aimlines”

Trendline Shows the general “trend” or trajectory of the student’s data so far Web-based Programs typically use an OLS regression line Aimsweb, Dibels data system, Excel Need approx. 7 to 9 data points Trendlines on few data points or on highly variable data are NOT reliable!!! Christ, T (2006). Short term estimates of growth using CBM ORF: Estimating Standard Error of Slope to construct confidence intervals. School Psychology Review, 35(1) 128-133

Refer to Christ et al article

How much progress is “enough”? What is “adequate” progress? Criterion referenced will student meet goal? in reasonable amt of time Growth is at or above “target” growth rate Norm referenced Growth is at or above growth of grade level peers Individually referenced Growth is better than before “Intervention”/research referenced Growth is similar to what was seen in research on this intervention (with similar population)

MRC Target Growth Rates 0708

Remember to use your brain! (and eyes and ears) These are guidelines, THINKING is REQUIRED… If overall trend of progress is good but s/he happens to have 4 data points just barely below the aimline, you may decide to continue your intervention for a week and see what happens. Use Convergence of Data (teacher report, mastery monitoring, behavioral indicators)

Practice Exercises: Is there go upness? Is there enough go upness? What else would you like to know? What would you do? Exit to less intense service Keep going and collect more data Problem solve and change something

Finn Gr 2 ORF Reading Links 1:5 for 15 min. Aimline

Reading Links 1:5 For 15 min. Added distributed practice and preteaching Aimline

Justan Gr 1 NWF

And now?

And now????

On track…

What decision would you make?

Is there “enough” go upness…

What happened here?

Enough go upness?

Can also make decisions about exiting to less intensive service Can also make decisions about exiting to less intensive service! (and celebrate!)

What can you do about “bounce” in the data?

Dealing with bounce… Is there a “measurement” problem? Fidelity of admin and scoring Materials aren’t well designed or are too difficult Who, where, and when measurement takes place can matter (esp for some kids) Motivation issues (can’t do vs. won’t do)

Dealing with bounce Other ways to minimize bounce or make decisions despite bounce Do more probes at one time and take median or average score Do more frequent measuremt (e.g., weekly or 2xweek) Look at trend over time with many data points Look at ALL data together (errors, mastery data, etc) Use the least dangerous assumption…

What if there isn’t adequate progress? If you keep doing what you’ve been doing then you will keep getting what you’ve got.

Back to Problem Solving

What if there isn’t adequate progress? Is the intervention being done with fidelity? Has fidelity check been done? Is the student in the right level of materials? Has the student been in school? Are they getting enough minutes of intervention per week?

What if there isn’t adequate progress? Cont’d Should the intervention be “tweaked”? changed? Is there an intervention better “matched” to this student’s needs? Changes could include trying a different intervention or just “tweaking” the current intervention such as adding a 5th repeat to a repeated reading or a sticker incentive for accurate reading Grade level or prob solving team members work together to discuss the data, the student, and what intervention changes would have the best chance of success Whether to tweak or change depends, in part, on how concerned you are…..

Problem Analysis: What do we know? RIOT and ICEL

Instructional Procedures What could we change? Instructional Procedures   Materials Arrange ments Time Motiva tional Strategies Focus or Skill Teaching Strategies

What could we change? Focus or skill Teaching strategies Materials more explicit, more modeling, more practice, more previewing, better matched with core Materials Easier, better matched (cultural, interests, etc) Arrangements Size group, location, Who is teaching? Time Amount of time, days per week, time of day Motivation Interests, goals, rewards, home/school Could use any basic teaching best practices framework…. BUT HAVE ONE!!!

Tatiana Example Fall Grade 2 Data... Is the Core (Tier 1) working? How can we group students and differentiate in Tier 1? Do some students need “more than the core”? Is Tatiana in Trouble? Do others have similar difficulty? Where would we like her to be? Add qualitative

Tatiana Tier 2 Grade level team put Tatiana In a Tier 2 small group working on reading rate… is it working? Went to Problem Solving Team…

What should we do? Look at existing data/information- why? (problem analysis) Increase Instructional Integrity (of Core? Of Tier 2) if that was a problem Collect more information if needed for intervention planning Decided to change intervention (develop a plan) Change SOMETHING: Group size, focus, instructional strategies, level of explicitness, motivation, time/timing, parent involvement, etc. Still Tier 2 or now Tier 3? Implement the plan, monitor progress and evaluate…

Problem Solving Team…. SCRED Graphic

Complete the Problem Solving Cycle Did we do it? Integrity Did it work?

Adam, Grade 4 Winter Benchmark data = 85 wrc (target =114) Fall Benchmark data= 89 (target=93) Error rate moderate (4, 4, & 6 errors) Very inconsistent academically, good attendance but attention, accuracy and work completion issues, basic decoding skills ok, can correct errors, can read better (with expression, meaning) in high interest material? Grade Level Team put Adam in Tier 2 intervention- working with MRC 1:1 on repeated reading intervention 20 min per day

Adam Decisions What should we change…. What else would you want to know? What are at least 5 different ideas for changes that could be made…. Is this likely to be a tweak or a major shift? How would you know if you made a good decision?

Resources Web Resources Print Resources www.studentprogress.org http://www.rti4success.org/ click on Progress monitoring on right side www.interventioncentral.org look for information on CBM, graphing, etc Print Resources Riley-Tillman & Burns (2009) Evaluating Educational Interventions. Guilford Press Safer & Fleishman (2005). How Student Progress Monitoring Improves Instruction Educational Leadership 62(5) 81-83. Research Matters: How Student Progress Monitoring Improves Instruction 69

Why do this? When teachers USE progress monitoring Students learn more! Teachers design better instructional programs Teacher decision making improves Students become more aware of their performance Safer & Fleishman, 2005