Abstract In a response to intervention (RTI) model, educators in both general and special education use curriculum-based measurement (CBM) oral reading.

Slides:



Advertisements
Similar presentations
Tier III Meeting Outline. Roles of Team Members Coordinator –Set up meetings, contact parents, inform members of team about agenda, inform teachers about.
Advertisements

Scott Linner Aimsweb Trainer Aimsweb support
Chapter 9 Fluency Assessment Jhanyce A. Acosta. What? * Fluency Assessment -a method of listening to students read aloud in order to gathering their data,
Chapter 9: Fluency Assessment
DIBELS Part I SEDL 368. To access the materials, you will need to go to Download.
SLD Body of Evidence and Eligibility Denver Public Schools, 2011.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
Monitoring Student Progress: Administrative Issues Part I – History of Using Progress Monitoring in Minneapolis Schools Doug Marston John Hintze July 8,
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
Reading Comprehension and Math Computation Screening and Progress Monitoring Assessments for Secondary Students Carrie Urshel, Ph.D., School Psychologist.
DIBELS – Part II SED 368 Fall Review DIBELS Benchmarks – 3 times/year – At grade-level learners may need only benchmarks – Can be used as a screener.
An Introduction to Response to Intervention
STEEP Data Management Ease, Reliability, Power. STEEP Data Management Goal: Make it simple to implement RTI correctly –Select Students who Need Intervention.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
One More Piece of the RTI Puzzle: Zones of Growth for Students Receiving Tier 2 Instructional Supports Hank Fien, Ph.D. Center for Teaching and Learning.
EasyCBM: Benchmarking and Progress Monitoring System.
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
RESPONSE TO INTERVENTION Georgia’s Pyramid. Pyramid Vocabulary  Formative Assessment  Universal Screening  Intervention  Progress Monitoring.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
AIMSweb and Goal- Setting December 8, 2010 PRTI. Basic goal setting Measurable goals include: Conditions Timeline, materials, difficulty level In 9 weeks.
Response to Intervention How to Monitor RTI Reading Interventions Jim Wright
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Assessment Methodologies Used to Personalize Instruction Gerald Tindal Castle-McIntosh-Knight Professor Behavioral Research and Teaching – UO.
Setting Ambitious & Attainable Student Goals OrRTI Spring Training May 3 rd, 2011.
UNIVERSITY OF MINNESOTA Minnesota Center for Reading Research 175 Peik Hall 159 Pillsbury Drive SE, Minneapolis, MN Contacts: Kathrin Maki:
DATA BASED DECISION MAKING IN THE RTI PROCESS: WEBINAR #2 SETTING GOALS & INSTRUCTION FOR THE GRADE Edward S. Shapiro, Ph.D. Director, Center for Promoting.
Response to Intervention RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving.
Progress Monitoring and Response to Intervention Solution.
Intervention Management. Keeping RtI on Track Jigsaw chapter 1 (pps. 1-6) Each person reads one section Share a big idea from your section and answer.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 3 Descriptive Statistics: Numerical Methods.
Response to Intervention RTI: Decision Rules.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
PROGRESS MONITORING FOR DATA-BASED DECISIONS June 27, 2007 FSSM Summer Institute.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Progress Monitoring and RtI: Questions from the Field Edward S. Shapiro Director, Center for Promoting Research to Practice Lehigh University, Bethlehem,
From Screening to Verification: The RTI Process at Westside Jolene Johnson, Ed.S. Monica McKevitt, Ed.S.
Progress Monitoring and the Academic Facilitator Cotrane Penn, Ph.D. Intervention Team Specialist East Zone.
Response to Intervention: Tier 1 Connecting Research to Practice for Teacher Educators 1.
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
Data Driven Decision Making Across All Content Areas WI PBIS Network Summer Leadership Conference Rachel Saladis Lynn Johnson The Wisconsin RtI Center/Wisconsin.
RTI and District Assessments Jack B. Monpas-Huber, Ph.D. Director of Assessment and Student Information Anzara Miller RTI Coordinator / Professional Development.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
National Center on Response to Intervention Using CBM in a Response to Intervention Framework Other Ways to use Curriculum Based Measurement Data.
Response to Intervention in Mathematics Thinking Smart about Assessment Ben Clarke University of Oregon May 21, 2014.
1 The Course of Reading Disability in First Grade: Latent Class Trajectories and Early Predictors Don Compton, Lynn Fuchs, and Doug Fuchs.
Part 2: Assisting Students Struggling with Reading: Multi-Tier System of Supports H325A
Response to Intervention: Tiers 2 & 3 W RITING A GOAL.
1 Average Range Fall. 2 Average Range Winter 3 Average Range Spring.
Thinking Smart about Assessment in a MTSS Model Ben Clarke University of Oregon September 11, 2013.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Balancing on Three Legs: The Tension Between Aligning to Standards, Predicting High-Stakes Outcomes, and Being Sensitive to Growth Julie Alonzo, Joe Nese,
Data Examples Problem Identification—Tier 1 October, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG)
The Reliability of CBM Reading Growth Estimates for Different Student Groups Joseph F. T. Nese B. Jasmine Park Aki Kamata Julie Alonzo Gerald Tindal Behavioral.
The Invariance of the easyCBM® Mathematics Measures Across Educational Setting, Language, and Ethnic Groups Joseph F. Nese, Daniel Anderson, and Gerald.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading - AIMS.
Department of Curriculum and Instruction Considerations for Choosing Mathematics Progress Monitoring Measures from K-12 Anne Foegen, Ph.D. Pursuing the.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
RTI Trends & Issues Keith Drieberg Brad McDuffee San Bernardino City Unified School District Keith Drieberg Brad McDuffee San Bernardino City Unified School.
DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies.
STAR Reading. Purpose Periodic progress monitoring assessment Quick and accurate estimates of reading comprehension Assessment of reading relative to.
Achievement Growth and Gaps for Students with Disabilities
What is AIMSweb? AIMSweb is a benchmark and progress monitoring system based on direct, frequent and continuous student assessment.
Data Collection Challenge:
EasyCBM: Benchmarking and Progress Monitoring System: RTI Assessment Julie Alonzo, Ph.D. & Gerald Tindal, Ph.D. July 2011.
Special Education teacher progress monitoring refresher training
Presentation transcript:

Abstract In a response to intervention (RTI) model, educators in both general and special education use curriculum-based measurement (CBM) oral reading fluency (ORF) assessments across grades to monitor progress of students receiving reading intervention. Researchers have explored average growth in ORF and found decelerating, nonlinear growth rates across grade-levels (e.g., Christ et al., 2010; Nese et al., 2012; Nese et al., 2013). In this study, we examine growth for those students in Grades 1-8 specifically identified for reading intervention and receiving regular progress monitoring, and explore two growth models to determine (a) the within-year ORF growth, and (b) the growth across Tier II/III intervention. Abstract In a response to intervention (RTI) model, educators in both general and special education use curriculum-based measurement (CBM) oral reading fluency (ORF) assessments across grades to monitor progress of students receiving reading intervention. Researchers have explored average growth in ORF and found decelerating, nonlinear growth rates across grade-levels (e.g., Christ et al., 2010; Nese et al., 2012; Nese et al., 2013). In this study, we examine growth for those students in Grades 1-8 specifically identified for reading intervention and receiving regular progress monitoring, and explore two growth models to determine (a) the within-year ORF growth, and (b) the growth across Tier II/III intervention. Results  Table 1: Trajectory fixed effect parameters (intercept and slopes) were similar across Within-year and Tier II/II growth models. Figure 1: The first testing occasion for Tier II/II students was nearly always in the early fall for all grades but first (winter).  Figure 2 Decelerating Tier II/II growth in Grades 1-5. Linear Tier II/II growth in Grades 6-8. Growth of progress monitored students similar to that of easyCBM 20 th percentile group, but with lower intercepts and greater yearly gains in most grades. Results  Table 1: Trajectory fixed effect parameters (intercept and slopes) were similar across Within-year and Tier II/II growth models. Figure 1: The first testing occasion for Tier II/II students was nearly always in the early fall for all grades but first (winter).  Figure 2 Decelerating Tier II/II growth in Grades 1-5. Linear Tier II/II growth in Grades 6-8. Growth of progress monitored students similar to that of easyCBM 20 th percentile group, but with lower intercepts and greater yearly gains in most grades. Describing the Reading Fluency Growth of Progress Monitored Students Joseph F. T. Nese, Julie Alonzo, Gerald Tindal National Center on Assessment and Accountability for Special Education, 2013 References Christ, T. J., Silberglitt, B., Yeo, S., & Cormier, D. (2010). Curriculum-based measurement of oral reading: An evaluation of growth rates and seasonal effects among students served in general and special education. School Psychology Review, 39, Muthén, L. K., & Muthén, B. O. ( ). Mplus user’s guide (7th ed.). Los Angeles, CA: Muthén & Muthén. Nese, J. F. T., Biancarosa, G., Anderson, D., Lai, C. F., & Tindal, G. (2012). Within-year oral reading fluency with CBM: A comparison of models. Reading and Writing, 25, DOI: /s Nese, J. F. T., Biancarosa, G., Cummings, K., Kennedy, P., Alonzo, J., Tindal, G. (2013). In search of average growth: Describing within-year oral reading fluency growth for Grades 1-8. DOI: /j.jsp References Christ, T. J., Silberglitt, B., Yeo, S., & Cormier, D. (2010). Curriculum-based measurement of oral reading: An evaluation of growth rates and seasonal effects among students served in general and special education. School Psychology Review, 39, Muthén, L. K., & Muthén, B. O. ( ). Mplus user’s guide (7th ed.). Los Angeles, CA: Muthén & Muthén. Nese, J. F. T., Biancarosa, G., Anderson, D., Lai, C. F., & Tindal, G. (2012). Within-year oral reading fluency with CBM: A comparison of models. Reading and Writing, 25, DOI: /s Nese, J. F. T., Biancarosa, G., Cummings, K., Kennedy, P., Alonzo, J., Tindal, G. (2013). In search of average growth: Describing within-year oral reading fluency growth for Grades 1-8. DOI: /j.jsp For More Information Please contact Joe Nese: More information on this and related projects can be obtained at For More Information Please contact Joe Nese: More information on this and related projects can be obtained at Funding Source This project was funded through the National Center on Assessment and Accountability for Special Education (NCAASE) grant (R324C110004) from the U.S. Department of Education, Institute of Education Sciences. Opinions expressed do not necessarily reflect the opinions or policies of the U.S. Department of Education. Funding Source This project was funded through the National Center on Assessment and Accountability for Special Education (NCAASE) grant (R324C110004) from the U.S. Department of Education, Institute of Education Sciences. Opinions expressed do not necessarily reflect the opinions or policies of the U.S. Department of Education. Selection Rules for ORF Progress Monitoring Sample Selection Rules for ORF Progress Monitoring Sample StepRulesStudentsScores Original data139,165495,174 1)Delete out of range scores (e.g., 365)-495,112 2)Delete students in grades other than ,848- 3)Delete students with any instance of off-grade-level testing129,617- 4) Delete students who scored > 30th percentile on first benchmark or progress monitoring assessment 35,923- 5)Delete scores from assessments > 20th testing occasion-145,478 Delete “invalid” scores-- If two scores are less than two weeks (14 days) apart AND are different by > 35 WCPM, delete the score that is least like adjacent scores. 57,787 Based on the following rule of growth: [(Expected growth per week+SE g ) * 2 weeks)]+( SEM * 2 weeks) -- [(1.5wcpm+1.0SE g ) * 2]+(15*2) ) Take the median of scores that are within 7 days keep scores that are >7 and 28 days apart. 34,550 7)Delete students with < 3 progress monitoring tests5,37330,628 Method We conducted two latent growth models for each grade using Mplus version 7.11 ( Muthén & Muthén, ): (a) Within-year growth, where the time metric was the calendar year, such that the intercept was either late September or early October; (b) Tier II/III growth, where the time metric was the assessment occasion, such that the intercept was the first assessment occasion. Method We conducted two latent growth models for each grade using Mplus version 7.11 ( Muthén & Muthén, ): (a) Within-year growth, where the time metric was the calendar year, such that the intercept was either late September or early October; (b) Tier II/III growth, where the time metric was the assessment occasion, such that the intercept was the first assessment occasion. Discussion  Tier II/III interventions begin in the fall; less identification of at-risk students beyond fall. Implications for winter/spring benchmarks in RTI system: Lost opportunities for educators and students for lack of RTI system training?  As hypothesized, growth rates are lower than 50 th percentile and average empirical trajectories ( e.g., Nese et al., 2013 ), except for Grades 4 & 5.  In all grades but sixth, linear growth rates greater in magnitude for the Tier II/II metric than the Within-year metric. Tenuous evidence for effective interventions.  Limitations Selected sample very specific. Intervention information (if any) unknown. Discussion  Tier II/III interventions begin in the fall; less identification of at-risk students beyond fall. Implications for winter/spring benchmarks in RTI system: Lost opportunities for educators and students for lack of RTI system training?  As hypothesized, growth rates are lower than 50 th percentile and average empirical trajectories ( e.g., Nese et al., 2013 ), except for Grades 4 & 5.  In all grades but sixth, linear growth rates greater in magnitude for the Tier II/II metric than the Within-year metric. Tenuous evidence for effective interventions.  Limitations Selected sample very specific. Intervention information (if any) unknown. Table 1. Fixed Effects and Within- and Between-Model Correlations for the Quadratic Within-year and Tier II/II Growth Models Figure 1. Percent of Students First ORF Assessment for Each Month Across Grades 1-8 Figure 2. Predicted mean ORF scores in words correct per minute (wcpm) by grade based on the Within-year and Tier II/II time metrics, and observed mean ORF benchmark scores at the 20 th and 50 th percentiles for the easyCBM system.