Method Participants. Approximately 64 second (43.8%) and third (56.3%) grade students. 51.6% female and 48.6% male Measures Oral Reading Fluency (ORF)

Slides:



Advertisements
Similar presentations
Reading Recovery: Can School Psychologists Contribute? Ruth M. Kelly, Western Illinois University Kelly R. Waner, Special Education Association of Adams.
Advertisements

Using an iTouch to Increase Sight Word Accuracy Mary Beth Pummel, William R. Jenson, Daniel Olympia, Lora Tuesday Heathfield, Kristi Hunziker Department.
Fluency Assessment Bryan Karazia.
Chapter 9 - Fluency Assessment
 Reading Assessment informs instruction  We base our reading instructional program on the student’s reading strengths and weaknesses  Differentiate.
The 6 Minute Solution Teaching your students to be fluent readers.
Fluency This publication is based on the First and Second Grade Teacher Reading Academies, ©2002 University of Texas System and the Texas Education Agency,
Chapter 9: Fluency Assessment
Maine Department of Education 2006 Maine Reading First Course Session #11 Fluency Research and Assessment.
Reading Fluency Intervention Strategies and Techniques 1. Does repeated reading alone show students gaining at least 10% reading comprehension skills of.
STAR Assessments: Using data to drive your instruction 2012.
Please sit with your CMA groups.
Measures of Academic Progress. Make informed instructional decisions  Identify gaps/needs  Support specific skill development across content areas 
MASI-R Oral Reading Fluency Measures
Running Records The How and Why.
Guided Reading An Overview. It’s not enough just to create opportunities for children to do things they can already do. Instead, it’s up to us to provide.
SPY 627: Survey Level Assessment Rachel Brown-Chidsey, Ph.D. Associate Professor of School Psychology University of Southern Maine
An Introduction to Response to Intervention
Fountas & Pinnell Reading Assessment Rebecca McCormick EDAD 618, Fall 2013.
Developmental Reading Assessment-NEW TEACHERS 2012.
Implementation Plan Jamie Stief. Problem Statement Several upper elementary students receiving pull out special education services are continuing to struggle.
Adolescent Literacy, Reading Comprehension & the FCAT Dr. Joseph Torgesen Florida State University and Florida Center for Reading Research CLAS Conference,
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
 New communication technologies are being developed and these changes affect not only literacy instruction but also our definition of literacy itself.
The Lexile Framework ® for Reading Overview and Uses.
AIMSweb and Goal- Setting December 8, 2010 PRTI. Basic goal setting Measurable goals include: Conditions Timeline, materials, difficulty level In 9 weeks.
Response to Intervention How to Monitor RTI Reading Interventions Jim Wright
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Reading Maze General Outcome Measurement TES Data Meeting April, rd and 4 th Grades Cynthia Martin, ARI Reading Coach.
Measures of Academic Progress. Make informed instructional decisions  Identify gaps/needs  Support specific skill development across content areas 
UNIVERSITY OF MINNESOTA Minnesota Center for Reading Research 175 Peik Hall 159 Pillsbury Drive SE, Minneapolis, MN Contacts: Kathrin Maki:
The Developmental Reading Assessment
1 Using Progress Monitoring to Develop Strong IEPs Nancy Safer Whitney Donaldson National Center on Student Progress Monitoring Ingrid Oxaal OSEP OSEP.
Response to Intervention RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving.
Using handheld computers to support the collection and use of reading assessment data Naomi Hupert.
Alternate Assessments: A Case Study of Students and Systems: Gerald Tindal UO.
Method Participants and Setting Three second grade students from two different elementary schools in Eau Claire, WI participated in this study. Teachers.
How to Administer and Interpret Running Records. Running records A running record is a tool that helps teachers to identify patterns in student reading.
Designing and using assessment systems to prevent reading difficulties in young children Dr. Joseph Torgesen Florida State University and Florida Center.
UNIVERSITY OF MINNESOTA Department of Educational Psychology School Psychology 250 Education Sciences Building 56 E. River Road, Minneapolis, MN
Fountas & Pinnell Benchmark Assessment System Training
Formative Assessment Title I/Reading Resource Reading Groups By Ellen Maxwell And Kristen Wilkes East Salem Elementary.
AN ANALYSIS OF THE EFFECTIVENESS OF THE EARLY READING INTERVENTION FOR SELF-EFFICACY (E-RISE) ON FIRST, SECOND, AND THIRD GRADE STUDENTS IN AN AT-RISK.
Class Action Research: Treatment for the Nonresponsive Student IL510 Kim Vivanco July 15, 2009
Form Effects on the Estimation of Students’ Progress in Oral Reading Fluency using CBM David J. Francis, University of Houston Kristi L. Santi, UT - Houston.
Fluency Assessment and Intervention. Determining the need for intervention Frustrated while reading grade level material Not participating in class Low.
Four Blocks Literacy Framework  What is the Four Blocks program?  How does it work?  How do Big Blocks/Four Blocks compare?  Assessment  Why did we.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Universal Screening Cadre 6 Training October 12, 2010.
Berry Middle School/Spain Park High School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Guided Reading Podcast SOUTH SHORE ELEMENTARY SCHOOL.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Educational Research Descriptive Statistics Chapter th edition Chapter th edition Gay and Airasian.
Overview of DRA2 (Developmental Reading Assessment) Middle School Presentation Cathy Rosso Stephanie Williams May 2012.
Informal Reading Inventory
A parents Guide to Guided Reading
Data-Driven Decision Making
M.A.P. Measures of Academic Progress
Fountas and Pinnell Benchmark Assessment 2 Workshop
M.A.P. Measures of Academic Progress
The 5 Ws of Testing
Data Collection Challenge:
Mesa Union School District “A Day in the Life of Data”
My Conference by your name
Understanding Your Child’s Report Card
The Relationship between Fidelity of Implementation and Classroom Quality in Early Childhood Education Katerina Sergi, M.A.1, Giorgio Carlo Cappello, Ph.D.1,
Parent Information Night
Relationship between Standardized and Classroom-based Assessment
Presentation transcript:

Method Participants. Approximately 64 second (43.8%) and third (56.3%) grade students. 51.6% female and 48.6% male Measures Oral Reading Fluency (ORF) Administered by school personnel 3 times per year (fall, winter, and spring). Spring scores were used for analyses. Students read from 3 one minute grade level passages, and the median score was recorded. Fountas & Pinnell Benchmark Assessment System (BAS) Instructional level determined by spring scores based on fluency, accuracy, and comprehension. Curriculum Based Assessment for Instructional Design (CBA-ID) Administered by researchers one time in the spring. Students read from 3 books (1 minute each) based on their BAS instructional level. The number of words read correctly divided by the total words read was recorded and the median score was used for analyses. UNIVERSITY OF MINNESOTA Department of Educational Psychology Sandra M. Pulles, Kathrin E. Maki, & Matthew K. Burns Relationship Between Reading Inventory Instructional Level and Student Reading Performance School Psychology 250 Education Sciences Building 56 E. River Road, Minneapolis, MN Contacts: Sandra M. Pulles: Kathrin E. Maki: Matthew K. Burns: College of Education + Human Development Introduction Instructional match is closely associated with improved student learning (Burns, 2007; Daly, Martens, Kilmer, & Massie, 1996) Instructional level occurs when students have sufficient background knowledge to interact with the material, yet still experience some level of challenge (Betts, 1946). Frustrational: <93% Instructional: 93-97% Independent: % Two ways to assess instructional level 1.Informal Reading Inventories (IRI) Fountas & Pinnell (1996) minutes per student; low psychometric properties Provide students 2.Curriculum Based Assessment for Instructional Design (CBA-ID ; Gickling & Havertape, 1981). Students read from instructional level text for 1 minute and the number of words read correctly is recorded and accuracy is computed. 5 minutes per student; high psychometric properties Research Questions 1.What level of agreement is there between instructional level estimates from reading three books from the same reading level? 2.To what extent does the estimate of instructional level from a reading inventory agree with instructional level estimates from reading the corresponding leveled book? 3.How do reading skills affect agreement between estimate of instructional level from a reading inventory and estimates from reading the corresponding leveled book? Discussion Students did not consistently read with accuracy from books rated at their IRI instructional level Students read with 93 to 97% accuracy about 28% of the time Struggling readers frequently failed to read with 93% accuracy High readers were not challenged enough by their IRI instructional level Psychometric issues associated with IRIs make it difficult to obtain an accurate student instructional level Reliability-inconsistency across books Validity-use of IRIs for determining instructional level Matching instructional material with student skill level results in improved student outcomes (Burns, 2007) Students should therefore be reading at their instructional level to ensure adequate reading growth Limitations Many students were higher readers therefore limiting generalizability to other skill levels No direct measure of comprehension was used There was no control over prior exposure thus it is unknown whether or not students were familiar with the material Mean (SD) Frustration n (%) Reading Level Instructional n (%) Independent n (%) Spring Benchmark Oral Reading Fluency (48.71) NA Reading 1 Accuracy 96.7% (3.27%) 8 (12.5%) 23 (35.9%) 33 (51.6%) Reading 2 Accuracy 96.4% (4.91%) 10 (15.6%) 15 (23.4%) 39 (60.9%) Reading 3 Accuracy 96.1% (5.9%) 11 (17.2%) 17 (26.6%) 36 (56.3%) Median Accuracy 96.7% (3.6%) 10 (15.6%) 18 (28.1%) 36 (56.3%) Reading 1 AccuracyReading 2 AccuracyReading 3 Accuracy Reading 1 Accuracy r =.47*r =.61* Reading 2 Accuracy  =.67* r =.68* Reading 3 Accuracy  =.67*  =.59* Reading 1 AccuracyReading 2 AccuracyReading 3 Accuracy Reading 1 Accuracy 70.3%68.8% Reading 2 Accuracyk =.49*67.2% Reading 3 Accuracyk =.47*k =.42* Median Percent Accurate CBA-ID Categorical Score IRI Instructional Levelr =.65tau =.65 Group Frustration n (%) Instructional n (%) Independent n (%) Low – 25th Percentile or Less 7 (58%) 5 (41.7%) 0 (0.0%) Middle – 26th to 75th Percentile 2 (9.5%) 4 (19.0%) 15 (71.4%) High – 76th Percentile or Higher 1 (3.2%) 9 (29.0%) 21 (67.7%) Table 1 Descriptive Statistics and Frequency Data Table 2 Correlations Among Accuracy Measures from Three Reading Performance Assessments Table 3 Percent Agreement and Kappa Among Accuracy Measures from Three Reading Performance Assessments Table 4 Correlation between IRI Instructional Level and CBA-ID Accuracy and Categorical Score Table 5 Number and Percentage of Median Accuracy Scores from Three Reading Performance Assessments that Fell within the Frustration, Instructional, and Independent Level by Skill Group Results LowAverageHigh 2 nd Grade < > rd Grade < > 140