Module 10 Assessment Logistics

Slides:



Advertisements
Similar presentations
Response to Intervention (RTI) in Fresno Unified School District Presentation for SELPA Directors December 1 st 2005 By Sue Pellegrino, FUSD SELPA Director.
Advertisements

WELCOME to DIBELS Night 6:00 – 6:30 Dinner 6:30 – 7:00 Children: Story Teller Parents: DIBELS presentation on 1 st /2 nd grade assessments at Wilcox 7:00.
Understanding DIBELS Next
DIBELS Part I SEDL 368. To access the materials, you will need to go to Download.
Agenda Introduction First Sound Fluency (FSF) Phoneme Segmentation Fluency (PSF) Letter Naming Fluency (LNF) Nonsense Word Fluency (NWF) DIBELS ® Oral.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
*This is a small school district of fewer than 1000 students located in northern Illinois. *The district consists of: an Elementary School (Pre-K--4 th.
Response to Intervention (RTI) Lindenhurst Schools
Instruction GoalsAssessment For Each Student For All Students Approaches and Considerations of Collecting Schoolwide Early Literacy & Reading Performance.
What Can We Do to Improve Outcomes? Identifying Targets of Opportunity Roland H. Good III University of Oregon WRRFTAC State.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Schoolwide Reading Improvement Model - RTI Name:___________________________ Grade:___________________________ School:__________________________ CONSULTING.
Interpreting DIBELS reports LaVerne Snowden Terri Metcalf
DIBELS Information Night for Kindergarten Parents Monday, September 24, 200 6:30 – 7:30 P.M. Mirage Elementary Media Center Information about what DIBELS.
Wisconsin’s New Kindergarten Screener A training for the administration and scoring of the Phonological Awareness Literacy Screener.
A Review of RTI Literacy Assessment/ Monitoring Tools Jim Wright www
Utilizing AIMSweb to Inform Instruction June 25, 2012.
December 7, Purpose  Reflect on the components of the Fall Progress Monitoring Data Meeting using the Success Analysis Protocol.  Identify lessons.
1 RtII: Response to Instruction and Intervention Wissahickon School District.
Progress Monitoring Cadre 8 Training February 6 th, 2012.
Supporting Successful Data Collection. School improvement is everyone’s job Assessment team & classroom teachers collect data Assessment Team DIBELS 1:1.
DIBELS: Dynamic Indicators of Basic Early Literacy Skills 6 th Edition A guide for Parents.
1 Welcome! to Leeds Elementary ARI Reading Coach Cynthia Wallace.
Title I Parent Meeting Miami Elementary School
Response to Intervention RTI Teams: Goal-Setting Jim Wright
Dynamic Measurement Group (DMG) Part 2.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Systems Review: Schoolwide Reading Support Cohort 5: Elementary Schools Winter, 2009.
RTI Procedures Tigard Tualatin School District EBIS / RTI Project Jennifer Doolittle Oregon Department of Education, January 27, 2006.
DIBELS Overview Kindergarten. What is DIBELS?? Kindergarten Assessments DIBELS Report What can parents do at home?
From Screening to Verification: The RTI Process at Westside Jolene Johnson, Ed.S. Monica McKevitt, Ed.S.
DIBELS Overview First and Second Grade.
DIBELS Data: From Dabbling to Digging Interpreting data for instructional decision-making.
Data Analysis MiBLSi Project September 2005 Based on material by Ed Kameenui Deb Simmons Roland Good Ruth Kaminski Rob Horner George Sugai.
HOW DO WE USE DIBELS WITH AN OUTCOMES-DRIVEN MODEL? Identify the Need for Support Validate the Need for Support Plan Support Evaluate Effectiveness of.
Class Action Research: Treatment for the Nonresponsive Student IL510 Kim Vivanco July 15, 2009
Data-Based Decision Making: Universal Screening and Progress Monitoring.
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
SEPT. 21, 2015 DIBELS- NEXT REFRESHER. ACCESS mClass Portal mclasshome.com/portal Support: resources and contact Reports mClass Assessment mclasshome.com/assessment.
Interpreting data for program evaluation and planning.
Using DIBELS to Improve Reading Outcomes in Grades 3-5.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
RTI Trends & Issues Keith Drieberg Brad McDuffee San Bernardino City Unified School District Keith Drieberg Brad McDuffee San Bernardino City Unified School.
Winston-Salem / Forsyth County Schools
Addressing Questions with KN, 1st and 2nd Grade Reading
K-5: Progress Monitoring JANUARY, 2010 WAKE COUNTY PUBLIC SCHOOL SYSTEM INTERVENTION ALIGNMENT.
Data-Driven Decision Making
Assistant Principal’s Meeting Miami-Dade County Public Schools
Are students benefitting from core instruction + interventions?
What is AIMSweb? AIMSweb is a benchmark and progress monitoring system based on direct, frequent and continuous student assessment.
The Continuum of Interventions in a 3 Tier Model
Multi-Tiered System Of Supports (MTSS)
Data-Based Leadership
Coffee With the Principal Café con el Director
CASD K-6 Transition Plan from DIBELS to DIBELS Next
Academic Parent Teacher Teams (APTT) Winter
mClass Training for New Staff
DIBELS Next Overview.
RTI: Response To Instruction
Southview Elementary.
2018/2019 School Calendar July August September October November
Implementation of Data-Based Decision-Making in an Urban Elementary School Doug Marston Jane Thompson Minneapolis Public Schools March 26, 2009.
Title I Annual District Meeting October 3, 2016
Title I Annual District Meeting October 4, 2018
Ensuring Success for Every Reader
DIBELS: An Overview Kelli Anderson Early Intervention Specialist - ECC
Data-based Decisions: You try it
RTI Procedures Tigard Tualatin School District EBIS / RTI Project
Snowshoe Elementary School $109,762
Presentation transcript:

Module 10 Assessment Logistics DIBELS® Training Institute: Essential Workshop Module 10 Assessment Logistics

Logistics of DIBELS® Data Collection Plan and schedule data collection. Organize resources. Collect the data. Enter the data. Use data for educational decision making. © 2005c, Dynamic Measurement Group

DIBELS® Data Collection Benchmark Assessment Assess all children three times/year (e.g., Fall, Winter, Spring). Progress Monitoring Assess children needing strategic support/monitoring more frequently (e.g., monthly). Assess students needing more intensive intervention weekly. © 2005c, Dynamic Measurement Group

Planning for Benchmark Assessment Decisions When will the data be collected? Who will collect the data? How will the data be collected? What data collection approach will be used? © 2005c, Dynamic Measurement Group

When Will the Benchmark Data be Collected? Three times a year Fall (or first quarter) September, October, November Winter (or second quarter) December, January, February Spring (or third quarter) March, April, May © 2005c, Dynamic Measurement Group

Considerations for Scheduling Schedule data collection for at least 2 weeks after a major break. Prevent overlap with other major events (e.g., state-level testing). Coordinate with other events (e.g., prior to parent conferences if you would like the data by then). Plan time for training/updating of data collection team. © 2005c, Dynamic Measurement Group

Schedule Benchmark Assessment Schedule assessment time and final date by which all data must be collected. (D) Schedule assessment at each school. (D/B/C) D = District-level decision B = Building-level decision C = Classroom-level decision © 2005c, Dynamic Measurement Group

Who will Collect the Benchmark Data? Options Classroom teachers and assistants Specialists and support staff Principals and administrators Trained volunteers Practicum students Considerations Resources Number and availability of staff Interest Budgetary resources Training needs Timeline Approach © 2005c, Dynamic Measurement Group

What Data Collection Approach? Options Within classroom School-wide: one day School-wide: multiple days Considerations © 2005c, Dynamic Measurement Group

Within Classroom Approach Who collects the data? Classroom teachers and assistants Where? In the classroom How? Teachers set aside time (e.g., 30 minutes a day for 4 days) to assess each child in the room Advantages Disadvantages Teachers test their own students Less disruptive to school in general Detracts from instructional time Requires more days May be more difficult to organize and use data system-wide Does not improve communication and encourage team approach to decision making Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

One-day School-wide Approach Who collects the data? Data collection team: support staff, trained volunteers, specialists, educational assistants, teachers (1 data collector for 20-25 students) Where? A large, central location with many tables and places for students to be assessed and to wait their turn (e.g., library, multipurpose room, cafeteria). How? A schedule is set for classrooms to come to a central location where the team assess all day. Advantages Disadvantages Minimal classroom disruption (e.g., 30 minutes per class) All data collected in one day School-wide effort improves communication and enhances effort to organize and use data system-wide. Need a large team Scheduling and logistics issues (e.g., need location for assessments; services may be disrupted for all students (e.g., computer lab, library). If using support staff, specialized services for students may be disrupted Teachers not involved in assessing own students Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Multi-day School-wide Approach Who collects the data? Data collection team: support staff, trained volunteers, specialists, educational assistants, teachers Where? Central location or temporary area in/near the classroom How? The team either goes to the classroom and tests students in/near each classroom or classrooms go to team location Advantages Disadvantages Requires smaller core team Teachers may be more involved in data collection process Less disruption to school in general No need for central location School-wide effort improves communication and enhances effort to organize and use data system-wide. Takes longer to collect data on all students Testing team may need to move from one location to another If using support staff, specialized services for students may be disrupted Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group Considerations Goal: To collect valid and reliable benchmark data as efficiently and economically as possible, with minimal disruption Before deciding on an approach to use, consider: Number of students to be assessed School calendar and events Timeline for completion of assessment Availability of resources © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group Time to Assess Measure Approximate Time Initial Sound Fluency (ISF) 3 minutes Letter Naming Fluency (LNF) 1.5 minutes Phoneme Segmentation Fluency (PSF) Nonsense Word Fluency (NWF) 2 minutes Oral Reading Fluency (ORF) (3 passages) 4 minutes Retell Fluency (3 passages) Word Use Fluency Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Benchmark Assessment Time Kindergarten-First Grade Fall Winter Spring Measures Minutes K ISF, LNF 4.5 ISF, LNF, PSF, NWF 8 LNF, PSF, NWF 5 (optional) ISF, LNF, WUF 6 ISF, LNF, PSF, NWF, WUF 9.5 LNF, PSF, NWF, WUF 6.5 1 PSF, NWF, ORF 7.5 PSF, NWF, ORF, RTF, WUF 11 Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Benchmark Assessment Time Second-Third Grade Fall Winter Spring Measures Minutes 2 NWF, ORF 6 ORF 4 (optional) NWF, ORF, RTF, WUF 9.5 ORF, RTF, WUF 7.5 3 Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Benchmark Assessment Time Fourth-Sixth Grade Fall Winter Spring Measures Minutes 4-6 ORF 4 (optional) ORF, RTF 6 Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Data Collection Conversion (no optional measures) Minutes per student Number of Data Collectors Students assessed per 30 minutes K fall 6 1 5 winter 9.5 3-4 spring 6.5 4-5 First 11 2-3 Second 7.5 4 Third and Above Adapted from B. Harn (2000) © 2005c, Dynamic Measurement Group

Data Collection Conversion (with optional measures) Minutes per student Number of Data Collectors Students assessed per 30 minutes K fall 4.5 1 6-7 winter 8 3-4 spring 5 6 First 7.5 4 Second 7-8 Third and Above © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group Planning Activity How much time for benchmark assessment (with and without optional measures) for: First grade classroom with 20 children in winter? Kindergarten classroom with 18 children in spring? Second grade classroom with 26 children in fall? © 2005c, Dynamic Measurement Group

Planning Results First Grade 2.5 hours 3 hours 40 minutes Kindergarten Time w/out optional measures Time w/ optional measures First Grade 2.5 hours 3 hours 40 minutes Kindergarten 1.5 hours 2 hours Second Grade 2 hours 36 minutes 4 hours © 2005c, Dynamic Measurement Group

Logistics of DIBELS® Data Collection Identify and organize resources. Collect the data. Enter the data. Use the data for educational decision making. © 2005c, Dynamic Measurement Group

Identify and Organize Resources Arrange logistics for data collection(B/C). Materials -- Who will access, prepare and organize; how will materials be organized; where will materials be stored? Where will the assessment take place? In classroom, hallway/breezeway, pod, library, cafeteria Prepare assessment “stations” (2 chairs, small desk/table). Assign assessors to classrooms/stations. Allot time (30 minutes for 1 assessor/5 children). Specialists/volunteers -- at which school when? D = District-level decision B = Building-level decision C = Classroom-level decision © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group Organize Resources Prepare assessment materials. (D/B) Download/copy all assessment materials. Child assessment booklets - 1 per child Assessment administration and scoring manual and stimulus materials - 1 per assessor Get class lists*. (D/B) Label child assessment booklets*. Collate child assessment booklets*. D = District-level decision B = Building-level decision C = Classroom-level decision © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group Next Steps Collect Data. Enter the data into the computer. Use the data to make decisions. Download reports. Share data/reports with teachers. Meetings by grade Individual meetings Make decisions about individual children. Make system-wide decisions. © 2005c, Dynamic Measurement Group

Team Assessment Advantages Team assessment is efficient. 5 people can assess a class in about 30 minutes. Team assessment distributes investment. Team assessment shares ownership and skills. Team assessment engages the educator in us all. Team assessment makes the results vivid and meaningful. Scores of 7 words per minute and 40 words per minute are NOT just a little bit different. © 2005c, Dynamic Measurement Group

© 2005c, Dynamic Measurement Group TEAMWORK is not always about winning, but working together to make things happen. © 2005c, Dynamic Measurement Group