Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012.

Slides:



Advertisements
Similar presentations
Progress Monitoring And RtI System
Advertisements

Scott Linner Aimsweb Trainer Aimsweb support
OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE
DIBELs Next…. Oh my, What’s Next?
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Assessment, Screening and Progress Monitoring made Easy! a tool for every tier Darrell L. Lee, easyCBM Senior Account Manager.
Academic Data for Instructional Decisions: Elementary Level Dr. Amy Lingo, Dr. Nicole Fenty, and Regina Hirn Project ABRI University of Louisville Project.
Introduction to DIBELS Next Stephanie Stollar, PhD UC Summer Institute June 14 & 15, 2010 Dynamic Measurement Group
Summer Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved. All names and data used in this presentation are fictitious.
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
Reading Comprehension and Math Computation Screening and Progress Monitoring Assessments for Secondary Students Carrie Urshel, Ph.D., School Psychologist.
Steve King The mCLASS® System North Carolina April 2013 Steve King
First Sound Fluency & Phoneme Segmentation Fluency Phonemic Awareness
Agenda Introduction First Sound Fluency (FSF) Phoneme Segmentation Fluency (PSF) Letter Naming Fluency (LNF) Nonsense Word Fluency (NWF) DIBELS ® Oral.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
DIBELS Next Introduction Webinar Oregon Reading First Center April 1, 2010 Materials reproduced with permission of Dynamic Measurement Group © 2010.
Aimsweb overview Group-Administered Measures: Training Format
Reading First Assessment Faculty Presentation. Fundamental Discoveries About How Children Learn to Read 1.Children who enter first grade weak in phonemic.
Training on measures… We strongly recommended that you watch these on demand videos because all directions, cover sheets, etc. are housed here. OR.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Cohort 5 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Intervention Placement Process: Finding the Right Fit Cadre 8 Training Feb 5, 2012.
Assessment: Universal Screening Cadre 7 Initial Training September 29, 2011.
Grade-level Benchmark Data Meetings
Utilizing AIMSweb to Inform Instruction June 25, 2012.
DATA BASED DECISION MAKING IN THE RTI PROCESS: WEBINAR #2 SETTING GOALS & INSTRUCTION FOR THE GRADE Edward S. Shapiro, Ph.D. Director, Center for Promoting.
Progress Monitoring and Response to Intervention Solution.
Progress Monitoring Cadre 8 Training February 6 th, 2012.
Path Driver Training Julie Combs At-Risk Coordinator August 17, 2015.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Dynamic Measurement Group (DMG) Part 2.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Systems Review: Schoolwide Reading Support Cohort 5: Middle Schools Winter, 2009.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Systems Review: Schoolwide Reading Support Cohort 5: Elementary Schools Winter, 2009.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Progress Monitoring and the Academic Facilitator Cotrane Penn, Ph.D. Intervention Team Specialist East Zone.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
Spring Literacy Training Kindergarten Spring 2014.
Data Analysis MiBLSi Project September 2005 Based on material by Ed Kameenui Deb Simmons Roland Good Ruth Kaminski Rob Horner George Sugai.
Data Driven Decision Making Across All Content Areas WI PBIS Network Summer Leadership Conference Rachel Saladis Lynn Johnson The Wisconsin RtI Center/Wisconsin.
Benchmark Data Meetings Presented to Coaches September 6, 2013 Adapted from MiBLSi materials.
Detroit Public Schools Data Review and Action Planning: Schoolwide Reading Spring
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Foundational Tier 1 training Sept. 22, 2015 Universal.
DRA2 and DIBELS Next October 15 th, What is the DRA2? Universal assessment used last year- administered 3 times a year in grades K-3 The DRA2 provides.
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
Literacy Assessments Literacy Workgroup Marcia Atwood Michelle Boutwell Sue Locke-Scott Rae Lynn McCarthy.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
READING 3D TRAINING Lynn Holmes Fall 2015.
ILEADR Consultants: Brie Beane & Amanda Goulds. Prior to RtI  Academic Ranking- 55 th  Graduation Rate- 61% (53rd)  SAT- 57 th  Dropout Rate- 6.5%
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
1 1.0 Review DIBELS Next ® Data Interpretation & Tier I Reading Systems.
Essentials for Teachers Getting Started with Benchmark Screening.
Colt Early Childhood Education Center October 8, 2012
Data-Driven Decision Making
Are students benefitting from core instruction + interventions?
What is AIMSweb? AIMSweb is a benchmark and progress monitoring system based on direct, frequent and continuous student assessment.
Getting Started with Welcome/Intro Melissa Arellano, OAR August 2017.
CASD K-6 Transition Plan from DIBELS to DIBELS Next
Reading Inventory Expectations
DIBELS Next Overview.
Special Education teacher progress monitoring refresher training
CARI Training Content Area Reading Indicators Introduction
Presentation transcript:

Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012

Agenda Updates from 2012 DIBELS Summit AIMSweb updates Universal Screening Flowchart Advanced Report Generation Reading Data Coordinator Listserv 2 minutes: Share with your elbow partner which agenda item your are most interested in and why.

Updates from 2012 DIBELS Summit and DMG What’s next after Next DIBELSnet Webinars with Wireless Generation

What’s next after Next IDAPEL – French version IDEL – revision of Spanish version will be released soon CRTIEC – Center on Response to Intervention in Early Childhood: –Descriptive study of Tier 1 –Tier 2 & 3 Interventions –Progress Monitoring –Goal to disseminate findings and provide national leadership

What’s next after Next CFOL – Comprehension Fluency Oral Language –Is in development WUF-R –Changes were made Modified administration – all words administered, total time recorded Modified word pool Words were assigned theoretical categories Words randomly stratified –After initial research, changes were made again. Next study will include modified scoring rules to include partial credit

What’s next after Next DIBELS Deep –Brief Diagnostic assessments –Linked to DIBELS Next –For students who are not yet at benchmark or for those who are very inaccurate –Fits in the plan/support phase of the outcomes driven model –Aligned with the Common Core State Standards –Assessments include Phonemic Awareness and WRD Reading –Not all assessments are timed –Teaching suggestions are contained within to help get a student to the concept –Routing form

What’s next after Next DIBELS Survey –Based on DIBELS Next Benchmark Assessment scores –Administered to students who have not reached benchmark goals Takes 5-20 minutes per student to administer Includes four DIBELS Next Measures: –First Sound Fluency (FSF) –Phoneme Segmentation Fluency (PSF) –Nonsense Word Fluency (NWF) –DIBELS Oral Reading Fluency (DORF) Includes a manual, a flipbook with assessor directions and student materials and scoring booklets –Assesses off grade-level progress monitoring –Accurately identifies the level of materials needed for progress monitoring –Supports teachers in selecting intervention materials at the right level

What’s next after Next DIBELS Math –This will be different than other CBM measures currently in the field –Aligned with Common Core State Standards –Still determining correct timing and amount of growth –Early Numeracy and Computation measures KBeginning Quantity Discrimination K-1Number Identification Fluency (1-99) K-1 Next Number Fluency 1Advanced Quantity Discrimination 1 Missing Number Fluency 1-5Computation (addition, subtraction, multiplication, division, & fractions)

What’s next after Next DIBELS 7-9 –The goal is to assess a broad range of critical reading comprehension skills –Currently 190 science, social studies, and prose passages are being written. Must have good flow, content, and be factually accurate Passages are currently going through a rigorous process of review and revision –Comprehension pieces Daze Recall (passage specific questions for vocabulary, events, and summarizing) Multiple choice (passage specific questions for vocabulary, events, and summarizing) –Goal Readability studies in , Comprehension studies in , Benchmark studies in

What’s next with Next PELI – Preschool Early Literacy Indicators –Screening and progress monitoring for ages 3-6 –Book format –5-7 minutes to administer –Skills assessed Comprehension – literal questions, predictions, inferences Alphabet Knowledge – Name upper case letters Phonemic Awareness Vocabulary/Oral Language Story Retell –Current & Future work Development of new books – up to 10 Benchmark goal study Sensitivity to intervention study Version study

Research Opportunities with DMG Contact: –DIBELS Deep Kelly Powell-Smith –Math and WUF-R Courtney Wheeler –PELI and 7-9 Mary Abbott

14 DIBELSnet Sample Reports

Objectives Introduce a new data service from the authors of DIBELS Next, called DIBELSnet Present features of the system Review sample reports Show data entry fields Discuss frequently asked questions 15

Features of DIBELSnet Created and managed by Dynamic Measurement Group, the authors of DIBELS Next Supports data entry of: DIBELS Next, DIBELS 6 th Edition, IDEL (Spanish) IDAPEL (French) PELI (Preschool) DIBELS Math Built to support the use of DIBELS within an Outcomes-Driven Model 16

Log In Screen 17

School or District Overview Report 18 Can be generated by school or district Can be generated by school or district Identifies School, Grade and Year Identifies School, Grade and Year Shows the % in each benchmark status category Shows the % in each benchmark status category

School or District Overview Report 19

Status by Grade 20 Identifies District Shows the % in each benchmark status category Shows the % in each benchmark status category

Status by Grade 21

Status by Measure 22 Identifies District and Grade Identifies District and Grade

Histogram and Box Plot 23 Identifies District and Grade Identifies District and Grade Shows the # of students who scored a certain way Shows the # of students who scored a certain way

Histogram and Box Plot 24 Shows the range of scores relative to the benchmark goal at the beginning, middle and end of the year for one measure Shows the range of scores relative to the benchmark goal at the beginning, middle and end of the year for one measure

25 Effectiveness of Instructional Levels By School

Classroom Report 26

Grouping Report 27

Progress Monitoring Report 28

Student Benchmark Assessment History Report 29

Frequently Asked Questions What is the cost? $1 per student per year Can historical data be imported? Yes Will historical data for DIBELS 6 th Edition be displayed with DIBELS Next? Yes. A line will delineate when the transition occurred Can I determine who has access to what data? Yes. Local personnel assign passwords and determine level of access 30

Frequently Asked Questions Do I have to sign an agreement? Yes, it can be downloaded from dibels.net What support is available? Manual can be downloaded from dibels.net Who do I contact? Josh Wallin Customer Support 31

DMG Monthly Webinars March 27, 3-4 p.m. EST: Data Driven Decision Making April 24, 3-4 p.m. EST: DIBELS and the Common Core May 22, 3-4 p.m. EST: Role of Retell in DIBELS

AIMSweb Updates Browser-Based Scoring Review of AIMSweb National Norms and Default Criteria Webinar dates/Resources

Browser-Based Scoring Online scoring Eliminates need to print student scoring booklets, hand-score, and then enter scores online. Does require that each screener has access to a computer or other electronic device with internet access. Used to be an additional cost per student, but is now included as part of the Reading Pro package. Can be used for the following reading subtests: Letter Naming Fluency Letter Sound Fluency Nonsense Word Fluency Phoneme Segmentation Fluency Reading CBM

Browser-Based Scoring Tutorial Videos Developed by: Lisa Langell, M.A., S.Psy.S. National Manager of AIMSweb Professional Development Assessment & Information BBS Public Overview: R-CBM Training BBS: TEL Training BBS:

AIMSweb National Norms

7 Sources of Info on the National Norms and Default Cut Scores

What are the National Norms? A set of Raw Scores and Percentile Ranks that was designed to be representative of the national student population. Data were selected from schools with an AIMSweb account. Most data were from the school year.

Only schools conducting universal screening in the Fall, Winter and Spring were used to develop the National Norms. (excluded schools just using AIMSweb with a subpopulation) A school and grade level was included if at least 95% of the enrolled population (based on NCES numbers) was screened in Fall, Winter and Spring. What are the National Norms?

Available for TEL, R-CBM, and Maze (among others) and replace the AIMSweb Aggregate Norms for these measures. Not available for DIBELS 6 th Ed., DIBELS Next, or R-SPAN The AIMSweb Aggregate Norms represented all data in the AIMSweb system from year to year. In most cases, the raw scores for the National Norms are slightly higher than the Aggregate Norms from previous years. What are the National Norms?

AIMSweb National Norms

AIMSweb Defaults

How were the Default Criteria Developed Using data in AIMSweb accounts from 20 states (Michigan not included) The cut scores represent averages across the states providing data. Tier 1 cut scores for R-CBM: 80% probability of passing the “typical” state test Tier 2 cut scores for R-CBM: 50% probability of passing the “typical” state test Because the cuts for R-CBM and M-CAP consistently fell around the 15 th and the 45 th percentiles of the National Norms, the cut scores for Maze and other math measures we set at the same percentiles of the National Norms. The higher cut score (35 th percentile) for TEL and TEN is based on a success-probability study of TEL done by Silberglitt.

AIMSweb Defaults

How to Run Reports using the AIMSweb Defaults

Tier Transition Report

Select AIMSweb Defaults as the Criteria

Tier Transition Report

Scores and Percentile (Rainbow)

Select AIMSweb Defaults as the Criteria Select Criterion as the Report Method

Scores and Percentile (Rainbow)

Norm Chart/ Comparison Report

Select AIMSweb Defaults as the Target Sets

Norm Chart/ Comparison Report Targets based on AIMSweb Defaults

The AIMSweb Default CAN be applied to previous data:

Universal Screening & Fidelity Checks Why do we want to do fidelity checks? –Check the accuracy of administration of universal screeners for Benchmark and Progress Monitoring –To ensure Reading CBM data is accurate for decision-making Universal Screening Flowchart Examples of AIMSweb AIRS and DIBELS Next Accuracy Checks

Examples of Accuracy Checks

Early Warning Signs (EWS) Tool Middle & High School Universal Screener

EWS for Middle & High Schools A Universal Tool: Enables schools/districts to identify students who may be at risk for academic failure Monitor students’ responses to interventions Relies on student level data available at the school or district including indicators for attendance, course failures, and behavior to calculate potential risk for eventual dropping out Purpose: To support students with an increased risk of academic failure, in order to get them back on track for academic success and eventual graduation

EWS Indicators

Early Warning Signs

Advanced Report Generation and Data Analysis Why conduct a subgroup analysis with CBM measures? –Distribution Report (DIBELS Data System) –Tier Transition Report (AIMSweb) Class List Reports (DIBELS Data System)

Reading Data Summary Pink Assessment Binder

DIBELS Distribution Reports Disaggregated by results by school, class, or demographics Example shows disaggregation by school

AIMSweb Subgroup Analysis Tier Transition Report: Grade Level Choose Reports and Grade tabs

AIMSweb Analysis Tier Transition Report: Grade Level Click on Expand

AIMSweb Subgroup Analysis Tier Transition Report: Grade Level Select: Ethnicity or Meal Status Report Criteria should be at AIMSweb Defaults Grade: Click Display

AIMSweb Subgroup Analysis Tier Transition: Grade Level Demographic information has been filtered based on selection criteria

Data Coordinator Listserv Purpose of listserv: Share information related to DIBELS and AIMSweb measures including: –Measurement/system updates –Tools and resources for data collection and analysis. –Networking with other Reading Data Coordinators If you want to be added contact either: –Nikki Matthews –Jennifer Rollenhagen

3-2-1 Processing From today’s session: What 3 big ideas are you going to take away?What 3 big ideas are you going to take away? What 2 questions do you have?What 2 questions do you have? What 1 action are you going to implement when you return to your district?What 1 action are you going to implement when you return to your district?