First Sound Fluency & Phoneme Segmentation Fluency Phonemic Awareness

Slides:



Advertisements
Similar presentations
Digging Deeper with DIBELS Data
Advertisements

Understanding DIBELS Next
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
Changing the World through an Outcomes-Driven Model Roland H. Good III University of Oregon 38 th Annual PA School Psychologists.
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
Eugene, OR Brown Bag Presentation: November 19, 2007
1 Data-Based Leadership Cohort B March 2, 2006 (C) 2006 by the Oregon Reading First Center Center on Teaching and Learning.
Implementing a Comprehensive Reading First Assessment Plan
Oregon Reading First (2009)1 Oregon Reading First Regional Coaches’ Meeting December 10, 2009.
Oregon Reading First (2009)1 Oregon Reading First Webinar Data-based Action Planning Winter 2009.
What Can We Do to Improve Outcomes? Identifying Targets of Opportunity Roland H. Good III University of Oregon WRRFTAC State.
Changing the World through Reading First Using an Outcomes-Driven Model Roland H. Good III University of Oregon WRRFTAC State.
Oregon Reading First (2010)1 Winter 2010 Data Based Planning for Instructional Focus Groups.
Oregon Reading First (2009)1 Oregon Reading First Regional Coaches’ Meeting November 12, 2009.
Reading First Assessment Faculty Presentation. Fundamental Discoveries About How Children Learn to Read 1.Children who enter first grade weak in phonemic.
Angela is a 6 th grader in a k-8 schools. She hit her reading benchmarks in K, 1 st and 2 nd Grade. She was proficient on state testing in 3 rd grade,
From Data to Dialogue: Facilitating meaningful change with reading data Ginny Axon misd.net) Terri Metcalf
Cohort 5 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Intervention Placement Process: Finding the Right Fit Cadre 8 Training Feb 5, 2012.
Grade-level Benchmark Data Meetings
Interpreting DIBELS reports LaVerne Snowden Terri Metcalf
1 Preventing Reading Difficulties with DIBELS Assessment.
Digging Deeper with Screening Data: Creating Intervention Groups Seaside School District March 17, 2010 Adapted from a presentation by.
1 RtII: Response to Instruction and Intervention Wissahickon School District.
1 Welcome! to Leeds Elementary ARI Reading Coach Cynthia Wallace.
Progress Monitoring for students in Strategic or Intensive intervention levels Based on the work of Roland Good and Ruth Kaminski.
Dynamic Measurement Group (DMG) Part 2.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Response to Intervention: Improving Achievement for ALL Students Understanding the Response to Intervention Process: A Parent’s Guide Presented by: Dori.
RTI Procedures Tigard Tualatin School District EBIS / RTI Project Jennifer Doolittle Oregon Department of Education, January 27, 2006.
Using Data in the EBIS System Universal Screening and Progress Monitoring.
DIBELS Data: From Dabbling to Digging Interpreting data for instructional decision-making.
Designing and using assessment systems to prevent reading difficulties in young children Dr. Joseph Torgesen Florida State University and Florida Center.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. What is Student Progress Monitoring and How Will it Help Me? Laura Florkey.
Response to Intervention in KPS Linda Campbell
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
Data Analysis MiBLSi Project September 2005 Based on material by Ed Kameenui Deb Simmons Roland Good Ruth Kaminski Rob Horner George Sugai.
HOW DO WE USE DIBELS WITH AN OUTCOMES-DRIVEN MODEL? Identify the Need for Support Validate the Need for Support Plan Support Evaluate Effectiveness of.
Benchmark Data Meetings Presented to Coaches September 6, 2013 Adapted from MiBLSi materials.
ELLA Module 3 Assessments and Interventions. Goals for Today: Participants will be able to: Identify the four purposes for assessment. Align DIBELS assessments.
Detroit Public Schools Data Review and Action Planning: Schoolwide Reading Spring
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Tallassee Elementary Summary of Effectiveness DIBELS Report Data Meeting May 9, 2012 Presenter: Cynthia Martin, ARI Reading Coach.
WORKING TOGETHER TO HELP CHILDREN SUCCEED. *providing high-quality instruction/intervention matched to individual student needs *using a researched-based.
Digging Deeper with Screening Data: Creating Intervention Groups Gresham-Barlow School District September 8, 2011.
What Do I Do With My DIBELS Data? Aligning Student Needs and Instruction source: Tracy Cormane:
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
Data-based Decisions: A year in review Sharon Walpole University of Delaware.
Effective Behavior & Instructional Support. Implementing RTI through Effective Behavior & Instructional Support.
1 Linking DIBELS Data to Differentiated Instructional Support Plans 32 nd Annual COSA Seaside Conference June 23, 2006 Hank Fien, Ph.D. Center for Teaching.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Addressing Questions with KN, 1st and 2nd Grade Reading
Data Review Team Time Spring Purpose 0 This day is meant to provide school leadership teams with time to review the current status of their.
1 1.0 Review DIBELS Next ® Data Interpretation & Tier I Reading Systems.
Nuts and Bolts of Progress Monitoring Response to Instruction and Intervention RtII.
Data-Driven Decision Making
DIBELS.
Data Review Team Time Fall 2013.
Data Review Team Time Winter 2014.
Data-Based Leadership
Data Review Team Time Spring 2014.
DIBELS Next Overview.
RTI Readiness Conference: Intensive Levels of Assistance
DIBELS: An Overview Kelli Anderson Early Intervention Specialist - ECC
RTI Procedures Tigard Tualatin School District EBIS / RTI Project
Presentation transcript:

First Sound Fluency & Phoneme Segmentation Fluency Phonemic Awareness DIBELS Next Measure Big 5 Idea in Reading First Sound Fluency & Phoneme Segmentation Fluency Phonemic Awareness Nonsense Word Fluency -Correct Letter Sounds -Whole Words Read Basic Phonics Skills Oral Reading Fluency -Accuracy Advanced Phonics -Correct Words/Minute Fluency -Retell -Retell Quality DAZE Comprehension

Essential Questions How do we use DIBELS with an outcomes- driven model? How are the data results different? How do we read the new reports? How can we use the information to plan instruction and change reading outcomes? How can we use the results to evaluate progress? How can we use the results to evaluate the effectiveness of our programs?

Composite Score Replaces old instructional recommendation Composite Score Benchmarks Composite Score Formula Worksheets Red Partner: Figuring 1st & 4th Grade Meet with Red Partner and practice figuring the composite score for the 1st and 4th grade student using the samples in the data sample handouts

New Benchmarks Look at new benchmarks. What’s different? Benchmarks for each measure but only certain measures factor into the overall composite score – see formula worksheets For example, retell quality doesn’t figure into formula Whole group discussion

How do we use DIBELS with an outcomes-driven model? Identify the Need for Support Benchmark Assessment Validate the Need for Support Plan Support Evaluate Effectiveness of Support Implement Support Progress Monitoring Review Outcomes Benchmark Assessment

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

Step 1: Identify Need for Support What do you know? Are there students who may need additional instructional support to achieve benchmark goals? How many students may need additional instructional support? Which students may need additional instructional support? What data to use? Histograms Box Plots Class Lists

Histograms Summarizes the distribution of scores of all children in a grade within a school or district relative to the progressive benchmark goal for the time. Student performance is depicted in 3 categories according to students who have met benchmarks, making progress toward benchmarks, or are seriously below benchmarks

Yearly Box Plot Summarize the distribution of performance in a class at a single point in time. The box depicts the range of scores for a school or district relative to the benchmark.

Practice Activity Blue Partner: Review Histograms and Box Plot Samples for 1st grade. Be prepared to report out the following: What do you know from the data? What are the implications for curriculum and instruction, professional development and teacher support? Use 1st grade data in data packet to review histogram and box plot samples. Then, fill out “reviewing Historgrams and Box Plots and be ready to share out with whole group.

Class List & Grade List Reports Provide information on individual students at a given assessment period. 3 types of reports Need for Support District Percentile National DIBELS Data System Percentile How do we get to them on the website? Show how to navigate to them on the website.

Class List: Need for Support Need for Support recommendations are provided for individual measures and the DIBELS Next Composite Score. Let them look at their reports or sample

Class List: District Percentile Calculated based on the scores of students in your district's DIBELS Data System account during the selected year. Let them look at their reports or sample

Class List: National DIBELS Data System Percentile Calculated based on student scores from the Sentinel Schools Project conducted by the Center on Teaching and Learning at the University of Oregon during 2010-2011 school year. Let them look at their reports or sample

Highlight Reports? Remember how some of us use to highlight the reports so we could identify areas of strength and weaknesses. Well, now they do this for you in the class progress summaries

Class Progress Summary Student scores for one class over an academic year

Practice Activity 5 min. to analyze your own class list reports What do we know from the data? Which students may need additional support? What current instructional recommendations do you have for students that are struggling? Ask if anyone is willing to share out what they discovered.

Distribution Report Disaggregated results by school, class, or demographics Would need to add demographic information for each student through “edit class demographics” function. Show sample and how they can get that information.

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

Step 2: Validate the Need for Support What do you need to know? Are we reasonably confident the student needs instructional support? Rule out any reasons for poor performance such as bad day, confused on directions, ill, shy, etc. What data can you use? Repeat assessments using progress monitoring booklets At least 2 more times, not on same day but within 1 week

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

Step 3: Plan for Support Decisions/Questions What are the goals of instruction? How much instruction support is needed? How will children be grouped for support? What specific skills should we teach? What instructional curriculum/program to use? What specific instructional strategies to use?

Big Ideas & Instructional Goals Instructional goals should be guided by the five Big Ideas: Phonemic Awareness Alphabetic Principal (Phonics) Accuracy and Fluency with Connected Text (Fluency) Vocabulary Reading Comprehension Remind them of the big idea puzzle and how each area assessed on dibels really asses a big 5 idea in reading

Considerations in Planning Instruction Think about what DIBELS results indicate… Are my students on track? What do I need to target for my instruction? Look at student booklets/patterns of errors for additional direction. Example 1: Are students not reaching benchmark on NWF because they don’t know letter-sound correspondences or because they are not blending sounds together? Example 2: Are students not reaching benchmark on ORF because they are accurate but not fluent OR because they have low fluency?

Try it out… (green partner) Kindergarten Classroom Spring 50% at benchmark on PSF and NWF Goal??? What would you set for the goal? Increase PA (because they’re behind) and also work on alphabetic principle skills because it’s the time of year they need to develop these

Try it out… (green partner) First-Grade Classroom 80% at benchmark on ORF, 40% on NWF Goal??? What would you set for the goal? Increase alphabetic principle skills & accuracy & fluency w/ connected texts skills so they don’t nose dive into 2nd grade

Try it out… (green partner) Second Grade Classroom Fall 90% benchmark on NWF, 40% benchmark on ORF Goal??? What goal would you set? Would want to know if they barely hit NWF benchmarks or if they were way over. Increase accuracy, & fluency w/ connected text Also work on advanced phonics skills

Try it out… (green partner) Third Grade Classroom 95% benchmark on ORF, 45% benchmark on DAZE Goal??? What goal would you set? Work on comprehension skills Determine instructional needs of 5% that didn’t make it on ORF

How much instructional support? What level of support is needed to change your student’s reading trajectory? Double dose of reading instruction? Before/after school tutoring? Preteach/reteach? Different materials? What factors can YOU alter to meet your students needs?

RtI Framework Academic System Behavioral System 1-5% Intensive Individualized Interventions 5-10% Targeted Interventions 80-90% School-Wide Instruction

How will students be groups for instruction? Students with same composite score or overall instructional recommendation DO NOT necessarily have the same instructional needs. Students who have scores within the same range on a measure DO NOT necessarily have the same instructional needs. Ask them why this would be?

Grouping Students Analyze student performance across all measures Group students with similar instructional needs Its important to consider how each DIBELS Measure relates to the BIG Ideas of reading instruction and to each other

Considerations for Groupings You MUST look at the scoring protocol – a number is NOT enough information for grouping purposes Ask yourself Is the student accurate but slow? How accurate? Are there any error patterns? Is a problem fluency-based? Is the student making multiple errors and performing at a slow pace?

Considerations for Groupings Are additional diagnostic assessments, placement tests, and/or work samples needed? What student factors do I need to consider? (behavioral needs, attendance, etc) What personnel resources do I have an what does my schedule/time allotment for instruction look like?

Sample 1st grade http://esu6-readingnews.wikispaces.com/ Grouping Worksheets Explain how they developed some for 6th edition but they’re not ready yet for the new DIBELS Next. However, you can do something similar. Sample 1st grade http://esu6-readingnews.wikispaces.com/

Practice Activity Choose one of your class list reports (or Class Progress Summary) and look for students that will have similar skills needs. See if you can think through some students that may be in the same groups.

Planning Support DIBELS may be used to identify general area(s) in need of instruction (e.g., alphabetic principle) Additional data may be needed for making decisions about which particular skills within a big idea that you should target (e.g., short u and e)

Diagnostics? For specific skill level use: The primary questions are: Error analysis of DIBELS Knowledge of student performance in class Program assessments Supplementary assessments The primary questions are: What can the child do? What specific skills does the child need? Ask them to share what diagnostics they’ve used. Reference Core Phonics Survey

What skills should we teach? Scenarios (Yellow Partner) What if a student is low on First Sound Fluency and Phoneme Segmentation Fluency? Target Phonemic awareness 2. What if a students is low on Nonsense Word Fluency? If NWF accuracy is below 97%, target accuracy w/ beginning phonics If NWF accuracy is at/above 97%, but low recoding, target blending with phonemic awareness blending skills If NWF accuracy is at/above 97%, target building automaticity (fluency) Have them discuss each scenario with yellow partner Share out with whole group

What skills should we teach? Scenarios (Yellow Partner) What if a student is low on oral reading fluency? Target fluency with connected text if accuracy is greater than 95% Target alphabetic principle if accuracy is less than 95% Target comprehension and/or vocabulary if student is making meaning distortion errors What if a student is low on ORF + DAZE Teach fluency & comprehension Discuss w/ partner. Share out w/ whole group

Share out with shoulder partner Practice Activity Identify 2 students on your class list Determine what they need for instructional support Share out with shoulder partner

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

Step 4: Evaluate & Modify Support What do you need to know? Is the additional instructional support effective in getting the students on track to achieve the next benchmark goal? What data can you use? Progress Monitoring Booklets or Graphs Individual Student Performance Profiles (not ready yet) Class Progress Graph

DIBELS PM Graphs Review a Sample Graph What advantage do you have for using their aimline? What disadvantage do you have for using their aimline? Have them discuss how some students may need an individual growth rate drawn on their booklet by using the ambitious growth rates

Expected ORF Growth Rates Grade Realistic Goal Ambitious Goal 1st 2.0 words/week 3.0 words/week 2nd 1.5 words/week 3rd 1.0 words/week 4th .85 words/week 1.1 words/week 5th .50 words/week .80 words/week 6th .30 words/week .65 words/week Explain that some school districts are setting their own growth rates for students based upon these ambitious growth rates Fuchs, Fuchs, Hamlett, Walz, & Germann (1993)

Growth Rates for NWF & PSF No scientific guidelines on ambitious growth rates for NWF or PSF at this time Tentative Guidelines: PSF – 2-3 segments/week NWF – 2-3 letter sounds/week

Individual Student Problem Solving Agenda April’s Sample Agenda

Class Progress Graph Student scores for one class and measure, graphed over time. Look at the class progress graph you brought. How would this report provide you with information about individual students? Discuss as a whole group

Student History Report Brown Partner: Look at Student History Report What does this report tell you? How might it be helpful in evaluating support provided?

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

Review Systems Level Outcomes What do we need to know? How is the curriculum/program working? Who is the curriculum/program working for? Are we doing better this year than last year? What data can you use? Summary Report Histograms Cross-Year Box Plots Summary of Effectiveness Reports

Summary Report Means and progress over time by school or district 5 min. to analyze your own report. Ask yourself: What does the data tell you? Why might this be happening? What do we need to do about it? Show sample and explain how to read.

Histograms Summarizes the distribution of scores of all children in a grade within a school or district relative to the progressive benchmark goal for the time. Student performance is depicted in 3 categories according to students who have met benchmarks, making progress toward benchmarks, or are seriously below benchmarks Compare Histograms during beginning, middle, and end of the year. We’ve looked at these already. Just a reminder that these can be used to see how many of your students are reaching the goal and the distribution you have of students that aren’t meeting it.

Cross-Year Box Plots Distribution of benchmark scores by measure across multiple assessment periods and years. Orange Partner: Look at the sample cross-year box plot. What does the data tell us? What do we need to do about it? Discuss w/ partner and share out with whole group

Summary of Effectiveness Reports Progress of students by Composite Score or Instructional Recommendation over time. Helps us determine 3 things: How effective is our core instruction? How effective is our strategic instruction? How effective is our intervention instruction? Show sample report and explain how to read. Give them a bit to think and process how to read it.

How effective is our core instruction? Core Program is effective if it: Meets the needs of 80% of all students in the school. Supports 95-100% of benchmark students to make adequate progress and achieve the benchmark.

How effective is our supplemental support? Supplemental Program is effective if it: Meets the need of 5-10% of students that need supplemental instruction. Supports 80-100% of strategic students to achieve the benchmark goal.

How effective is our intervention support? Core Program is effective if it: Meets the needs of 5% of students in the school that need intensive instruction. Supports 80%-100% of intensive students to reduce the risk of reading difficulty to strategic or achieve the benchmark goal.

What is adequate progress? Benchmark Students Effective core instruction should: Support 95% of benchmark students to maintain benchmark. Strategic Students Effective supplemental support should: Support 80% of strategic students to achieve benchmark Intensive Students Effective interventions should: Support 80% of intensive students to achieve strategic or benchmark benchmark

Partner Activity Purple Partner Look at Summary of effectiveness sample report Determine the effectiveness of: Core Instruction Supplemental Instruction Intervention Instruction What else did you notice? Have the discuss with partner and be prepared to share out their findings.

Outcomes-Driven Model ODM Step Questions Data Identify Need Are there students who may need support? How many? Which students? Benchmark data: Histograms, Box Plots, Class List Report, Class Progress Summary, Distribution Report Validate Need Are we confident that the indentified students need support? Benchmark data and additional information: Repeat assessment, use additional data, knowledge of/information about student Plan Support What level of support for which students? How to group students? What goals, specific skills, curriculum/program, instructional strategies? Benchmark data and additional information: Individual student booklets, additional diagnostic information, knowledge of/information about student Evaluate Support Is the support effective for individual students? Progress monitoring data: Individual student progress graphs, class progress graphs, student history report Evaluate Outcomes As a school/district: How effectives is our core (benchmark) support? How effective is our supplemental (strategic) support? How effective is our intervention (intensive) support? Benchmark data: Summary Report, Histograms, Cross-Year Box Plots, Summary of Effectiveness Reports, Distribution Report

What information do you plan to share with your district? Give them time to think about this and jot down some notes to themselves. Go around and share out.

Evaluation at www.esu6pdsurveys.wikispaces.com Thank you! Evaluation at www.esu6pdsurveys.wikispaces.com