Presentation is loading. Please wait.

Presentation is loading. Please wait.

David Putnam, Jr., Ph.D. Associate Director, C & I Tigard Tualatin School District.

Similar presentations


Presentation on theme: "David Putnam, Jr., Ph.D. Associate Director, C & I Tigard Tualatin School District."— Presentation transcript:

1 David Putnam, Jr., Ph.D. Associate Director, C & I Tigard Tualatin School District

2  Provide a rationale and framework for literacy intervention at the secondary level  Examine the Maze and its relationship to OAKS  Detail the process for using Maze for Universal Screening, program evaluation, and identifying students in need of additional support  Describe progress monitoring effectiveness, and procedures for analyzing performance and instructional decision making

3  Teaching reading is often considered an elementary school task despite…..  More than 8 million students in grades 4 – 12 are struggling readers (USDoE 2003).  40% of HS students cannot read well enough to benefit from their textbooks (NAEP, 2005).  In Oregon in 2006-07, 33% of 8 th graders and 35% of 10th graders (35%) did not meet OAKS reading.  The problem is more severe when we disaggregate data by racial and special program subgroups.

4 High Expectations for Student Achievement-- And Always Increasing Students w Moderate to Severe Educational and/or Behavioral Needs--Big Prerequisite Skill Deficits Students with a Long History of Failure--Poor Motivation and Lots of Escape Driven Behavior General Education Teachers with Limited Support Skills and Instructional Materials Students’ Programs Being Driven by Graduation Requirements Rather Than Instructional Needs Mark R. Shinn, Ph.D. & Madi Phillips, Ph.D. NASP, 2007

5  Focus resources on teaching literacy strategies proven to increase achievement for all students across all content areas  Execute a comprehensive literacy intervention model to address students in need of strategic and intensive interventions  Use a Three Tier Model morphed to secondary schools

6  All students, IN EVERY TIER, have access to embedded literacy strategies across content areas  Strategies:  Frayer Model  Anticipation Guide  Word Sorts  DR/TA or KWL  Group Summarizing  Definition Word Chart  Differentiated Assessment Tier III Tier II Tier I

7  Core Curriculum  Access to Content Literacy Strategies  A limited number of students are monitored by the Literacy Specialist Target = 80% of student population

8  Content Literacy Strategies Across the Content Areas  Strategic Intervention  Middle School: Soar to Success  High School: Literacy Strategies Classes Target = >15% Student Population

9  Content Literacy Strategies Across the Content Areas  Comprehensive reading and writing support  LANGUAGE! (High School)  LANGUAGE! (Middle School) Target: <5%

10  Universal screening is the process of efficiently assessing ALL students in a timely manner to analyze:  The effectiveness of curriculum, instruction, and school organization  Students’ level of proficiency in essential academic areas  Identify student that MAY need additional help

11  Are 80% of our students meeting the benchmark?  80% by ethnicity?  By program sub-group?  By subject?

12  Helps you to determine if the core curriculum needs to be addressed  Intensity  Fidelity  Targeted  Group size  Instructional skills

13  Periodic and universal screening ensures that no students “fall through the cracks”  Strategic support: Students are placed in a program that provides moderate intervention and progress monitored every 2 weeks  Intensive support: Students are placed in an intervention that is intense and progress monitored every 2 weeks

14  MAZE  OAKS  Grades  Attendance  Office Discipline Referrals (ODRs)

15  Multiple-choice cloze task  Grade-level passage w/ every 7 th word replaced by 3 word choices in parenthesis  Student reads silently and selects as many correct words as they can in 3 minutes  Curriculum-Based Measurement test that is “INDICATOR” of overall reading health  Combines fluency, comprehension, and all other subsumed reading skills  Can be administered to a group; scored later  Easy & quick to administer, multiple forms

16

17  Allows for screening/assessing ALL students, ALL groups of Students, and School-wide literacy in time for intervention  Can use same test to monitor progress  Frequent progress monitoring increases academic achievement  Maze scores are a predictor of performance on OAKS AND NOW HS graduation

18 Maze Correct Choices 3 minutes Writing: CWS minus IWS 7 minutes Probability of Passing Minnesota Basic Skills Test 4910% 73320% 105330% 127040% 148350% 1610060% 1911670% 2213780% 2616290% 37210100% Critical values corresponding to likelihood of passing 8 th grade Minnesota Basic Skills Test – Doug Marston, et al.

19 GradeMedian ScorePassage 1Passage 2Passage 3 6.660.607.668.636 7.689.615.649.706 8.684.634.701.661 All correlations moderate to high Relatively consistent across passages Median correlations “in the middle”

20 Grade Fall Maze Score Needed for 85% Probability of Passing OAKS Reading in Spring Spring Maze Score Needed for 85% Probability of Passing OAKS Reading in Spring 62033 72033 82137

21

22  All students screened 3 times per year  Three, 3 minute tests will be given each time  Screening assessment will occur in (Matrix/Trek/LA class)  Tests will be scored and data entered by (Classified Staff/Parent volunteers/Electronically)  Data will be used for program evaluation and to place students in support  Students in support will be monitored

23  Focused on MAZE, OAKS and Grades  Queried ESIS for a demographic file with student name, ID #, ethnicity, program subgroup  Merged demographic file with data file for each measure  Created an Excel template organized by all subgroups

24  Core Data Analysis Core Data Analysis  MAZE, OAKS, Grades blank template MAZE, OAKS, Grades blank template  MAZE, OAKS, Grades Data Example MAZE, OAKS, Grades Data Example

25  Initial Screening :  Screening process initiated when academic skills fall at or below the 35 % on OAKS, AND/OR  In Middle Schools: Bottom 20% of students on the MAZE-CBM/Maze Benchmark scores  Screen further with San Diego Quick, SRAI, and curriculum placement tests, when appropriate

26  Post Screening Diagnostics and Placement:  6-Minute Solution--check for fluency & accuracy; then,  San Diego Quick to determine level of SRAI to use; then,  SRAI to gauge comprehension skills; then,  Language! placement tests are administered for students with the most significant reading needs

27  Example Excel file Example Excel file  Example of IPAS School Student list

28  What is progress monitoring?  What are the effects of progress monitoring?  How do you conduct progress monitoring at the secondary level?  How do you decide if the intervention is working?

29  An on-going, systematic approach to gathering academic and behavioral data to  evaluate response to intervention, thereby allowing data-based decision-making regarding instruction and learning outcomes on a frequent basis.  help schools establish more effective programs for children who have not benefited from previous programming.  In other words, it tells us if our interventions are working

30 Progress monitoring has been extensively researched in Special Education (Fuchs & Fuchs, 1986) Students showed improved reading scores when teachers: monitored their progress (+.70 effect size; ≈ 25 th  50 th %ile. Like it!) graphed their reading scores (+.80 effect size. Love it!) used decisions rules to determine whether to make a change in instruction (+.90 effect size. Gotta have it!)

31 CBM with decision rules (Fletcher, et.al., 2006)  “goal raising rule” for students responding well: effect size.52 (≈ 25 th  40 th %ile)  “change the program rule” for students not responding well: effect size.72 (≈ 25 th  50 th %ile)  Results in teachers planning more comprehensive reading programs Additional support for effectiveness in General Education (Fuchs, et al., 1994)

32  Select assessment tools  Maze  Determine how often to progress monitor  Every 2 weeks  Identify & Train staff to:  Administer & score  Reading Teacher  Input & Analyze data  Instructional Coordinator  Use the data  Intervention planning at “20%” monthly meetings  Student feedback Sanford & Putnam (2007) 32

33 1. Continuing (Student is making progress, but, continues to need support) 2. Intensifying (Intervention is not working and should be revised), or 3. Referring for Special Education Evaluation (Intensive intervention is proving unsuccessful) 4. Exiting (Intervention no longer needed)

34 Intervention Change: Language C 3-4 Data Points Below the Aimline!

35 Intervention Change: Language C Now that’s WORKIN’!

36 Intervention Change: Language C 3-4 Data Points Below the Aimline! Consider SPED Referral

37 Maze scores indicate 4 or more data points above the aimline AND are at or above the 50 th percentile; AND Grade+ scores are at or above the 5 th stanine; AND OAKS scores are at or above the 35 th percentile

38  Progress monitoring indicates 4 data points below the aimline (maze).  Slope is flat or decreasing AND scores are below 50 th percentile (maze).  Grade+ scores at or below 3 rd stanine.

39  Select Measures  Decide  Who will assess students?  Who will record & graph the information?  Who will make instructional decisions?  Get Training  Establish  Decision rules  Team Process  Schedule for assessment

40  AIMSweb www.aimsweb.org  Easy CBM http://easycbm.com/  National Center on Student Progress http://www.studentprogress.org/ http://www.studentprogress.org/  Intervention Central www.interventioncentral.org www.interventioncentral.org  David Putnam, Jr., Ph.D. dputnam@ttsd.k12.or.us

41


Download ppt "David Putnam, Jr., Ph.D. Associate Director, C & I Tigard Tualatin School District."

Similar presentations


Ads by Google