Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Dabbling” in DIBELS with DataDirector St. Clair County RESA August 21, 2009.

Similar presentations


Presentation on theme: "“Dabbling” in DIBELS with DataDirector St. Clair County RESA August 21, 2009."— Presentation transcript:

1 “Dabbling” in DIBELS with DataDirector St. Clair County RESA August 21, 2009

2 Background Our district selected/purchased DataDirector in Summer 2008. We first learned of DIBELS entry and reporting capabilities (BETA) at the April 2008 DataDirector Users Conference This became an additional “selling point” for us in the selection of DataDirector

3 Background cont’d In late spring 2008 we began the “clean-up” of DIBELS data to export/import –Accurate/complete student data UICs/student id #s Previous errors had been made in entry of student information (e.g., name spelling, nickname used) Some cases of previous school within district not “unassigning” the student, and the new school creating a whole new record when the student enrolled there –Merging of individual/separate records when possible (after data “clean-up”)

4 Benefits of DIBELS in Data Director Cost “savings” “All” student data in one location instead of multiple databases with different data/assessments, and not connected to each other Allows more complete “profile” of student assessment history/performance DIBELS scores all linked to same student Do not have to create a new (separate) student record for retentions, change in student demographics, etc. Schools not having to “unassign” when move (e.g., to another school in the district) –This is done for them when the student information system “syncs” every few days with DataDirector

5 Benefits of DIBELS in Data Director cont’d “Automatic” entry of student information—schools just have to enter the DIBELS score(s), not the student identifying information as well (which was an area we previously had many problems with) Enhancements and “gaps” in existing DIBELS reports in DataDirector have encouraged more detailed/in-depth discussions within the district on reading assessments, student progress and intervention effectiveness

6 DIBELS Reporting in DataDirector: Where we are, how we use it, what we’ve learned, and how we’re changing

7 Where we are…… DIBELS benchmark data from the last two years was exported/imported prior to the start of the 2008-09 school year. All elementary schools now enter benchmark and progress monitoring data directly into DataDirector. DIBELS data/reports are currently used as part of RTI and as part of examining/monitoring core support interventions –“Point in time” reports –“Performance over time” reports

8 Direct Entry of DIBELS into DataDirector DIBELS is one of the ‘Assessment Structures’ already set up in DD Assessment structures in DD allow the user to upload or hand-enter summative data (e.g., overall score, not item level) When DIBELS data is entered into the DD assessment structure for this assessment, reports are automatically available through the pre-built reports section in DD. Benchmark (beginning, middle, and end) and progress monitoring (post-beginning, post-middle) assessment structures exist in DD. Must request of DD that they “share” with you any of the assessment structures already in DD that you want to use –Point person in the district then shares those within the district

9

10

11

12

13 Existing DIBELS reports in DataDirector Class Progress Report Distribution Summary Progress Monitoring Summary of Effectiveness (K-3) Instructional Recommendation Report

14 “Point in Time” Reports Instructional Recommendation Report Distribution Summary Report

15 Instructional Recommendation Report After importing our DIBELS data into DataDirector over the summer, we began verifying the assessment data we loaded by running different reports –For DIBELS, we began with the Instructional Recommendation report Initial “panic” because 4 th and 5 th grade is “missing” the instructional recommendation –Instructional recommendation is included for these grade levels on University of Oregon site, but… »Preliminary estimates only of the goals and cutpoints for these grade levels »In DataDirector, ORF scores are color-coded to reflect these estimates

16 Instructional Recommendation Report—4 th grade

17 Instructional Recommendation Report—3 rd grade

18 Instructional Recommendation Report cont’d As a result of what we did and didn’t have for 4 th and 5 th grade ORF on the DD reports, we began to reflect on and debate the alignment of the measure at upper elementary grade levels and the Michigan GLCEs. We also debated findings by some that ORF and comprehension are correlated. We still continue to use DIBELS as a screening measure at upper elementary (and lower elementary as well), but are especially sensitive at these upper elementary grade levels to use it in combination with other measures that are more closely aligned with the GLCEs.

19 Instructional Recommendation Report con’td Also, we use some caution at 3rd grade –For example, R.WS.03.07—”apply the following aspects of fluency: pauses and emphasis, punctuation cues, intonation, and automatic recognition of identified grade-level specific words and sight words while reading aloud familiar grade-level text.” –Research by Michael Pressley and others

20 Instructional Recommendation Report cont’d We use the DIBELS Instructional Recommendation Report (and other DIBELS reports) as one piece of data for identifying reading interventions (RTI) for individual students –Link to student profile report in DD (click on name) provides us with additional data we use in this process of identification and monitoring

21 Instructional Recommendation Report cont’d The ability to sort students in the DataDirector DIBELS reports by Instructional Recommendation (K-3) and/or scores is extremely helpful in identifying groups of children with similar strengths and/or areas in need of additional support/intervention

22 Distribution Summary Report

23

24 Distribution Summary Report cont’d Provides aggregate information on “risk status” and instructional recommendation for a grade level at a given point in time for benchmark DIBELS data –Can also be used to identify % of students at each “tier” (goal: 5-15-80). Report also is “interactive” and can be used to identify who is at a particular risk status within a particular area

25

26 “Performance over time” reports Summary of Effectiveness Report Class Progress Report Progress Monitoring Graphs –Individual –Group (e.g., intervention group)

27 Summary of Effectiveness Report

28 Allows you to select the two benchmark times you want to compare (beginning, middle, end) Can use to identify if student is making needed progress (vs. staying in same intensive or strategic category)

29 Summary of Effectiveness Report cont’d The Summary of Effectiveness report in DD is very similar to the University of Oregon –Addition of “color coding” makes the results “jump out” Easier to see progress over time by student when comparing time-points (have score/performance as well as the color coding) Resulted in difficult conversations within the district regarding the interventions –Effectiveness of intervention (e.g., number of students making benchmark or moving from intensive to strategic) –Fidelity of intervention (e.g., length) –Staff training on the different intervention models –Match between student need, intervention, and measures being used to monitor progress

30 Class Progress Report Allows you to track the progress of students in a class (or grade level) across the multiple benchmark time points during the year Color coding of student scores allows you to easily “track” progress There is also a class progress graph available –Color coding on this report is based on benchmark timepoint (not risk status or instructional recommendation like in the other DIBELS reports)

31

32

33 Progress Monitoring Graphs Can be run for an individual student or small groups of students at the same time (e.g., intervention group) Select the start and end points (can show over multiple years, or single year) Unlike University of Oregon, does not have aimline or benchmark goals indicated –As a result of not having this information on the DataDirector reports, we had to look more closely at the data and what the gains really were “saying” (e.g., not just are there gains, but are they enough to get them to benchmark, especially given the intervention provided) Not having the histogram report in DataDirector also encouraged us to do this as well.

34

35 What we’ve learned DataDirector DIBELs reports are not identical to University of Oregon’s –Some do not exist at all in DD (but enhancement requests have been put in for some such as the Individual Student Performance Profile report) –Some exist but are not quite the same (e.g., Progress Monitoring reports)…but enhancement requests have been put in –Some exist and are similar, but have additional features which make them more helpful “Interactive” with student profile report Color coding of risk status or instructional recommendation

36 How we’re changing As a result of not having some reports available, and of having additional features on some DD reports, we’ve looked much more carefully at the data/results and effectiveness of intervention. More carefully examining what the data says about student needs and student progress. As a result of not having some “features” on some reports (e.g., Instructional Recommendation for 4 th and 5 th grade), we’ve looked more closely at all of our assessments, their alignment with GLCEs, and their use/application within the district.

37 How we’re changing cont’d We are currently creating district assessments for grades 2-5 that measure student growth relative to the GLCEs across the school year. Given 3 times a year Using DataDirector to create, scan, score, and report the new assessments.

38 Questions?

39 Contact Information: Erika Bolig Director of Assessment & Data Analysis West Ottawa Public Schools bolige@westottawa.net


Download ppt "“Dabbling” in DIBELS with DataDirector St. Clair County RESA August 21, 2009."

Similar presentations


Ads by Google