DOES THE USE OF DATA ANALYSIS TEAMING FOR STUDENT ACHIEVEMENT AND LEVEL OF STUDENT WORK IMPROVE STUDENT PERFORMANCE IN READING? Christina M. Marco-Fies Indiana University of Pennsylvania Dissertation Defense March 11, 2013
Dissertation Defense Rationale for study Literature review School district data collection timeline Study design Research questions and variables Procedures Statistical analyses and results Conclusions Committee discussion
Rationale for Study A Nation at Risk (NCEE, 1983) PISA Results (Lemke et al., 2001) US Dept. of Ed. Office of Spec. Ed. Programs (2005) NCLB (2001) ARRA/Race to the Top (US Dep. of Ed., 2009)
Literature Review Response to Intervention (Batsche et. al, 2006) Assessment Data Analysis Teaming Fidelity Reading Research (NRP, 2000) Classroom Walkthroughs (Teachscape, 2010)
School District Data Collection Timeline DateActivity September 2006School District begins collecting DIBELS data January 2008School District begins training and initiating of data analysis teaming for DIBELS data Fall 2009School District trains select staff with Teachscape Classroom Walkthrough system November 2009School District staff begin to collect Classroom Walkthrough data January 2010School District data analysis teaming occurs for both DIBELS and Classroom Walkthrough data
Study Design SexPre-ORFPost-ORF Collecting DIBELS SexPre-ORFPost-ORF Data Teaming for DIBELS SexPre-ORFPost-ORF Data Teaming for DIBELS and Walkthrough
Research Questions and Variables Question 1: Does collecting DIBELS data increase the percentage of students reaching benchmark in reading compared to a national sample of students? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability Reading Data Teaming Strategy No Initiation of DATSchool RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ’07 DIBELS ORFVery Good
Research Questions and Variables Question 2: Does using data analysis teaming to discuss DIBELS data improve student performance in reading beyond levels that were attained when data were collected and not analyzed? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability SexMale/FemaleSchool RecordsExcellent Reading Data Teaming Strategy DIBELS DATSchool RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ’07 & ’09 DIBELS ORFVery Good
Research Questions and Variables Question 3: Does analyzing DIBELS data and walkthrough data for data analysis teaming improve student reading performance beyond no data analysis teaming or data analysis teaming for DIBELS data only? Does student performance differ depending on sex? Latent VariableObserved VariableInstrument or Source Validity/ Reliability SexMale/FemaleSchool RecordsExcellent Reading Data Teaming Strategy DIBELS and Walkthrough DAT School RecordsExcellent Pre- and Post- Reading Performance ORF Score Winter to Spring ‘07, ’09, and ‘10 DIBELS ORFVery Good
Procedures 1. Archival data gathered from school district 2. Review data teaming logs 3. Analyze data Winter ‘07 and Spring ‘07 DIBELS Winter to Spring ‘07 and Winter to Spring ‘09 DIBELS Winter to Spring’07, Winter to Spring ’09, and Winter to Spring ‘10 DIBELS
Statistical Analysis and Results Sample 174 elementary school students 1 st through 4 th grade Demographics Complications Lack of data log availability Lack of national norms Lack of sample for all grades Lack of demographic information
Statistical Analysis and Results Question 1: Does collecting DIBELS data increase the percentage of students reaching benchmark in reading compared to a national sample of students? Does student performance differ depending on sex? Hypothesis: Collecting DIBELS data will not increase percentage of benchmark students. Statistic: One Sample t-Test
Statistical Analysis and Results One-Sample t-Test for DIBELS Oral Reading Fluency Winter to Spring 2007 Improvement and DIBELS Norms Descriptive Statistics nMeanSD Study Sample Improvement Winter 2007 ORF Spring 2007 ORF DIBELS Norms Improvement23.76 Winter 2002 ORF3, Spring 2002 ORF3, One-Sample t-Test EffectMean DifferencetdfSign. Improvement <.001 Winter >.05 Spring >.05 Note. Mean numbers are expressed in words correct per minute (wcpm).
Statistical Analysis and Results Hypothesis was not supported Students in the study did not show as much improvement as the national sample Possible Reasons Demographic differences Instruction received
Statistical Analysis and Results Question 2: Does using data analysis teaming to discuss DIBELS data improve student performance in reading beyond levels that were attained when data were collected and not analyzed? Does student performance differ depending on sex? Hypothesis: Using DIBELS for DAT will improve student performance in reading. Statistic: ANOVA-RM
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2009 Descriptive Statistics nMeanSD Winter Male Female Spring Male Female Winter Male Female Spring Male Female
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2009 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time234, ,369/73< Sex3.5.01> Time*Sex > Error Post Hoc Comparison of Means W 2007Sp 2007W 2009Sp 2009 W *61.5*78.2* Sp *60.9* W * Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2007 and 2009 Descriptive Statistics nMeanSD Male Female Male Female Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year > Sex > Year*Sex > Error Note. Mean numbers are expressed in words correct per minute (wcpm).
Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2007 and 2009 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring First Grade (2007)3%4%25%28%72%68% Male3%3%22%26%75%70% Female3%5%28%29%69%66% Third Grade (2009)14%10%26%29%60%61% Male 13%10%27%28%60%61% Female16%10%24%29%59%60% Friedman TestWilcoxon Test Overallp.05 Malep.05 Femalep=.001Femalesp>.05
Statistical Analysis and Results Hypothesis was not supported Showed growth in reading over time Did not show significant improvement after DAT for DIBELS began Differences in benchmark levels No differences in improvement in risk levels Possible Reasons DAT is not effective Fidelity of strategies
Statistical Analysis and Results Question 3: Does analyzing DIBELS data and walkthrough data for data analysis teaming improve student reading performance beyond no data analysis teaming or data analysis teaming for DIBELS data only? Does student performance differ depending on sex? Hypothesis: DIBELS and walkthrough DAT will add to the improvement of student performance. Statistic: ANOVA-RM
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2010 Descriptive Statistics nMeanSD Winter Male Female Spring Male Female Winter Male Female Spring Male Female
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2007 and 2010 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time322, ,378.43< Sex > Time*Sex > Error Post Hoc Comparison of Means W 2007Sp 2007W 2010Sp 2010 W *68.5*92.1* Sp *74.8* W * Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2007 and 2010 Descriptive Statistics nMeanSD Male Female Male Female Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year3, < Sex > Year*Sex > Error Note. Mean numbers are expressed in words correct per minute (wcpm).
Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2007 and 2010 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring First Grade (2007)3%4%25%28%72%68% Male3%3%22%26%75%70% Female3%5%28%29%69%66% Fourth Grade (2010)12%12%27%28%61%60% Male 12.5%13%25%28%62.5%59% Female12%12%29%27%59%62% Friedman TestWilcoxon Test Overallp.05 Malep=.001Malesp>.05 Femalep.05
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2009 and 2010 Descriptive Statistics nMeanSD Winter Male Female Spring Male Female Winter Male Female Spring Male Female
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Winter to Spring 2009 and 2010 Analysis of Variance – Repeated Measures EffectMS FdfpPartial Eta Squared Time30, < Sex > Time*Sex > Error Post Hoc Comparison of Means W 2009Sp 2009W 2010Sp 2010 W *7.0*30.6* Sp *13.8* W * Note. Mean numbers are expressed in words correct per minute (wcpm). * Significant at the.001 level.
Statistical Analysis and Results Analysis of Variance – Repeated Measures for DIBELS Oral Reading Fluency Improvement Scores 2009 and 2010 Descriptive Statistics nMeanSD Male Female Male Female Analysis of Variance – Repeated Measures EffectMSFdfpPartial Eta Squared Year < Sex > Year*Sex.3.01> Error Note. Mean numbers are expressed in words correct per minute (wcpm).
Statistical Analysis and Results DIBELS ORF Winter and Spring Percentage of Students at Benchmark Levels 2009 and 2010 AR WinterAR SpringSR WinterSR SpringLR WinterLR Spring Third Grade (2009)14%10%26%29%60%61% Male 13%10%27%28%60%61% Female16%10%24%29%59%60% Fourth Grade (2010)12%12%27%28%61%60% Male 12.5%13%25%28%62.5%59% Female12%12%29%27%59%62% Friedman TestWilcoxon Test Overallp>.05Overallp>.05 Malep>.05Malesp>.05 Femalep>.05Femalesp>.05
Statistical Analysis and Results Hypothesis was not supported Showed growth in reading over time Significant improvement after DAT for DIBELS and Walkthrough began Differences in benchmark levels No differences in improvement in risk levels Possible Reasons Walkthrough DAT is not effective Fidelity of Walkthrough DAT
Statistical Analysis and Results Fidelity of Data Analysis Teaming for DIBELS First Grade ( ): no data teaming Third Grade ( ): 99% fidelity, 5 data logs Fourth Grade ( ): 100% fidelity, 2 data logs Fidelity of Data Analysis Teaming for Walkthroughs Fourth Grade ( ): no data logs found
Conclusions Limitations Data not independent Fidelity of DAT History/treatment interaction Convenience sample Student differences Implications for Practice
Conclusions Future Research Directions Fidelity of Process Assess Fidelity Component Effectiveness Strategies Selected Strategy Fidelity Time Implementing DAT for Other Areas Walkthroughs and Achievement Walkthrough Models Student Variables Replication Studies
Committee Discussion
References Batsche, G., Elliott, J., Graden, J., Grimes, J., Kovaleski, J., Prasse, D.,... Tilly, W. D. (2006). Response to intervention: Policy considerations and implementation. Alexandria, VA: National Association of State Directors of Special Education. Lemke, M., Calsyn, C., Lippman, L., Jocelyn, L., Kastberg, D., Liu, Y.,... Bairu, G. (2001). Highlights from the 2000 program for international student assessment. Washington, DC: National Center for Education Statistics. National Commission on Excellence in Education (NCEE). (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office. National Reading Panel. (2000). Teaching children to read: An evidence based assessment of the scientific literature on reading and its implications for reading instruction. Bethesda, MD: National Institute of Child Health and Human Development.
References No Child Left Behind Act of 2001, PL , 115 Stat. 1425, 20 U.S.C. §§ 6301 et seq. Teachscape. (2010). Classroom walkthrough. Retrieved August 1, 2010 from walkthrough.html United States Department of Education. (2009). Race to the top program executive summary. Retrieved November 7, 2010, from summary.pdf United States Department of Education Office of Special Education Programs. (2005). Reading rockets: Toolkit for school psychologists. Washington, D.C.: Greater Washington Educational Telecommunications Association, Inc.