Lessons Learned from the Evaluation of an Inquiry Science Project Presented by Sarah Brasiel, Leslie Grunden, and Eric Rolfhus
Outline of Presentation Project and Evaluation Overview Ensuring Quality Data for Informative Analysis Problem-solving Data Collection Issues Presenting Findings that are of Utility for Program Improvement Documenting implementation issues Project management
Project and Evaluation Overview
Inquiry-Based Science Kit Instruction Project Based on the National Science Resources Center (NSRC) inquiry-centered science education reform model High quality K–8 curriculum Sustained PD for teachers Material support Community and administrative support Assessment and evaluation. National Science Resources Center was formed by the Smithsonian Institution and the National Academies in 1985.
Participants 4 Southwest school districts 10 schools Elementary students (n=1,514) and teachers (n=77) in grades 3–6 Science Coaches
Inquiry- based Science Kits Theory of Change Teacher Professional Development and Coaching Increased teacher content and pedagogical knowledge and skills in science Inquiry- based Science Kits Effective delivery of classroom instruction Improved student achieve- ment
Evaluation Components Teacher Knowledge and Skills Curriculum assessment performance Delivery of Classroom Instruction Fidelity of implementation Student Achievement State standardized Science assessment performance
Inquiry-based Science Kits Full Option Science System (FOSS), developed by the Lawrence Hall of Science, Berkeley Science and Technology for Children (STC), developed by the National Science Resource Center (NSRC)
Ensuring Quality Data for Informative Analysis
Student Data Issue Resolution Unusual patterns in over student kit assessment performance data Resolution Time spent to document issues and discuss with project leader to correct Requested item level data Re-ran analyses and problem-solved data issues
Problem Solving Data Collection Issues
Classroom Observations Issue If observers did not see behavior, they were to leave form items blank, resulting in skewed data Recommend Resolution for Future Require a rating per Item to document expected teacher and student behaviors Determine acceptable benchmark of low versus high implementation
Availability of Data Issue Resolution Not granted access to state data to perform propensity score matching Resolution Requested and granted access to data within participating (plus one additional) district to perform student level propensity score matching
Presenting Findings that are of Utility for Program Improvement
Issue: Presentation of Findings not Useful Main users of report did not find typical statistical displays useful Example: Distribution of Student Performance at Pretest and Posttest
Resolution: Item Analysis with Stars for Areas of Low Effect
Resolution: Reflection for Areas of Low Effect Take a look at this item. This item had a small effect for both students and teachers. What makes this item difficult? What instructional strategies might be used to improve understanding related to this item?
Classroom Observations
HLM Analysis findings
Documenting Implementation Issues
Issues Resolved through Communication Biweekly meetings with client Interim Reports Discuss potential implementation biases Future recommendations for improvement of program in future as well as study design
Project Management
Deliverables Issue: Late delivery of data to evaluators Resolution: Change in statement of work to remove hard deadlines of exact dates for deliverables to # of days following receipt of data.