Fidelity of Implementation in Scaling-up Highly Rated Science Curriculum Units for Diverse Populations Carol O’Donnell and Sharon Lynch The George Washington.

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Ohhhhh, Christopher Robin, I am not the right one for this job... The House on Pooh Corner, A.A. Milne.
Formative and Summative Evaluations
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Types of Evaluation.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
ESOL Update 2014 Van Wert Elementary 2014/2015 School Year.
ASSESSMENT Formative, Summative, and Performance-Based
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Scaling Up Curriculum for Achievement, Learning, and Equity (SCALE-uP): Highlights from a 7-year research program funded by NSF/IERI Sharon Lynch, PI SCALE-uP,
Defining, Conceptualizing, and Measuring Fidelity of Implementation and Its Relationship to Outcomes in K–12 Curriculum Intervention Research Prepared.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Improving Implementation Research Methods for Behavioral and Social Science Working Meeting Measuring Enactment of Innovations and the Factors that Affect.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Discussant Comments Prepared by Carol.
DR-K12 PI Meeting Building the Knowledge Base of Teacher Learning in STEM Education A Framework and Suite of Adaptable Instruments for Examining Fidelity.
PROPONENTS: Isabelita R. Hizon, Ed. D. Susan O. Habacon INQUIRY-BASED COLLABORATIVE LEARNING PROGRAM (ICLP) FOR MANAGING LARGE CLASSES AND ITS EFFECT ON.
Research Strategies Chapter 6. Research steps Literature Review identify a new idea for research, form a hypothesis and a prediction, Methodology define.
Scientific Validation Of A Set Of Instruments Measuring Fidelity Of Implementation (FOI) Of Reform-based Science And Mathematics Instructional Materials.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Laying the Foundation for Scaling Up During Development.
Quantitative and Qualitative Approaches
Educable Mental Retardation as a Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Experiments. The essential feature of the strategy of experimental research is that you… Compare two or more situations (e.g., schools) that are as similar.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Researchers Without Borders Webinar 4 A Framework and Suite of Instruments for Examining Fidelity of Implementation Jeanne Century Center for Elementary.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Sample Outcome Data from Makes Sense Strategies Research Impact on WRITING Impact on VOCABULARY Impact on INFORMATION PROCESSING SKILLS Impact on CONTENT.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
Incorporating Instructional Design into Library Instruction Classes NEFLIN Live Online July 7, 2011.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Preparing to Facilitate Mathematics Professional Development: Aiming for Alignment Between the Program and the Facilitator Nanette Seago Karen Koellner.
Preparing Facilitators to Use and Adapt Video-Based Professional Development Materials with Fidelity Nanette Seago WestEd, Redwood City, CA Karen Koellner.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Middle School Training: Ensuring a Strong Foundation of Supports
Issues in Evaluating Educational Research
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Classroom teaching observation tools
Coaching and Supervision:
Evaluating Student-Teachers Using Student Outcomes
Classroom Assessment A Practical Guide for Educators by Craig A
ISBE Mathematics Foundational Services Training
Evaluation of An Urban Natural Science Initiative
Formative Assessments
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment Validity And Bias in Assessment.
Chapter Eight: Quantitative Methods
Research Methods A Method to the Madness.
Big Data, Education, and Society
Overview of Major Kinds of Educational Research
Measuring Teachers’ Fidelity of Implementation
Mary Weck, Ed. D Danielson Group Member
9 Experimental Design.
Considering Fidelity as an Element Within Scale Up Initiatives Validation of a Multi-Phase Scale Up Design for a Knowledge-Based Intervention in Science.
Reminder for next week CUELT Conference.
Social Validity and Treatment Integrity
Presentation transcript:

Fidelity of Implementation in Scaling-up Highly Rated Science Curriculum Units for Diverse Populations Carol O’Donnell and Sharon Lynch The George Washington University Michael Szesze, Suzanne Merchlinsky Montgomery County Public Schools Prepared for the IERI PI Meeting September 10, 2004 Do not cite, quote, or distribute without permission from authors.

A Quasi-Experimental Design SCALE-uP: A Quasi-Experimental Design Design: 5 pairs of middle schools matched on demographic characteristics. Highly rated curriculum units randomly assigned to treatment or matched comparison schools. Pre- and posttesting. Sample: Approximately 3,000 MCPS middle school students for each grade level (6-8) per year.

Implementation Research Questions Experimental - Are highly rated science curriculum units more effective than “standard menu”? If yes, what happens when data are disaggregated (gender, ethnicity, and FARMS, ESOL, and SPED eligibility)? Ethnographic - How do these units function in middle school classrooms?

Scale-up Research Questions Experience--Do students in schools in the first year of implementation have better outcomes than those in the second year? Scale--Do students in schools at small scale (5 schools) have better outcomes than students in schools at large scale (37 schools)? Fidelity of Implementation (FOI)--Do students whose teachers enact the curriculum as it was intended have higher outcomes than those whose teachers enact with less fidelity of implementation (measured by a classroom observation instrument currently under development).

Fidelity of Implementation: Definitions Determination of how well a program is implemented in comparison with the original program design (Mihalic, 2002). Degree to which program providers implement programs as intended by the program developer (Dusenbury, Brannigan, Falco, & Hansen, 2003).

Work to Date Developed a classroom observation protocol to capture quality of delivery and participant responsiveness. Piloted protocol before implementation, then during implementation (analysis in process). Piloted pre- and post-observation interviews between teachers and observers. Preliminary observations show that some aspects of classroom implementations are consistent with the unit’s instructional “intent”—high fidelity. Others were not—lower fidelity.

Teachers’ Questions about FOI Can we modify the unit to meet the needs of diverse student populations (SPED, ESOL, etc.)? What if we have requirements to meet state indicators (e.g. vocabulary) not covered by unit? Can we use instructional practices we typically use in the classroom (e.g. exit cards, warm ups)? How do we deal with student behavior issues? Can we add supplemental readings?

Teachers' Questions about FOI Prompted a Set of FOI Guidelines: Fidelity Is… Adhering to unit and lesson purpose, goals, and objectives. Adhering to unit pedagogical approaches. Following lesson sequence. Using the recommended equipment or materials. Making an adaptation to the lesson that does NOT change the nature or intent of the lesson.

Fidelity Is Not… Reducing or modifying unit goals and objectives. Reconfiguring the lesson so that other instructional practices gradually replace parts of the new unit. Reducing student expectations inherent to the unit. Varying grouping strategies outlined in the unit. Changing the unit’s organizational patterns. Substituting other curriculum materials or lessons for those described by the unit.

Dane & Schneider (1998); Dusenbury, Brannigan, Falco, & Hansen (2003) Teachers' Questions Prompted More Expansive Definition of FOI: Components Adherence to the unit – Unit delivered as designed. Exposure - Number of lessons; length of time. Quality of delivery – The manner in which a teacher implements an intervention. Participant responsiveness – The extent to which students are engaged by the intervention. Program differentiation – Whether critical features (e.g. instructional strategies) that distinguish the intervention from the traditional curriculum are present or absent from implementation. Adapted from: Dane & Schneider (1998); Dusenbury, Brannigan, Falco, & Hansen (2003)

Questions to Ponder Can we expect teachers and students to only exhibit instructional qualities inherent to the intervention? How does a measure of FOI account for pre-existing good teaching practices vs. those prompted by the intervention curriculum units? Why are certain aspects of instructional delivery consistently absent, despite unit support?

Next Steps Finalize instrument that captures quality of delivery and program responsiveness. Develop means of measuring adherence, exposure, and program differentiation. Validate all FOI instruments. Test for reliability. Quantify classroom FOI observation scores. Analyze correlation between FOI and student outcomes.