Scientific Validation Of A Set Of Instruments Measuring Fidelity Of Implementation (FOI) Of Reform-based Science And Mathematics Instructional Materials.

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Performance Assessment
Experimental Research Designs
© CCSR 5Essentials: Background and methodology Rachel Levenstein Senior Manager for Survey Research
Instrument Development for a Study Comparing Two Versions of Inquiry Science Professional Development Paul R. Brandon Alice K. H. Taum University of Hawai‘i.
An evaluation framework
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Home Economics Teachers’ Readiness for Teaching STEM
Curriculum Development Center (CDC) Curriculum Development Process Continue.
Chapter 13 Survey Designs
Learning Goals, Scales and Learning Activities
Overview of Gifted Implementation and Advanced Learning Program (ALP)
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Literacy with Information and Communication Technology (ICT): A developmental learning continuum.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Identifying critical components To answer “What is the status of implementation of each STEM school’s model and to what extent does the model reflect the.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Curriculum Renewal: Fidelity of Implementation WERA/OSPI State Assessment Conference— Seattle Airport Hilton December 4, 2008 Peter Hendrickson, Ph.D.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
Collecting Quantitative Data Creswell Chapter 6. Who Will You Study? Identify unit of analysis Specify population Describe sampling approach  Class =
ALWAYS LEARNING1 Effective Teaching for ALL Students: The SIOP ® Model NAME TITLE Pearson School Achievement Services.
Kirby School District 140 Everyday Mathematics (EM) and Science Program Research and Evaluation of Mathematics and Science Implementation.
Identifying the gaps in state assessment systems CCSSO Large-Scale Assessment Conference Nashville June 19, 2007 Sue Bechard Office of Inclusive Educational.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
Content Analysis of Common Core State Standards: Initial Findings Webex: September 20, 2010.
Predicting Patterns: Lenawee County's Use of EXPLORE and PLAN DataDirector 2011 User Conference Dearborn, Michigan.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Improving Implementation Research Methods for Behavioral and Social Science Working Meeting Measuring Enactment of Innovations and the Factors that Affect.
DR-K12 PI Meeting Building the Knowledge Base of Teacher Learning in STEM Education A Framework and Suite of Adaptable Instruments for Examining Fidelity.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
C. “Changing the conversation…” Instructional Change –  Align to standards  Set higher expectations  Rigorous coursework  Assess  Data driven intervention.
Michigan MSPs June 2007 Wendy Tackett, PhD, iEval
Welcome to the State of the STEM School Address National Inventor’s Hall of Fame ® School Center for Science, Technology, Engineering and Mathematics (STEM)
IES Evaluations and Data Collection Instruments Lauren Angelo National Center for Education Evaluation and Sally Atkins-Burnett Mathematica Policy Research.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Quantitative and Qualitative Approaches
Mt. Diablo Unified School District Elementary Mathematics Adoption Buy Back Day Thursday, August 27, 2009.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Researchers Without Borders Webinar 4 A Framework and Suite of Instruments for Examining Fidelity of Implementation Jeanne Century Center for Elementary.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
MSP Program Evaluation Carol L. Fletcher, Ph.D. TRC Project Director Meetings 1/27/09 and 2/5/09 Carol L. Fletcher, Ph.D. TRC Project Director Meetings.
The Ohio STEM Learning Network: A Study of Factors Affecting Implementation Spread and Sustainability.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
The Teacher- Child Interaction Linking Developmentally Appropriate Practices to the Characteristics of Effective Instruction.
Jeanne Century CEMSE, University of Chicago NRC Committee on Integrated STEM January 10, 2012 Defining iSTEM.
1 Conceptual Framework Research Questions Data Collection.
Knowledge-Building and Instructional Practice in Georgia Reading First.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Research Opportunities in AMSP UK Mathematics Education Retreat October 15, 2005.
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
Preparing Facilitators to Use and Adapt Video-Based Professional Development Materials with Fidelity Nanette Seago WestEd, Redwood City, CA Karen Koellner.
Research and Evaluation
Lessons Learned from the Evaluation of an Inquiry Science Project
Melanie Taylor Horizon Research, Inc.
Verification Guidelines for Children with Disabilities
Measuring Teachers’ Fidelity of Implementation
Spatial STEM - C Evaluation of a Model Spatial Thinking Curriculum for Building Computational Skills in Elementary Grades K-5 A collaboratvive project.
K–8 Session 1: Exploring the Critical Areas
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Assessment Literacy: Test Purpose and Use
Fidelity of Implementation in Scaling-up Highly Rated Science Curriculum Units for Diverse Populations Carol O’Donnell and Sharon Lynch The George Washington.
Dr. Phyllis Underwood REL Southeast
Presentation transcript:

Scientific Validation Of A Set Of Instruments Measuring Fidelity Of Implementation (FOI) Of Reform-based Science And Mathematics Instructional Materials Dae Y. Kim, Principal Investigator Amy Cassata, Co-Principal Investigator Jeanne Century, Co-Principal Investigator IES Grantee Panel on Fidelity of Implementation September 7, 2011

Instrument Development NSF-Funded Development Project ( ) Developed a framework for measuring the enactment of five reform-based mathematics and science curricula Created a suite of instruments based on this framework Driving Research Questions What are the enacted components of reform-based instructional materials in mathematics and science that are common across specific programs? How can we rigorously and specifically measure multiple components of implementation? How can we collect data on and accumulate knowledge about implementation across multiple programs?

Instrument Development Reform-based Curricula Full Option Science System (FOSS), Science and Technology for Children (STC), Science Companion, Science Education for Public Understanding Program, and Everyday Mathematics “Critical Components” The measurable elements of the intended program model that are essential to its implementation May be shared across programs or unique to specific programs Can be organized into general categories Structural: design, procedures, organization, and built-in supports Instructional: expectations for participants’ actions, behaviors and interactions as they enact and engage in the intervention (users and recipients)

Framework for Measurement Process of Critical Component Identification Review of written materials to identify explicit and implicit components Interviews with program developers to articulate the “intended” program model and critical components Conversations with program users about critical components Iterative revision, reconciliation, and modification of critical components list and framework for organizing them

FOI Framework for Instructional Materials

Measuring Implementation Suite of instruments measuring the operationalization of the identified critical components Teacher Instructional Questionnaire Teacher Instructional Log Teacher Instructional Observation Protocol Teacher Interview Protocol School Leader Questionnaire School Leader Interview Protocol School-Wide Observation Protocol Pilot tested in 200 classrooms in CPS (2008) and revised Field-tested in 265 classrooms in CPS (2009)

Measuring Implementation

Classroom-Level Instruments Critical Components Number of Items Teacher Instructional Questionnaire Teacher Instructional Observation Protocol Teacher Instructional Log Structural Procedural Structural Educative404 Instructional Pedagogical Instructional Student Engagement32

Next Steps: Validation IES Funding: Goal 5 (Measurement) Goals Establish validity and reliability within and across the three classroom-level instruments Develop a student questionnaire (grades 3-5) to triangulate measures of student engagement Participants 50 schools across 4 sites (IL, WA, MA) All teachers grades K-5 (N~1000) All consented students grades 3-5 (N~4500) Schools use both Everyday Mathematics and a reform-based science curriculum

Year 1: Student Questionnaire 20 to 25-item questionnaire Literature review for items measuring Student Engagement critical components Item development and pilot testing Cognitive interviews Consult with experts Field testing 300 students in grades 3-5 Analyze factorial validity and internal consistency Instrument revision

Year 2: Data Collection for Validation All teachers will complete Teacher Instructional Questionnaire – Math Teacher Instructional Questionnaire – Science All students (grades 3-5) will complete Student Questionnaire – Math Student Questionnaire – Science 300 teachers with additional observations and logs 100 teachers: 2 Math observations and 4 Math logs 100 teachers: 2 Science observations and 4 Science logs 100 teachers: 1 Math & 1 Science observation, 4 Math & 4 Science logs Student achievement data (grades 3-5)

Year 3: Analysis Inter-rater agreement (observation) Internal consistency of items Factorial structure of items Differences in scores across different groups/content areas Science vs. mathematics Grade level Study site Cross-instrument consistency Teacher Instructional Questionnaire vs. Student Questionnaire Teacher Instructional Observation Protocol vs. Teacher Log Predictive validity Mathematics and science achievement (grades 3-5)

Predicting Student Achievement To what extent do composite indices on the four subcategories (SP, SE, IP, ISE) predict student achievement in science and mathematics? Planned analysis for predictive validity 3-level fixed effects HLM model will explore the effect of the FOI composite indices measured at the teacher (classroom) level on individual students’ science and mathematics achievement (Grades 4-5) Prior year’s science or math score will be entered as a covariate Other variables: student demographics, classroom characteristics Data for science and mathematics will be analyzed separately, then pooled Data for questionnaires, logs, and observations will be analyzed separately

Summary The FOI framework clearly and specifically describes the nature of reform-based mathematics and science instructional materials so these elements can be measured The framework has resulted in a set of instruments which are currently being validated The framework, and related instruments, have the potential to inform our understanding reform-based STEM curricula and to accumulate knowledge about the elements that make them most effective