Melanie Taylor Horizon Research, Inc.

Slides:



Advertisements
Similar presentations
Initiative on K-12 Teacher Preparation Natasha Speer, Univ. of Maine Tim Scott and Omah Williams, Texas A & M Noah Finkelstein, Univ. Colorado-Boulder.
Advertisements

Providing On-going Support for STEM Teachers Joan D. Pasley Horizon Research, Inc.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
ESTEEMS (ESTablishing Excellence in Education of Mathematics and Science) Project Overview and Evaluation Dr. Deborah H. Cook, Director, NJ SSI MSP Regional.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Psychometric Properties of the Job Search Self-Efficacy Scale Investigators: Jeff Christianson Cody Foster Jon Ingram Dan Neighbors Faculty Mentor: Dr.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
UNIVERSITY OF CALIFORNIA, IRVINE CAOMP UNIVERSITY OF CALIFORNIA, IRVINE CAOMP ONLINE TOOL BOX.
What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.
Local Evaluation Overview and Preliminary Findings Diane Schilder, EdD.
IES e-PATT Grant e-PATT: Parents and Teachers Together.
1 Developing an Evaluation Plan _____________________ The Mathematically- Connected Communities MSP Developed for the February, MSP Conference Dr.
Project P.O.S.T. Preparing Outstanding Science Teachers A Partnership of GCS & UNCG A Partnership of GCS & UNCG.
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Frances Lawrenz and The Noyce evaluation team University of Minnesota 1 Acknowledgement: This project was funded by National Science Foundation (Grant#REC )
Project Director – Dr. Mark Lung Dept of Natural & Environmental Sciences Western State College of Colorado Project Evaluator – Dr. Dave Shannon Educational.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
Mathematics and Science Partnerships, Title II, Part B, NCLB.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
California Afterschool Outcome Measures Project California Afterschool Outcome Measures Project UNIVERSITY OF CALIFORNIA, IRVINE PRINCIPAL INVESTIGATOR:
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
WORKING TOGETHER TO IMPROVE SCIENCE EDUCATION PRESENTED BY GIBSON & ASSOCIATES A CALIFORNIA MATH AND SCIENCE PARTNERSHIP RESEARCH GRANT WISE II Evaluation.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Local Evaluation Overview and Preliminary Findings Diane Schilder, EdD.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
South Jersey Math/Science Partnership at Rowan University Dr. Eric Milou Dr. Jill Perry SJMP.
Passport to Science MSP Science Program Indianapolis Public Schools.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
MSP Program Evaluation Carol L. Fletcher, Ph.D. TRC Project Director Meetings 1/27/09 and 2/5/09 Carol L. Fletcher, Ph.D. TRC Project Director Meetings.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
COLLEGE OF EDUCATION & HUMAN DEVELOPMENT Measuring Teacher Science Knowledge Tom Tretter University of Louisville
1 Innovative Teaching and Learning (ITL) Research Corinne Singleton SRI International.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
MSP Summary of First Year Annual Report FY 2004 Projects.
Stimulating Research and Innovation for Preservice Education of STEM Teachers in High-Need Schools W. James Lewis Deputy Assistant Director, Education.
Preparing to Facilitate Mathematics Professional Development: Aiming for Alignment Between the Program and the Facilitator Nanette Seago Karen Koellner.
CAEP Standard 4 Program Impact Case Study
SNRPDP Self Evaluation
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Interview Responses: Job Satisfaction
Assessments for Monitoring and Improving the Quality of Education
Evaluation of An Urban Natural Science Initiative
Governor’s Teacher Network
TEACHER PERCEPTIONS OF TECHNOLOGY INTEGRATION IN A RURAL COMMUNITY
ASSESSMENT OF STUDENT LEARNING
AICE Sociology - Chapter 3
Teaching and Educational Psychology
Social Studies Coalition Professional Development Cohorts
PDS Coalition Meeting April 22, 2016
Overview of Student Learning Objectives (SLOs) for
The Ongoing Assessment Project (OGAP). Impact Findings
Connecticut Core Standards for Mathematics
Measuring Teachers’ Fidelity of Implementation
Texas Regional Collaboratives for Excellence in Science and Mathematics Teaching Grant Year
Introduction to the SEC
Connecticut Core Standards for Mathematics
Common Core State Standards AB 250 and the Professional Learning Modules Phil Lafontaine, Director Professional Learning and Support Division.
Project: Assessing Teacher Learning About Science Teaching (ATLAST)
Overview of Student Learning Objectives (SLOs) for
Evaluating the relationship between intensity of teacher professional development and student achievement in a Northern New Mexico inquiry-based science.
Multiple background possibilities
Presentation transcript:

Melanie Taylor Horizon Research, Inc. AIM: K–8 Science Melanie Taylor Horizon Research, Inc.

MSP Theory of Action (Big Picture) This is the basic theory of action for the MSP program. PD is intended to lead to increased teacher knowledge and skills, which should lead to improved classroom practice, and ultimately greater student learning. Our goal is to better understand how different approaches to PD affect downstream outcomes.

AIM: K–8 Science AIM is an NSF-funded MSP RETA AIM has the opportunity to develop instruments and collect data that no single MSP project has the resources to do.

Study Component 1 For component 1 of the study, we need information on the PD, and pre/post PD teacher knowledge measures

Study Component 2 For component 2, want teacher knowledge measure prior to teaching of unit; if teacher was in component 1, the post-PD assessment would be this data point. Teacher survey will include items on instructional practices, teacher beliefs about teaching and learning, and contextual factors (e.g., alignment of MSP efforts with school/district priorities). Classroom observations will not be done on a large scale, but will be done in select areas.

Topics Force and Motion Populations and Ecosystems (i.e., Interdependence) Climate and Weather Evolution and Diversity Forms of Energy Properties of and Changes in Matter

Instruments

PD-Provider Log Captures what teachers experience in PD PD providers complete a log at the end of each day of PD on the targeted topic 15 minutes or fewer to complete Honorarium of $15 per completed log Also, our plan is for the log to be web-based, though a paper version is possible if needed. $15/log, but providers must complete log for each day of PD on the targeted topic.

PD Observations HRI will observe a sample of PD sessions.

Teacher Tracking System In addition to knowing what happens in PD, AIM needs to know which teachers attended each session.

Teacher and Student Assessments Each assessment will take about 30 minutes to administer (all multiple choice). All teacher assessment items are set in the context of work that teachers do, e.g., using content knowledge to analyze student thinking.

Classroom Practice Teacher questionnaire Classroom observations Instructional practices Beliefs about effective instruction Teacher efficacy Contextual factors that affect science instruction Classroom observations Only for a subset of teachers

Timeline Teacher assessments for Force and Motion, Populations and Ecosystems by Summer 2010. Student assessments for these two topics ready for use by Spring of the 2010-11 academic year. Data collection will begin in Summer 2010.

Timeline Assessments for other topics will be added in following years. We anticipate data collection continuing through at least the 2011-12 academic year.

Re-Cap: Two Main Components Relationship between PD and teacher content knowledge Relationships among teacher content knowledge, beliefs, classroom practice, and student learning Projects can participate in either or both components for one or more topics.

What’s Required to Participate Component 1: Complete PD-provider logs Submit teacher tracking data Administer content assessment to teachers pre- and post-PD Allow PD to be observed

What’s Required to Participate Component 2: Administer content assessment to teachers prior to their teaching of the unit on targeted topic If teacher is participating in Component 1, the post-PD assessment may be used Administer student content assessment at the beginning and end of unit on targeted topic Administer teacher questionnaire Teachers complete while students are doing their post-test

What Do Projects Gain by Participating? Opportunity to contribute to knowledge generation that will help the field in the future. Additional data for project-specific studies: PD-provider log data; Teacher assessment data; Teacher questionnaire data; and Student assessment data.

We are looking for a set of projects that vary in their approaches to professional development and that have different school contexts. We will try to include as many projects as possible in the study, but need to make sure we get the right mix.

Projects Not Participating for a Given Topic We will be recruiting teachers to pilot instruments (on-line). If you are willing to disseminate information to teachers, we will send you emails and ask that you forward them to potential participants. Teachers will be paid an honorarium for participating in a pilot and their identities will be kept confidential.

How to Contact Us If you might be interested in participating: http://www.horizon-research.com/aim/signup How to contact AIM: aim@horizon-research.com

Also of Possible Interest MSP-KMD has also developed a series of “knowledge reviews” that share research findings and practice-based insights: Deepening teacher content knowledge Preparing and deploying teacher leaders Involving STEM faculty

Also of Possible Interest MSP-KMD has developed a searchable, on-line, database with information about instruments used to assess teacher content knowledge (mathematics and science, K–12). The database currently contains summaries of 144 instruments.

Can Search By: Content area Grade levels Nature of instrument, e.g., Any multiple-choice/constructed response assessments Assessments that include a scale score with information about reliability and validity Interview protocols Observation protocols

These resources are located at: http://www.mspkmd.net/

Disclaimer The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U. S. Department of Education.