Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.

Slides:



Advertisements
Similar presentations
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
Advertisements

Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
New and Emerging GEAR UP Evaluators
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
ESTEEMS (ESTablishing Excellence in Education of Mathematics and Science) Project Overview and Evaluation Dr. Deborah H. Cook, Director, NJ SSI MSP Regional.
Materials Support Assessment Professional Development Community/ Administrative Involvement Curriculum Materials Science: It’s Elementary Bringing science.
MSP Evaluation Rubric and Working Definitions Xiaodong Zhang, PhD, Westat Annual State Coordinators Meeting Washington, DC, June 10-12, 2008.
Mathematics and Science Partnership Grant Title IIB Information Session April 10, 2006.
Formative and Summative Evaluations
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Teacher Professional Development Programs in Grades 3-8: Promoting Teachers’ and Students’ Content Knowledge in Science and Engineering Beth McGrath &
What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Mathematics/Science Partnerships U.S. Department of Education: New Program Grantees.
Ch 6 Validity of Instrument
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Evaluating Student Growth Looking at student works samples to evaluate for both CCSS- Math Content and Standards for Mathematical Practice.
Understanding and Using Standardized Tests
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
Test item analysis: When are statistics a good thing? Andrew Martin Purdue Pesticide Programs.
MATH/SCIENCE PARTNERSHIP BASICS The U.S. Department of Education´s Mathematics and Science Partnerships (MSP) program is administered by the Academic Improvement.
Student Learning Outcomes Assessment More on Measures Assessment Brown Bag Session 2 March 16-17, 2009.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
Mathematics and Science Education U.S. Department of Education.
SciencePLUS (Promoting Learning & Understanding for Students) Network A Federally Funded Project through the Math-Science Partnership and the Kentucky.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Mathematics and Science Partnerships: An Introduction for New State Coordinators February /2013.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
WORKING TOGETHER TO IMPROVE SCIENCE EDUCATION PRESENTED BY GIBSON & ASSOCIATES A CALIFORNIA MATH AND SCIENCE PARTNERSHIP RESEARCH GRANT WISE II Evaluation.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
Tim Brower Professor & Chair Manufacturing & Mechanical Engr. Oregon Institute of Technology MSP Regional Meeting, San Francisco, February 14 & 15, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Mathematics and Science Partnerships program U.S. Department of Education Regional Conferences February - March, 2006.
Research and Evaluation Team Lines of Work Andy Porter, Director Building a Partnership – Susan Millar District Case Studies – William Clune Targeted Studies.
Arkansas Capacity Building Science Partnership Grant: Beyond Traditional Professional Development Models 2008 Math Science Partnership Regional Conference.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
The Evaluation of Mathematics and Science Partnerships Program A Quasi Experimental Design Study Abdallah Bendada, Title II Director
Welcome to the San Francisco Mathematics and Science Partnerships Regional Meeting March 21-23, 2011.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
U.S. Department of Education Mathematics and Science Program State Coordinators’ Meeting.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
MSP Program Evaluation Carol L. Fletcher, Ph.D. TRC Project Director Meetings 1/27/09 and 2/5/09 Carol L. Fletcher, Ph.D. TRC Project Director Meetings.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Adult Education Assessment Policy Effective July 1 st, 2011.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Mathematics and Science Partnership APR Updates apr.ed-msp.net.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Balancing the Call for Evidence- Based Research Designs With Formative Evaluation to Improve Implementation of Inquiry-Based Science Teaching Ken Wareham.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Breakout Discussion: Every Student Succeeds Act - Scott Norton Council of Chief State School Officers.
EVALUATING A MIDDLE SCHOOL MATH M.ED. PROFESSIONAL DEVELOPMENT PROGRAM. Knowledge, Pedagogy, Practice or Student Achievement:
Research and Evaluation
CAEP Standard 4 Program Impact Case Study
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Professor Jim Tognolini
Evaluation of An Urban Natural Science Initiative
Melanie Taylor Horizon Research, Inc.
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
Grantee Guide to Project Performance Measurement
Project: Assessing Teacher Learning About Science Teaching (ATLAST)
Student Learning Outcomes Assessment
Presentation transcript:

Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC

2 Overview of Session Introductions Importance of Measuring Teacher Content Knowledge Measuring Teacher Content Knowledge Discussion

3 Importance of Measuring Teacher Content Knowledge US ED MSP Program Office Perspective The Government Performance Results Act (GPRA)  A law passed in 1993 that requires federally funded agencies to develop and implement an accountability system based on performance measurement. Importance of Measuring Teacher Content Knowledge

4 Requires all federally funded program to: Outline long-term and annual performance goals that include outcomes, Develop indicators to assess performance goals, Collect and analyze data on the indicators, and Report progress toward achieving performance goals based on the data collected and analyzed. MSP GPRA indicator related to content knowledge  The percentage of MSP teachers who significantly increase their content knowledge, as reflected in project-level pre- and post-assessment Importance of Measuring Teacher Content Knowledge

5 GPRA and PART are used by congress to inform appropriations – they want concrete evidence of results More uniform and higher quality data will enable the MSP Program office to more accurately and convincingly assess progress toward MSP performance goals This is challenging for measuring changes in teachers’ content knowledge Importance of Measuring Teacher Content Knowledge

6 MSP Grantee Perspective Has the potential to affect funding of the program at federal level US ED annual reporting requirement One of the more direct variables for individual partnerships to measure and demonstrate program outcomes Importance of Measuring Teacher Content Knowledge

7 Because of the variability in the activities projects are implementing to improve teacher content knowledge the use of one instrument is not a viable solution Some instruments are better than others Search for or develop an instrument that:  Is aligned with the content of your MSP professional development activity;  Has been piloted with teachers similar to those participating in your program; Measuring Teacher Content Knowledge

8  Provides evidence of reliability or the extent to which an instrument yields consistent, stable, and uniform results over repeated administrations under the same conditions each time  Provides evidence of validity or the extent to which 1) the instrument measures the skills it sets out to measure and 2) the inferences and actions made on the basis of the scores are appropriate and accurate Measuring Teacher Content Knowledge Figure obtained from the website: Discussion Question: What instruments have grantees been using? What are the properties of those instruments?

9 Potential Tests Assessing Teacher Learning About Science Teaching (ATLAST) - Horizon Research, Inc.  Three instruments developed in the following content areas to assess changes in teacher content knowledge: Force and motion, plate techtonics, and flow of matter and energy in living systems  Diagnostic Teacher Assessments in Mathematics and Science (D- TAMS) - University of Louisville  Three pre-post assessments of middle school science teacher’s content knowledge in: physical science, earth / space science, and life science  Misconception Oriented Standards-based Assessment Resource for Teachers (MOSART) - National Science Foundation  Multiple-choice instrument designed to measure gains in teacher content knowledge in setting s such as summer institutes. 

10 Resources Overview of reliability and validity     Crocker, L. & Algina, J. (1986). Introduction to classical and modern test theory. Belmont, CA: Wadsworth.  Shultz, K. S. & Whitney, D. W. (2005). Measurement theory in action: Case studies and exercises. Thousand Oaks, CA: Sage. Information on established instruments   Measuring Teacher Content Knowledge

11 Information on developing instruments  Crocker, L. & Algina, J. (1986). Introduction to classical and modern test theory. Belmont, CA: Wadsworth.  Haladyna, T. M. (1999). Developing and validating multiple-choice test items (2 nd ed.). Mahwah, NJ: Erlbaum.  Osterlind, S. J. (1998). Constructing test items: Multiple-choice, constructed response, performance, and other formats. Boston, MA: Kluwer.  Shultz, K. S. & Whitney, D. W. (2005). Measurement theory in action: Case studies and exercises. Thousand Oaks, CA: Sage. Measuring Teacher Content Knowledge

12 When To Administer Your Test Administer the pretest immediately prior to the MSP activity or college course Administer the posttest after the MSP activity and follow-up are complete, but within the same school year as the initial MSP activity Measuring Teacher Content Knowledge

13 How to Determine Whether Teachers Have Made Significant Gains To report for GPRA, US ED needs a way to tell whether the observed gains in teacher content knowledge between the pretest and the posttest are significant US ED is proposing that projects use a statistical test called a dependent t-test to determine whether teachers have made significant gains Measuring Teacher Content Knowledge

14 Proposed process  The federal program office would supply grantees with an Excel spreadsheet with embedded formulas to do the needed calculations  Grantees would enter into the spreadsheet the pretest scores and posttest scores for the teachers they test  Grantees would use one spreadsheet per test  The spreadsheet would calculate the needed statistics and produce a report for grantees showing the total number of teachers and the number who showed significant gains  Grantees would report this information to US ED through the Annual Performance Report  US ED would aggregate the information from all grantees and use it for GPRA reporting Measuring Teacher Content Knowledge

15 Discussion Experiences developing content knowledge instruments Challenges encountered in measuring changes in content knowledge Questions about measuring changes in content knowledge