Professional Development Activity Log: Comparing Teacher Log and Survey Approaches to Evaluating Professional Development AERA Annual Meeting Montreal,

Slides:



Advertisements
Similar presentations
Best Practices Conference 2009 Multi-State, Multi-Site Engagement: Efficient and Effective Methods with Reduced Resources and No Travel Presenters: Karen.
Advertisements

PD Plan Agenda August 26, 2008 PBTE Indicators Track
1 Strengthening Teaching and Learning: Educational Leadership and Professional Standards SABES Directors’ Institute July 2011.
UTILIZING FORMATIVE EVALUATION IN A PROFESSIONAL DEVELOPMENT PROGRAM Tiah Alphonso Louisiana State University Department of Educational Theory, Policy,
Learning Teaching Enhancing Supporting Sharing ISLN March, 2015.
ESTEEMS (ESTablishing Excellence in Education of Mathematics and Science) Project Overview and Evaluation Dr. Deborah H. Cook, Director, NJ SSI MSP Regional.
Lesson Study Races to the Top in Florida A Report on Lesson Study Initiatives In The Sunshine State Collaborate Plan Align Learn Motivate Share.
Mathematics and Science Partnership Grant Title IIB Information Session April 10, 2006.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Fred Gross Education Development Center, Inc.
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.
What Can Districts and Schools Do to Make Professional Development Work? Andy Porter Vanderbilt University June, 2004.
1 Developing an Evaluation Plan _____________________ The Mathematically- Connected Communities MSP Developed for the February, MSP Conference Dr.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Experiences and requirements in teacher professional development: Understanding teacher change Sylvia Linan-Thompson, Ph.D. The University of Texas at.
Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,
Southern Regional Education Board HSTW An Integrated and Embedded Approach to Professional Development and School Improvement Using the Six-Step Process.
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction: Year 3 Update Report MSP-RETA Project.
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution Saturday,
OJJDP Performance Measurement Training 1 Presented by: Dr. Kimberly Kempf-Leonard School of Social Sciences University of Texas at Dallas
Reaching for Excellence in Middle and High School Science Teaching Partnership Cooperative Partners Tennessee Department of Education College of Arts and.
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction.
Professional Development Activity Log: A New Approach to Design, Measurement, Data Collection, and Analysis AERA Annual Meeting San Diego April 13, 2004.
Evaluating a Research Report
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
Mathematics and Science Education U.S. Department of Education.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Teacher Algebra Network: Our Model for Professional Development in Three Rural North Carolina Counties Presented by Katie J. Mawhinney and Tracie McLemore.
Mathematics and Science Partnerships, Title II, Part B, NCLB.
Empirical Study of the Effects of Professional Development on Improving Mathematics and Science Instruction MSP Goal 3 NSF/RETA Study Website:
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
AERA Annual Meeting 2004, San Diego April, Optimizing Evaluation Quality & Cost Effectiveness Evaluating the University of Texas Master Teacher.
PRIMES Partnerships and Research Investigations with Mathematicians, Engineers, and Scientists Professional Development Model MSP Regional Meeting February.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
ALM15/July2008/Schmitt1 Initial Findings from Research on the TIAN Project A New Professional Learning Model for Adult Education Math Teachers Mary Jane.
11 Report on Professional Development for and Update Developed for the Providence School Board March 28, 2011 Presented by: Marco Andrade.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Introduction to Surveys of Enacted Curriculum Presentation: Introduce SEC to Educators [Enter place and date]
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
National Science Foundation Mathematics & Science Partnerships Program (MSP)
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
South Jersey Math/Science Partnership at Rowan University Dr. Eric Milou Dr. Jill Perry SJMP.
Survey of Enacted Curriculum— © 2004 CCSSO All rights reserved. Surveys of Enacted Curriculum: An Introduction to Accessing and.
Passport to Science MSP Science Program Indianapolis Public Schools.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Project Overview Purpose: to partner with 9 th grade math teachers as they implemented a proficiency- based learning & grading system in their Algebra.
Lenoir STEM Learning Community. Just the facts, ma’am… 0 What is Lenoir STEM? A program funded by MSP grant with additional financial support from industry.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
North Carolina MSP Data Collection Center. Primary Purposes To create a database application containing common information for all MSP’s across NC To.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
MSP Summary of First Year Annual Report FY 2004 Projects.
What if… You could use data to guide professional development? You could have consistency across grade levels? You could know how well aligned your state.
SCALE Quality Indicator System
Southern Regional Education Board Annual Leadership Forum
Measuring Differences in the Quality of Professional Development
Project: Assessing Teacher Learning About Science Teaching (ATLAST)
Presentation transcript:

Professional Development Activity Log: Comparing Teacher Log and Survey Approaches to Evaluating Professional Development AERA Annual Meeting Montreal, April 11, 2004 Symposium - Evaluating the Quality of Professional Development: Implications for Districts and States Kwang Suk Yoon Reuben Jacobson American Institutes for Research

Overview Comparisons between teacher logs with survey Teachers’ experience with logs Lessons learned Next steps

Professional Development Activity Log (PDAL) The PDAL is a web-based, self-administered, longitudinal data collection tool for teachers to record their professional development experiences in detail with the assistance of a series of structured prompts Teachers log on to their password-protected web account and fill out their PDAL at regular intervals Visit for more informationwww.PDAL.net

PDAL Entries Name of activity Number of hours spent on each activity and its duration Whether the activity is a one-time or continuous event (e.g., recurring over a number of months) Type of activity (e.g., workshop, summer institute, study group) Purpose of activity (e.g., strengthening subject matter knowledge) PD quality features (e.g., active learning, coherence, collective participation) Content focus (e.g., algebraic concepts: absolute values, use of variables, etc.) Instructional practice – instructional topics covered in each activity (e.g., use of calculators, computers, or other educational technology)

Why PDAL? Collects disaggregate information about specific PD activities – Increases the level of specificity of PD data and reduces bias introduced by gross data aggregation Gathers accurate, reliable, and time-sensitive information – Minimizes recall problem with retrospective reports Tailors technical assistance to teachers based on their response patterns Allows teachers to review their own logs – Teachers can reflect on their own PD experiences Generates context sensitive questions

“Black Box” of Survey Data Hypothetical Data

Fine-grain log-level data on contact hours : Disaggregated by teacher by activity by time Hypothetical Data

Validation of Teacher Logs as an Alternative Data Collection Method Comparing PDAL and exit survey results –Measurement properties –Correlations –Mean levels Assessing the relative efficacy of teacher logs and survey

Method Instruments –PDAL Conduced over 15 months –Exit Survey Follow-up questions about PD activities over the same period of 15 months Questions about teachers’ experiences with PDAL Sample –4 Math-Science Partnership Program projects –Sample 476 math and science teachers mostly in middle or high schools –Participants 326 teachers completed at least for a month in PDAL 165 teachers participated in the Exit Survey

Measurement Properties of PDAL

Agreement between Log & Survey Methods: Correlations between methods

Agreement between Log & Survey Methods: Mean Levels

Teachers’ Experience with or Opinions about PDAL: Results from the PDAL Exit Survey

Unique Benefits of PDAL Rich, in-depth data with a high level of specificity –Differences in features between PD sponsored by MSP vs. other PD –Topic intensity (i.e., amount of contact hours per topic) Time-dependent measures –Percent of months with PD –Average contact hours by month –Average span of activity Can be used for on-going formative evaluation to continuously improve PD –Episode-specific comments and feedback

Average Contact Hours by Month

Lessons Learned & Implications Complementary uses of logs & surveys for different purposes –Globally estimating the mean level of PD activities –Investigating the variability of specific PD elements and relating it to other outcomes –Improving PD design Cost and benefits of different data collection methods Improving survey method: Increase the level of specificity

Next Steps Final phase of data analysis –Using both PDAL and survey data to assess change in teaching practice and assess their relative predictive validity PDAL users focus group (May 2005) PDAL usability study Need for follow-up studies –Such as a new CCSSO-AIR study on improving evaluation of professional development in math and science at state and local levels

Contact Information Kwang Suk Yoon (202) Reuben Jacobson (202) Visit us