Seattle’s Teacher Evaluation Reform in Context. Two questions How does PG&E’s design compare to evaluation systems nationwide? What can SPS learn about.

Slides:



Advertisements
Similar presentations
A “Best Fit” Approach to Improving Teacher Resources Jennifer King Rice University of Maryland.
Advertisements

Multiple Measures of Teacher Effectiveness Tulsa Public Schools Jana Burk.
MEASURING TEACHING PRACTICE Tony Milanowski & Allan Odden SMHC District Reform Network March 2009.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
PRIDE Professional Rubrics Investing & Developing Educator Excellence
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Bringing Your Human Resources Practice into the 21 st Century Presented by Mary M. Jessie, Education Management Consultant Georgia Association of School.
1 SMHC Project Overview Jim Kelly and Allan Odden Co-Directors July 21, 2008.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Transforming Community Colleges to Accelerate Student Success Thomas Bailey Shanna Jaggars Davis Jenkins Community College Research Center June 2011.
Teacher Evaluation Systems: Opportunities and Challenges An Overview of State Trends Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
Minneapolis Public Schools QComp An Overview of Quality Compensation In Minneapolis Bill Gibbs Site administrator Kenny School Former district QComp Coordinator.
Strategic Human Resource Alignment: The Context for Changing Teacher Compensation Herb Heneman & Tony Milanowski Consortium for Policy Research in Education.
Developing Human Capital Management Strategies Herb Heneman University of Wisconsin-Madison Tony Milanowski Westat.
Welcome What’s a pilot?. What’s the purpose of the pilot? Support teachers and administrators with the new evaluation system as we learn together about.
Aligning Observations and In-service Professional Development Prek-3 rd Jason Downer May 9, 2012.
Student Learning Objectives 1 Implementing High Quality Student Learning Objectives: The Promise and the Challenge Maryland Association of Secondary School.
1 Developing an Evaluation Plan _____________________ The Mathematically- Connected Communities MSP Developed for the February, MSP Conference Dr.
Annotated Bibliography Presentation April 17, 2013 Carol Redmond ED 521: Educational Research and Analysis Flipping Literacy and Improved Reading Levels.
Day 9. Agenda Research Update Evidence Collection SLO Summative Help Summative Evaluation Growth-Producing Feedback The Start of the Second.
Recognizing Effective Teaching Thomas J. Kane Professor of Education and Economics Harvard Graduate School of Education.
Educator Preparation, Retention, and Effectiveness Ed Fuller University Council for Educational Administration and The University of Texas at Austin February.
Washington State Teacher and Principal Evaluation 1.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
Connecting the Dots BISD’S LEARNING PLATFORM AND ESTABLISHING EXPECTATIONS FOR LEARNING RUBRIC.
Presentation for. Powered by New Leaders and Pearson Transform Principal Practice Evidence-based Research-based Lead So Students Can Soar Real Challenges.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Public Charter School Grant Program Workshop Aligning Teacher Evaluation, Professional Development, Recruitment and Retention March 3, 2014.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
1 © The New Teacher Project (TNTP) works to end the injustice of educational inequality by providing excellent teachers to the students who need.
CPRE Research on Standards-Based Teacher Evaluation National Conference on Teacher Compensation and Evaluation Chicago, Illinois November 29, 2001 Herbert.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
 In Cluster, all teachers will write a clear goal for their IGP (Reflective Journal) that is aligned to the cluster and school goal.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools.
Thompson School District Organizational Systems Alignment December 2, 2008 Thompson Leadership Team.
Strategic Planning and AdvancEd Accreditation In partnership with Quality New Mexico Taos NMSBA Leadership Conference July 13, 2012.
STRATEGIC MANAGEMENT OF HUMAN CAPITAL IN Chicago Overview: Steve Kimball, SMHC/CPRE/Univ. Wisconsin Panel: Hill Hammock & Monica Santana Rosen, Chicago.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
Intro to TPEP. A new evaluation system should be a model for professional growth, supporting collaboration between teachers and principals in pursuit.
External Review Exit Report Springfield Platteview Community Schools March 2-4, 2015.
Bonnie S. Billingsley, Professor Virginia Tech College of Liberal Arts & Human Sciences School of Education, Department of Teaching & Learning
CPRE Research on Teacher Compensation & Evaluation National Conference on Teacher Compensation and Evaluation Chicago, IL; November 21, 2002 Herbert Heneman.
Educator Effectiveness: State Frameworks and Local Practice CCSSO Annual Conference, June 2012 Allan Odden Strategic Management of Human Capital (SMHC)
Aligning Academic Review and Performance Evaluation AARPE Session 5 Virginia Department of Education Office of School Improvement.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Transforming the Learning, Teaching, and Leadership Environment Summer Institutes 2001 Office of Superintendent of Public Instruction/Association of Washington.
Learning-Centered Leadership Joseph Murphy Peabody College, Vanderbilt University.
Teacher Evaluation Systems 2.0: What Have We Learned? EdWeek Webinar March 14, 2013 Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
Overview of the ProComp Evaluation What to talk about when you can’t talk results. Robert Reichardt AEFP Seattle WA, 3/24/11 1.
RAPPS – Rural Alaska Principal Preparation and Support Program Ensuring Fidelity In the Evaluation System May 28 – 30, 2014 Learning Groups 1, 2, and 3.
Weighting components of teacher evaluation models Laura Goe, Ph.D. Research Scientist, ETS Principal Investigator for Research and Dissemination, The National.
Performance Incentives to Improve Community College Completion: Lessons from Washington State Davis Jenkins, Community College Research Center Nancy Shulock,
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
SCHOOL LEADERS AS HUMAN CAPITAL MANAGERS Tony Milanowski & Steve Kimball University of Wisconsin-Madison.
New Approaches to Old Problems: how “guided pathways” can lead to student success Paul N. Markham Program Officer, Postsecondary Success TASS Conference.
SEA Strategies for Promoting Equity: SEA/IHE Collaboration on Teacher Preparation Lynn Holdheide, Center on Great Teachers and Leaders & Collaboration.
CLASS Project Inspiring Oregonians… to do what it takes to make our schools among the nation’s best. CLASS Project.
Approaches to Measuring Teaching Practice: Review of Seven Systems Tony Milanowski University of Wisconsin-Madison (With Contributions from Herb Heneman.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
School improvement and teacher development
Evaluation of An Urban Natural Science Initiative
Elayne Colón and Tom Dana
The School Turnaround Group
The impact of an assessment policy upon teachers' self-reported assessment beliefs and practices Research Consortium: Dr Gavin Brown, The University of.
Principles of Teacher Evaluation Design
Evaluating the Quality of Student Achievement Objectives
Themes Collaboration – vertical and horizontal Autonomy at all levels
Presentation transcript:

Seattle’s Teacher Evaluation Reform in Context

Two questions How does PG&E’s design compare to evaluation systems nationwide? What can SPS learn about implementing PG&E from similar efforts in other districts?

Bottom lines Do…Don’t… Realign instructional and operational systems to support eval system Treat evaluation as a stand- alone reform Communicate constantly about structure and purpose Assume people understand Train principals to work with teachers on improving practice Focus only on “calibrating” observations Monitor reliability and validity of measures Assume you’re measuring what you want

Today’s briefing on implementation analysis Approach Findings Implications Discussion

Our approach Reviewed empirical studies on implementation of PG&E-like reforms Looked for evidence on what districts are actually doing, not should be doing.

Presentation includes information from studies on: Chicago (2), Denver, Washington D.C., Coventry, RI, Washoe Country (Reno), NV, Cincinnati The Measure of Effective Teaching (MET) Project, and the Teacher Advancement Program (TAP)

What the research base covers Studies focus on –Implementation dynamics and fidelity –Validity and reliability of performance rating But generally no evidence on –Effects on teacher workforce or classroom practice –Effects on student learning

Four key findings Evaluation reforms can expose problems in other district-wide systems Teachers and principals often struggle with understanding and carrying out the reforms Observation-based ratings can identify “effective” teachers, but there’s room to improve Observation-based ratings are more reliable when based on multiple observations

Reforms expose problems in other district-wide systems Teaching-focus of reform highlights misalignments in instructional and operational systems –Are PD and curriculum aligned with instructional frameworks and assessments? –Are training, hiring, and payroll aligned in HR? –Do data systems speak to each other (E.g., compensation and evaluation)?

People struggle to understand and implement the reforms Teachers struggle to understand structure and purpose of new evaluation systems –Especially financial incentives Principals struggle to work with teachers to improve teaching practice –Most training focuses on calibrating observations and ratings –Time constraints are big issue

Observation-based ratings “work,” but could be better Teachers who do well on observation ratings also tend to have higher VAM scores Ratings are better at identifying “effective” teachers when combined with other measures

(Kane & Staiger, 2012, p.9) Combining measures adds predictive power

More observations = more reliable (Kane & Staiger, 2012, p.37)

Implications Ensure district improvement initiatives complement and support PG&E implementation –E.g., Work of EDs, HR, C&I, and DoTS Assess how well people understand PG&E and redouble communication efforts

Implications con’t Train principals in observations and rating but also working with teachers to improve practice –Place a premium on hiring and developing leadership talent Create systematic process to monitor reliability and validity of PG&E evaluations. –Double ratings –Comparing ratings to VAM

Bottom lines Do…Don’t… Realign instructional and operational systems to support eval system Treat evaluation as a stand- alone reform Communicate constantly about structure and purpose Assume people understand Train principals to work with teachers on improving practice Focus only on “calibrating” observations Monitor reliability and validity of measures Assume you’re measuring what you want

Inclusion criteria Research must be on programs with teacher evaluation systems, not simply pay reform systems Research must evaluate domestic reform at the district, county or state level Study must examine student outcomes, instructional practice, or effects on staffing (recruitment, retention, dismissal) Studies must clearly state the methodology that the authors use, the research sample and the sources of data that the research uses Authors must explain and justify the thoughtful creation of their samples (i.e., reports must not simply use convenience samples) The study must include quantitative or qualitative data that represents reform outcomes throughout the geographic area of implementation The research must compare measured outcomes to either a control group, the school’s past performance, or both.

Studies in review Milanowski, A.T. (2004). The Relationship Between Teacher Performance Evaluation Scores and Student Achievement: Evidence from Cincinnati. Peabody Journal of Education, 79(4), Proctor, D., Walters, B., Reichardt, R., Goldhaber, D., Walch, J. (2011). Making a difference in education reform: ProComp external evaluation report University of Colorado Denver Center for Education Data and Research. Sartin, L., Stoelinga, S.R., Brown, E.R. (2011). Rethinking teacher evaluation in Chicago: Lessons learned from classroom observations, principal-teacher conferences, and distriCurtis, District of Columbia Public Schools: Defining Instructional Expectations and Aligning Accountability and Glazerman, S., Seifullah, A. (2012) An evaluation of the Chicago Teacher Advancement Program (Chicago TAP) after four years. Washington, D.C: Mathematica Policy Research. Thomas J. Kane and Douglas O. Staiger, Gathering Feedback for Teaching: Combining High-­ ‐ Quality Observations with Student Surveys and Achievement Gains (Seattle, WA: Bill & Melinda Gates Foundation, January 4, 2012) Kimball, S. M., White, B., Milanowski, A. T., Borman, G. (2004). Examining the relationship between teacher evaluation and student assessment results in Washoe County. Peabody Journal of Education. 79(4), Milanowski, A.T. (2004). The ct implementation. Consortium on Chicago School Research at the University of Chicago Urban Education Institute. Springer, M. G. (2008). Impact of the Teacher Advancement Program on student test score gains: Findings from an independent appraisal. National Center of Performance Incentives, Peabody College of Vanderbilt University. Retrieved from dProg1.pdf. dProg1.pdf White, B. (2004). The relationship between teacher evaluation scores and student achievement: Evidence from Coventry, RI. Madison, WI: Consortium for Policy Research in Education.