Discussion and Vote to Amend the Regulations

Slides:



Advertisements
Similar presentations
A Vehicle to Promote Student Learning
Advertisements

Overview of the New Massachusetts Educator Evaluation Framework October 2011.
PORTFOLIO.
Evaluating Educator Impact on Student Learning M.A.S.S. Midwinter Meeting March 17, 2015 DRAFT FOR DISCUSSION PURPOSES.
Paul Toner, MTA, President Heather Peske, ESE, Associate Commissioner for Ed Quality Teachers Union Reform Network Conference November 1, 2013 Massachusetts.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
ESEA FLEXIBILITY WAIVER RENEWAL Overview of Proposed Renewal March 6, 2015 Alaska Department of Education & Early Development.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Differentiated Supervision
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
1-Hour Overview: The Massachusetts Framework for Educator Evaluation September
1 Orientation to Teacher Evaluation /15/2015.
Department of Elementary and Secondary Education July, 2011
 Reading Public Schools Staff Presentations March 30, 2012.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
Name Workshop Facilitator Instructional Leadership: Creating Demand.
Type Date Here Type Presenter Name/Contact Here Making Evaluation Work at Your School Leadership Institute 2012.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Making Plans for the Future April 29, 2013 Brenda M. Tanner, Ed.D.
March Madness Professional Development Goals/Data Workshop.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
 Blue Ribbon Schools of Excellence National Institute April 12 and 13, 2012.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Office of Service Quality
Office of Service Quality
Vision Statement We Value - An organization culture based upon both individual strengths and relationships in which learners flourish in an environment.
 Teachers 21 June 8,  Wiki with Resources o
July 11, 2013 DDM Technical Assistance and Networking Session.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
East Longmeadow Public Schools SMART Goals Presented by ELPS Leadership Team.
Stages of Research and Development
School Building Leader and School District Leader exam
What it means for New Teachers
Secondary Assessment & Grade Reporting
The New Educator Evaluation System
The New Educator Evaluation System
Child Outcomes Summary Process April 26, 2017
Advancing Student and Educator Growth through Peer Feedback
Teacher Evaluation Performance Categories
Iowa Teaching Standards & Criteria
RECOGNIZING educator EXCELLENCE
The Year of Core Instruction
The New Educator Evaluation System
Educator Evaluations DARTEP 2017
DESE Educator Evaluation System for Superintendents
Five Required Elements
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Personal Growth and Professional Development
Evaluating the Quality of Student Achievement Objectives
Waiting Room Today’s webinar will begin shortly. REMINDERS:
Objectives for today If we have done our job today, you will:
Developing 21st Century Classrooms: Connecting the Dots IV
From Compliance to Impact: Utilizing the School Systems Review as part of a Continuous Improvement Process Ann LaPointe, Educational Improvement Consultant.
State Board of Education Progress Update
Introduction to Student Achievement Objectives
Administrator Evaluation Orientation
Leveraging Performance Management to Support School Priorities
February 21-22, 2018.
SGM Mid-Year Conference Gina Graham
Welcome to Your New Position As An Instructor
Baldwin High School Wednesday, August 21, 2013
Kentucky’s Professional Growth and Effectiveness System
Presentation transcript:

Discussion and Vote to Amend the Regulations Educator Evaluation Discussion and Vote to Amend the Regulations Heather Peske, Senior Associate Commissioner Ron Noble, Director of Integration and Strategy HP Let me start by thanking the Board for its commitment to the continuous improvement of the Massachusetts Educator Evaluation Framework. This topic has been consistently on the Board’s agenda for the past several years, from the launch of the taskforce in 2010 through these first years of implementation. More recently, we have been before you several times since last spring to describe the progress districts are making in realizing the framework and some of the challenges we’ve encountered along the way. We are here today following a period of public comment. We have reviewed and synthesized the public comment and made changes to the original amendments we proposed. Today, we would like to review the final proposed amendments to the regulations for you to vote on these amendments.

Educator Evaluation in MA Five Key Priorities Promote Growth and Development Place Student Learning at the Center Recognize Excellence Set a High Bar for Tenure Shorten Timelines for Improvement HP As a reminder, the Massachusetts Educator Evaluation Framework is designed to meet these five key priorities. As we developed the proposed changes, we have continued to be guided by these priorities.

Summative Performance Rating Proposed Framework Design One Rating Summative Performance Rating Student Impact Rating Purpose: Provide educators with feedback on professional practice and impact on student learning Provide educators with feedback on impact on student learning Rating Scale: Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low HP As you will recall from the November Board meeting, the current framework includes two separate ratings: the Summative Performance Rating and the Student Impact Rating. These two ratings provide opportunities for evaluators and educators to engage in focused dialogue about both professional practice and impact on student learning. In response to stakeholder feedback, including that which you heard at the June meeting, the proposed changes to the regulations you are considering tonight eliminate the separate Student Impact Rating. If the Board votes to accept the proposal, there would be only the Summative Performance Rating. As we discussed at your November meeting, in designing the proposed changes, we had to consider how best to modify the Summative Performance Rating to keep what is working well and add a focus on providing educators with feedback about their impact on student learning in order to maintain the key priority of keeping student learning at the center. To ensure that evaluators and educators remain attentive to measures of student learning, the proposed changes include adding a Student Learning Indicator to the Standards of Effective Practice that undergird the framework. Massachusetts Department of Elementary and Secondary Education

Keeping Student Learning at the Center “Student learning indicator: Demonstrates expected impact on student learning based on multiple measures of student learning, growth, and achievement. For teachers who are responsible for direct instruction, these measures must include student progress on common assessments and, where available, statewide student growth measures.” RN To ensure that the framework continues to serve the priority of keeping student learning at the center of the evaluation process, the proposed changes include adding a new Student Learning Indicator. The reason for this change is to ensure that the focused conversation about student outcomes that would have happened as part of the process for determining a Student Impact Rating is not lost with the elimination of that rating. Absent the separate rating, we believe this conversation appropriately fits for teachers in Standard II, which is all about meeting the needs of all students, and for administrators under Standard I, which is about promoting the learning and growth of all students and the success of all staff. *CLICK Evaluators would consider whether the educator met the expectations of the student learning indicator when determining the educator’s rating on Standard II. Ratings are not typically provided at the Indicator level. *CLICK The proposed changes add a parallel indicator to the Standards used in administrator evaluation.

Framework Design Summative Performance Rating 5 RN The new indicator capitalizes on existing sources of evidence. The proposed changes retain the three categories of evidence originally included in the framework (listed here in these green boxes): products of practice, multiple measures of student learning, and other evidence, such as student feedback. The proposed changes add specificity related to multiple measures of student learning. The goal is for educators and evaluators to be on the same page about how best to use multiple measures to draw conclusions about student learning. How much learning is enough? To facilitate these conversations and in light of public comments, the proposed changes include a process for educators and evaluators to be clear about which measures will be used as evidence in the evaluation process and the anticipated student learning gains for each measure. *CLICK Three broad types of measures will play a part. First, there are the classroom assessments that educators use regularly. Because these are often chosen or created by individual educators, decisions about how much learning students should demonstrate are best left to educators and evaluators. Second, there are district common assessments. Common assessments allow for consistency across schools and promote collaboration. Here, anticipated student learning gains are best set at the district level, typically by the teams of educators who selected or created the common assessment. Finally, we have student growth percentiles from our state assessment. Here, it’s the Department’s responsibility to set anticipated student learning gains. This process of being clear about anticipated student learning gains before the measures are administered promotes meaningful conversations about the extent to which student outcomes match the expectations. 5 Massachusetts Department of Elementary & Secondary Education

Summary of Proposed Changes Eliminate the separate student impact rating. Retain the 5-step cycle. Establish a "student learning indicator" under Standard II: Teaching All Students (for teachers) and under Standard I: Instructional Leadership (for administrators). Add a new definition of "expected impact" on student learning. To sum up, the proposed changes are responsive to stakeholder concerns about a separate Student Impact Rating, but retain a focus on providing educators with feedback about their impact on student learning. The 5-step evaluation cycle is retained and the addition of a student learning indicator ensures that evaluators and educators are looking at whether the educator is demonstrating expected impact throughout the cycle. The indicator also ensures that an educator’s impact on student learning is factored into the summative performance rating. From MDC memo to the Board: The proposed regulatory amendments include the following:   Eliminating the separate student impact rating of high, moderate, or low, which was to be reported to the Department at the educator level. Retaining the 5-step cycle, which culminates in a summative performance rating based on multiple categories of evidence and the professional judgment of the evaluator. Establishing a "student learning indicator" under Standard II: Teaching All Students (for teachers) and under Standard I: Instructional Leadership (for administrators). Standards describe the broad categories of knowledge, skills and performance of effective practice, and indicators describe specific knowledge, skills, and performance for each standard. By including a student learning indicator, impact on student learning is a component of the rating for the Standard. The rating for the Standard, in turn, is a component of the Summative Performance Rating. The student learning indicator does not result in a separate stand-alone rating. Providing additional information about the types of measures that can be used as evidence of educator impact on student learning. The amendments include a definition of common assessments, permit flexibility for the use of common assessments and statewide growth measures in the evaluation process for educators who are not responsible for direct instruction, and clarify that other evidence of student learning beyond common assessments and statewide growth measures may be considered. Adding a new definition of "expected impact" on student learning.

Summary of Proposed Changes (cont’d) Additional proposed modifications informed by public comment: Educators and evaluators discuss anticipated student learning gains during the development of the Educator Plan. An evaluator’s professional judgment determines the length of a self directed growth plan. HP In addition, public comment prompted a few additional modifications. Comments indicated the proposed changes could be construed to actually retain a requirement that evaluators make a separate judgment about an educator’s impact on student learning that stands apart from the educator’s performance rating on Standard 2 due to language that teased out an educator’s impact into three categories: more than expected, expected, and less than expected. Commenters pointed out similarities to the high, moderate, and low categories associated with the Student Impact Rating. We’ve modified the proposal to eliminate references to three categories of expected impact and embedded the process mentioned earlier about setting clear anticipated student learning gains for the assessments that will be used as evidence in an educator’s evaluation. Another modification relates to the evaluator’s professional judgment determining the length of a self-directed growth plan – the plan reserved for those rated Proficient or Exemplary. The earlier proposal hinged a two-year plan on the educator demonstrating at least expected impact. The modified proposal allows evaluators to consider a variety of evidence in making the decision. It also parallels how the length of plans is determined for educators rated less than Proficient. In total, we believe these changes are responsive to feedback we have heard from the field and should promote a well-rounded evaluation process.

Questions? Discussion.