The New MA Educator Evaluation Framework: District-Determined Measures and Student and Staff Feedback ASE June Statewide Conference June 10, 2013 Ron Noble.

Slides:



Advertisements
Similar presentations
RIDE – Office of Special Populations
Advertisements

Overview of the New Massachusetts Educator Evaluation Framework October 2011.
FRANKLIN PUBLIC SCHOOLS SCHOOL COMMITTEE MAY 27, 2014 Massachusetts Kindergarten Entry Assessment (MKEA)
District Determined Measures
Paul Toner, MTA, President Heather Peske, ESE, Associate Commissioner for Ed Quality Teachers Union Reform Network Conference November 1, 2013 Massachusetts.
Massachusetts Department of Elementary & Secondary Education 1 Welcome!  Please complete the four “Do Now” posters.  There are nametags on the tables:
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
Educator Evaluation Regulations, Mandatory Elements & Implementation MTA Center for Education Policy and Practice August 2014.
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
1 Transitioning to the New Massachusetts Curriculum Frameworks in ELA/Literacy and Mathematics Susan Wheltle Department of Elementary and Secondary Education.
EDUCATOR EVALUATION August 25, 2014 Wilmington. OVERVIEW 5-Step Cycle.
A Collaborative Approach to Planning for DDM’s Kristan Rodriguez, Ph.D Chelmsford Public Schools.
Staff & Student Feedback The Role of Feedback in Educator Evaluation January 2015.
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Educator Evaluation: The Model Process for Principal Evaluation July 26, 2012 Massachusetts Secondary School Administrators’ Association Summer Institute.
Student Perception Survey Toolkit Colorado’s Student Perception Survey Planning Webinar for Districts.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
This document is an editable template. Please make sure to customize it for your district before distributing. Copyright 2014 by The Colorado Education.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
1-Hour Overview: The Massachusetts Framework for Educator Evaluation September
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
North Reading Public Schools Educator Evaluation and District Determined Measures: Laying the Foundation Patrick Daly, Ed.D North Reading Public Schools.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
Department of Elementary and Secondary Education July, 2011
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Waiting Room  Today’s webinar will begin shortly. REMINDERS: Dial and enter the passcode # to hear the audio portion of the presentation.
 Reading Public Schools Staff Presentations March 30, 2012.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Evidence-Based Observations Training for Observers of Teachers Module 5 Dr. Marijo Pearson Dr. Mike Doughty Mr. John Schiess Spring 2012.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
Special Educator Evaluation Matt Holloway Educator Effectiveness Specialist.
Educator Evaluation Regulations, Mandatory Elements & Next Steps Prepared by the MTA Center for Education Policy and Practice January 2012.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
2013 MASS Executive Institute. More Than a Decade of Progress: Grade 10 MCAS % proficient or higher 2.
March Madness Professional Development Goals/Data Workshop.
Education Data Services & Educator Evaluation Team Reporting Educator Evaluation Information in EPIMS for RTTT Districts April – May, 2013 Robert Curtin.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
 Teachers 21 June 8,  Wiki with Resources o
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education September 2010.
July 11, 2013 DDM Technical Assistance and Networking Session.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
East Longmeadow Public Schools SMART Goals Presented by ELPS Leadership Team.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Colorado Accommodation Manual Part I Section I Guidance Section II Five-Step Process Welcome! Colorado Department of Education Exceptional Student Services.
CSDCDecember 8, “More questions than answers.” CSDC December 8, 2010.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
Professional Growth and Effectiveness System Update Kentucky Board of Education August 8,
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: June 2012.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Illinois Performance Evaluation Advisory Council Update
DESE Educator Evaluation System for Superintendents
Waiting Room Today’s webinar will begin shortly. REMINDERS:
Discussion and Vote to Amend the Regulations
Illinois Performance Evaluation Advisory Council Update
Roadmap November 2011 Revised March 2012
Presentation transcript:

The New MA Educator Evaluation Framework: District-Determined Measures and Student and Staff Feedback ASE June Statewide Conference June 10, 2013 Ron Noble Educator Evaluation Project Co-Lead

Massachusetts Department of Elementary & Secondary Education 22 Agenda  Setting the Stage  Implementation  Lessons Learned  On the Horizon  District-Determined Measures  Student and Staff Feedback  Q&A

Massachusetts Department of Elementary & Secondary Education 3 Setting the Stage When policy and practice must move faster than research and development, where do you begin? ESE Philosophy:  Don’t let perfection be the enemy of good: the work is too important to delay.  Understand this is just the beginning: we will be able to do this work with increasing sophistication each year  Phased-in implementation: take advantage of emerging research, resources, and feedback from the field.

Massachusetts Department of Elementary & Secondary Education Questions for Policy Makers: Attribution: “When crediting teachers for student learning, how should the individual contributions of teachers acting in a coteaching or consultant role be determined? Assessments: “How can the contributions to student achievement be accurately measured for teachers instructing special populations for which alternative standards and/or assessments are used?” Educator differentiation: “Are the key features of teacher effectiveness for specialized personnel, such as special education teachers different… and should those unique features lead to additional or different content on observation protocols, student growth assessments, or alternative instruments?” Evaluator training: “When rating special education teachers…using an observation protocol or alternative instrument, what special training, if any, do evaluators need?” Holdheide, L.R., Goe, L., & Reschly, D.J.. (2010) Challenges in Evaluating Special Education Teachers and English Language Learner Specialists. National Comprehensive Center for Teacher Quality. 4 Who IS the evaluator? Are variations in contributions measurable? How should we use the MCAS Alternate Assessment? How do we differentiate without creating “two systems”?

Massachusetts Department of Elementary & Secondary Education 5 Implementation Timeline June 2011MA Board of Education passed new educator evaluation regulations September 2011Implementation began in 34 “Level 4” schools, 11 “Early Adopter” districts and 4 Special Education Collaboratives January 2012MA Department of Elementary and Secondary Education (ESE) published the MA Model System for Educator Evaluation September 2012RTTT districts began implementation with at least 50% of educators. September 2013 RTTT districts begin implementation with remaining educators. Non-RTTT districts begin implementation with at least 50% of educators school year All districts pilot District-Determined Measures. Selected districts pilot student and staff surveys school year All districts implement District-Determined Measures. All districts implement student and staff surveys June 2016 District determine Student Impact Ratings for all educators.

Massachusetts Department of Elementary & Secondary Education Implementation  234 Race to the Top Districts  At least 50% of educators  Summative Performance Rating only  5-Step Evaluation Cycle  June data reporting (EPIMS)  6 data elements: 1.Rating on Standard I 2.Rating on Standard II 3.Rating on Standard III 4.Rating on Standard IV 5.Overall Summative Performance Rating 6.Professional Teacher Status (Y/N)

Massachusetts Department of Elementary & Secondary Education 7 5 Step Evaluation Cycle  Every educator is an active participant in an evaluation  Process promotes collaboration and continuous learning  Process applies to all educators

Massachusetts Department of Elementary & Secondary Education 8 Summative Performance Rating Summative Rating UnsatisfactoryImprovement Plan Needs Improvement Directed Growth Plan Exemplary Self-Directed Growth Plan Proficient

Massachusetts Department of Elementary & Secondary Education 9 Educator Evaluation Spring Convening: Connecting Policy, Practice, and Practitioners  May 29, 2013  Over 700 participants from district teams (RTTT and non-RTTT) and educator preparation programs  Key messages:  Integrate with other key district initiatives  Opportunity to strengthen labor-management relations  Albeit difficult, it’s the right work

Massachusetts Department of Elementary & Secondary Education 10 On the Horizon  District-Determined Measures  Student and Staff Feedback

Massachusetts Department of Elementary & Secondary Education 11 District-Determined Measures: Key Terms  Student Impact Rating – a rating of high, moderate, or low for an educator’s impact on student learning  District-Determined Measures – measures of student learning, growth, and achievement that will inform an educator’s Student Impact Rating

Massachusetts Department of Elementary & Secondary Education 12 Student Impact Rating Regulations  Evaluators must assign a rating based on trends (at least 2 years) and patterns (at least 2 measures)  Options – 603 CMR 35.07(1)(a)(3-5)603 CMR 35.07(1)(a)(3-5)  Statewide growth measure(s)*  District-determined Measure(s) of student learning comparable across grade or subject district-wide.  For educators whose primary role is not as a classroom teacher, the appropriate measures of the educator's contribution to student learning, growth, and achievement set by the district.

Massachusetts Department of Elementary & Secondary Education 13 Two Ratings Summative Rating UnsatisfatoryImprovement Plan Needs Improvement Directed Growth Plan Exemplary Self-Directed Growth Plan Proficient 1-yr Self-Directed Growth Plan 2-yr Self-Directed Growth Plan LowModerateHigh Rating of Impact on Student Learning

Massachusetts Department of Elementary & Secondary Education 14 Student Impact Rating Regulations  Why focus on growth?  Level playing field  Fairness  Achievement measures may be acceptable when the district judges them to be the most appropriate/feasible measure for certain educators

Massachusetts Department of Elementary & Secondary Education 15 Revised Implementation Timeline  Commissioner’s Memo - 4/12/13  – districts pilot and identify DDMs  – districts implement DDMs and collect the first year of trend data  – districts collect the second year of trend data and issue Student Impact Ratings for all educators  Districts positioned to accelerate the timeline should proceed as planned.  Guidance and resources to support districts with the identification of DDMs are available here:

Massachusetts Department of Elementary & Secondary Education 16 Revised Implementation Timeline  Minimum Piloting Requirements  Early grade (K-3) literacy  Early (K-3) grade math  Middle grade (5-8) math  High school writing to text  Traditionally non-tested grades and subjects (e.g., fine arts, music, physical education)  If a district is unable to identify a DDM in the grades and subjects listed above, the district must pilot one of ESE’s exemplar DDMs to be released in summer 2013.

Massachusetts Department of Elementary & Secondary Education 17 Recommended Steps for Districts  Identify a team of administrators, teachers and specialists to focus and plan the district’s work on District-Determined Measures.  Complete an inventory of existing assessments used in the district’s schools.  Identify and coordinate with partners that have capacity to assist in the work of identifying and evaluating assessments that may serve as District- Determined Measures. 17 Quick Reference Guide: District-Determined Measures Quick Reference Guide: District-Determined Measures

Massachusetts Department of Elementary & Secondary Education 18  WestEd is supporting ESE with next steps in implementing the Commonwealth’s Model System for Educator Evaluation  Two broad categories of work  Support development of anchor standards in almost 100 separate grades/subjects or courses  Identification and evaluation of promising measures, tools, tests, rubrics  Work to be completed by mid-August ESE Supports

Massachusetts Department of Elementary & Secondary Education 19 ESE Supports  Supplemental guidance on the selection of DDMs and the process of determining an Impact Rating  DDM and Assessment Literacy Webinar Series (March – December)  Technical Guide A (released in May 2013) focuses on selecting high quality assessments  Includes Assessment Quality Checklist and Tracking Tool  Technical Guide B (expected in August 2013) will focus on measuring growth.

Massachusetts Department of Elementary & Secondary Education 20 ESE Supports Assessment Quality Checklist Tool General Information Grade and Subject or Course Potential DDM Name Potential DDM Source Type of Assessment Item Types Step #1: Evaluate Content Alignment Describe the Process Used to Determine Ratings AlignmentAlignment to Curriculum 0 Rigor Alignment to Intended Rigor 0 Total Score 0 % of Possible Score 0% Step #2: Evaluate Remaining Evidence of Assessment Quality Describe the Process Used to Determine Ratings Utility & Feasibility Utility 0 Feasibility 0 Assessment Components Table of Test Specifications 0 Administration Protocol 0 Instrument 0 Scoring Method 0 Technical Documentation 0 Reliability Reliability Evidence Collection Approach 0 Reliability Evidence Quality 0 Validity Validity Evidence Collection Approach 0 Validity Evidence Quality 0 Non-Bias Gathered Evidence of Non-Bias 0 Item Quality Range of Item Difficulties 0 Positively Discriminating Items 0 No Floor/Ceiling Effects 0 Total Score 0 % of Possible Score 0% Click here to transfer assessment information to Tracker

Massachusetts Department of Elementary & Secondary Education 21 DDMs: Request for Feedback  Attribution: How can ESE best support districts in developing attribution policies related to the determination of Student Impact Ratings, particularly for coteachers, consulting teachers, and other scenarios where more than one teacher contributes to student learning, growth, and achievement?  Movement of Students: Due to highly specialized and often changing needs, the population of children identified as needing special education services fluctuates annually, sometimes in significant amounts, and mostly in the elementary grades. This fluctuation means students move in and out of special education classes and may not receive special education instruction for an entire year. How should ESE recommend districts take student movement into account when determining special educators’ Student Impact Ratings?  Selecting Assessments: What are some considerations ESE should be aware of when providing guidance on the selection of measures of student growth to be used in the determination of special educators’ Student Impact Ratings? Please include specific examples of measures that would or would not be appropriate and why. 21

Massachusetts Department of Elementary & Secondary Education 22 Student and Staff Feedback  Revised Implementation Timeline: Beginning in the school year, districts will include student feedback in the evaluation of all educators and staff feedback in the evaluation of all administrators.  During the school year, ESE will work with districts to pilot/field test model survey instruments.

Massachusetts Department of Elementary & Secondary Education 23 Multiple sources of evidence inform the summative rating

Massachusetts Department of Elementary & Secondary Education  Idaho  Kentucky  Maine  Massachusetts  Michigan  Missouri 24  Alaska  Arizona  Colorado  Delaware  Georgia  Hawaii National Overview A growing number of states are currently using or preparing to use student surveys in educator evaluations  Mississippi  New Jersey  New York  North Carolina  Rhode Island  Washington

Massachusetts Department of Elementary & Secondary Education  Perception surveys round out a multiple measure evaluation system  Research also finds student surveys are correlated with student achievement  The Measures of Effective Teaching Project found students’ perceptions are reliable, stable, valid, and predictive  Surveys may be the best gauge of student engagement  When asked which measures are good or excellent at assessing teacher effectiveness, teachers reported  District standardized tests (56 percent)  Principal feedback (71 percent)  Students’ level of engagement (92 percent) 25 Why Use Student Surveys in Educator Evaluations?

Massachusetts Department of Elementary & Secondary Education 26  MA’s State Student Advisory Council and six regional student advisory councils provide a unique feedback loop for students  MA Student Advisory Council focus groups were overwhelmingly positive toward soliciting their input through student surveys  MA students want to help teachers improve  MA students are excited about the prospect of being surveyed for this purpose  MA students offered thoughtful precautions about survey use:  Use surveys for teacher goal-setting  Consider making survey feedback visible only to teachers  Provide 3 rd party screeners of any open-ended questions What Students Say…

Massachusetts Department of Elementary & Secondary Education 27  Benefits of Surveys of Classroom/School Experiences  Offers valuable insight from those with first-hand experience  Empowers and engages survey recipients, sending a signal that their input is valued  Comparatively inexpensive Surveys as a Form of Feedback  Considerations When Using Surveys of Classroom/School Experiences  Students may lack cognitive ability or maturity  Could become a popularity contest or “rate-your- teacher.com”  Survey results could be misused by evaluators

Massachusetts Department of Elementary & Secondary Education 28  The more immediate the feedback the better  The more flexibility for teachers to administer surveys when they wish the better  Surveys for early grades and special populations require special attention  To the extent that surveys are used for high stakes decisions at all, this should not happen until after they have been used effectively and reliably, and educators have grown comfortable with them, in a low stakes setting  When used for formative purposes, surveys are generally seen as a good thing National Perspective – Lessons Learned

Massachusetts Department of Elementary & Secondary Education 29  Key areas for state or district consideration:  1. Determining survey samples  2. Timing of survey administration  3. Reporting of survey results  4. Using survey results in evaluations  5. Considerations for pre-readers, special education, and English Learners Perspectives & Considerations

Massachusetts Department of Elementary & Secondary Education 30 Student and Staff Feedback: Request for Feedback  Source of Evidence: In what way or ways should ESE recommend student and staff feedback be used as a source of additional evidence relevant to one or more Performance Standards?  Accommodations: What types of arrangements are most appropriate for the special populations, i.e., pre-readers, students with limited English proficiency, and students with disabilities, so that their feedback can be taken into account as well?  Data Collection Tools: In addition to perception surveys, what other types of data collection tools for capturing student feedback should ESE recommend and for what populations would these tools be most useful?

Massachusetts Department of Elementary & Secondary Education 31 Additional Questions?  Ron Noble – or