Educator Evaluation Workshop: Gathering Evidence, Conducting Observations & Providing Feedback MSSAA Summer Institute July 26, 2012 Massachusetts Department.

Slides:



Advertisements
Similar presentations
A Vehicle to Promote Student Learning
Advertisements

The Massachusetts Model System for Educator Evaluation Training Module 4: S.M.A.R.T. Goals and Educator Plan Development August 2012 I. Welcome (3 minutes)
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Thank you!. At the end of this session, participants will be able to:  Understand the big picture of our new evaluation system  Create evidence-based.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
“SMARTer” Goals Winter A ESE-MASS Workshop for superintendents and representatives from their leadership teams.
The Massachusetts Model System for Educator Evaluation
The Massachusetts Model System for Educator Evaluation Unpacking the Rubrics and Gathering Evidence September 2012 Melrose Public Schools 1.
 Reading School Committee January 23,
Educator Evaluation Regulations, Mandatory Elements & Implementation MTA Center for Education Policy and Practice August 2014.
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
EDUCATOR EVALUATION August 25, 2014 Wilmington. OVERVIEW 5-Step Cycle.
Collecting Artifacts: Showcasing Your Best Work!
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Artifacts 101: The Role of Artifacts in Educator Support and Evaluation Spring 2015 Webinar.
Staff & Student Feedback The Role of Feedback in Educator Evaluation January 2015.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Observation Process and Teacher Feedback
Collaboration and continuous learning are the focus.
ORIGINAL SMART GOAL FORM Smart Goal Setting Form Name: School: Shared: Primary Evaluator (Name/Title): A minimum of one student learning goal and one professional.
The New Massachusetts Educator Evaluation System Natick Public Schools.
Educator Evaluation: The Model Process for Principal Evaluation July 26, 2012 Massachusetts Secondary School Administrators’ Association Summer Institute.
NAPS Educator Evaluation Spring 2014 Update. Agenda Evaluation Cycle Review Goal Expectations and Rubric Review SUMMATIVE Evaluation Requirements FORMATIVE.
1-Hour Overview: The Massachusetts Framework for Educator Evaluation September
Professional Performance Process Principal Level Meetings October 19 & 22, 2009.
CLASS Keys Orientation Douglas County School System August /17/20151.
New Teacher Introduction to Evaluation 08/28/2012.
Stronge Teacher Effectiveness Performance Evaluation System
APS Teacher Evaluation Module 9 Part B: Summative Ratings.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
 Reading Public Schools Staff Presentations March 30, 2012.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Evaluation Team Progress Collaboration Grant 252.
Educator Growth and Professional Development. Objectives for this session The SLT will…  Have a thorough understanding of High Quality Standard 5: Educator.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Data Sources Artifacts: Lesson plans and/or curriculum units which evidence planned use of diagnostic tools, pre- assessment activities, activating strategies,
South Western School District Differentiated Supervision Plan DRAFT 2010.
1 Archdiocese of New York Academic Initiatives.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
The Massachusetts Model System for Educator Evaluation S.M.A.R.T. Goals and Educator Plan Development August
March Madness Professional Development Goals/Data Workshop.
Candidate Assessment of Performance (CAP): an Overview August 4, 2015 and August 10, 2015 Presented by: Jennifer Briggs.
 Development of a model evaluation instrument based on professional performance standards (Danielson Framework for Teaching)  Develop multiple measures.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Education Unit The Practicum Experience Session Two.
Idaho Principal Evaluation Process Tyson Carter Educator Effectiveness Coordinator Idaho State Department of Education
Science Instructional Guide For High School Integrated Coordinated Science Biology and Chemistry Professional Development Module 1 Introduction to the.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Northwest ISD Target Improvement Plan Seven Hills Elementary
© 2014, KDE and KLA. All rights reserved. PROFESSIONAL GROWTH AND SELF-REFLECTION.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
The Use of Artifacts as Evidence in Educator Evaluation Fall 2015.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
 Teachers 21 June 8,  Wiki with Resources o
July 11, 2013 DDM Technical Assistance and Networking Session.
Calibrating Feedback A Model for Establishing Consistent Expectations of Educator Practice Adapted from the MA Candidate Assessment of Performance.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Educator Supervision and Evaluation Clarke and Diamond MS September 2013.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
SOESD’s Teacher Evaluation & Support System
ORIGINAL SMART GOAL FORM
Educator Effectiveness Regional Workshop: Round 2
DESE Educator Evaluation System for Superintendents
Presentation transcript:

Educator Evaluation Workshop: Gathering Evidence, Conducting Observations & Providing Feedback MSSAA Summer Institute July 26, 2012 Massachusetts Department of Elementary and Secondary Education 1

Agenda The Role of Evidence in the 5-Step Cycle Artifacts of Practice Three types of evidence Roles & responsibilities Artifacts of Practice Observations & Feedback Tips & Strategies Resources Massachusetts Department of Elementary and Secondary Education

Intended Outcomes At the end of this session, participants will be able to: Define “evidence of practice” and understand the role of artifacts, observations, and feedback in the 5-Step Cycle Understand the value of frequent, unannounced observations with targeted feedback Identify tools and processes for gathering and organizing evidence that will make evidence collection and feedback more doable in their schools. Massachusetts Department of Elementary and Secondary Education

Every educator is an active participant in the evaluation process Continuous Learning Every educator & evaluator collects evidence and assesses progress Collaboration and Continuous Learning are the focus Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 4

“I was evaluated today…” sound familiar? Evaluation  Observations

Multiple sources of evidence inform the Performance Rating Three categories of evidence must be collected for each educator: Multiple measures of student learning, growth and achievement Judgments based on observations and artifacts of professional practice Additional evidence relevant to standards student/staff feedback (2013-2014) Massachusetts Department of Elementary and Secondary Education

Multiple sources of evidence inform the performance rating This graphic explains the summative performance rating in new educator evaluation framework. The left column includes the three categories of evidence used in evaluation: Products of Practice (including observations, unit plans, schedules and the like), Multiple Measures of Student Learning (ranging from classroom assessments to MCAS Growth Percentile Scores and MEPA when available), and Other Evidence including, eventually, student feedback. (A note here: MEPA – the Massachusetts English Proficiency Assessment for English Language Learners – is being replaced by a better assessment being developed by a national consortium of states called WIDA (pronounced “WEE-DUH.”)) The middle column represents the translation of the evidence into an assessment of performance on each of four standards in addition to an assessment of progress on goals. The right column is the single summative performance rating. Starting from the left, using a rubric, an evaluator lines up evidence from the three categories on the left to determine a rating on each of the four Standards (for teachers, the four standards are Curriculum, Planning and Assessment, Teaching all Students, Family Outreach & Engagement, and Professional Culture) and an assessment of progress on both student learning and professional practice goals. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

What does this look like? Products of Practice related to Standards Multiple Measures of Student Learning Other Evidence related to Standards Artifacts Teacher-developed unit assessments Grade level meeting notes Parent/teacher communication log PLC meeting notes Observations Notes/feedback from short, frequent observations (inside/outside classrooms) Notes and feedback from announced observations Student work (quizzes, homework, presentations, etc.) Portfolios Performance assessments (including arts, vocational, health & wellness) Interim assessments State or district assessments Student and staff feedback (2013/2014 school year)

Implementation Responsibility Educator Responsibilities: Documenting action steps completed. Collecting, organizing and submitting evidence to demonstrate progress toward professional practice and student learning goals. Evaluator Responsibilities: Observing practice on a regular basis and providing targeted feedback on performance Making resources and supports available. Identification of common artifacts/evidence.

Products of Practice: Artifacts

It starts with the Educator Plan… Student Learning Goal: In order to ensure mathematical literacy in each of the three content areas for 8th grade geometry (8.G), I will incorporate at least one essay question into each unit assessment that requires elaboration of mathematical reasoning so that 80% or more of my ELL students demonstrate proficiency on essay questions on the end of the year 8th grade geometry assessment. Student Learning Goal(s) Planned Activity Action Supports/Resources from School/District Timeline/Benchmark or Frequency By October 1, I will assess ELL student comprehension and knowledge with formative assessments By October 15th, I will share this data with my department team and instructional coach and solicit feedback on instructional strategies related to teaching mathematical literacy to ELL students. By October 30th, I will develop writing objectives for each unit and integrate them into unit assessments. From November through May, after each unit, I will disaggregate assessment data for ELL students, focusing on mathematical literacy. I will track their progress and adjust instruction as necessary. Formative geometry assessment Monthly department team meetings Monthly one-on-one data analysis with instructional coach and ELL specialist Unit assessments Oct. 1: review formative assessment results for my ELL students Oct. 15: share formative assessment results with department team and instructional coach and identify at least three instructional strategies related to building mathematical literacy with ELL students. Evidence: meeting notes, 3 strategies 2. October 30th: developing writing objectives for each unit Evidence: written objectives, essay questions 3. November—June: Administer unit assessments in three content areas and analyze student performance on essay questions Evidence: student data from essay questions in at least three unit assessments

Importance of Strategically Collecting Artifacts Artifacts should be a sample that demonstrates educator performance and impact Aligned with educator goals, the Model System Teacher Rubric or school goals Number of artifacts to collect varies by educator Artifacts can provide evidence of more than one Standard-Indicator An annotated summary of Grade 5 unit assessment results can include evidence of practice related to I.C. (Analysis), IV.A.1 (Reflective Practice), and IV.C (Collaboration).

Lessons from Early Adopters: Collecting Evidence Quality not quantity Guidelines and exemplars will help Prioritize based on focus areas Massachusetts Department of Elementary and Secondary Education

Products of Practice: Observations & Feedback

Observations The regulations require a minimum of one unannounced observation. The Model System recommends short, frequent unannounced observations for all educators, as well as at least one announced observation for non-PTS educators and struggling educators. Observations are one more source of evidence, to support and live alongside artifacts of practice. They serve a formative AND summative purpose, as do artifacts. They’re the vehicle through which evaluators can provide targeted, ongoing feedback to educators and maintain a dialogue around teaching and learning. Massachusetts Department of Elementary and Secondary Education

Why short, frequent observations? More opportunities to see patterns of practice Flexibility in scheduling Promotes ongoing conversation around teaching and learning Facilitates observations beyond the classroom Is 5-15 minutes enough?

Observation and Feedback School-level Administrator Rubric (I-D-2): Typically makes at least two unannounced visits to classrooms each day and provides targeted, constructive feedback to all educators. Acknowledges effective practice and provides redirection and support for those whose practice is less than proficient. Superintendent Rubric (I-D-2): Typically makes at least three unannounced visits to each school to observe principal practice every year and provides targeted, constructive feedback to all administrators. Acknowledges effective practice and provides redirection and support for those whose practice is less than proficient. We need to go into more detail on Observation here since it is an important source of evidence about educator performance. Here are two more excerpts from the model rubrics for principal (school-level administrator) and superintendent. Going forward, observations will look quite different from the observations that are done in many schools today. In the model system, principals will be making at least two unannounced observations to classrooms every day, and superintendents are expected to make at least three unannounced visits to each school over the course of the year. The school-level classroom observations described here, in particular, may sound undoable, especially if you’re thinking about it in terms of the “classic” observation that requires scheduling an announced, 45-minute visit to a classroom and setting up a pre- and post-conference. HANDOUT: Strategies & Suggestions for Observations Take a moment to read the section on frequent, unannounced observations. Think about how this type of observation differs from what you currently do, and how it will reshape the way we think about observations with regard to educator evaluation. Will this be a big culture shift for you? Massachusetts Department of Elementary and Secondary Education 17

Principles of high quality observations Frequent Focused Inside/Outside the Classroom Useful & Timely Feedback The rubric is not an evaluation tool, but a guide to help identify trends and patterns of practice over time.

Feedback “[O]bservers must learn how to capture classroom events in literal notes, and to talk productively with the teacher about it afterward in a way that is evidence-based and productively points toward actionable improvement.” –John Saphier

Principles of Good Feedback Verbal as well as written Focused on a few key areas Based on evidence Tied to Standards of effective practice Offers reinforcement for areas of effective practice Facilitates self-reflection on areas of practice that need refinement and guides the teacher in thinking beyond the lesson observed “The Proficient performance descriptors represent the expected Standard—the bar we expect all experienced teachers to demonstrate over time.” Instruct school teams that they will now have the opportunity to begin looking at the performance descriptors for a particular Indicator and corresponding elements found in Standards I and II, starting with Proficient. Ask participants to count off from 1–7 until every individual has been assigned a number. Place the numbered tabletop cards at each table, and instruct participants to temporarily “re-sort” into these new “teams,” with the 1’s gathering together, the 2’s gathering together, and so on. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

Tips & Strategies

PLAN COMMUNICATE EXPECTATIONS ORGANIZE

1. PLAN The more concrete the Educator Plan, the easier it is to identify and collect artifacts Identify common artifacts all or most educators will be expected to collect (unit assessments, parent-teacher logs, etc.) Share examples of high-quality, valuable evidence during faculty or team meetings Demonstrate example artifacts that provide evidence of more than one Standard-Indicator

2. Communicate Expectations OR

2. Communicate Expectations Artifacts should be a sample that demonstrates educator performance and impact Submitted evidence should be tied to educator goals, Standards or Indicators, or school goals Provide everyone with a clear idea of how and when to share products of practice Email? Paper? Online cloud?

3. ORGANIZE Calendar observations Adopt a process for organizing artifacts and observation notes by Standard/Indicator and/or goals Paper-based, email-driven, or online repository Sample tools for evidence collection and organization

Sample Tools for Evidence Collection and Organization Included in your packet Sample Evidence Tool Completed by . . . Artifact Cover Page Educator or Evaluator (the person who identifies the artifact) Observation Evidence Collection Tool Observer/evaluator

Massachusetts Department of Elementary and Secondary Education

Next Steps – Suggestions for Principals Read “Strategies and Suggestions for Observations” (p. 39 of the School-Level Planning & Implementation Guide) Identify options for collecting and organizing evidence at your school and establish a protocol for all educators Work with your administrative team to set a calendar for observations and evaluations based on the distribution of educators by plan type at your school Massachusetts Department of Elementary and Secondary Education

Massachusetts Model System for Educator Evaluation Resources Massachusetts Model System for Educator Evaluation Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education

School-Level Planning & Implementation Guide Content Overview The Massachusetts Model System for Educator Evaluation Step 1: Self-Assessment Step 2: Goal Setting and Plan Development Step 3: Implementation of the Plan Step 4: Formative Assessment and Evaluation Step 5: Summative Evaluation Appendices: Forms for Educator Evaluation, Setting SMART Goals This is a quick snapshot of what you will see in the School Guide. The guide itself is structured around the five steps of the evaluation cycle: Self-Assessment, Goal Setting & Plan Development, Implementation, Formative Assessment/Evaluation, and Summative Evaluation. The related forms appear in the Appendix along with other resources including guidance about setting “SMART” goals. HANDOUT: The 5-Step Educator Evaluation Cycle: Train-the-Trainer Modules One additional resource will be available to every district and school this spring. ESE will be publishing seven “train-the-trainer” modules with facilitator guides designed to help school-level leadership teams learn about the 5-step cycle. ESE will subsidize approved vendors to offer these modules regionally. The seven sequential modules are as follows: 1 – Overview 2 – Unpacking Rubrics 3 – Self-Assessment and Goal Proposal 4 – S.M.A.R.T. Goal and Educator Plan Development 5 – Gathering Evidence through Artifacts 6 – Gathering Evidence through Observation 7 – Rating Educator Performance An eighth module will follow this on Rating Educator Impact on Student Learning based on District-Determined Measures, after ESE publishes guidance on Phase 2 of the Educator Evaluation Framework. Massachusetts Department of Elementary and Secondary Education Massachusetts Department of Elementary and Secondary Education 31

ESE Evaluation Resources What’s coming? Summer 2012 Guidance on District-Determined Measures Training Modules with facilitator guides, PowerPoint presentations, and participant handouts List of approved vendors Updated website with new Resources section Newsletter Massachusetts Department of Elementary and Secondary Education

ESE Evaluation Resources What’s coming? Fall/Winter 2012 Solicit and review feedback on Model System; update Research & develop student and staff feedback instruments Collect and disseminate best practices Collect and vet assessments to build a repository of district measures Internal collaboration to support cross-initiative alignment EX: Support for use of rubric for teachers of ELLs aligned to RETELL initiative Massachusetts Department of Elementary and Secondary Education

For More Information and Resources: Visit the ESE educator evaluation website: www.doe.mass.edu/edeval Contact ESE with questions and suggestions: EducatorEvaluation@doe.mass.edu Presenter: Claire Abbott – cabbott@doe.mass.edu 34 Massachusetts Department of Elementary and Secondary Education