DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.

Slides:



Advertisements
Similar presentations
Overview of the New Massachusetts Educator Evaluation Framework October 2011.
Advertisements

Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District Determined Measures
Gwinnett Teacher Effectiveness System Training
Paul Toner, MTA, President Heather Peske, ESE, Associate Commissioner for Ed Quality Teachers Union Reform Network Conference November 1, 2013 Massachusetts.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Massachusetts Department of Elementary & Secondary Education 1 Welcome!  Please complete the four “Do Now” posters.  There are nametags on the tables:
Common Core State Standards OVERVIEW CESA #9 - September 2010 Presented by: CESA #9 School Improvement Services Jayne Werner and Yvonne Vandenberg.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
The Massachusetts Model System for Educator Evaluation Unpacking the Rubrics and Gathering Evidence September 2012 Melrose Public Schools 1.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
Educator Evaluations Education Accountability Summit August 26-28,
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
EDUCATOR EVALUATION August 25, 2014 Wilmington. OVERVIEW 5-Step Cycle.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
A Collaborative Approach to Planning for DDM’s Kristan Rodriguez, Ph.D Chelmsford Public Schools.
Staff & Student Feedback The Role of Feedback in Educator Evaluation January 2015.
Science Content Leadership Network July 17, 2014.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Observation Process and Teacher Feedback
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Educator Evaluation: The Model Process for Principal Evaluation July 26, 2012 Massachusetts Secondary School Administrators’ Association Summer Institute.
Evaluation Resources for Educators in Complex Roles ESE Spring Convening May 27-28, 2015.
Check-in on Curriculum Progress Next Steps.  Brings all of the pieces together.  Transparency  Creates curriculum conversation  A tool for the journey.
The New MA Educator Evaluation Framework: District-Determined Measures and Student and Staff Feedback ASE June Statewide Conference June 10, 2013 Ron Noble.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Moving to the Common Core Janet Rummel Assessment Specialist Indiana Department of Education.
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
1-Hour Overview: The Massachusetts Framework for Educator Evaluation September
2012 Secondary Curriculum Teacher In-Service
North Reading Public Schools Educator Evaluation and District Determined Measures: Laying the Foundation Patrick Daly, Ed.D North Reading Public Schools.
Student Learning Objectives (SLOs) Measuring Teacher Effectiveness Through the Use of Student Data Overview of the SLO Process April 7,
A major shift in Alabama’s curriculum to better prepare our students for post-secondary life (i.e. life after school). 1.Beginning this school year ( )
Waiting Room  Today’s webinar will begin shortly. REMINDERS: Dial and enter the passcode # to hear the audio portion of the presentation.
 Reading Public Schools Staff Presentations March 30, 2012.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Massachusetts Department of Elementary & Secondary Education Waiting Room Today’s webinar will begin shortly. Massachusetts Department of Elementary &
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
EVALUATIONS, STUDENT GROWTH MEASURES & KEEP AUG 25, 2014 BILL BAGSHAW, ASSISTANT DIRECTOR.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
Special Educator Evaluation Matt Holloway Educator Effectiveness Specialist.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Educator Evaluation 101: A Special Overview Session for Educator Preparation Programs May 2013.
Getting Started: Educator Evaluation in Non-RTTT Districts.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
March Madness Professional Development Goals/Data Workshop.
Primary Purposes of the Evaluation System
Education Data Services & Educator Evaluation Team Reporting Educator Evaluation Information in EPIMS for RTTT Districts April – May, 2013 Robert Curtin.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
The Individual Education Plan (IEP) Toronto District School Board January 20, 2015.
 Blue Ribbon Schools of Excellence National Institute April 12 and 13, 2012.
Educator Effectiveness Summit School District’s Recommendation for the School Year.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
 Teachers 21 June 8,  Wiki with Resources o
Welcome!  Please complete the three “Do Now” posters.  There are nametags on the tables:  Please ensure that more than one district is represented at.
July 11, 2013 DDM Technical Assistance and Networking Session.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Springfield Public Schools Springfield Effective Educator Development System Overview for Educators.
District-Determined Measures November 5, 2014 MASC/MASS Joint Conference.
DESE Educator Evaluation System for Superintendents
Presentation transcript:

DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme

Agenda  Overview of DDMs (30 minutes)  Introduction to DDMs  Q&A with Craig Waterman  DDM Considerations for School Counselors – Group Planning Activity (40 minutes)  Selecting Measures (15 minutes)  Closing and Next Steps Massachusetts Department of Elementary and Secondary Education 2

The Educator Evaluation Framework Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low Summative Performance Rating Student Impact Rating  Everyone earns two ratings 3

Two Ratings – Intersection of Practice and Impact Summative Rating Exemplary 1-yr Self-Directed Growth Plan 2-yr Self-Directed Growth Plan Proficient Needs Improvement Directed Growth Plan UnsatisfactoryImprovement Plan LowModerateHigh Rating of Impact on Student Learning

Identifying DDMs – Key Questions  Is the measure aligned to content?  Does it assess what the educators intend to teach and what’s most important for students to learn?  Is the measure informative?  Do the results tell educators whether students are making the desired progress, falling short, or excelling?  Do the results provide valuable information to schools and districts about their educators? See Technical Guide B for more about these key questions: /ddm/TechnicalGuideB.pdf See Technical Guide B for more about these key questions: /ddm/TechnicalGuideB.pdf

Key Messages for Stakeholders

Difference Between Direct and Indirect Measures Classroom-based educator provides direct instruction to students. Classroom- based Educator Responsibilities Students learn; acquire knowledge and skills Measures of Student Learning  Classroom-based educators can directly measure the impact of their instruction on student learning. Direct Measures Direct Measure

Difference Between Direct and Indirect Measures  For SISP who do not directly instruct students, there often is an an intermediary step that exists between SISP responsibilities and student learning, therefore making the SISP contribution indirectly tied to student learning. SISP educator provides specialized support services to students. SISP Responsibilities Students can access general education curriculum. Indirect Measures Students learn; acquire knowledge and skills Measures of Student Learning Indirect Measures Indirect Measure

Do DDMs have to be identical?  DDMs must be “comparable across schools, grades, and subject matter district-wide.” (603 CMR 35.09(2)a)603 CMR 35.09(2)a Type 1: Comparable within a grade, subject, or course across schools within a district Identical measures are recommended Investigate Fairness Investigate Fairness for all students and educators Example Elementary 5 th Grade Teacher Music Teacher Another Elementary 5th Grade Teacher Music Teacher Type 2: Comparable across grade or subject level district-wide Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor See Investigating Fairness Implementation Brief: m/Fairness.pdf m/Fairness.pdf See Investigating Fairness Implementation Brief: m/Fairness.pdf m/Fairness.pdf

Identify Commonalities  One of the goals of DDMs is to support common measures across a district  Approach to selecting DDMs for SISP educators could involve identifying commonalities across multiple roles  For example, a district-wide DDM involving collecting feedback from stakeholders on the quality, usefulness, and timeliness of communications Massachusetts Department of Elementary and Secondary Education 10

DDMs for District “Singletons”  Districts should consider the following options to identify and develop DDMs for “singletons”:  Work with neighboring districts to identify and/or create DDMs.  Group educators within the district who have similar responsibilities, albeit different roles, to collaborate on DDMs.  Look to outside rating entities (such as state or national organization standards).  Use DDMs that can be easily interpreted (e.g., for a school counselor, using number of seniors who applied to college instead of number of students who are prepared to apply to college.) Massachusetts Department of Elementary and Secondary Education 11

DDM Implementation Plan – June 1, 2014

Resources:  Implementation Briefs  Technical Guide B  Webinar Series  Commissioner’s Memoranda  Educator Evaluation Newsletter  Technical Assistance and Networking Sessions  Using Current Assessments in DDMs (Curriculum Summit)  Example Assessments  Other ESE documents (Technical Guide A, Part VII, Regulations) 13

Questions and Answers

Who scores the DDM?  Outside Organizations  Commercial assessments  Automated methods  Paid raters (e.g., college students, retired teachers)  Teams of Teachers (e.g., all 5 th grade teachers)  Team members rate each other’s students’ responses  Multiple raters score each response  Individual Teachers  Random auditing (rechecking) Internal External See Scoring and Parameter Setting Implementation Brief: du/edeval/ddm/Scoring &PSetting.pdf du/edeval/ddm/Scoring &PSetting.pdf See Scoring and Parameter Setting Implementation Brief: du/edeval/ddm/Scoring &PSetting.pdf du/edeval/ddm/Scoring &PSetting.pdf

How do I determine high, moderate, or low growth? QualitativeQuantitative Previous Results Variability Fairness Educator Review Identification Agreement Ask educators how much growth is moderate on this assessment? Does the DDM make meaningful distinctions? Do all students have an equal chance to demonstrate growth? See Scoring and Parameter Setting Implementation Brief: du/edeval/ddm/Scoring &PSetting.pdf du/edeval/ddm/Scoring &PSetting.pdf See Scoring and Parameter Setting Implementation Brief: du/edeval/ddm/Scoring &PSetting.pdf du/edeval/ddm/Scoring &PSetting.pdf

DDMs for School Counselors

Identifying DDMs – Key Questions for Specialized Instructional Support Personnel Massachusetts Department of Elementary and Secondary Education 18  Is the measure aligned to content?  For SISP educators (including school counselors), content should reflect their job functions and responsibilities and should align to what they do to support students, educators, administration, and/or parents.  Instructions: With you table, identify key content (i.e., key job functions and responsibilities) of your role.

Identifying DDMs – Key Questions for Specialized Instructional Support Personnel  What does success look like? What would you see if someone was successful at these job functions and responsibilities?  Instructions: With your table, identify what success looks like for each of the job functions and responsibilities you identified. Massachusetts Department of Elementary and Secondary Education 19

Identifying DDMs – Key Questions for Specialized Instructional Support Personnel  What is measureable?  Measurement is…  The systematic process to assign a number to an observation  Example: How tall is my friend? 1.Place a measuring tape next to my friend and read the number next to the line that is closest to the top of his or her head. 2.Stand next to my friend and estimate how much taller or shorter they are compared to myself. 3.Place a ruler flat on my friend’s head and mark a line on a wall where the ruler hits the wall. Then measure how far the line is off the ground.  Instructions: With your table, identify approaches could you use to measure the job responsibilities and functions you identified.

Identifying DDMs – Key Questions for Specialized Instructional Support Personnel  Is the measure informative?  Chosen measures should provide both educators and districts with actionable information to inform practice and identify both areas of strength and areas where more supports are needed. Massachusetts Department of Elementary and Secondary Education 21

Key Messages for Stakeholders