District Determined Measures

Slides:



Advertisements
Similar presentations
Overview of the New Massachusetts Educator Evaluation Framework October 2011.
Advertisements

Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
The SCPS Professional Growth System
Performance Evaluation
Introduction to Creating a Balanced Assessment System Presented by: Illinois State Board of Education.
Assessment Literacy Shirley Gilfether, Easthampton Public Schools Director of Curriculum and Grants Management DESE Educator Evaluation Spring Convening.
Gwinnett Teacher Effectiveness System Training
Sub-heading ADMINISTRATOR EVALUATION AND SUPPORT SYSTEM Curriculum, Instruction and Assessment Leader Proposed Adaptations.
Introduction to Teacher Evaluation August 20, 2014 Elizabeth M. Osga, Ph.D.
Math Content Network Update The Power of Mistakes Student Engagement Culture of Learning Growth Mindset Congruent Tasks.
Professional Learning
Connecting the Process to: -Current Practice -CEP -CIITS/EDS 1.
Paul Toner, MTA, President Heather Peske, ESE, Associate Commissioner for Ed Quality Teachers Union Reform Network Conference November 1, 2013 Massachusetts.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Franklin Public Schools MCAS Presentation November 27, 2012 Joyce Edwards Director of Instructional Services.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
Unpacking the Educator Evaluation System Non-Professional Status Teachers entering the Evaluation System.
District Determined Measures
Student Growth Developing Quality Growth Goals II
Educator Evaluation System Salem Public Schools. All DESE Evaluation Information and Forms are on the SPS Webpage Forms may be downloaded Hard copies.
Student Learning Objectives NYS District-Wide Growth Goal Setting Process December 1, 2011 EVOLVING.
The Massachusetts Framework for Educator Evaluation: An Orientation for Teachers and Staff October 2014 (updated) Facilitator Note: This presentation was.
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Moving to the Common Core Janet Rummel Assessment Specialist Indiana Department of Education.
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Today’s website:
North Reading Public Schools Educator Evaluation and District Determined Measures: Laying the Foundation Patrick Daly, Ed.D North Reading Public Schools.
Prepared by: Scott R. Morrison Director of Curriculum and Instructional Technology 11/3/09.
Department of Elementary and Secondary Education July, 2011
Student Impact Rating Teacher Professional Growth and Effectiveness System Daviess County Public Schools.
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
 Reading Public Schools Staff Presentations March 30, 2012.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
Information for school leaders and teachers regarding the process of creating Student Learning Targets. Student Learning targets.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Evaluation Team Progress Collaboration Grant 252.
MEASURES OF STUDENT OUTCOMES WPSD EDUCATOR EFFECTIVENESS 102.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
District In-Service October 14, The principal reason for releasing the 2013 edition of The Framework for Teaching Evaluation Instrument was to respond.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
Student Growth within the Teacher Professional Growth and Effectiveness System (TPGES) Overview 1.
Special Educator Evaluation Matt Holloway Educator Effectiveness Specialist.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
March Madness Professional Development Goals/Data Workshop.
Student Growth Percentiles Basics Fall Outcomes Share information on the role of Category 1 assessments in evaluations Outline steps for districts.
Primary Purposes of the Evaluation System
Melrose High School 2014 MCAS Presentation October 6, 2014.
OREGON DEPARTMENT OF EDUCATION COSA PRINCIPAL’S CONFERENCE 2015 ODE Update on Educator Effectiveness.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Springfield Effective Educator Development System (SEEDS)
July 11, 2013 DDM Technical Assistance and Networking Session.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Somers Public Schools Building and Departmental Goals
Introduction to Teacher Evaluation
Release of PARCC Student Results
Introduction to Teacher Evaluation
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
DESE Educator Evaluation System for Superintendents
CORE Academic Growth Model: Results Interpretation
Waiting Room Today’s webinar will begin shortly. REMINDERS:
Teacher SLTs
Presentation transcript:

District Determined Measures Professional Development Day January 27, 2014 White Brook Middle School Staff And Elementary Specialists Shirley Gilfether

District Determined Measures Measures of student learning, growth, and achievement related to the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks, or other relevant frameworks, that are comparable across grade or subject level district-wide.

The Educator Evaluation Framework Beginning 2015-16 educators earn two ratings Massachusetts Department of Elementary and Secondary Education

Student Impact Rating Regulations For each educator there must be at least two measures. Options – 603 CMR 35.07(1)(a)(3-5) Statewide growth measure(s) [MCAS SGP must use if applicable] District-determined Measure(s) of student learning comparable across grade or subject district-wide. For educators whose primary role is not as a classroom teacher, the appropriate measures of the educator's contribution to student learning, growth, and achievement set by the district.

Rating of Impact on Student Learning Two Ratings Summative Rating Exemplary 1-yr Self-Directed Growth Plan 2-yr Self-Directed Growth Plan Proficient Needs Improvement Directed Growth Plan Unsatisfactory Improvement Plan Low Moderate High Rating of Impact on Student Learning Massachusetts Department of Elementary and Secondary Education

Multiple sources of evidence inform the summative rating Administrator Observations; Teacher supplied evidence Classroom assessments; Benchmarks from SLG; recurring assessments- BAS, writing prompts…. Explain: Here is a visual representation of the entire process. You can see the three categories of evidence along the left, and how they inform practice associated with each of the four Standards (rubric is the frame), as well as an assessment of progress on educator goals. In its totality, an evaluator uses this information to determine an overall summative performance rating of exemplary, proficient, needs improvement or unsatisfactory. More information on determining a Summative Performance Rating is available in ESE’s related guidance document, located on its website.   Other evidence related to EES standards; student feedback 6 Massachusetts Department of Elementary and Secondary Education

District-Determined Measures DDMs may inform both an educator’s summative performance rating and impact rating Summative Performance Rating Student Impact Rating Evidence Products of practice (e.g., observations) Other evidence relevant to one or more of the four Standards of practice (e.g., student surveys) Multiple measures of student learning, growth and achievement, including: Measures of student progress on classroom assessments Measures of student progress on learning goals set between the educator and evaluator Evidence Trends and patterns in student learning, growth & achievement At least two years of data At least two measures Statewide growth measures, where available (including MCAS SGP) Additional DDMs comparable across schools, grades, and subject matter district-wide Explain: Let’s look at precisely how and where DDMs will inform an educator’s evaluation. Here we have the required sources of evidence for both evaluation ratings. With respect to the Summative Performance Rating, multiple measures of student learning, growth and achievement comprise one of the three categories of evidence that must be taken into account, and must include measures of student progress on classroom assessments, as well as measures of student progress related to an individual educator’s learning goals. District-determined measures can serve this role for the Summative Performance Rating, depending on the nature of the educator’s goals or educator plan. With respect to the Student Impact Rating, data from at least two state or district-wide measures of student learning gains over at least two years is used to determine each educator’s impact on student growth. MCAS SGP must be one of these measures for educators with students taking the MCAS, and can be both where available. Bottom line: the Impact Rating is always based on a trend over time of at least two years, and it should reflect a pattern in the results on at least two different assessments, which is where district-determined measures come into play. Massachusetts Department of Elementary and Secondary Education

The two ratings work together Explain: To wrap up… Here you can see how these processes ultimately work together. Across the top, you’ll see the evidence categories that inform an evaluator’s judgment of practice associated with the four Standards, as well as the goals, and ultimately, the development of a Summative Performance Rating. On the bottom, you see how trends and patterns of student performance, evident within at least two measures of student learning, inform the Rating of Impact. These two ratings come together and determine type and duration of educator plan, as well as when and how to recognize excellent teachers and leaders. 8 Massachusetts Department of Elementary and Secondary Education

Student Impact Rating Regulations Evaluators must assign a rating based on trends (at least 2 years) and patterns (at least 2 measures) Options – 603 CMR 35.09(3)(a-c) high indicates significantly higher than one year's growth relative to academic peers in the grade or subject. moderate indicates one year's growth relative to academic peers in the grade or subject. low indicates significantly lower than one year's student learning growth relative to academic peers in the grade or subject.

SGP from MCAS as Growth Measure Scores not available until fall of following year Student impact rating only determines a one, or two-year plan, and is a separate rating from the Summative Rating, which determines the type of Plan you will be on The State has determined the SGP range for rating LOW 34 MODERATE 35 - 65 HIGH 66

District-Determined Measures DDMs should measure growth, not achievement. Student growth measures answer the fundamental question of, “Where did my students start and where did they end?” All DDMs have to have baseline data…some point of origin for the growth….some measure of the same core objectives Assessments should be administered across all schools in the district where the same grade or subject is taught. (e.g. 2nd gr ELA will be BAS for everyone) DDMs should assess learning as directly as possible. Massachusetts Department of Elementary and Secondary Education

Measures of Growth – 4 Options Every measure MUST have a BASELINE Pre- Test / Post- Test Pre- and post-tests can be identical measures administered twice or comparable versions Repeated Measures Design Some teachers use short measures throughout the year to monitor student growth on a set of skills. Holistic Evaluation A holistic evaluation of student growth combines aspects of a pre- and post-test model with the regularity of a repeated measures approach. These use a rubric that describes growth over time. Post- Test only Not really feasible for a locally made assessment. Only applies to MCAS and some commercial product because there is not baseline

Two fundamental questions should be the guideposts for selecting DDMs as a measure of student learning: Is the measure aligned to content? Does it assess what is most important for students to learn and be able to do? Does it assess what the educators intend to teach? 2. Is the measure informative? Do the results inform educators about curriculum, instruction, and practice? Does it provide valuable information to educators about their students, helping them identify whether students are making the desired progress, falling short, or excelling? Does it provide valuable information to schools and districts about their educators?

Measures of Growth with Specific Assessment Types Portfolios If a portfolio is to be used as a DDM that measures growth, it must be designed to capture progress rather than to showcase accomplishments. Unit Assessments While a common form of assessment, it is necessary to have baseline data to compare. Also, one unit alone is not enough of a measure. End-of-Course Exams While many courses have these already, it is again necessary to have baseline data. For this reason, these are rarely used as DDMs, unless pre-assessment data is collected. Capstone Projects Capstone Projects are large-scale student projects that represent a culmination of the work completed in a course. Perhaps the biggest challenge in using capstone projects as DDMs is the difficulty with measuring growth. The goal of DDMs is that they measure student growth over the year.

Developing a DDM Assessment Step 1 – Identify the key content (CCO, key standards, concepts or skills…) may be taught repeatedly across the year, or once in the year be sure that it is informative (informs teachers, students, and administration) Step 2 - Ensure that change in performance represents student growth starts with a baseline, measures similar content, demonstrates what students know and don’t know Step 3 - Select an approach for measuring growth pre/post, repeated measures, holistic which might include portfolios, performance tasks, unit assessment, capstone projects, etc. Think about the type of assessments coming with PARCC

Developing a DDM Assessment (cont.) Step 4 – Begin to build assessments align to content, seek ideas from other resources as available, think about weighting of certain questions Step 5 - Decide on a scoring protocol raw score, percent score, rubric score how will growth be determined [raw to raw, % to %....] Step 6 – Draft a scale for low, moderate and high impact moderate is based on what would be the expectation for most students or a year’s worth of growth; low is below the expectation; and high is significantly higher than the expectation

Assessment Protocols It is important to know that there need to be a set of protocols for the assessments used as DDMs. For example, assessments should be done on the same day, have the same set of directions, use the same scoring methods, etc. The protocols are similar to the steps taken to administer MCAS and assure the reliability of the assessment. More information will become available once DDMs have been chosen for courses.

For More Information MTA You Tube piece on Student Growth Percentile DESE website – Educator Evaluation – District Determined Measures Technical Guide B Example DDMs – based on Core Course Objectives The examples include Core Course Objectives for many levels Go to Assessment Literacy Webinar Series Go to Presentations on left menu instead of DDM …and look for Getting Started ppt for Educator Evaluation Visit PARCC or Smarter Balance websites for samples Materials and information stored on the NEW district website including this ppt and the DDM packet of info Contact Shirley Gilfether who will get answers to your questions

Questions and Answers IMPORTANT NOTES: DDM Pilot going on now- grade 10 ELA, grade 7 music and grade 5 and 3 mathematics February 3rd –DDMs should be decided for each course (only the basics of course/level and type of DDM) April 4th – Drafts of DDMs to administrators