If you cannot be called directly, dial 888-803- 6395. For technical assistance, Thursday, March 3, 2011 2:30.

Slides:



Advertisements
Similar presentations
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Student Growth in Non-Tested Subjects and for At-Risk Students.
Advertisements

Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
POSTER TEMPLATE BY: Increasing Student Growth and Achievement A Systems Approach: Improving Our Teacher Evaluation System Dawn.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Measuring Teacher Effectiveness in Untested Subjects and Grades.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Developing School-Based Systems of Support: Ohio’s Integrated Systems Model Y.S.U. March 30, 2006.
Teacher Evaluation Systems: Opportunities and Challenges An Overview of State Trends Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Student Learning Objectives 1 Implementing High Quality Student Learning Objectives: The Promise and the Challenge Maryland Association of Secondary School.
Administrative Evaluation Committee – Orientation Meeting Dr. Christine Carver, Associate Superintendent of Human Capital Development Mr. Stephen Foresi,
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Measuring Teacher Effectiveness in Untested Subjects and Grades.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Measuring Teacher Effectiveness in Untested Subjects and Grades.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Student Learning Objectives (SLOs) Measuring Teacher Effectiveness Through the Use of Student Data Overview of the SLO Process April 7,
1 Orientation to Teacher Evaluation /15/2015.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Models of Comprehensive Teacher Evaluation Systems Laura Goe,
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Stronge Teacher Effectiveness Performance Evaluation System
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Multiple Measures of Teacher Effectiveness Laura Goe, Ph.D. Tennessee.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
Evaluation Team Progress Collaboration Grant 252.
Models for Evaluating Teacher/Leader Effectiveness Laura Goe, Ph.D. Eastern Regional SIG Conference Washington, DC  April 5, 2010.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Student Learning Objectives: Approval Criteria and Data Tracking September 17, 2013 This presentation contains copyrighted material used under the educational.
The Improving Teacher Quality State Grants Program California Postsecondary Education Commission California Mathematics & Science Partnership 2011 Spring.
Evidence-Based Observations Training for Observers of Teachers Module 5 Dr. Marijo Pearson Dr. Mike Doughty Mr. John Schiess Spring 2012.
OCM BOCES SLOs Workshop. Race To The Top: Standards Data Professional Practice Culture APPR.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Evaluating Teacher Effectiveness: Some Models to Consider Laura.
Evaluating Teacher Effectiveness: Selecting Measures Laura Goe, Ph.D. SIG Schools Webinar August 12, 2011.
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Using Student Growth as Part of a Multiple Measures Teacher Evaluation.
Connecticut PEAC meeting Today’s meeting Discussion of draft principal evaluation guidelines (1 hour) Evaluation and support system document.
PGES: The Final 10% i21: Navigating the 21 st Century Highway to Top Ten.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Models for Evaluating Teacher Effectiveness Laura Goe, Ph.D. California Labor Management Conference May 5, 2011  Los Angeles, CA.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Educator Evaluation and Support System Basics. Oregon Framework for Teacher and Administrator Evaluation and Support Systems Alignment of State and Federal.
Teacher Evaluation Systems 2.0: What Have We Learned? EdWeek Webinar March 14, 2013 Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
Student Learning Objectives NYS District-Wide Growth Goal Setting Process December 1, 2011 EVOLVING.
Teacher Incentive Fund U.S. Department of Education.
Understanding Student Learning Objectives (S.L.O.s)
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
Models for Evaluating Teacher Effectiveness Laura Goe, Ph.D. CCSSO National Summit on Educator Effectiveness April 29, 2011  Washington, DC.
Weighting components of teacher evaluation models Laura Goe, Ph.D. Research Scientist, ETS Principal Investigator for Research and Dissemination, The National.
July 11, 2013 DDM Technical Assistance and Networking Session.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
New Haven, A City of Great Schools MOVING FROM COMPLIANCE TO COHERENCE IN EVALUATION AND DEVELOPMENT: THE IMPACT OF THE E3 PROGRAM NEW HAVEN PUBLIC SCHOOLS.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Introduction to Teacher Evaluation
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Introduction to Teacher Evaluation
Teacher Evaluation “SLO 101”
Illinois Performance Evaluation Advisory Council Update
Illinois Performance Evaluation Advisory Council Update
Administrator Evaluation Orientation
Presentation transcript:

If you cannot be called directly, dial For technical assistance, Thursday, March 3, :30 – 4:30 p.m. EST Defining and Measuring Educator Effectiveness Part II 1

If you cannot be called directly, dial For technical assistance, Agenda 2 Overview and IntroductionPamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Adobe Connect Pro

If you cannot be called directly, dial For technical assistance, 4 Who’s With Us Today?

If you cannot be called directly, dial For technical assistance, Network of ten RELs Serves regional needs: –Applied research –Development projects –Studies –Technical assistance What Is a Regional Educational Laboratory? 5

If you cannot be called directly, dial For technical assistance, Connecticut Maine Massachusetts New Hampshire New York Puerto Rico Rhode Island US Virgin Islands Vermont Five-million-plus students Nearly 10,000 schools and 2,000 districts REL-NEI States and Territories 6

If you cannot be called directly, dial For technical assistance, Defining and Measuring Educator Effectiveness Working Together REL Northeast and Islands National Comprehensive Center for Teacher Quality New England Comprehensive Center Maine Department of Education 7

If you cannot be called directly, dial For technical assistance, Agenda 8 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Featured Presenter Dr. Laura Goe Research Scientist Educational Testing Service Principal Investigator National Comprehensive Center for Teacher Quality 9

10 National Comprehensive Center for Teacher Quality (the TQ Center) A federally-funded partnership whose mission is to help states carry out the teacher quality mandates of ESEA  Vanderbilt University Students with special needs, at-risk students  Learning Point Associates Technical assistance, research, fiscal agent  Educational Testing Service Technical assistance, research, dissemination If you cannot be called directly, dial For technical assistance,

The Goal of Teacher Evaluation The ultimate goal of all teacher evaluation should be… TO IMPROVE TEACHING AND LEARNING 11 If you cannot be called directly, dial For technical assistance,

Federal priorities (August 2010)  From “Race to the Top” and reiterated in the August 5, 2010 Federal Register (Vol. 75, No. 150) “Secretary’s Priorities for Discretionary Grant Programs” Teachers should be evaluated using state standardized tests where possible For non-tested subjects, other measures (including pre- and post- tests) can be used but must be “rigorous and comparable across classrooms” and must be “between two points in time” Multiple measures should be used, such as multiple classroom evaluations 12 If you cannot be called directly, dial For technical assistance,

Questions to Ask About Models  Are they “rigorous and comparable across classrooms”?  Do they show student learning growth “between two points in time”?  Are they based on grade level and subject standards?  Do they allow teachers from all subjects and grades (not just 4-8 math & ELA) to be evaluated with evidence of student learning growth? 13 If you cannot be called directly, dial For technical assistance,

Evaluation Models  Austin, TX  Delaware  Georgia  Hillsborough, FL  New Haven, CT  Rhode Island  TAP (Teacher Advancement Program)  Washington, DC 14 If you cannot be called directly, dial For technical assistance,

Evaluation System Models Austin (Student learning objectives with pay-for-performance, group and individual SLOs assess with comprehensive rubric) Delaware Model (Teacher participation in identifying grade/subject measures which then must be approved by state) Georgia CLASS Keys (Comprehensive rubric, includes student achievement— see last few pages) System: Rubric : pdf?p=6CC6799F8C1371F6B59CF81E4ECD54E63F615CF1D9441A92E28BFA2A0AB27E3 E &Type=Dhttp:// pdf?p=6CC6799F8C1371F6B59CF81E4ECD54E63F615CF1D9441A92E28BFA2A0AB27E3 E &Type=D 15 If you cannot be called directly, dial For technical assistance,

Evaluation System Models Hillsborough, Florida (Creating assessments/tests for all subjects) New Haven, CT (SLO model with strong teacher development component and matrix scoring; see Teacher Evaluation & Development System) Rhode Island DOE Model (Student learning objectives combined with teacher observations and professionalism) k/Assnt_Sup_August_24_rev.ppt 16 If you cannot be called directly, dial For technical assistance,

Evaluation System Models Teacher Advancement Program (TAP) (Value-added for tested grades only, no info on other subjects/grades, multiple observations for all teachers) Washington DC IMPACT Guidebooks (Variation in how groups of teachers are measured—50% standardized tests for some groups, 10% other assessments for non-tested subjects and grades) MPACT+(Performance+Assessment)/IMPACT+Guidebooks 17 If you cannot be called directly, dial For technical assistance,

Austin Independent School District Student Learning Objectives:  Teachers determine two SLOs for the semester/year  One SLO must address all students, other may be targeted  Use broad array of assessments  Assess student needs more directly  Align classroom, campus, and district expectations  Aligned to state standards/campus improvement plans  Based on multiple sources of student data  Assessed with pre and post assessment  Targets of student growth  Peer collaboration 18 If you cannot be called directly, dial For technical assistance,

19 Rubric for Student Learning Objectives

20 Rubric for Student Learning Objectives

21 Rubric for Student Learning Objectives APPROVALNEEDS REVISION Level 4Level 3Level 2Level 1 RIGOR: How rigorous in your SLO? Content is challenging, complex and progressively deepens knowledge of core content. Content is thought- provoking requiring high thinking demand. Requires analytical thinking and active use of knowledge. Content is relevant to life/experiences. Content is challenging, complex for most, but not all, students. Requires analytical thinking. Content is relevant for most, but not all students. Content is challenging, complex for some students. Does not requires analytical thinking. Content is relevant for some students. Content is not challenging Does not requires analytical thinking. Content is not relevant to life and learning experiences. If you cannot be called directly, dial For technical assistance,

SLO Model Strengths/Weaknesses  Strengths Teachers take an active role in determining student learning goals Good professional growth opportunity for teachers If objectives are of high-quality and teachers plan instruction to meet them, students should benefit  Weaknesses Heavily dependent on administrator understanding and time commitment to supervision Not “comparable across classrooms” because teachers set the objectives and they will vary widely “Rigor” dependent on evaluators’ understanding and/or having an appropriate rubric 22 If you cannot be called directly, dial For technical assistance,

“Rhode Island Model” is Another Example of an SLO Model  Under consideration, not yet implemented Teachers measure student growth by setting student academic goals aligned to standards Principals, during the goal setting process, will confer with teachers to establish each goal’s degree of ambition and select the appropriate assessments for measuring progress against the goals Teacher evaluation will be based on students’ progress on the established goals, as determined by an end-of- the-year principal review of the pre-determined assessments and their results 23 If you cannot be called directly, dial For technical assistance,

The “Rhode Island Model”  The Rhode Island Model (RI Model) 1.Impact on student learning 2.Professional Practice (including content knowledge) 3.Professional Responsibilities  “…each teacher’s Student Learning (SL) rating will be determined by a combination of state-wide standardized tests, district-selected standardized tests, and local school- based measures of student learning whenever possible.” 24 If you cannot be called directly, dial For technical assistance,

RIDE Model: Impact on Student Learning  Category 1: Student growth on state standardized tests that are developed and/or scored by RIDE  Category 2: Student performance (as measured by growth) on standardized district-wide tests that are developed and/or scored by either the district or by an external party but not by RIDE (e.g., NWEA, AP exams, Stanford-10, ACCESS, etc.)  Category 3: Other, more subjective measures of student performance (growth measures and others, as appropriate) that would likely be developed and/or scored at the district- or school-level (e.g., student performance on school- or teacher-selected assessments, administrator review of student work, attainment of student learning goals that are developed and approved by both teacher and evaluator, etc.) 25 If you cannot be called directly, dial For technical assistance,

Rhode Island DOE Model: Framework for Applying Multiple Measures of Student Learning Category 1: Student growth on state standardized tests (e.g., NECAP, PARCC) Student learning rating Professional practice rating Professional responsibilities rating + + Final evaluation rating Category 2: Student growth on standardized district-wide tests (e.g., NWEA, AP exams, Stanford- 10, ACCESS, etc.) Category 3: Other local school-, administrator-, or teacher- selected measures of student performance The student learning rating is determined by a combination of different sources of evidence of student learning. These sources fall into three categories: 26 If you cannot be called directly, dial For technical assistance,

“‘Rhode Island Model”: Student Learning Group Guiding Principles “Not all teachers’ impact on student learning will be measured by the same mix of assessments, and the mix of assessments used for any given teacher group may vary from year to year.” Teacher A (5 th grade) Teacher B (11 th grade English) Teacher C (middle school art) This teacher may use several category 3 assessments Category 1 (growth on NECAP) Category 2 (e.g., growth on NWEA) Category 3 (e.g., principal review of student work over a six month span) Teacher A’s student learning rating + += Category 2 (e.g., AP English exam) Category 3 (e.g., joint review of critical essay portfolio) Teacher B’s student learning rating += 27 If you cannot be called directly, dial For technical assistance,

“Rhode Island Model” Strengths and Weaknesses  Strengths Includes teachers in evaluation of student learning (outside of standardized tests) Teachers will benefit from having assessment of student learning at the classroom level  Weaknesses Heavily administrator/evaluator driven process Teachers can weigh in on assessments, but do not determine student growth 28 If you cannot be called directly, dial For technical assistance,

New Haven Evaluators and Support Providers  Instructional managers are responsible for giving final rating  They may be principals, assistant principals, or “as necessary and appropriate, a designee”  There are also coaches (instructional and content), lead teachers, and mentors May have no teaching load or reduced load May be itinerant or school-based 29 If you cannot be called directly, dial For technical assistance,

New Haven “Matrix” 30 “The ratings for the three evaluation components will be synthesized into a final summative rating at the end of each year. Student growth outcomes will play a preponderant role in the synthesis.” If you cannot be called directly, dial For technical assistance,

New Haven Goal-Setting Process  Teachers administer formative/diagnostic assessments for each of his/her groups of students prior to the Goal-Setting Conference.  During the Goal-Setting Conference, teachers set appropriate academic goals for students in collaboration with instructional manager.  Secondary level: Goals for each of the teacher’s individual classes, with academic goals focused solely on the knowledge and skills that are relevant to the content area.  Elementary level: Where a teacher works primarily with one group of students (or a class) across multiple disciplines, the teacher will devise academic goals that cover the breadth of instruction with a focus on the priority learning areas. 31 If you cannot be called directly, dial For technical assistance,

New Haven Goal-Setting Process  Teachers, in collaboration with their instructional manager, will determine the appropriate number of goals as well as whether or not the goals set are “acceptable” – i.e., aligned to standards, challenging but attainable, measureable, and based on assessment(s) that meet district criteria.  If teacher and instructional manager are not able to agree on an appropriate set of goals, a third party individual (e.g., a district supervisor) will mediate and, if necessary, act as the final decision-maker. 32 If you cannot be called directly, dial For technical assistance,

New Haven Measures by “Group” 33 If you cannot be called directly, dial

New Haven Assessment Examples  Examples of Assessments/Measures Basic literacy assessments, DRA District benchmark assessments District Connecticut Mastery Test LAS Links (English language proficiency for ELLs) Unit tests from NHPS approved textbooks Off-the-shelf standardized assessments (aligned to standards) Teacher-created assessments (aligned to standards) Portfolios of student work (aligned to standards) AP and International Baccalaureate exams 34 If you cannot be called directly, dial For technical assistance,

New Haven Strengths/Weaknesses  Strengths Focus on support and growth opportunities Good documentation/communication Teacher and union buy-in/participation Student learning growth assessed for all teachers  Weaknesses Not comparable in many cases because teachers choose their own measures Rigor may vary depending on guidance of instructional managers 35 If you cannot be called directly, dial For technical assistance,

Teacher Advancement Program (TAP) Model  TAP requires that teachers in tested subjects be evaluated with value-added models  All teachers are observed in their classrooms (using a Charlotte Danielson type instrument) at least three times per year by different observers (usually one administrator and two teachers who have been appointed to the role)  Teacher effectiveness (for performance awards) determined by combination of value-added and observations  Teachers in non-tested subjects are given the school- wide average for their value-added component, which is combined with their observation scores 36 If you cannot be called directly, dial For technical assistance,

TAP Strengths/Weaknesses  Strengths Value-added becomes everyone’s responsibility, which should encourage efforts from teachers in non-tested subjects to support teachers in tested subjects Multiple yearly observations should be more informative and produce more reliable information about practice Professional development aligned with results is required  Weaknesses Concerns about “fairness” when only a few teachers’ student achievement and progress toward learning goals “counts” Tells you nothing about how teachers in other subjects are performing in terms of student learning growth (grades are not always good indicators) 37 If you cannot be called directly, dial For technical assistance,

IMPACT Sorts Teachers into Groups That Are Evaluated Differently  Group 1: general ed teachers for whom value-added data can be generated  Group 2: general ed teachers for whom value-added data cannot be generated  Group 3: special education teachers  Group 4: non-itinerant English Language Learner (ELL) teachers and bilingual teachers  Group 5: itinerant ELL teachers  Etc… 38 If you cannot be called directly, dial For technical assistance,

Score Comparison for Groups 1 & 2 Group 1 (tested subjects) Group 2 (non-tested subjects) Teacher value-added (based on test scores) 50%0% Teacher-assessed student achievement (based on non-VAM assessments) 0%10% Teacher and Learning Framework (observations) 35%75% Commitment to School Community 10% School Wide Value-Added 5% 39 If you cannot be called directly, dial For technical assistance,

Group 2 Assessment Rubric  3 “cycles” of data collected & averaged/year  Highest level of rubric: “Teacher has at least 1 high-quality source of evidence (i.e., one that is rigorous and reliable) demonstrating that approximately 90% or more of her/his students are on track to make significant learning growth (i.e., at least a year’s worth) towards mastery of the DCPS content standards over the course of the year.” 40 If you cannot be called directly, dial For technical assistance,

Non-VAM tests (accepted under Washington, DC’s IMPACT evaluation system)  DC Benchmark Assessment System (DC BAS)  Dynamic Indicators of Basic Early Literacy Skills (DIBELS)  Developmental Reading Assessment (DRA)  Curriculum-based assessments (e.g., Everyday Mathematics)  Unit tests from DCPS-approved textbooks  Off-the-shelf standardized assessments that are aligned to the DCPS Content Standards  Rigorous teacher-created assessments that are aligned to the DCPS Content Standards  Rigorous portfolios of student work that are aligned to the DCPS Content Standards 41 If you cannot be called directly, dial For technical assistance, 41

DC IMPACT Strengths/Weaknesses  Strengths Uses multiples measures to assess effectiveness Permits the use of many types of assessment for students in non-tested subjects and grades Includes what is important in the system (in order to encourage specific teacher behaviors)  Weaknesses No multiple measures of student learning growth for teachers in tested subjects and grades Huge differences in how teachers are measured 42 If you cannot be called directly, dial For technical assistance,

43 Georgia KEYS

44 Georgia KEYS for Non-tested subjects

Georgia KEYS Strengths/Weaknesses  Strengths Rubric for measuring teacher contribution is easy to understand Includes examples of multiple measures of student learning for all teachers, including those in tested grades and subjects  Weaknesses Rubric (including observation and other information) is about 100 pages long Might be a challenge to implement 45 If you cannot be called directly, dial For technical assistance,

Hillsborough, FL  Stated goal is to evaluate every teacher’s effectiveness with student achievement growth, even teachers in non-tested subjects and grades  Undertaking to create pre- and post-assessments for all subjects and grades  Expanding state standardized tests and using value-added to evaluate more teachers  Part of a multiple measures system 46 If you cannot be called directly, dial For technical assistance,

Hillsborough Strengths/Weaknesses  Strengths Teacher and union involvement in evaluation system decisions Teachers may be able to recommend tests they are already using All teachers included, not just tested subjects  Weaknesses Very expensive to create tests for all grades and subjects Takes teachers out of the assessing/scoring/improving instruction loop 47 If you cannot be called directly, dial For technical assistance,

Delaware Model  Standardized test will be used as part of teachers’ scores in some grades/subjects  “Group alike” teachers, meeting with facilitators, determine which assessments, rubrics, processes can be used in their subjects/grades (multiple measures)  Assessments must focus on standards, be given in a “standardized” way, i.e., giving pre-test on same day, for same length of time, with same preparation  Teachers recommend assessments to the state for approval  Teachers/groups of teachers take primary responsibility for determining student growth  State will monitor how assessments are “working” 48 If you cannot be called directly, dial For technical assistance,

Delaware Model: Strengths/Weaknesses  Strengths Teacher-driven process (assumes teachers are the experts in assessing their students’ learning growth) Great professional growth opportunity as teachers work together across schools to determine assessments, score student work, etc.  Weaknesses Validity issues (how the assessments are given and scored, teacher training to score, etc.) Time must be built in for teachers to work together on scoring (particularly for rubric-based assessments) 49 If you cannot be called directly, dial For technical assistance,

Final Thoughts  Policy is way ahead of the research in teacher evaluation It’s too early to tell which model and combination of measures will provide the most accurate and useful information about teacher effectiveness  Inclusion of student achievement growth data represents a huge culture shift in evaluation Communication and training are essential Teacher/administrator participation and buy-in are crucial to ensure change  Focus on models and measures that may help districts, schools, and teachers improve performance 50 If you cannot be called directly, dial For technical assistance,

More Final Thoughts  Consider implementing measures in stages By subject and grade By tested vs. non-tested subjects By particular measures (observations, student achievement)  Analyze data from various performance measures before setting cut scores Check measures for the “widget effect”  Build in frequent “system evaluations” to examine whether measures are working as intended Good measures used properly should differentiate among teachers 51 If you cannot be called directly, dial For technical assistance,

If you cannot be called directly, dial For technical assistance, Questions or Comments 52

If you cannot be called directly, dial For technical assistance, Bridging Research and Practice in Maine 53

If you cannot be called directly, dial For technical assistance, Which of the presented models have characteristics or attributes that align to your local policy and practice? 54 :: CHECK ALL THAT APPLY::

If you cannot be called directly, dial For technical assistance, Agenda 55 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Regional Context New England Collaborative for Educator Effectiveness (NECEE) Regional partnership Established in July 2009 by state education leaders from the six NE states Focused on defining, measuring, and evaluating educator effectiveness Supported by REL-NEI, NECC, and NCCTQ 56

If you cannot be called directly, dial For technical assistance, REL-NEI Technical Assistance to NECEE Regional Conference: Bridging Research and Practice: Multiple Measures of Teacher Effectiveness (June 2010) Webinar: Measuring Teacher Effectiveness in New England (May 2010) Key Considerations When Measuring Teacher Effectiveness: A Framework For Validating Teachers’ Professional Practices (February 2011) 57

If you cannot be called directly, dial For technical assistance, REL-NEI Related Work Race to the Top states Sharing learning from MA, NY, and RI State-specific work New Hampshire: Teacher Effectiveness Task Force Video Conference Connecticut: Rethinking the Role of Teacher Evaluation in Improving Teacher and Principal Effectiveness Maine: Defining and Measuring Teacher Effectiveness Conference New York: Series of TA projects on educator effectiveness 58

If you cannot be called directly, dial For technical assistance, Questions or Comments 59

If you cannot be called directly, dial For technical assistance, Agenda 60 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Panelist Barbara Moody Maine Department of Education Title II Coordinator Math/Science Partnership Grants State Agency for Higher Education Grants 61

If you cannot be called directly, dial For technical assistance, Maine’s Path Toward Improving Educator Effectiveness Removed barriers to linking student achievement data to evaluations Established the Evaluation Model Stakeholders Group Applied for Teacher Incentive Fund Grant Joined New England Collaborative for Educator Effectiveness Joined States’ Collaborative for Educator Quality (national) 62

If you cannot be called directly, dial For technical assistance, Evaluation Model Stakeholders Group Approve a model of a “qualifying evaluation system” that local districts can use if they choose to link student achievement to principal and teacher evaluation systems. 63

If you cannot be called directly, dial For technical assistance, Teacher Incentive Fund 5 districts, 15 schools Three areas of focus: –Mentoring and induction –PD through National Board for Professional Teaching Standards –Performance-based compensation system 64

If you cannot be called directly, dial For technical assistance, Maine’s Model for Evaluating and Developing Great Teachers and Leaders Student Achievement / Growth Standards for Effective Teachers Standards for Effective Leaders 65

If you cannot be called directly, dial For technical assistance, Agenda 66 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Panelist Patrick Phillips Superintendent of Schools MSAD #61 Lake Region School District Bridgton, Maine Campaign for the Civic Mission of Schools, Council for Excellence in Government Deputy Commissioner, Maine DOE Distinguished Educator for Learning Results, Maine DOE Co-chair, Maine Task Force on Citizenship Education Co-chair, Maine Task Force on Gender Equity in Education 67

If you cannot be called directly, dial For technical assistance, Teacher Evaluation SAD #61 Local process began four years ago, but stalled due to lack of a clearly better model Broad-based committee re-energized in 2009–10 when State of Maine stakeholders group was convened Our research identified the Kim Marshall rubrics and approach to “walkthroughs” as most promising approach and consistent with our philosophy 68

If you cannot be called directly, dial For technical assistance, Teacher Evaluation SAD #61 Marshall rubrics based on Jon Saphier’s “The Skillful Teacher” SAD #61 has one school participating in the School Improvement Grant process with State of Maine and USED Significant funds and mandate in SIG to improve teacher evaluation Grant funds permitted us to seek technical assistance to support the work 69

If you cannot be called directly, dial For technical assistance, Teacher Evaluation SAD #61 Our belief was to start with a clear definition of what excellent teaching looks like (rubrics) Then to build knowledge of the rubrics through coursework and extensive professional development To accomplish this work, we hired Research for Better Teaching to teach an initial course in fall 2010 for 12 administrators and 12 teachers Course included training administrators in observing using the rubrics 70

If you cannot be called directly, dial For technical assistance, Teacher Evaluation SAD #61 In-service day in March will also be devoted to Skillful Teacher training Summer course planned as well New evaluation system in place this year for volunteers; will be implemented for all next year The evaluation system is “holistic” by design, and will include a broad look at student achievement Continue to wait for further development at the state level 71

If you cannot be called directly, dial For technical assistance, Questions or Comments 72

If you cannot be called directly, dial For technical assistance, Agenda 73 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Panelists Markos Miller Spanish Language Teacher High School Suellyn Santiago Assistant Principal Lincoln Middle School Kate Theriault School Improvement Grant Coordinator 74

If you cannot be called directly, dial For technical assistance, Educator Evaluation in Portland: Context Prompted and supported by School Improvement Grant (Riverton School) Led by district-level workgroup (n = 11) Teacher, administrator, and Central Office representatives K–5, 6–8, and 9–12 are represented Meets twice monthly Timeline 2010–11: Develop plan; seek State Board approval 2011–12: Riverton and select schools will pilot 2012–13: District-wide implementation 75

If you cannot be called directly, dial For technical assistance, Educator Evaluation in Portland: Vision “The Portland Public Schools will develop an evaluation system for professional staff and administrators that focuses on student learning as the primary component of decision-making and support. This system builds on the professionalism of teachers, specialists, and administrators, and is based on current research and models in other districts.” 76

If you cannot be called directly, dial For technical assistance, Educator Evaluation in Portland: Standards Charlotte Danielson’s Enhancing Professional Practice, with local adaptation Domains Student Growth and Proficiency: School-wide targets based on state assessments, plus student learning goals established with teams or individual teachers and measured by various assessments Instructional Practice: Planning and Preparation, Classroom Environment, and Instruction Professional Practice: Includes work with families and community 77

If you cannot be called directly, dial For technical assistance, Educator Evaluation in Portland: Resources Student Growth and Proficiency Partnership with USM’s CEPARE New Haven, CT; other districts Portland Evaluation Workgroup website: Teacher Union Reform Network (TURN): Kate Theriault:

If you cannot be called directly, dial For technical assistance, Agenda 79 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Roundtable Discussion Laura Goe Patrick Phillips Barbara Moody General Questions Portland Public Schools 80

If you cannot be called directly, dial For technical assistance, What Are the Policy Issues and Implications? Data systems — student and teacher Compensation Teacher preparation and recruitment Laura Goe Patrick Phillips Barbara Moody Portland Public Schools 81

If you cannot be called directly, dial For technical assistance, Agenda 82 Overview and Introduction Pamela Buffington Featured Presentation Laura Goe Regional Context Pamela Buffington State and Local Context Barbara Moody Patrick Phillips Portland Public Schools Roundtable Discussion All Panelists What’s Next Pamela Buffington

If you cannot be called directly, dial For technical assistance, Online Discussion Forum 83

If you cannot be called directly, dial For technical assistance, Relnei.org 84

If you cannot be called directly, dial For technical assistance, 85 The Reference Desk service provides brief responses to your education- related questions. Reference Desk

If you cannot be called directly, dial For technical assistance, 86 Laura Goe, Ph.D. P: National Comprehensive Center for Teacher Quality th Street NW, Suite 500 Washington, DC Dr. Laura Goe

If you cannot be dialed directly, dial For technical assistance, REL-NEI’s Maine State Team Peter Tierney-Fife Maine State Researcher REL Northeast and Islands Pamela Buffington Maine State Liaison REL Northeast and Islands 87

If you cannot be dialed directly, dial For technical assistance, Panelists Patrick Phillips Superintendent of Schools MSAD #61 Lake Region School District Bridgton, Maine Barbara Moody Title IID Coordinator Maine Department of Education 88