DELAWARE COUNTY CURRICULUM DIRECTORS EDUCATOR EFFECTIVENESS UPDATE August 19, 2014.

Slides:



Advertisements
Similar presentations
The Delaware Performance Appraisal System II for Specialists August 2013 Training Module I Introduction to DPAS II Training for Specialists.
Advertisements

Educator Effectiveness System November 5, Agenda – Town Hall Meeting Community Builder/Video (OPTIONAL) Today, we will be presenting an overview.
Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Educationwww.education.state.pa.us Measuring Educator Effectiveness Educator Effectiveness.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Student Growth Measures in Teacher Evaluation
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
LCSD APPR Introduction: NYS Teaching Standards and the Framework for Teaching Rubric Welcome! Please be seated in the color-coded area (marked off by colored.
Measuring Educator Effectiveness Pennsylvania’s Educator Effectiveness Project Specialist Effectiveness October 15, 2013.
Student Learning Objectives (SLO) Essential Questions: 1. How does an SLO factor into teacher evaluation? 2. How will the SLO process be organized at Lower.
SLO Process A process to document a measure of educator effectiveness based on student achievement of content standards.
Overview for Teachers. Session Objectives I.Review Teacher Effectiveness System II.Define SLO process III.Exploring SLO Templates -Assessment Literacy-
Educator Effectiveness ACT 82 Overview 1. ACT 82 Within Act 82, new requirements for Educator Effectiveness have been defined for teachers, principals,
Student Learning Objectives (SLOs) Upper Perkiomen School District August 2013.
School Counselor Evaluation Rubric Overview Linda Brannan, M.Ed, NBCT K-12 Student Support Services Consultant NC Department of Public Instruction
Module 1: PERA Illinois Administrative Code Part 50
Overview for School Leaders. Session Objectives I.Review Teacher Effectiveness System II.Define SLO process III.Exploring SLO Template 10 IV.Identifying.
Measuring Principal Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education Measuring Principal.
Measuring Educator Effectiveness Pennsylvania’s Educator Effectiveness Project Specialist Effectiveness October 15, 2013.
Tennessee Department of Education Compliance Training February 2012 Department of Exceptional Children.
PENNSYLVANIA’S EDUCATOR EFFECTIVENESS SYSTEM Dr. John White, Director of SAS EVAAS, presenting on behalf of Dr. Kristen Lewald, PVAAS Statewide Director.
Today’s website:
Principals’ Council Meetings May  Given feedback from multiple stakeholders and after much deliberation, PDE has made the determination to classify.
Student Learning targets
Curriculum Council May 21, Nearpod Introduction
Measuring Principal Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education Correlation Data.
CLASS Keys Orientation Douglas County School System August /17/20151.
Measuring Principal Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education CIU 20 Principal.
Non Teaching Professionals ½ Day Training. Act 82 of 2012 Passed on June 30, 2012 Defined Three Groups of Educators ◦ Teaching Professionals  Began
Non-Teaching Professionals’ Effectiveness Dr. Patricia DiRienzo October 1, 2014.
The Three Buckets. #1 Classroom Teachers #2 Principals #3 Nonteaching Professional Employees.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
MEASURES OF STUDENT OUTCOMES WPSD EDUCATOR EFFECTIVENESS 102.
STUDENT GROWTH MEASURES Condensed from ODE Teacher Training.
The Delaware Performance Appraisal System II (DPAS II) for Teachers Training Module I Introduction to DPAS II Training for Teachers.
To Test or Not to Test?...that is the question! By: Sandy Church and Linda Lerch.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
LEAP in School Staff. Training Objectives  Understand the changes to LEAP for  Have questions answered.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Teacher Effectiveness Who begins in ? Teaching Specialists Special Education Teachers English as a Second Language Teachers Gifted Teachers.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Educationwww.education.state.pa.us Measuring Educator Effectiveness Educator Effectiveness:
New Work January 28, 2015 Yukon Koyukuk School District.
Jeffrey Freund. Jeff Freund: Education and Work History Class of 2000 Class of 2004 Elementary Education Middle Level Mathematics.
Kansas Educator Evaluation Bill Bagshaw Asst. Director Kansas State Department of Education February 25, 2015.
Act 82 Boot Camp Day 1 July 29 th, 2015 Angela Kirby-Wehr, Educator Effectiveness.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Mohawk Jr-Sr High School PSSA/PVAAS Act 82 – New Teacher Evaluation Law Beginning in the school year, fifty percent (50%) of the evaluation.
Measuring Principal Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education Measuring Principal.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Teacher Evaluation Components in Legislation Rose Hermodson Assistant Commissioner Minnesota Department of Education May 2, 2012.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Understanding How Evaluations are Calculated Professional Practices, Measures of Student Learning/ Outcomes- Calculating Scores & Translating SLOs/SOOs.
PA-ETEP Overview for Teachers. Table of Contents  Registration Registration  Portal Navigation Portal Navigation  Notifications Notifications.
Educator Effectiveness Digging into Domain 4. Educator Effectiveness in PA  PDE has been working since 2010 to develop an educator effectiveness model.
Differentiated Teacher Supervision and Evaluation Models
Measuring Principal Effectiveness
American Institutes for Research
Professional Learning – October 12, 2015
SLO Scoring and Updates
Measuring Principal Effectiveness
Overview for School Leaders
Changes to the Educator Evaluation System
Overview for School Leaders
Making Sense of SLOs Overview
Teacher Evaluation “SLO 101”
Baldwin FLEXIBLE CLERICAL AND PROFESSIONAL LEARNING and DIFFERENTIATED SUPERVISION.
Administrator Evaluation Orientation
McREL TEACHER EVALUATION SYSTEM
McREL TEACHER EVALUATION SYSTEM
Presentation transcript:

DELAWARE COUNTY CURRICULUM DIRECTORS EDUCATOR EFFECTIVENESS UPDATE August 19, 2014

What happened? EE Admin Manual updated Regs for PE and NTPs formalized Teacher specific data “clarified”

Timelines Teacher Effectiveness Components Observation (in effect ) Building level (in effect eval year+) SLOs (in effect ) Teacher specific (in effect 14-15—except PVAAS) Principal Effectiveness Components Observation – FFL (in effect ) Correlation data (in effect ) SLOs (in effect , optional for 14-15) NTP Effectiveness Components Observation – rubric-specific (in effect 14-15) Building level (in effect eval year +)

Keywords for TSD Elements LEAs must use these elements when they are AVAILABLE AND APPLICABLE to teachers Recommendation is to work closely with bargaining unit and solicitor in determining which TSD elements apply when and to whom

Reference Material EE Administrative Manual (Overviews of EE) EE Administrative Manual PA Bulletin Rules and Regulations (Conversion Tables) PA Bulletin Rules and Regulations Teacher Effectiveness Student Performance Data FAQ Act 82 (original language) Act 82

Teacher Effectiveness – Teacher Specific Data (TSD) Formerly this was addressed as teacher level PVAAS only Act 82 includes 4 elements of TSD Section 1123: Rating System--(b)(1)(ii)(B): (B) Fifteen per centum (15%) teacher-specific data, including, but not limited to, student achievement attributable to a specific teacher as measured by all of the following: (I) Student performance on assessments. (II) Value-added assessment system data made available by the department under section 221. (III) Progress in meeting the goals of student individualized education plans required under the Individuals With Disabilities Education Act (Public Law , 20 U.S.C. § 1400 et seq.). (IV) Locally developed school district rubrics.

Teacher Effectiveness – Teacher Specific Data (TSD)

Student performance on assessments Based upon % of students who score PROF/ADV on state standardized assessments (PSSA, KE) Cannot count for more than 5% of TSD Convert % to 0-3 scale using Table H (from PA Bulletin):

Teacher Effectiveness – Teacher Specific Data (TSD) Value-added assessment system data 3 year trend data per eligible teacher In effect for evaluation year (3 rd year) Cannot count for less than 10% of TSD Convert % to 0-3 scale using Table I (from PA Bulletin):

Teacher Effectiveness – Teacher Specific Data (TSD) Value-added assessment system data

Teacher Effectiveness – Teacher Specific Data (TSD) Progress in meeting the goals of student individualized education plans Use SLO process for determination Cannot count for more than 5% of TSD “[This element] is a measure of growth and student performance related to special education students meeting IEP goals. Any measure based upon progress made in meeting students’ IEPs may be developed by the local LEA, if applicable to a particular classroom teacher, and shall be validated through a Student Learning Objective (SLO) process to compile a score for such measure.” (from FAQ)

Teacher Effectiveness – Teacher Specific Data (TSD) Progress in meeting the goals of student individualized education plans “Teachers may use aggregated case load data of the percentage of students meeting IEP goals through documented progress monitoring. The supervising administrator should work in collaboration with the special education teacher to set the performance measures and indicators and should meet frequently with the special education teacher to review progress monitoring data (e.g. select a targeted subject area and grade level).” “Per IDEA, it is expected if students’ progress monitoring data indicates a student is not making progress, the IEP team must be reconvened to consider all data and make adjustments to the students program. This IEP progress monitoring aggregate data for a targeted subject and targeted case load should only be used in circumstances that preclude the use of the general education performance measures.” (from FAQ)

Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics Use SLO process for determination For teachers with PVAAS: Cannot count for more than 5% of TSD For teachers without PVAAS: Cannot count for more than 15% of TSD (from PA Bulletin)

Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics “Because LDR is not defined in Act 82, pursuant to regulation, LEAs may choose to utilize a measure from the list of elective data measures as the LDR. It is recommended that a classroom teacher’s evaluation which utilizes an elective data measure as the LDR also include an additional and separate elective data measure attributed to the 20% elective data measure.” “Although it is recommended that classroom teachers be given a separate LDR/elective measure and separate elective data measure, a LEA and classroom teacher may agree to use a single elective data measure chosen from the list of elective measures to comprise both the 15% teacher specific data score and the 20% elective score, which would account for 35 % of an educator’s evaluation if no other teacher specific data elements are available or applicable.” (from FAQ)

Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics “The LEA should consult with its solicitor regarding any possible agreement made between the LEA and classroom teachers to utilize a single elective/LDR measure as the combined teacher specific data and elective data measure in light of the requirements of Act 82 and current regulation.” (from FAQ)

Teacher Effectiveness – Teacher Specific Data (TSD) Who decides the proportion and weights of TSD elements? The LEA determines what teacher specific data elements are utilized for a classroom teacher based upon the availability of the data and applicability to the individual classroom teacher in accordance with Act 82 and regulation. The LEA also determines the final weight allotted to applicable teacher specific data component in accordance with regulation and as explained in these FAQs. (from FAQ)

Principal Effectiveness

Correlation Data Formerly, this was the “Correlation to Teacher PVAAS” piece of Principal Effectiveness Correlation based on teacher –level measures includes “any combination of one or more of the following data for classroom teachers who are evaluated by the Principal/School Leader: (i) Building level data (ii) Teacher specific data (iii) Elective data” (from PA Bulletin)

Principal Effectiveness The Correlation Data Performance Level Descriptors in Table H below are provided for the rater to use as a basis for developing a rating of 0, 1, 2 or 3 for the Correlation Rating (from PA Bulletin—see next slide) Discussions should take place between the supervising administrator and principal/school leader. Correlation Rating (15%) 0 - Failing1 - Needs Improvement2 - Proficient3 - Distinguished Degree of understanding of evidence presented regarding the relationship between teacher- level measures and teacher observation and practice ratings. Quality of explanation provided for observed relationships between teacher- level measures and teacher observation and practice ratings. Plans for how the data will be used to support school and LEA goals. The Principal/School Leader’s responses demonstrate no understanding of the three aspects of correlation: Degree, Quality, and Planning.  Does not disaggregate teacher observation/practice ratings and teacher-level measures.  Cannot cite plausible causes for connections among teacher observation/practice ratings and teacher-level measures.  Cannot articulate why plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures.  Does not identify elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate limited understanding of the three aspects of correlation: Degree, Quality, and Planning.  Attempts to disaggregate and/or analyze teacher observation/practice ratings and teacher-level measures.  Attempts to cite plausible causes for the connections among teacher observation/practice ratings and teacher-level measures.  Attempts to articulate why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures.  Attempts to identify elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate solid understanding of the three aspects of correlation: Degree, Quality, and Planning.  Disaggregates and conducts an analysis of teacher observation/practice ratings and teacher-level measures.  Cites plausible causes for the connections among teacher observation/practice ratings and teacher-level measures.  Articulates why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures.  Identifies elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate comprehensive understanding of the three aspects of correlation: Degree, Quality, and Planning.  Disaggregates teacher observation/practice ratings and teacher-level measures, as well as conducts an analysis to determine plausible connections among the data.  Cites plausible causes for the connections among teacher observation/practice ratings and teacher- level measures..  Articulates why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures.  Establishes an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher- level measures.  Incorporates the results from the correlational section of the Principal Rating Form into the other aspects of Principal Effectiveness (e.g. Elective Data - Principal SLOs). Sample Items of Data to be Considered But Not Limited to the Following: Examples of Aggregate Data:  Average teacher ratings for the building  Teacher-Level Measures o Sc h o ol Pe rf or m an ce Pr of ile (S P P) o Te ac he r S pe cif ic D at a / P V A A S o El ec tiv e D at a / S L O Examples of Disaggregated Data From Teacher Observation and Practice Ratings:  Teacher observation and practice ratings o Analysis by rating category o Analysis by department o Analysis by grade level o Analysis by years of teaching experience o Analysis by years of service in building  Other teacher observation and practice data Examples of Disaggregated Data From SPP:  PSSA scores  Keystone Scores  Graduation rates  Closing achievement gap  NOCTI performance by department  Other measures of student performance

Principal Effectiveness

SLOs for Principals:

Principal Effectiveness Training 2 Components to training Framework for Leadership Data elements (Correlation and SLOs) DCIU will be hosting upcoming PE training sessions in the fall If you have completed FFL training already, there is no requirement to attend again PIL hours 30 PIL hours will be available for each training Awaiting job embedded assignment from PDE

NTP Effectiveness

Who are NTPs? 1. CSPG-defined Specialists: CSPG – 75 Dental Hygienist CSPG – 76 Elementary and Secondary School Counselor CSPG – 77 Home and School Visitor CSPG – 78 Instructional Technology Specialists CSPG – 80 School Nurse CSPG – 81 School Psychologists 2. Supervisors (not Principals) CSPG – 88 Supervisor of Curriculum and Instruction CSPG – 89 Supervisor of Pupil Services CSPG – 90 Supervisor of s Single Area (Subject) CSPG – 91 Supervisor of Special Education CSPG – 92 Supervisor of Vocational Education

NTP Effectiveness Who are NTPs? 3. Instructionally Certified but provide No Direct Instruction (ICNDI) Remember the 2 prong test: To determine whether you are a teaching professional, you must be able to answer yes to the following two questions: 1) Are you working under your instructional certification? 2) Do you provide direct instruction* to students in a particular subject or grade level? *Direct instruction is defined as planning and providing the instruction, and assessing the effectiveness of the instruction. Teaching Professionals with Unique Roles and Functions (from EE Admin Manual)

NTP Effectiveness Rubrics Rubrics for CSPG defined specialists finalized Supervisors (not Principals) included as NTPs with Guiding Questions and Examples but no rubrics Staff members who are Instructionally Certified but provide NO Direct Instruction (ICNDI) would be evaluated based on their corresponding rubric, but… Disparity between ICNDI FFT and NTP 83-3: The rubric for ICNDI may utilize domains different from those found on the NTP summative eval form 83-3 Training Process is the same as teachers Training not required if you’ve already done it DCIU will be offering NTP Effectiveness training in 2 parts Danielson process Rubric review