Download presentation
Presentation is loading. Please wait.
Published byAlfred Shine Modified over 9 years ago
1
DELAWARE COUNTY CURRICULUM DIRECTORS EDUCATOR EFFECTIVENESS UPDATE August 19, 2014
2
What happened? EE Admin Manual updated Regs for PE and NTPs formalized Teacher specific data “clarified”
3
Timelines Teacher Effectiveness Components Observation (in effect 13-14 +) Building level (in effect 13-14 eval year+) SLOs (in effect 14-15+) Teacher specific (in effect 14-15—except PVAAS) Principal Effectiveness Components Observation – FFL (in effect 14-15+) Correlation data (in effect 14-15+) SLOs (in effect 15-16+, optional for 14-15) NTP Effectiveness Components Observation – rubric-specific (in effect 14-15) Building level (in effect 14-15 eval year +)
4
Keywords for TSD Elements LEAs must use these elements when they are AVAILABLE AND APPLICABLE to teachers Recommendation is to work closely with bargaining unit and solicitor in determining which TSD elements apply when and to whom
5
Reference Material EE Administrative Manual (Overviews of EE) EE Administrative Manual PA Bulletin Rules and Regulations (Conversion Tables) PA Bulletin Rules and Regulations Teacher Effectiveness Student Performance Data FAQ Act 82 (original language) Act 82
6
Teacher Effectiveness – Teacher Specific Data (TSD) Formerly this was addressed as teacher level PVAAS only Act 82 includes 4 elements of TSD Section 1123: Rating System--(b)(1)(ii)(B): (B) Fifteen per centum (15%) teacher-specific data, including, but not limited to, student achievement attributable to a specific teacher as measured by all of the following: (I) Student performance on assessments. (II) Value-added assessment system data made available by the department under section 221. (III) Progress in meeting the goals of student individualized education plans required under the Individuals With Disabilities Education Act (Public Law 91-230, 20 U.S.C. § 1400 et seq.). (IV) Locally developed school district rubrics.
7
Teacher Effectiveness – Teacher Specific Data (TSD)
8
Student performance on assessments Based upon % of students who score PROF/ADV on state standardized assessments (PSSA, KE) Cannot count for more than 5% of TSD Convert % to 0-3 scale using Table H (from PA Bulletin):
9
Teacher Effectiveness – Teacher Specific Data (TSD) Value-added assessment system data 3 year trend data per eligible teacher In effect for evaluation year 15-16 (3 rd year) Cannot count for less than 10% of TSD Convert % to 0-3 scale using Table I (from PA Bulletin):
10
Teacher Effectiveness – Teacher Specific Data (TSD) Value-added assessment system data
11
Teacher Effectiveness – Teacher Specific Data (TSD) Progress in meeting the goals of student individualized education plans Use SLO process for determination Cannot count for more than 5% of TSD “[This element] is a measure of growth and student performance related to special education students meeting IEP goals. Any measure based upon progress made in meeting students’ IEPs may be developed by the local LEA, if applicable to a particular classroom teacher, and shall be validated through a Student Learning Objective (SLO) process to compile a score for such measure.” (from FAQ)
12
Teacher Effectiveness – Teacher Specific Data (TSD) Progress in meeting the goals of student individualized education plans “Teachers may use aggregated case load data of the percentage of students meeting IEP goals through documented progress monitoring. The supervising administrator should work in collaboration with the special education teacher to set the performance measures and indicators and should meet frequently with the special education teacher to review progress monitoring data (e.g. select a targeted subject area and grade level).” “Per IDEA, it is expected if students’ progress monitoring data indicates a student is not making progress, the IEP team must be reconvened to consider all data and make adjustments to the students program. This IEP progress monitoring aggregate data for a targeted subject and targeted case load should only be used in circumstances that preclude the use of the general education performance measures.” (from FAQ)
13
Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics Use SLO process for determination For teachers with PVAAS: Cannot count for more than 5% of TSD For teachers without PVAAS: Cannot count for more than 15% of TSD (from PA Bulletin)
14
Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics “Because LDR is not defined in Act 82, pursuant to regulation, LEAs may choose to utilize a measure from the list of elective data measures as the LDR. It is recommended that a classroom teacher’s evaluation which utilizes an elective data measure as the LDR also include an additional and separate elective data measure attributed to the 20% elective data measure.” “Although it is recommended that classroom teachers be given a separate LDR/elective measure and separate elective data measure, a LEA and classroom teacher may agree to use a single elective data measure chosen from the list of elective measures to comprise both the 15% teacher specific data score and the 20% elective score, which would account for 35 % of an educator’s evaluation if no other teacher specific data elements are available or applicable.” (from FAQ)
15
Teacher Effectiveness – Teacher Specific Data (TSD) Locally developed school district rubrics “The LEA should consult with its solicitor regarding any possible agreement made between the LEA and classroom teachers to utilize a single elective/LDR measure as the combined teacher specific data and elective data measure in light of the requirements of Act 82 and current regulation.” (from FAQ)
16
Teacher Effectiveness – Teacher Specific Data (TSD) Who decides the proportion and weights of TSD elements? The LEA determines what teacher specific data elements are utilized for a classroom teacher based upon the availability of the data and applicability to the individual classroom teacher in accordance with Act 82 and regulation. The LEA also determines the final weight allotted to applicable teacher specific data component in accordance with regulation and as explained in these FAQs. (from FAQ)
17
Principal Effectiveness
18
Correlation Data Formerly, this was the “Correlation to Teacher PVAAS” piece of Principal Effectiveness Correlation based on teacher –level measures includes “any combination of one or more of the following data for classroom teachers who are evaluated by the Principal/School Leader: (i) Building level data (ii) Teacher specific data (iii) Elective data” (from PA Bulletin)
19
Principal Effectiveness The Correlation Data Performance Level Descriptors in Table H below are provided for the rater to use as a basis for developing a rating of 0, 1, 2 or 3 for the Correlation Rating (from PA Bulletin—see next slide) Discussions should take place between the supervising administrator and principal/school leader. Correlation Rating (15%) 0 - Failing1 - Needs Improvement2 - Proficient3 - Distinguished Degree of understanding of evidence presented regarding the relationship between teacher- level measures and teacher observation and practice ratings. Quality of explanation provided for observed relationships between teacher- level measures and teacher observation and practice ratings. Plans for how the data will be used to support school and LEA goals. The Principal/School Leader’s responses demonstrate no understanding of the three aspects of correlation: Degree, Quality, and Planning. Does not disaggregate teacher observation/practice ratings and teacher-level measures. Cannot cite plausible causes for connections among teacher observation/practice ratings and teacher-level measures. Cannot articulate why plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures. Does not identify elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate limited understanding of the three aspects of correlation: Degree, Quality, and Planning. Attempts to disaggregate and/or analyze teacher observation/practice ratings and teacher-level measures. Attempts to cite plausible causes for the connections among teacher observation/practice ratings and teacher-level measures. Attempts to articulate why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures. Attempts to identify elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate solid understanding of the three aspects of correlation: Degree, Quality, and Planning. Disaggregates and conducts an analysis of teacher observation/practice ratings and teacher-level measures. Cites plausible causes for the connections among teacher observation/practice ratings and teacher-level measures. Articulates why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures. Identifies elements for an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher-level measures. The Principal/School Leader’s responses demonstrate comprehensive understanding of the three aspects of correlation: Degree, Quality, and Planning. Disaggregates teacher observation/practice ratings and teacher-level measures, as well as conducts an analysis to determine plausible connections among the data. Cites plausible causes for the connections among teacher observation/practice ratings and teacher- level measures.. Articulates why the plausible connections may have occurred among teacher observation/practice ratings and teacher-level measures. Establishes an effective plan for increasing student performance based upon the analysis of teacher observation/practice ratings and teacher- level measures. Incorporates the results from the correlational section of the Principal Rating Form into the other aspects of Principal Effectiveness (e.g. Elective Data - Principal SLOs). Sample Items of Data to be Considered But Not Limited to the Following: Examples of Aggregate Data: Average teacher ratings for the building Teacher-Level Measures o Sc h o ol Pe rf or m an ce Pr of ile (S P P) o Te ac he r S pe cif ic D at a / P V A A S o El ec tiv e D at a / S L O Examples of Disaggregated Data From Teacher Observation and Practice Ratings: Teacher observation and practice ratings o Analysis by rating category o Analysis by department o Analysis by grade level o Analysis by years of teaching experience o Analysis by years of service in building Other teacher observation and practice data Examples of Disaggregated Data From SPP: PSSA scores Keystone Scores Graduation rates Closing achievement gap NOCTI performance by department Other measures of student performance
20
Principal Effectiveness
21
SLOs for Principals:
22
Principal Effectiveness Training 2 Components to training Framework for Leadership Data elements (Correlation and SLOs) DCIU will be hosting upcoming PE training sessions in the fall If you have completed FFL training already, there is no requirement to attend again PIL hours 30 PIL hours will be available for each training Awaiting job embedded assignment from PDE
23
NTP Effectiveness
24
Who are NTPs? 1. CSPG-defined Specialists: CSPG – 75 Dental Hygienist CSPG – 76 Elementary and Secondary School Counselor CSPG – 77 Home and School Visitor CSPG – 78 Instructional Technology Specialists CSPG – 80 School Nurse CSPG – 81 School Psychologists 2. Supervisors (not Principals) CSPG – 88 Supervisor of Curriculum and Instruction CSPG – 89 Supervisor of Pupil Services CSPG – 90 Supervisor of s Single Area (Subject) CSPG – 91 Supervisor of Special Education CSPG – 92 Supervisor of Vocational Education
25
NTP Effectiveness Who are NTPs? 3. Instructionally Certified but provide No Direct Instruction (ICNDI) Remember the 2 prong test: To determine whether you are a teaching professional, you must be able to answer yes to the following two questions: 1) Are you working under your instructional certification? 2) Do you provide direct instruction* to students in a particular subject or grade level? *Direct instruction is defined as planning and providing the instruction, and assessing the effectiveness of the instruction. Teaching Professionals with Unique Roles and Functions (from EE Admin Manual)
26
NTP Effectiveness Rubrics Rubrics for CSPG defined specialists finalized Supervisors (not Principals) included as NTPs with Guiding Questions and Examples but no rubrics Staff members who are Instructionally Certified but provide NO Direct Instruction (ICNDI) would be evaluated based on their corresponding rubric, but… Disparity between ICNDI FFT and NTP 83-3: The rubric for ICNDI may utilize domains different from those found on the NTP summative eval form 83-3 Training Process is the same as teachers Training not required if you’ve already done it DCIU will be offering NTP Effectiveness training in 2 parts Danielson process Rubric review
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.