Download presentation
Presentation is loading. Please wait.
Published byLesly Haymes Modified over 10 years ago
1
... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012
2
What do we want our students to know and be able to do? How will we know? Ensuring high quality instruction in every classroom
3
Annual Professional Performance Review Inspired by Race to the Top Legislation New APPR a condition of the award Some portions of evaluation process negotiated between the district and its teacher union Some portions state-mandated Evaluation process results in teacher “HEDI” score ◦ Highly Effective, ◦ Effective, ◦ Developing, ◦ Ineffective Can lead to expedited 3020-A process for teacher termination
4
60 Points NYS, National, and/or District Teaching Standards Multiple Supervision Models, including performance rubrics Observations, surveys, evidence of Student Learning 20 Points: Student Growth Growth on State Assessments – State provided score for grades 4-8 ELA, Math OR Growth Using Comparable Measures – Student Learning Objectives (SLOs) 20 Points: Local Assessment Student Achievement – locally determined measures cross grade-levels, teams, building Can use third party State-approved assessments – can measure growth, also District, BOCES developed assessment (rigorous, comparable)
5
NY – districts can make individual decisions regarding: ◦ Specific supervision model to be used ◦ Priorities and academic need ◦ Which subjects/teachers will use state-provided ELA/Math scores and which will have SLOs ◦ In-house processes for SLO assessing, scoring, implementation Other States: Similar conditions Entire state interprets uniformly
6
The First 60%
7
Select a teacher practice rubric from the State-approved list or apply for a variance Danielson’s Framework for Teaching Marzano’s Causal Teacher Evaluation Model NYSTCE Framework for the Observation of Effective Teaching NYSUT Teacher Practice Rubric Collective Bargaining considerations
8
Agree on the definition of “classroom observation” and any additional measures in the 60 point category (40 pts must be multiple observations) Choose one or more of the following other measures of teacher practice: A portfolio or evidence binder (student work or teacher artifacts) Feedback from students, parents, and/or other teachers using a survey Professional growth goals using self reflection (maximum of 5 points)
9
Observation = 2 learning walks (15-minute informal walk-through, follow-up conversation) OR A formal class-length observation Multiple “observations” needed (2) Could be 2 class-length observations 1 class-length observation, 2 learning walks 4 learning walks
10
A portfolio or evidence binder (student work and/or teacher artifacts) Professional growth goals using self-reflection (Professional Learning Plan, PLP) WCSD selects 9 components from the 4 Domains Teachers select an additional 5 components
11
“The governing body of each school district and BOCES is responsible for ensuring that evaluators have appropriate training— including training on the application and use of the rubrics— before conducting an evaluation. The governing body is also responsible for certifying a lead evaluator as qualified before that lead evaluator conducts or completes a teacher’s or principal’s evaluation. ” NYS Commissioner’s Regulations
12
Responsible for carrying out observations, summative evaluations Must be trained and calibrated across each school district in selected model Knowledge of model Walk-through, observation protocols Evidence-based reports Use, knowledge of specific rubrics Forms, feedback for teachers Professional Learning Plans
13
Districts design a plan for: Training for all evaluators Certification for lead evaluators Role clarification Subcomponent and overall scoring Improvement plans Knowledge of appeals procedures (i.e., NYSED model appeals procedure in guidance)
14
Develop Professional Learning Plan (PLP) Attention to multiple professional areas (4 Domains) Preparation Classroom Environment Instruction Reflection, Professional Responsibilities Student-Centered Aspects Individual SLOs/Student Growth Student achievement Data-driven instruction Select observation protocol ◦ Traditional observation (class length) ◦ Walk-through/Learning Walk
15
More frequent interactions between teacher/supervising administrator Mid-October – describe and set PLP, content are SLO(s) November-December – observation and follow-up January – midterm check-in on PLP, SLO progress February-March - observation and follow-up May-June – summative evaluation conference
16
Districts are choosing specific models, scheduling and implementing administrator training Administrators, teachers at various stages Learning new protocols Scheduling workshops Goal-setting Setting district calendars To begin in September, 2012
17
Student Learning Objectives
18
A student learning objective: Is an academic goal for a teacher’s students that is set at the start of a course. Represents the most important learning for the year (or semester, where applicable) Is specific and measureable based upon available prior student learning data Aligned with Common Core, AND State or National Standards, as well as any other school and district priorities Represents growth from beginning to end of the course Teachers’ scores are based upon the degree to which their goals are attained. 18
19
Need common assessments for individual growth across grade levels, content 50% rule, applied to total student load Teacher sets individual growth targets per student Cross-scoring of summative assessments needed, to ensure equity in HEDI scoring (need for inter-rater reliability)
20
Any teacher who does not use a state growth measure (ELA/Math assessments, gr 4-8) “non-tested” subjects (70%) 50% + of student load Full-credit courses carry more weight than part-credit, or semester Teacher will likely have multiple SLOs Teacher tracks, monitors progress of each student in SLO classes to impact growth
21
For Growth, Start with EVIDENCE Teacher sets individual student baseline using Historical data (ex., prior year’s grades) Pre-Assessment performance Teacher predicts individual student growth in his/her course Sets individual growth targets for students Post-assessment given at end of course (can be state assessment) Data analysis yields success rate of students, and teacher’s score on this section
22
There are NO state assessments in the arts NO common opportunity-to-learn standards Regional BOCES are sponsoring writing sessions to design SLOs and assessments in the arts Local districts design, implement their own
23
… are using ELA and/or Math state test scores IN PLACE OF assessment data in non-tested subjects (the district-based SLO model) Due to: Lack of common assessments Lack of inter-rater reliability Lack of content oversight by content specialist Lack of effective data system for monitoring and tabulating results Ambitious timeline for implementation
24
NYS Learning Objective per grade selected Specific population/grade level Learning content Interval of instructional time (full year, usually) Evidence to be used/collected (three forms) ◦ Historical ◦ Pre-assessment ◦ Post-assessment Individual students’ baseline
25
Individual student targets (set by teacher) Teacher goal set Teacher scoring range, by HEDI ratings Rationale for the SLO and targets Eventually Final individual student growth score % of students meeting individual targets Student % aligned with specific scoring band for HEDI rating
26
Local Achievement Assessment
27
Must be common across districts for grade level and content areas Should represent summative measure of the course Not to be applied to the SLO course(s) Teacher sets target for students Can NOT be scored by the teacher of record
28
... Between SLO and Achievement measures? SLO involves setting a target for students based upon previous performance data, i.e. measuring students’ growth; applied to 50% of student load Achievement does not measure “growth” over the length of the course, but teacher needs to set group target; applied to one other course
29
Addition of Value-Added Growth Model Inclusion of other data in targeting growth Demographic Graduation Attendance Planned for 2013-2014 school year
31
Plans to release individual teacher evaluation ratings to the public (HEDI) Highly effective Effective Developing Ineffective “Teachers evaluations can be viewed as the equivalent of a Carfax report, empowering parents to attempt to avoid the ‘lemons.’ “ B. Jason Brooks, Foundation for Education Reform and Accountability
32
June Determine next year’s SLO courses, populations Design pre-, summative assessments July and August Summer workshops, planning with like SLO teachers Calibrate scorers Design post-assessments, local common measures Administrators’ training in APPR forms, protocols September Meet students, get historical achievement data Administer, grade pre-assessments Set goal targets for students and self Meet with administration to review goals, etc. October Set SLOs, student targets Start applying strategies to gain student growth!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.