Download presentation
Presentation is loading. Please wait.
Published byGerard Parker Modified over 9 years ago
1
March 28, 2011
2
What does the new law require?
3
20% State student growth data (increases to 25% upon implementation of value0added growth model) 20% Locally selected (and agree upon) measures (decreasing to 15%) 60% Multiple measures based on standards TDB
4
Being referred to as HEDI (pronounced Heidi) Highly effective (possibly >90) Effective (possibly 80-90) Developing (possibly 65-79) Ineffective (possibly 0-64)
5
A single composite score of teacher (or principal) effectiveness
6
Training for all evaluators (through Network Teams – after first week of August) Use of improvement plans for developing and ineffective ratings Utilize in other decisions (merit, etc.) Locally-developed appeals process Expedited 3020a process after two ineffective ratings
7
All agreements after July 1, 2010 For agreements prior to July 1, 2010, it depends on specific language in agreement 4-8 math and ELA (and principals) July 2011 Everyone else July 2012 Implementation of the value-added growth model (20% > 25%) 2012-2013
8
All agreements after July 1, 2010 For agreements prior to July 1, 2010, it depends on specific language in agreement 4-8 math and ELA (and principals) July 2011 Everyone else July 2012 Implementation of the value-added growth model (20% > 25%) 2012-2013
9
Board of Regents Agenda
10
MONTH January February March April May June ACTION 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations (4 th ) Draft Regulations Emergency Adoption of Regulations
11
20% increasing to 25%
12
Value Added/Growth model Annual achievement is more about the students than the teacher 680 2015 Teacher A 670 2015 Teacher B
13
Value Added/Growth model Adding average prior achievement for the same students shows growth 680 2015 Teacher A 670 2015 Teacher B 660 2014 645 2014 +20 growth +25 growth
14
Value Added/Growth model Adding average prior achievement for the same students shows growth 680 2015 Teacher A 670 2015 Teacher B 660 2014 645 2014 +20 growth +25 growth
15
Value Added/Growth model But what growth should students have shown? What growth did similar students obtain? What is the difference between the expected growth and the actual growth?
16
Value Added/Growth model Comparing growth to the average growth of the similar student is the value-added 680 2015 Teacher A 670 2015 Teacher B 660 2014 645 2014 +20 growth 665 2015 avg. for similar students +25 growth +15 val add 665 2015 avg. for similar students +5 val add
17
Value Added/Growth model Comparing growth to the average growth of the similar student is the value-added 680 2015 Teacher A 670 2015 Teacher B 660 2014 645 2014 +20 growth 665 2015 avg. for similar students +25 growth +15 val add 665 2015 avg. for similar students +5 val add
18
Calculating similar student growth Lots of statistical analysis Student characteristics such as academic history, poverty, special ed. status. ELL status, etc. Classroom or school characteristics such as class percentages of needs, class size, etc.
19
Data collection and policy options Linking students, teachers, and courses Who is the teacher of record? ▪ Scenario 1: Same Teacher the Entire Year ▪ Scenario 2: Team Teaching ▪ Scenario 3: Teacher for Part of the Year ▪ Scenario 4: Student for Part of the Year ▪ Scenario 5: Student Supplemental Instruction ▪ Additional Scenarios???
20
Non-tested areas
21
Teachers of classes with only one state test administration K-12 educators High school (no test) educators Middle and elementary (no test) educators Performance courses Others
22
Use existing assessments in other content areas to create a baseline for science tests and Regents examinations Use commercially available tests to create a baseline and measure growth
23
Add more state tests, such as: Science 6-8 Social studies 6-8 ELA 9-11 (2011-2012) PARCC ELA 3-11 (2014-2015) PARCC math 3-11 (2014-2015)
24
Add more state tests, according to December 2009 Regents Item; discussed and approved prior to inclusion in SED’s plans: ELA 9-11 (2011-2012)
25
Add more state tests, subject to funding availability and approval, such as: Science 6-7 Social studies 6-8
26
% growth model also can be used for school accountability measures Collaborate with state-wide professional associations or a multi-state coalition Empower local level resources to create and carry out a solution that meets state requirements
27
Use a group metric that is a measure of the school (or grade’s) overall impact In other states where this is implemented it tends to be tied to performance bonuses
28
20% decreasing to 15%
29
Objectives include: Provide a broader picture of student achievement by assessing more Provide a broader picture by assessing differently Verify performance of state measures
30
Reality check: Balance state/regional/BOCES consistency while accounting for local context School-based choice might appeal to teachers Districts must be able to defend their decisions about the tests
31
Considerations include: Rigor Validity and reliability Growth or achievement measures Cost Feasibility May be achievement or growth measure
32
Options under consideration: Districts choose or develop assessments for courses/grades Commercially available products Group metric of school or grade performance Other options that meet the criteria (previous slide)
33
Multiple measures
34
Begins with the teaching standards: 1. Knowledge of Students and Student Learning 2. Knowledge of Content and Instructional Planning 3. Instructional Practice 4. Learning Environment 5. Assessment for Student Learning 6. Professional Responsibilities and Collaboration 7. Professional Growth
35
Begins with the teaching standards: Some things observable Some not observable thus requiring some other form or documentation or artifact collection
36
Teacher practice rubrics: Describe differences in the four performance levels Articulate specific, observable differences in student and teacher behavior Not known whether there will be a single rubric, menu to choose from, or total local option
37
Teacher practice rubrics: Describe differences in the four performance levels Articulate specific, observable differences in student and teacher behavior Not known whether there will be a single rubric, menu to choose from, or total local option
38
Other items that might be included: Teacher attendance Goal setting Student surveys Portfolios/Evidence binders Other observer
39
Board of Regents Agenda
40
MONTH January February March April May June ACTION 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations
41
MONTH August September ACTION NT Training (included evaluator training) NT turns training to local evaluators Implementation for covered teachers
42
Tentative dates set (with multiple options): August 15, Rodax 8 Large Conference Room August 22, McEvoy Conference Center August 29, Rodax 8 Large Conference Room Ongoing training during year (TBD)
43
Tentative dates set (with multiple options): August 19, Rodax 8 Small Conference Room August 26, McEvoy Conference Center Ongoing training during year (TBD)
44
Regional/BOCES collaboration: Share data Share APPR Plans Build common understanding Work on parts under local jurisdiction Avoid duplication of work Have a common voice
45
APPR sub-site: APPR button under “for school districts” at ocmboces.org or leadership.ocmboces.org ocmboces.orgleadership.ocmboces.org User name: lrldocs Password: CBA1011
47
Regional/BOCES collaboration: Development of local 20% protocol Achievement in non-tested areas Qualities of effective Improvement plans and examples Appeals process Frameworks/models Summative evaluation (examples, best practices, share practices) Principal Evaluation (added back)
48
Share results of this afternoon’s work Gather again on __________ Updates Continue collaboration
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.