Www.engageNY.org 1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012.
AchieveNJ: Teacher Evaluation Scoring Guide
New York State’s Teacher and Principal Evaluation System VOLUME I: NYSED APPR PLAN SUBMISSION “TIPS”
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
EngageNY.org State-Calculated Growth Measures Overview July 2013 Network Training Institute.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
+ Utah Comprehensive Accountability System (UCAS) 1 Hal Sanderson, Ph.D. Research and Assessment August 21,
Student Growth Percentile Model Question Answered
Student Growth Percentiles For Classroom Teachers and Contributing Professionals KDE:OAA:3/28/2014:kd:rls 1.
Annual Professional performance review (APPR overview) Wappingers CSD.
Annual Professional Performance Review (APPR) as approved by the Board of Regents, May 2011 NOTE: Reflects guidance through September 13, 2011 UPDATED.
Student Growth Percentile (SGP) Model
Delaware’s Accountability Plan for Schools, Districts and the State Delaware Department of Education 6/23/04.
99th Percentile 1st Percentile 50th Percentile What Do Percentiles Mean? Percentiles express the percentage of students that fall below a certain score.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Montana’s statewide longitudinal data system Project Montana’s Statewide Longitudinal Data System (SLDS)
1 New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
Joint Meeting of the Commissioner’s and AYP Task Force October 14, 2010 NH DOE 1Joint Task Force Meeting: October 14, 2010.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
Student Impact Rating Teacher Professional Growth and Effectiveness System Daviess County Public Schools.
EngageNY.org State-Calculated Growth Measures Overview July 2013 Network Training Institute Revised 8/22/2013.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
School Performance Framework Sponsored by The Colorado Department of Education Summer 2010 Version 1.3.
The APPR Process And BOCES. Sections 3012-c and 3020 of Education Law (as amended)  Annual Professional Performance Review (APPR) based on:  Student.
Overall Training Objectives for TODAY Explain how appropriate STAR Reports assist with the Student Learning Objectives (SLOs) process Explain how to use.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012.
APPR:§3012-d A Preview of the changes from :§3012-c Overview.
NH Commissioner’s Task Force Meeting September 21, 2010 NH DOE 1 Commissioner's Task Force Meeting: September 21, 2010.
Student Learning Objectives SLOs April 3, NY State’s Regulations governing teacher evaluation call for a “State-determined District-wide growth.
Median Student Growth Percentile For Classroom Teachers and Contributing Professionals 1 ONGL-12/15/14.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
NH Commissioner’s Task Force Meeting August 10, 2010 NH DOE 1 Commissioner's Force Meeting: August 10, 2010.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Teacher Scores from the State
Student Growth Percentiles For Classroom Teachers and Contributing Professionals 1 October 22, 2014.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Student Growth Measures ODU Leadership Conference June 19, 2014.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
PRINCIPAL STATE GROWTH SCORES / Principal Performance/Visit= 50 Student Performance=50.
EngageNY.org State-Calculated Growth Measures Overview July 2013 Tracy Rowlands & Deb Duffy.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
EngageNY.org State-Calculated Growth Measures Overview July 2013 Deb Duffy.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Overview of the Georgia Student Growth Model 1.
1 NYS Grades 4-8 Teacher Growth Scores: From MGP to HEDI Ratings and Scores August 2016 Disclaimer If there are any discrepancies.
Analysing the Primary RAISE
Teacher SLTs
Overview of the Georgia Student Growth Model
FY17 Evaluation Overview: Student Performance Rating
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
EVAAS Overview.
CORE Academic Growth Model: Results Interpretation
Lead Evaluator for Principals Part I, Series 1
AchieveNJ: Teacher Evaluation Scoring Guide
Danvers Public Schools: Our Story
Understanding How Evaluations are Calculated
Created by Jena Parish Austell Intermediate July 2011 School Faculty
Teacher SLTs
New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Presentation transcript:

1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12

2 2 Today’s Agenda  Background  The What, Why, and How of Growth Models and Measures  Using Growth Measures for Educator Evaluation  What Data Will Be Available and When?

3 Background

4 4 Evaluating Educator Effectiveness Student growth on state assessments (state- provided) Student learning objectives Growth 20% Student growth or achievement Options selected through collective bargaining Locally Selected Measures 20% Rubrics Sources of evidence: observations, visits, surveys, etc. Other Measures 60%

5 The What, Why, and How of Growth Models and Measures

6 6 By the End of This Section….  You should be able to: –Explain why the state is measuring student growth and not achievement –Describe how the state is measuring growth compared to similar students –Define a student growth percentile and mean growth percentile

7 7 Prior Year Performance for Students in Two Teachers’ Classrooms ─ Proficiency

8 8 Current Year Performance of Same Students ─ Proficiency

9 9 Prior and Current Year Performance for Ms. Smith’s Students Ms. Smith’s Class Prior ScoreCurrent Score Student A Student B Student C Student D Student E600650

ELA Scale Score Student A 450 High SGPs Low SGPs Student A’s Current Year Performance Compared to “Similar” Students If we compare student A’s current score to other students who had the same prior score (450), we can measure her growth relative to other students. We describe her growth as a “student growth percentile” (SGP). Student A’s SGP is the result of a statistical model and in this example is 45, meaning she performed better in the current year than 45% of similar students.

Comparing Performance of “Similar” Students Prior Year Score Current Year Score Given any prior score, we see a range of current year scores, which give us SGPs of 1 to 99.

SGPs for Ms. Smith’s Students Ms. Smith’s Class Prior Score Current Score SGP Student A Student B Student C Student D Student E

Student Growth Percentiles: True or False? 1.A student with an SGP of 50 performed better than 50% of similar students. 2.A student with an SGP of 80 must be proficient. 3.A student with an SGP of 20 grew less than a student with an SGP of The highest SGP that a student can receive is A student with an SGP of 80 grew twice as much as a student with an SGP of 40.

From Student Growth to Teachers and Principals Ms. Smith’s Class SGP Student A45 Student B40 Student C70 Student D60 Student E40 To measure teacher performance, we find the mean growth percentile (MGP) for his or her students. To find an educator’s mean growth percentile, take the average of SGPs in the classroom. In this case: Step 1: =255 Step /5=51 Ms. Smith’s mean growth percentile (MGP) is 51, meaning on average her students performed better than 51% of similar students. A principal’s performance is measured by finding the mean growth percentile for all students in the school.

Which Students Count in a Teacher’s or Principal’s MGP for 2011–12? Student has valid test scores for at least 2011–12 and 2010–11 Student scores do not count for 2011–12 Yes Student meets continuous enrollment standard for 2011–12 No Student growth is attributed to the teacher and the school Yes No Expected for 2012–13: students weighted by duration of instructional linkage

From Student Growth to Teachers and Principals  In order for an educator to receive a growth score, he or she must have a minimum sample size of 16 student scores in ELA or mathematics across all grades taught. Examples: –A teacher has a self-contained classroom with 8 students who take the 4th grade ELA and math assessments; this teacher would then have 16 student scores contributing to his or her growth score. –A teacher has a class with 12 students in varied grades (4th, 5th, 6th) who take the ELA and math assessments for their respective enrolled grade level; this teacher would then have 24 student scores contributing to his or her growth score.  If an educator does not have 16 student scores, he or she will not receive a growth score from the state and will not receive information in the reporting system. –Educators likely to have fewer than 16 scores should use student learning objectives (SLOs).

MGPs and Statistical Confidence 87 Confidence Range Upper Limit Lower Limit MGP NYSED will provide a 95% confidence range, meaning we can be 95% confident that an educator’s “true” MGP lies within that range. Upper and lower limits of MGPs will also be provided. An educator’s confidence range depends on a number of factors, including the number of student scores in their MGP and the variability of student performance in the classroom.

Pause and Reflect: Mean Growth Percentiles  We talked about: –How to find a mean growth percentile (MGP) –How to interpret an MGP –What students are counted in an MGP –How many student scores are needed to provide an MGP –How a measure of statistical confidence (upper and lower limits of a 95% confidence range) will be provided with MGPs and why

Expanding the Definition of “Similar” Students  So far we have been talking about “similar” students as those with the same prior year assessment score  We will now add two additional features to the conversation:  Two additional years of prior assessment scores –Remember—a student MUST have current year and prior year assessment score to be included  Student-level factors – Economic disadvantage (ED) – Students with disabilities (SWDs) – English language learners (ELLs)

Adjustments for Three Student-Level Factors in Measuring Student Growth Student performance Teacher Instruction Other factors (12–13) Economic disadvantage Language proficiency Disability

ELA Scale Score Student A 450 High SGPs Low SGPs Student A’s Current Year Performance Compared to “Similar” Students If we compare student A’s current score to other students who had the same prior score (450), we can measure his or her growth relative to other students. We describe that growth as a student growth percentile (SGP). Student A’s SGP is the result of a statistical model and in this example is 45, meaning student A performed better in the current year than 45% of similar students.

ELA Scale Score Student A 450 High SGPs Low SGPs Expanding the Definition of “Similar” Students to Include Economically Disadvantaged—An Example Now if student A is economically disadvantaged, we compare student A’s current score to other students who had the same prior score (450) AND who are also economically disadvantaged. In this new comparison group, we see that student A now has an SGP of 48.

Further Information on Including Student Characteristics in the Growth Model  The following slides were developed using sample data from 2010–2011. –The “combined” MGPs on the charts have been calculated at the educator level (combining all grades and subjects). –Not all districts provided data linked to teachers for grades 4–8 ELA/Math in 2010–11.

Teacher MGPs after Accounting for Economic Disadvantage Taking student-level characteristics into account helps ensure educators with many students with those characteristics have a fair chance to achieve high or low MGPs. For example, note that for teachers with any percent of economically disadvantaged students, teacher MGPs range from 1 to 99. NOTE: Beta results using available 2010–11 data.

Teacher MGPs after Accounting for SWD NOTE: Beta results using available 2010–2011 data.

Teacher MGPs after Accounting for ELL Percent of ELL Students in Class NOTE: Beta results using available 2010–2011 data.

“Similar” Students: A Summary “Similar” Student Characteristics Unadjusted Mean Growth Percentiles Adjusted Mean Growth Percentiles Up to Three Years of Prior Achievement English Language Learner (ELL) Status Students with Disabilities (SWD) Status Economic Disadvantage Reported to Educators Used for Evaluation

One Last Feature of the Growth Model…. All tests contain measurement error, with greater uncertainty for highest and lowest achieving students The New York growth model accounts for measurement error in computing student growth percentiles.

State Growth Model Summary Regulations allow Prior years of student test results Three student- level variables: SWD, ELL, Econ Disadvantage Measurement error correction Model includes Up to three years, as available All three Measurement error correction Growth model for 2011–12 only for grades 4–8 ELA/Math for teachers and principals

By the End of This Section….  You should be able to: –Explain why the state is measuring student growth and not achievement –Describe how the state is measuring growth compared to similar students –Define a student growth percentile and mean growth percentile

31 Using Growth Measures for Educator Evaluation

By the End of This Section….  You should be able to: –Explain how growth ratings and scores will be obtained, using illustrative data

Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly Effective Well above state average for similar students 18–20 EffectiveResults meet state average for similar students 9–17 DevelopingBelow state average for similar students 3–8 IneffectiveWell below state average for similar students 0–2 The growth scores and ratings are based on an educator’s combined MGP.

Distribution of 2010–11 Teacher-Level MGPs MGP Number of Teachers NOTE: Beta results using available 2010–2011 data. For illustrative purposes only Distribution of Mean Student Growth Percentiles (Teacher Level) Percent of MGPs

MGPs and Statistical Confidence 87 Confidence Range Upper Limit Lower Limit MGP NYSED will provide a 95% confidence range, meaning we can be 95% confident that an educator’s “true” MGP lies within that range. Upper and lower limits of MGPs will also be provided. An educator’s confidence range depends on a number of factors, including the number of student scores included in his or her MGP and the variability of student performance in the classroom.

HEDI Classification Approach for Teachers (using 2010–11 sample data)  Effective requires MGPs within 1 standard deviation of the average MGP of 51. –MGPs between 40 and 61 will earn Effective ratings.  Well Above Average (Highly Effective) requires –MGP of 62 or higher –AND confidence range above 51. (If not, rating is Effective.)  Well Below Average (Ineffective) requires –MGP of 39 or lower –AND confidence range must be less than 51. (If not, rating is Developing.)

Yes No Yes From MGPs to Growth Ratings: Teachers Mean Growth Percentile ≥62 Lower Limit > 51 Highly Effective: Results are well above state average for similar students Mean Growth Percentile ≤39 Upper Limit < 51 Ineffective: Results are well below state average for similar students Developing: Results are below state average for similar students No Effective: Results equal state average for similar students Mean Growth Percentile Confidence RangeGrowth Rating Mean Growth percentile 40–61 Yes Any Yes

HEDI Classification Approach for Principals (using 2010–11 sample data) Same methodology as for Teachers. Slightly different cut scores.  Effective requires MGPs within 1 standard deviation of the average MGP of 50. –MGPs between 43 and 57 will earn Effective ratings.  Well Above Average (Highly Effective) requires: –MGP of 58 or higher –AND confidence range above 50. (If not, rating is Effective.)  Well Below Average (Ineffective) requires –MGP of 42 or lower –AND Confidence Range must be less than 50. (If not, rating is Developing.)

Yes No Yes From MGPs to Growth Ratings: Principals Mean Growth Percentile ≥ 58 Lower Limit > 50 Highly Effective: Results are well above state average for similar students Mean Growth Percentile ≤ 42 Upper Limit < 50 Ineffective: Results are well below state average for similar students Developing: Results are below state average for similar students No Effective: Results equal state average for similar students Mean Growth Percentile Confidence RangeGrowth Rating Mean Growth percentile 43–57 Yes Any Yes

Illustrating Possible Teacher Growth Ratings MGP 1 MGP 99 Well Below Average (39) Average (51) Average (51) Well Above Average (62) Well Above Average (62) MGP

Illustrating Possible Teacher Growth Ratings MGP 1 MGP 99 Well Below Average (39) Well Below Average (39) Average (51) Average (51) Well Above Average (62) Well Above Average (62) MGP

Illustrating Possible Teacher Growth Ratings Ineffective Developing Highly Effective Effective MGP 1 MGP 99 Well Below Average (39) Well Below Average (39) Average (51) Average (51) Well Above Average (62) Well Above Average (62) MGP Developing

Illustrating Possible Teacher Growth Ratings Effective MGP 1 MGP 99 Well Below Average (39) Well Below Average (39) Average (51) Average (51) Well Above Average (62) Well Above Average (62) MGP

Illustrative Results: Teachers (Using 2010–11 sample data) Rating & Points (2011–12 ) Number of Teacher MGPs Percent of Teacher MGPs Highly Effective 18– % Effective 9–17 16,68176% Developing 3– % Ineffective 0– % Points available within each HEDI category will be assigned based on educator MGP

Illustrative Results: Principals (Using 2010–11 sample data) Rating & Points (2011–12) Number of Principal MGPs Percent of Principal MGPs Highly Effective 18– % Effective 9– % Developing 3– % Ineffective 0–2 2417% Points available within each HEDI category will be assigned based on educator MGP

By the End of This Section….  You should be able to: –Explain how growth ratings and scores are obtained

47 What Data Will Be Available and When

Data — What to Expect When Growth scores provided to districts Mid-July Test scores finalized and teacher linkage data final submission Early fall Mid-August Online reporting system available

Data — What to Expect in August Data Elements (for teachers and schools)  Unadjusted mean growth percentiles (Unadjusted MGPs)  Adjusted mean growth percentiles (Adjusted MGPs and upper and lower limits based on confidence range for these adjusted MGPs)  Percent of students above the State median: this will be provided at the teacher and school level, and can be used as a local measure in APPR  Number of student scores included  Growth rating (HEDI)  Growth score (0–20) Breakdowns (by teacher and school)  MGPs by subject, grade, and overall (not HEDI) –Can be used with SLOs as part of the Comparable Measures or Locally Selected Subcomponent  Overall MGPs for subgroups — ELL, SWD, Economic Disadvantage, High- and Low-Achieving –Subgroup scores will not be included on reports if there are fewer than 16 student scores

One Teacher’s Information — August Number of Student Scores Percent of Students Above the State Median Unadjusted MGP Adjusted MGP Growth Rating Growth Score Lower LimitUpper Limit Jane Smith Highly Effective 18 Number of Student Scores Percent of Students Above the State Median Unadjusted MGP Adjusted MGP Lower LimitUpper Limit Jane Smith Math Math Grade ELA ELA Grade Students with disabilities 4* **** English language learners 0 * * * * * Economically disadvantaged 2* **** Low achieving (Level 1) 4* **** High achieving (Level 4) 4* ****

Number of Students Percent of Students Above the Median Unadjusted MGP Adjusted MGP Lower LimitUpper Limit Jane Smith Math Math Grade ELA ELA Grade Students with disabilities 4* **** English language learners 0* ** * * Economically disadvantaged2* **** Low achieving 4* **** High achieving 4* **** Adjusted MGP: 75 Number of Students Percent of Students Above the Median Unadjusted MGP Adjusted MGP Growth Rating Growth Score Lower LimitUpper Limit Jane Smith Highly Effective 18 2 SWD students, 0 ELL students, 1 econ disadvantaged student, 2 high- and 2 low-achieving students Math number of student scores: 28 ELA number of student scores: 28 Total number of student scores: 56 Upper and lower limits of adjusted MGP: 65 and 85 Growth rating of highly effective and growth score of percent of students above the State Median Unadjusted MGP: 70 One Teacher’s Information — August Adjusted MGPs by subject—can be used in an SLO for the Comparable Measures subcomponent No scores reported here since fewer than 16 student scores in a sub-group Unadjusted MGPs by subject

Scavenger Hunt and Quiz 1.What is Ms. Smith’s overall adjusted MGP? 2.What are the upper and lower confidence limits for Ms. Smith’s overall MGP and what do they represent? 3.How many scores are included from Ms. Smith’s class for ELA? 4.What is the adjusted MGP for Ms. Smith’s class in ELA? 5.How do Ms. Smith’s high- achieving students compare to her low-achieving students in terms of growth? 6.What score is Ms. Smith’s growth rating based on?

Definitions  SGP (student growth percentile): the result of a statistical model that calculates each student’s change in achievement between two or more points in time on a state assessment or other comparable measure and compares each student’s performance to that of similarly achieving students  Similar students: students with the same prior test scores, ELL, SWD, and economic disadvantage status  ELLs: English language learners  SWD: students with disabilities  Economic disadvantage: a student who participates in, or whose family participates in, economic assistance programs such as the Free- or Reduced-price Lunch Programs (FRPL), Social Security Insurance (SSI), Food Stamps, Foster Care and others

Definitions  High-achieving, low-achieving: defined by the performance of students based on prior year state assessment scores (i.e., Level 1 = low-achieving, Level 4 = high-achieving)  MGP (mean growth percentile): the average of the student growth percentiles attributed to a given educator  “Unadjusted” MGP: an MGP based on SGPs that have NOT accounted for ELL, SWD, and economic disadvantage status  “Adjusted” MGP: an MGP based on SGPs that HAVE accounted for ELL, SWD, and economic disadvantage status  Growth rating: HEDI rating based on growth  Growth score: growth subcomponent points from 0–20

Definitions  Measurement error: uncertainty in test scores due to sampling of content and other factors  Standard error: a measure of the statistical uncertainty surrounding a score  Standard deviation: a measure that shows the spread of scores around the mean  Upper/lower limit: highest and lowest possible MGP taking statistical confidence into account  Confidence range: range of MGPs within which we have a given level of statistical confidence that the true MGP falls (95% statistical confidence level used for state growth measure)