Www.engageNY.org 1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Stepping Stones to Using Data
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Winter Education Conference Consequential Validity Using Item- and Standard-Level Residuals to Inform Instruction.
We are learning to: - Enhance our Mathematical learning skills. (Which Enterprise skills?) -Accurately solve one and two step equations. (Levels 4-6)
Guide to Compass Evaluations and
Teacher Keys Effectiveness System
The SCPS Professional Growth System
Break Time Remaining 10:00.
PP Test Review Sections 6-1 to 6-6
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
Evaluation Orientation Meeting Teacher Evaluation System
Mesa County Valley School District #51 STANDARDS - BASED GRADING AND REPORTING
Measuring growth in student performance on MCAS: The growth model
RTTT Teacher Evaluator Training
Annual Title 1 Parent Meeting
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM Rigor Breakdown A Three Part Series.
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: July 2011.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
© 2012 National Heart Foundation of Australia. Slide 2.
Adding Up In Chunks.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Properties of Equality
SY PIMS/Educator Effectiveness/PVAAS Teacher Specific Reporting Mini PPT for LEA Admin Use Provided by: PVAAS Statewide Team.
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
SAFE-T Evidence
25 seconds left…...
: 3 00.
RTI Implementer Webinar Series: Establishing a Screening Process
Network Team Institute Training on Teacher and Leader Effectiveness From Theory To Action through the Network Team Approach NTI Monday,
Teacher Evaluation & APPR THE RUBRICS! A RTTT Conversation With the BTBOCES RTTT Team and local administrators July 20, 2011.
Holistic Rating Training Requirements Texas Education Agency Student Assessment Division.
1 Phase III: Planning Action Developing Improvement Plans.
Converting a Fraction to %
Clock will move after 1 minute
PSSA Preparation.
Understanding Common Concerns about the Focus School Metric August
Select a time to count down from the clock above
Compass: Module 3 Student Growth.
Teacher Evaluation Instructional Collaboration Day #2 January 3, 2014.
Virginia Teacher Performance Evaluation System 0 August 2012.
FLORIDA’S VALUE ADDED MODEL FLORIDA’S VALUE ADDED MODEL Overview of the Model to Measure Student Learning Growth on FCAT January
Dr. Jon Milleman, Assistant Superintendent Sabra Gage, WTEA President MSDWT Teacher Evaluation Information Review of Updates
Teacher Evaluation System LSKD Site Administrator Training August 6, 2014.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
Professional Learning
Connecting the Process to: -Current Practice -CEP -CIITS/EDS 1.
New York State District-wide Growth Goal Setting Process: Student Learning Objectives Webinar 2 (REVISED FEBRUARY 2012)
Data, Now What? Skills for Analyzing and Interpreting Data
New York State’s Teacher and Principal Evaluation System VOLUME I: NYSED APPR PLAN SUBMISSION “TIPS”
EngageNY.org State-Calculated Growth Measures Overview July 2013 Network Training Institute.
Student Growth Percentiles For Classroom Teachers and Contributing Professionals KDE:OAA:3/28/2014:kd:rls 1.
Annual Professional Performance Review (APPR) as approved by the Board of Regents, May 2011 NOTE: Reflects guidance through September 13, 2011 UPDATED.
1 New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
The APPR Process And BOCES. Sections 3012-c and 3020 of Education Law (as amended)  Annual Professional Performance Review (APPR) based on:  Student.
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
Teacher Scores from the State
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
PRINCIPAL STATE GROWTH SCORES / Principal Performance/Visit= 50 Student Performance=50.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
Presentation transcript:

1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012

2 2 By the End of This Presentation….  You should be able to: –Explain what information is included in a teacher growth report –Answer some common questions about the information –Describe the differences between a teacher-level, school-level, and district-level report –Understand how districts, principals and teachers can use the information on the reports as one source of data that can help improve instruction.

3 3 Evaluating Educator Effectiveness Student growth on state assessments (state- provided) Student learning objectives Growth 20% Student growth or achievement Options selected through collective bargaining Locally Selected Measures 20% Rubrics Sources of evidence: observations, visits, surveys, etc. Other Measures 60% 4-8 grade ELA and Math Teachers & their Principals

4 4 Key Points about NYS Growth Measures –We are measuring student growth and not achievement  Allow teachers to achieve high ratings regardless of incoming levels of achievement of their students –We are measuring growth compared to similar students  Similar students: Up to three years of the same prior achievement, three student-level characteristics (economic disadvantage, SWD, and ELL status) Every educator has a fair chance to demonstrate effectiveness on these measures regardless of the composition of his/her class or school.

5 5 Review of Terms  SGP (student growth percentile): –the result of a statistical model that calculates each student’s change in achievement between two or more points in time on a State assessment or other comparable measure and compares each student’s performance to that of similarly achieving students  Unadjusted and adjusted MGP (mean growth percentile): –the average of the student growth percentiles attributed to a given educator –For evaluation purposes, the overall adjusted MGP is used. This is the MGP that includes all a teacher or principal’s students and takes into account student demographics (ELL, SWD, and economic disadvantage status).

6 6 MGPs and Statistical Confidence 87 Confidence Range Upper Limit Lower Limit MGP NYSED will provide a 95% confidence range, meaning we can be 95% confident that an educator’s “true” MGP lies within that range. Upper and lower limits of MGPs will also be provided. An educator’s confidence range depends on a number of factors, including the number of student scores included in his or her MGP and the variability of student performance in the classroom.

7 7 Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly Effective Well above state average for similar students 18–20 EffectiveResults meet state average for similar students 9–17 DevelopingBelow state average for similar students 3–8 IneffectiveWell below state average for similar students 0–2 The growth scores and ratings are based on an educator’s combined MGP. For detailed information see the webinar posted here: evaluation-in / evaluation-in /

8 8 First let’s look at a growth report about a teacher… Jane Eyre

9 9 Teacher-level Report District X School #1 Jane Eyre Teacher 1D’s Growth Score and Growth Rating are listed here Teacher 1D has a higher adjusted MGP in Math than ELA Teacher 1D does not have any growth data reported for any of the subgroups because 16 student scores are required to report any data Jane’s MGP = 47 (this is what is used to determine the growth score and growth rating) Jane’s Upper Limit = 55 and Lower Limit = 39

Common Questions about Teacher-level Reports Will an educator’s adjusted MGP always be higher than the unadjusted since the adjusted MGP takes into account not just prior student achievement but also economic disadvantage, ELL, and SWD status? Team Order of finishOverall PercentileDivisionDivision FinishDivision Percentile L199A1 M266B199 N350B233 O4 A2 P517B31 Q61A31 Division B Teams Not necessarily. Division B Percentile Let’s take the following simplified example of a relay race.

Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of Let’s take the following simplified example:

Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of ( ) divided by 5 = 44 Let’s take the following simplified example:

Common Questions about Teacher-level Reports What is the difference between an educator’s MGP and the percent of an educator’s students above the state median? Student SGP Teacher MGP (Average of all SGPs) % Students above State Median of 50 20Not > 50 30Not > 50 55Yes > 50 55Yes > 50 60Yes > 50 ( ) divided by 5 = 44 3 out of 5 students above 50 = 60% Let’s take the following simplified example: Note: in NYSED reports, no MGP is calculated for less than 16 student scores.

School-level Report District X School #1 An adjusted MGP and associated confidence range will be reported for each subject and grade level within the school. 49 % of students at School #1 scored above the State median. The Growth Score and Growth Rating for the Principal of School #1 are listed here School #1 has scores broken out by subject for grades % of the student scores are from economically disadvantaged students, and no scores from English language learners. Summary of Revised APPR Provisions Memo:

School-level Report—Detailed View District X School #1 Teacher 1E Teacher 1D Teacher 1C Teacher 1B Teacher 1A Teacher 1F Teacher 1I Teacher 1J Teacher 1K Teacher 1L Teacher 1G Teacher 1H School #1 has 12 teachers who teach grades 4-8 ELA and Math Teacher 1B has the most student scores linked to him (43 scores) 43 student scores could not be linked to any of the teachers Each teacher receives an adjusted MGP and associated confidence range that are used to determine the growth rating and growth score Teachers 1E and 1G did not receive any growth data because they are linked to less than 16 student scores

Frequently Asked Questions Why do some teachers have a combined MGP but no subject-level MGPs?  Minimum of 16 student scores  Student must be continuously enrolled in a course that leads to an assessment for 195 calendar days for ELA or 203 calendar days for Math.

Frequently Asked Questions Why are some students unassigned? How did the State define “unassigned”? Students are considered unassigned if the district did not provide a valid teacher-of-record for that student, or if the student-teacher linkage relationship did not meet the continuous enrollment guidelines set forth in the APPR guidance professional-performance-review-law-and-regulations/. professional-performance-review-law-and-regulations/ To meet the continuous enrollment guidelines, a student needed to be linked to a teacher for 195 calendar days for ELA, or 203 calendar days for Math in

District-level View—Page 1 NY State Summary NYS Summary Data— Included on ALL District reports Number of student scores included in calculation of State MGP NY Statewide Adjusted MGP = 52 State Median = 50 District X Statewide about 50% of ELL, SWD, and economically disadvantaged students scored above the State median.

District X Summary Data District-level View—Page 1-2 District Summary District X Summary Data—continued on next page of report Number of student scores included in calculation of district-wide MGP District-wide Adjusted MGP District X

District-level View—Page 3 List of Schools District X has two schools that have grades 4-8 ELA and Math scores School #1 School #2 District X Principal of School #1 Growth Score = 14 Growth Rating = Effective Principal of School #2 Growth Score = 6 Growth Rating = Developing

Using Growth Score results  Beyond evaluation, growth score information can provide additional information to help teachers, principals and districts with instructional improvement. –Of course, these measures are only one of multiple sources of evidence to use for this purpose –The best insight comes from considering the results in the context of other information about a teacher, group of teachers, principal or group of schools.

Districts may want to: Analyze district-level information using these reflective questions: How much did our students grow, on average, compared to similar students? Is this higher, lower, or about what we would have expected? Why? How do our MGPs for each reported subgroup (ELL, SWD, economically disadvantaged students, high- and low-achieving students) compare to each other and to our overall MGP? Are there any patterns? Are the MGPs higher, lower, or about what we would have expected? Why? How do the MGPs compare by subject and across grade levels? Why might they be similar or different? What should we do to understand any surprises using other information and evidence? Do we have the right plans in place to aid in professional growth and learning for our educators?

Districts may want to: Convene principals to reflect upon their school growth results in context of other information about student learning and teacher effectiveness in their schools: –Use BOCES trainers and/or SED online resources to ensure basic understanding of the measures and what information is found on reports –Engage principals individually or in a group to reflect on questions about their school information in the context of other evidence of teacher effectiveness:  How much did the students of my teachers grow, on average, compared to similar students and how does this differ across teachers? Are there differences across grades or subjects?  How do my teachers’ MGPs differ across each reported subgroup? Do I see any patterns?

Districts may want to: Plan for communicating with teachers about their results:  How will teachers get general information about the growth measures? –District or school-level training or self-directed use of SED resources  How and when will teachers receive their individual reports? –Remember SED will have online access for individual educators later in the fall –Use BOCES trainers and/or SED online resources to ensure basic understanding of the measures and what information is found on reports

Principals may want to:  Consider the reflective questions in their school-level reports:  See the Principal’s Guide to Interpreting Growth Scores: content/uploads/2012/06/Principals_Guide_to_Interpreting_Your _Growth_Score.pdf content/uploads/2012/06/Principals_Guide_to_Interpreting_Your _Growth_Score.pdf  See the Sample Principal Report—Annotated: content/uploads/2012/06/Principal_Sample_Growth_Report.pdf content/uploads/2012/06/Principal_Sample_Growth_Report.pdf  Plan how teachers will get the information they need to understand their own growth reports

Teachers may want to:  Review materials from SED about growth measures  View the “Growth Model for Educator Evaluation ” Webinar:  View the “Using Growth Measures for Educator Evaluation in ” Webinar: / /  See the Teacher’s Guide to Interpreting Growth Scores: content/uploads/2012/06/Teachers_Guide_to_Interpreting_Your_Growth_Score.pdf content/uploads/2012/06/Teachers_Guide_to_Interpreting_Your_Growth_Score.pdf  See the Sample Teacher Report—Annotated: content/uploads/2012/06/Teacher_Sample_Growth_Report.pdf content/uploads/2012/06/Teacher_Sample_Growth_Report.pdf  Consider the following reflective questions:  How much did my students grow, on average, compared to similar students? Is this higher, lower, or about what I would have expected? Why?  How does this information about student growth align with information about my instructional practice received through observations or other measures? Why might this be?

For More Information… Please review our posted Guides for Interpreting Your Growth Scores: measures/ And the guidance on NYS’s APPR Law and Regulations: professional-performance-review-law-and-regulations/