Measuring Principals’ Effectiveness:

Slides:



Advertisements
Similar presentations
Discuss the charge of the Michigan Council for Educator Effectiveness (MCEE) Summarize the MCEE Interim Report Provide an Overview of the Pilot.
Advertisements

Overview of the Teacher Professional Growth and Effectiveness System KY Council of Administrators of Special Education Summer Conference July 9th, 2013.
Teacher Practice in  In 2012, the New Jersey Legislature unanimously passed the TEACHNJ Act, which mandates implementation of a new teacher.
AchieveNJ: Teacher Evaluation Scoring Guide
Kansas Educator Evaluation Bill Bagshaw Asst. Director Kansas State Department of Education February 13, 2015.
ESEA FLEXIBILITY WAIVER RENEWAL Overview of Proposed Renewal March 6, 2015 Alaska Department of Education & Early Development.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
Student Growth Objective (SGO) Evaluating SGO Quality
Implementing Virginia’s Growth Measure: A Practical Perspective Deborah L. Jonas, Ph.D. Executive Director, Research and Strategic Planning Virginia Department.
Understanding and Addressing Achievement Gaps 2014 Title 1 Directors Conference This work was originally produced in whole or in part by American Institutes.
Understanding Student Achievement: The Value of Administrative Data Eric Hanushek Stanford University.
2014 SOAR Update AAEA Fall Conference presented by Ivy Pfeffer, Assistant Commissioner Arkansas Department of Education October 29, 2014.
Nevada’s Growth Model Richard N. Vineyard, Ph.D. Asst. Dir. Assessment Nevada Dept. of Education.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
Impact Analyses for VAM Scores The following slides show the relationship of the teacher VAM score with various classroom characteristics The observed.
Assessing Learning-Centered Leadership Andrew C. Porter University of Pennsylvania Joseph Murphy, Ellen Goldring, & Stephen N. Elliott Vanderbilt University.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Alicia Currin-Moore Executive Director, TLE Oklahoma State Department of Education.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
Toolkit #3: Effectively Teaching and Leading Implementation of the Oklahoma C 3 Standards, Including the Common Core.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Summary Rating Responses November 13, 2013 Adobe Connect Webinar Bill Bagshaw, Kayeri Akweks - KSDE.
Grade 3-8 Math. 2 Regents: Raising Standards, with Extra Help to Achieve Them The Regents approved new, higher math standards in March A.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
AchieveNJ: Principal and Assistant/ Vice Principal Evaluation Scoring Guide
AchieveNJ: Principal and Assistant/ Vice Principal Evaluation Scoring Guide
Kentucky’s Professional Growth and Effectiveness System.
2009 Grade 3-8 Math Additional Slides 1. Math Percentage of Students Statewide Scoring at Levels 3 and 4, Grades The percentage of students.
Background CPRE brings together education experts from renowned research institutions to contribute new knowledge that informs K- 16 education policy &
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
Professional Growth and Effectiveness System Update Kentucky Board of Education August 8,
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
TEACHNJ Proposed Regulations. TEACHNJ Regulations Proposal  Two Terms that are very important to know: SGO – Student Growth Objective (Created in District)
The Impact of School Improvement Grants (SIG) on Student Outcomes:
The Arizona English Language Learner Assessment (AZELLA)
World’s Best Workforce (WBWF)
Lecture 5 Validity and Reliability
Teacher Evaluation Performance Categories
School Quality and the Black-White Achievement Gap
Value-Added Evaluation & Tenure Law
Educator Evaluations DARTEP 2017
FY17 Evaluation Overview: Student Performance Rating
Kansas Educator Evaluation
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Ongoing Lead Evaluator
The Ongoing Assessment Project (OGAP). Impact Findings
AchieveNJ: Teacher Evaluation Scoring Guide
Engineering Leadership Assessment Tool Profile
The Arizona English Language Learner Assessment (AZELLA)
Educator Effectiveness System Overview
Evaluation weights for
Validating Student Growth During an Assessment Transition
TeachNJ By Heather Perruso.
Session 2F Research-to-Practice Partnerships to Strengthen Research-Based Principal Evaluation Systems Stephen Lipscomb, “Do Principals’ Professional Practice.
New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Starting Community Conversations
PARCC RESULTS: PRESENTATION FAIRVIEW SCHOOL DISTRICT OCTOBER 2, 2018
TEACHNJ Act Tenure Law & Value Added Teacher Evaluation
Teacher Evaluation in BTSD (AchieveNJ)
Value-added Teacher Evaluation
Evaluation Training September 4, 2018
Release of Preliminary Value-Added Data Webinar
Presentation transcript:

Measuring Principals’ Effectiveness: Results from New Jersey’s Principal Evaluation Study SREE Spring Conference March 2, 2017 Christine Ross • Mariesa Herrmann

Research partnership with New Jersey Department of Education NJ developed a principal evaluation system, but little research was available to guide its design NJ’s principal evaluation system includes measures of professional practice and student achievement growth Analogous measures as teacher evaluation systems Problem: little evidence on reliability and validity of these measures NJDOE requested an assessment of the new principal evaluation system Pilot year 2012/13 Statewide implementation 2013/14

Main questions for the study Variation in ratings. To what extent did ratings overall and on each of the component measures vary across principals? Stability of measures. How stable were school median student growth percentiles (SGPs) across years? Schools with the same principal and those that changed principals Smaller and larger schools Correlations with schoolwide student characteristics. What were the correlations between principals’ ratings and the student characteristics of the schools they led? Correlations among component measures. What were the correlations among component measure ratings?

Weights on principal evaluation components in overall ratings, by type of school, 2013-14 Types of schools

Percentage of Principals Nearly all principals received overall ratings of effective or highly effective Percentage of Principals Note: 1,656 principals

Few principals were rated highly effective on SGPs; Most were rated highly effective on principal goals and teacher SGOs Note: 1,183 principals

School median SGPs were stable across years, even for new principals Same Principals in 2011-12 and 2012-13 New Principals in 2012-13 (1,374 principals/schools) (352 principals in 356 schools) Correlation: 0.69 Correlation: 0.63

School median SGPs were less stable for smaller schools than for larger schools Change in School Median SGPs 2012/13-2013/14 (Percentile Points) 500 Number of Tested Students in Grades 4–8 Note: 1,267 principals Findings from 3 years

Principals leading schools with larger proportions of economically disadvantaged students tended to receive lower ratings than other principals Component Measure Correlation with Schoolwide Percentage of Economically Disadvantaged Students (Correlation Coefficient) English Learner Students (Correlation Coefficient) School Median SGP Rating –.33* –.05* Principal Practice Instrument Rating –.20* –.14* Evaluation Leadership Instrument Rating –.11* Principal Goals Rating –.15* –.10* Teachers’ Student Growth Objectives Rating –.29* Overall Rating –.24* –.12* * Statistically significant at p < .05, two-tailed test. Note: 1,450 to 1,781 principals

Component measures modestly correlated with each other Correlation with School Median SGP Rating Principal Practice Instrument Rating Evaluation Leadership Instrument Rating Principal Goals Rating Principal Practice Instrument Rating .16* – a Evaluation Leadership Instrument Rating .08* .61* Principal Goals Rating .10* .32* Teachers’ Student Growth Objectives Rating .27* .23* .25* *Statistically significant at p < .05, two-tailed test. – indicates correlation is for the same measure. aindicates correlation is shown in a different cell. Note: 1,183 to 1,752 principals

Additional research could further assess measures’ reliability and validity Topics for further research on principal evaluation: Implementation quality Inter-rater reliability of practice instruments, and training needed to attain high rates of inter-rater reliability Internal consistency of practice instruments Requires item-level data on principal practice instruments and evaluation leadership instrument Measures of principals’ contributions to student achievement growth Requires student achievement data and principal school assignment data over multiple years Confirming findings with additional year of ratings data

For More Information Christine Ross Mariesa Herrmann CRoss@mathematica-mpr.com Mariesa Herrmann MHerrmann@mathematica-mpr.com REL 2016-156: Herrmann, M., & Ross, C. (2016). Measuring principals’ effectiveness: Results from New Jersey’s first year of statewide principal evaluation REL 2015-089: Ross, C., Herrmann, M., & Angus, M. H. (2015). Measuring principals’ effectiveness: Results from New Jersey’s principal evaluation pilot

Additional slides

References Branch, G., Hanushek, E., & Rivkin, S. (2012). Estimating the effect of leaders on public sector productivity: The case of school principals. Working paper. Cambridge, MA: National Bureau of Economic Research. Chiang, H., Lipscomb, S., & Gill, B. (2016). Is school value-added indicative of principal quality? Journal of Education Finance and Policy. Coelli, M., & Green, D. (2012). Leadership effects: School principals and student outcomes. Economics of Education Review, 31(1), 92-109. Dhuey, E., & Smith, J. (2012). How important are school principals in the production of student achievement? Working paper. Toronto, ON: University of Toronto.

Most school median SGPs were transformed into a rating of “effective” Ineffective Partially effective Effective Highly effect- ive Transforms school median SGP into school median SGP rating Note: The number of principals with school median SGPs is 1,742.

Districts mainly selected commercially-available principal practice instruments Note: Four other instruments are the New Jersey LoTi Principal Evaluation Instrument, the Rhode Island Model: Building Administrator Evaluation and Support Model, Principal Evaluation and Improvement Instrument, and the Thoughtful Classroom Principal Effectiveness Framework. Source: New Jersey Department of Education survey of school districts, February 2013 and October 2014

Instrument developers provided qualitative information on validity All seven indicated that the practice instruments are consistent with ISLLC standards for principal leadership Five indicated that the instruments were developed and informed by research on the relationship between principal practice and school performance or student achievement None provided information on the statistical relationship between scores on the instrument and student achievement

Most instrument developers provided incomplete or no information on reliability Internal consistency reliability: One developer indicated that analyses of internal consistency reliability confirmed desired constructs Inter-rater reliability: One developer provided information on inter-rater reliability standards and training required to meet those standards Two other developers provide ongoing inter-rater reliability refresher training, but no standards for reliability are indicated

Statewide principal SGP ratings were relatively stable across years, among principals in the same school Ineffective Partially effective Effective Highly effective Note: The analysis is based on principals who were in the same school in 2012 and 2013 (1,374 principals/schools).

School median SGPs were less stable for smaller schools than for larger schools Change in School Median SGPs 2012/13-2013/14 (Percentile Points) Increased by More than 5 Percentile Points 500 Decreased by More than 5 Percentile Points Number of Tested Students in Grades 4–8 Note: 1,267 principals

Large changes in school median SGP between years 1 and 2 were less persistent in year 3 for smaller schools than larger ones Note: 808 principals