Portability of Teacher Effectiveness across School Settings Zeyu Xu, Umut Ozek, Matthew Corritore May 29, 2016 Bill & Melinda Gates Foundation Evaluation.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Hierarchical Linear Modeling: An Introduction & Applications in Organizational Research Michael C. Rodriguez.
Girls’ scholarship program.  Often small/no impacts on actual learning in education research ◦ Inputs (textbooks, flipcharts) little impact on learning.
Are Teacher-Level Value- Added Estimates Biased? An Experimental Validation of Non-Experimental Estimates Thomas J. KaneDouglas O. Staiger HGSEDartmouth.
Teacher Credentials and Student Achievement in High School: A Cross Subject Analysis with Student Fixed Effects Charles T. Clotfelter Helen F. Ladd Jacob.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Explaining Race Differences in Student Behavior: The Relative Contribution of Student, Peer, and School Characteristics Clara G. Muschkin* and Audrey N.
Implementation and Evaluation of the Rural Early Adolescent Learning Project (REAL): Commonalities in Diverse Educational Settings Jill V. Hamm, Dylan.
Informing Policy: State Longitudinal Data Systems Jane Hannaway, Director The Urban Institute CALDER
Using State Longitudinal Data Systems for Education Policy Research : The NC Experience Helen F. Ladd CALDER and Duke University Caldercenter.org
Using School Climate Surveys to Categorize Schools and Examine Relationships with School Achievement Christine DiStefano, Diane M. Monrad, R.J. May, Patricia.
Understanding Student Achievement: The Value of Administrative Data Eric Hanushek Stanford University.
Special Education Teacher Quality and Student Achievement Li Feng Tim R. Sass Dept. of Finance & Econ.Dept. of Economics Texas State UniversityFlorida.
What Makes For a Good Teacher and Who Can Tell? Douglas N. Harris Tim R. Sass Dept. of Ed. Policy Studies Dept. of Economics Univ. of Wisconsin Florida.
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
Center for Education Policy Research | Regional Strategic Data Project Meeting Kentucky Department of Education Fall 2014.
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
NCAASE Work with NC Dataset: Initial Analyses for Students with Disabilities Ann Schulte NCAASE Co-PI
Addressing Student Growth In Performance Evaluation For Teachers and Administrators Patricia Reeves.
Measuring the Achievement Effects of Charter Schools in North Carolina Helen F. Ladd (Duke) Presentation for SREE conference Based on joint research with.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
Using VAM Data to Determine Points (50 % of the Total) toward Unified Single Rating Draft Procedures 11/21/2012 DRAFT DOCUMENT.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
School Accountability and the Distribution of Student Achievement Randall Reback Barnard College Economics Department and Teachers College, Columbia University.
Special Education Teacher Quality and Student Achievement Li Feng Tim R. Sass Dept. of Finance & Econ.Dept. of Economics Texas State UniversityFlorida.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
The Inter-temporal Stability of Teacher Effect Estimates J. R. Lockwood Daniel F. McCaffrey Tim R. Sass The RAND Corporation The RAND Corporation Florida.
Measured Progress ©2012 Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment.
MPS High School Evaluation Council of the Great City Schools Annual Fall Conference October, 2010 Deb Lindsey, Milwaukee Public Schools Bradley Carl, Wisconsin.
+ Third Party Evaluation – Interim Report Presentation for Early Childhood Advisory Council December 19, 2013.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables March 2012 Presentation to the Association of Education Finance and.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
How Much Value is Added? An Evaluation Plan for the Achievement Challenge Pilot Project.
Investigating the Role of Human Resources in School Turnaround: A Decomposition of Improving Schools in Two States Acknowledgements: This research draws.
The Policy Choices of Effective Principals David Figlio, Northwestern U/NBER Tim Sass, Florida State U July 2010.
© CCSR ccsr.uchicago.edu. © CCSR Early Warning Indicators of High School Graduation and Dropout Elaine Allensworth.
CCRPI AND BEATING THE ODDS Charter School Performance Goals 11/9/20151 GCSA Leadership Conference at Lake Oconee Academy January 31, 2014.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Strategies for estimating the effects of teacher credentials Helen F. Ladd Based on joint work with Charles Clotfelter and Jacob Vigdor CALDER Conference,
1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Education finance equalization, spending, teacher quality and student outcomes: The case of Brazil ’ s FUNDEF Nora GordonEmiliana Vegas UC San Diego The.
Free Education and Student Test Scores in Chad Gbetonmasse B. Somasse Worcester Polytechnic Institute (WPI) International Conference on Sustainable Development.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Teacher effectiveness. Kane, Rockoff and Staiger (2007)
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Kentucky’s New Assessment and Accountability System What to Expect for the First Release of Data.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
TRANSITION OF PA’S ASSESSMENT SYSTEM: IMPACT ON PVAAS?
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Evaluating Outcomes of the English for Speakers of Other Languages (ESOL) Program in Secondary Schools: Methodological Advance and Strategy to Incorporate.
Florida Algebra I EOC Value-Added Model June 2013.
Exploring Data Use & School Performance in an Urban School District Kyo Yamashiro, Joan L. Herman, & Kilchan Choi UCLA Graduate School of Education & Information.
Classroom Network Technology as a Support for Systemic Mathematics Reform: Examining the Effects of Texas Instruments’ MathForward Program on Student Achievement.
School Quality and the Black-White Achievement Gap
The Research Experience for Teachers Program
Dr. Robert H. Meyer Research Professor and Director
Portability of Teacher Effectiveness across School Settings
Dan Goldhaber1,2, Vanessa Quince2, and Roddy Theobald1
North Carolina Positive Behavior Support Initiative
Presentation transcript:

Portability of Teacher Effectiveness across School Settings Zeyu Xu, Umut Ozek, Matthew Corritore May 29, 2016 Bill & Melinda Gates Foundation Evaluation of the Intensive Partnership Sites initiative

Motivation  Redistributing effective teachers at the center of several education policy initiatives  Teacher is the most important school input related to student learning  The distribution of effective teachers is uneven (recruiting, who moves, and to where)  Key assumption: Teachers effectiveness is portable  Students face different challenges in learning  School culture, environment and working conditions may affect teacher learning, practices, efforts, burnout, etc.  Literature  Jackson (2010), Jackson & Bruegmann (2009), Goldhaber & Hansen (2010)  Sanders, Wright & Langevin (2009) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Research Questions  Do teachers retain their effectiveness across schools  On average  Across schools with similar settings  Across schools with different settings (by the direction of the change)  Teacher effectiveness measured by  Value-added  Settings defined by  School performance levels  School poverty levels  Conditional on teachers switching schools › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Preview of Findings  Among teachers who changed schools, on average their VA was unchanged or slightly improved  The same conclusion holds regardless of the similarity/difference between the sending and receiving schools or the direction of the move  High-performing teachers’ VA dropped and low-performing teachers’ VA gained in post-move years  This pattern is mostly driven by regression to the within-teacher mean and has little to do with school moves  Despite this pattern, high VA teachers still performed at a higher level than low VA teachers in post-move years › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Organization  Data and samples  Methodology  Findings  Summary and discussion › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Data  North Carolina through  Elementary level (4 th and 5 th grade math and reading teachers, self- contained classrooms)  Secondary level (algebra I and English I teachers, “Algebra I”, “Algebra I-B”, “Integrated Math II”, “English I” classrooms)  Florida through  Elementary level (4 th and 5 th grade math and reading teachers, “core courses” in a given subject)  Secondary level (9 th and 10 th grade math and reading teachers, “core courses” in a given subject) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Sample restrictions  Remove charter schools  Remove students and teachers who changed schools during a school year (about 2-4% of obs)  Remove students with missing values on covariates  Keep classrooms with 10~40 students  Remove classrooms with >50% special education students › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Sample sizes North CarolinaFlorida ElementarySecondaryElementarySecondary Math21,1194,99929,9899,101 Reading21,1193,77529,3549,681 › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Number of Unique Teachers in the Analytic Samples

Two-Stage Analysis  Estimate teacher-year value-added  Difference-in-differences analysis › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Estimate Teacher VA  Test scores standardized by year, grade and subject (mean=0, sd=1)  (X) Covariates include:  1) grade repetition, 2) FRPL, 3) sex, 4) race/ethnicity, 5) gifted, 6) special education, 7) student school mobility and 8) grade level.  Bias (no school FE)  Noise (EB adjustment)  Alternative model specifications (achievement levels model) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

DiD  Three groups: non-movers, movers to a similar school setting, movers to a different school setting  FGLS, se clustered at the teacher level  (Y) Year and (T) teacher FEs  (X) Teacher experience (0-2, 3-5, 6-12, 13 or more years of exp)  (S) School quality (average peer VA)  (C) Classroom characteristics (FRL %, mean pretest score, sd of pretest score)  (Post) Post-move years indicator  (DP, DN) Indicators for school setting differences › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Define School Settings  School performance  NC: % students performing at or above grade level  FL: School performance scores based on both levels and growth  Standardized by year and aggregated across all years  School poverty  % FRPL  Aggregated across all years in which a teacher taught in that school  Change in school setting measures  ∆ = Receiving school – Sending school  Similar setting = within half a SD around the mean of the ∆ distribution  DP = 1 if ∆ > 0.25 (performance) or ∆ > 0.15 (poverty)  DN = 1 if ∆ < (performance) or ∆ < (poverty) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Alternative DiD Specs  Last pre-move year and first post-move year  Between- vs. within-district moves  Replace the post-move indicator with individual year dummies (I t-1, I t-2, I t-3 …; I t+1, I t+2, I t+3 ) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Distribution of Movers › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 By school performance setting change

Distribution of Movers › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 By school poverty setting change

Mover Characteristics › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 NC elementary school teachers, by mobility status

Pre-Post Change in VA (elem) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 North CarolinaFlorida MathReadingMathReading All By school perf. Higher to lower Similar Lower to higher By school poverty Higher to lower Similar Lower to higher

Pre-Post Change in VA (sec) › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 North CarolinaFlorida MathReadingMathReading All By school perf. Higher to lower Similar Lower to higher By school poverty Higher to lower Similar Lower to higher

By Pre-Move VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Actual year of move “Pseudo” move

By Pre-Move VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Elementary math teachersElementary math teachers (pseudo move)

By Pre-Move VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Elementary reading teachersElementary reading teachers (pseudo move)

By Pre-Move VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Secondary math teachersSecondary math teachers (pseudo move)

By Pre-Move VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Secondary reading teachersSecondary reading teachers (pseudo move)

Adjacent Year Correlations › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Correlation North CarolinaFlorida MathReadingMathReading Y t-2, Y t (0.426, 0.535)(0.232, 0.362)(0.314, 0.443)(0.111, 0.260) Y t-1, Y t (0.256, 0.420)(0.182, 0.354)(0.231, 0.369)(0.061, 0.213) Y t-+1 Y t (0.381, 0.537)(0.175, 0.358)(0.363, 0.487)(0.115, 0.264)

Pre-Post Comparisons of VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 North Carolina

Pre-Post Comparisons of VA › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016 Florida

Summary  Among teachers who changed schools, on average their VA was unchanged or slightly improved  The same conclusion holds regardless of the similarity/difference between the sending and receiving schools or the direction of the move  High-performing teachers’ VA dropped and low-performing teachers’ VA gained in post-move years  This pattern is mostly driven by regression to the within-teacher mean and has little to do with school moves  Despite this pattern, high VA teachers still performed at a higher level than low VA teachers in post-move years › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016

Discussion  Teacher effectiveness does not appear to be hurt by moving to schools with different settings.  Multiple years of VA estimates can be used with other teacher evaluation data to identify effective teachers, capturing persistent teacher performance better and reducing post-move year shrinkage.  All results take teacher school changes as given. › Introduction › Data and Samples › Methodology › Findings › Summary and Discussion May 29, 2016