Strategies for estimating the effects of teacher credentials Helen F. Ladd Based on joint work with Charles Clotfelter and Jacob Vigdor CALDER Conference,

Slides:



Advertisements
Similar presentations
Developmental Education in North Carolina Community Colleges Charles T. Clotfelter Helen F. Ladd* Clara Muschkin Jacob L. Vigdor Sanford School of Public.
Advertisements

SARAH FULLER HELEN LADD DUKE UNIVERSITY SANFORD SCHOOL OF PUBLIC POLICY School Based Accountability and the Distribution of Teacher Quality Across Grades.
Teacher Effectiveness in Urban Schools Richard Buddin & Gema Zamarro IES Research Conference, June 2010.
Presented to: Oklahoma State Board of Education August 20, 2013 Oklahoma Interruption Investigation Arthur Thacker.
Are Teacher-Level Value- Added Estimates Biased? An Experimental Validation of Non-Experimental Estimates Thomas J. KaneDouglas O. Staiger HGSEDartmouth.
Teacher Credentials and Student Achievement in High School: A Cross Subject Analysis with Student Fixed Effects Charles T. Clotfelter Helen F. Ladd Jacob.
Maryland School Assessment (MSA) 2012 Science Results Carolyn M. Wood, Ph.D. Assistant Superintendent, Accountability, Assessment, and Data Systems August.
Explaining Race Differences in Student Behavior: The Relative Contribution of Student, Peer, and School Characteristics Clara G. Muschkin* and Audrey N.
HUDM4122 Probability and Statistical Inference March 30, 2015.
Informing Policy: State Longitudinal Data Systems Jane Hannaway, Director The Urban Institute CALDER
Using State Longitudinal Data Systems for Education Policy Research : The NC Experience Helen F. Ladd CALDER and Duke University Caldercenter.org
State of Texas Assessments of Academic Readiness.
1 Concentration of Low-Performing Students (4 th grade Math, 2002)
Inequality within Schools: Ability Grouping and Tracking Sociology September 5,2004.
Assessments to VAM to VAS to EES Points July 28,
Chapter 7 Correlational Research Gay, Mills, and Airasian
What Makes For a Good Teacher and Who Can Tell? Douglas N. Harris Tim R. Sass Dept. of Ed. Policy Studies Dept. of Economics Univ. of Wisconsin Florida.
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
Production Functions and Measuring the Effect of Teachers on Student Achievement With Value-Added HSE March 20, 2012.
Standard Error of the Mean
Chapter 8 Experimental Research
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
Education in South Korea: Challenges and Reforms
Chapter 13: Inference in Regression
Measuring the Achievement Effects of Charter Schools in North Carolina Helen F. Ladd (Duke) Presentation for SREE conference Based on joint research with.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Student Engagement Survey Results and Analysis June 2011.
Profile Analysis. Definition Let X 1, X 2, …, X p denote p jointly distributed variables under study Let  1,  2, …,  p denote the means of these variables.
What does it mean? The variance of the error term is not constant
The EXPLORE Test: What?/When?/Why? From ACT.  Taking EXPLORE ® in 8 th grade tells students (and parents) things they need to know  to plan your high.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
Special Education Teacher Quality and Student Achievement Li Feng Tim R. Sass Dept. of Finance & Econ.Dept. of Economics Texas State UniversityFlorida.
Middle School Math Presentation for Elementary IRTs February 12, 2010 REVISED 02/12/10 PM.
Agresti/Franklin Statistics, 1 of 106  Section 9.4 How Can We Analyze Dependent Samples?
The Inter-temporal Stability of Teacher Effect Estimates J. R. Lockwood Daniel F. McCaffrey Tim R. Sass The RAND Corporation The RAND Corporation Florida.
Don Boyd, Pam Grossman, Karen Hammerness, Hamp Lankford, Susanna Loeb, Matt Ronfeldt & Jim Wyckoff This work is supported.
Teacher-Student Ratio and Elementary Children’s Academic Achievement Wendy Jowers, Teri Paulk, and Sol Summerlin.
Portability of Teacher Effectiveness across School Settings Zeyu Xu, Umut Ozek, Matthew Corritore May 29, 2016 Bill & Melinda Gates Foundation Evaluation.
Julian Betts, Department of Economics, UCSD and NBER.
Z-Scores Standardized Scores. Standardizing scores With non-equivalent assessments it is not possible to develop additive summary statistics. –e.g., averaging.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
Public Finance Seminar Spring 2015, Professor Yinger Public Production Functions.
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
Impediments to the estimation of teacher value added Steven Rivkin Jun Ishii April 2008.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
School-level Correlates of Achievement: Linking NAEP, State Assessments, and SASS NAEP State Analysis Project Sami Kitmitto CCSSO National Conference on.
CREATE – National Evaluation Institute Annual Conference – October 8-10, 2009 The Brown Hotel, Louisville, Kentucky Research and Evaluation that inform.
Expectations from the Number and Operations Standard Principles and Standards for School Mathematics National Council of Teachers of Mathematics 2000.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Free Education and Student Test Scores in Chad Gbetonmasse B. Somasse Worcester Polytechnic Institute (WPI) International Conference on Sustainable Development.
Student Data Review Title VII Indian Education Parent Meeting March 4, 2013.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Using Prior Scores to Evaluate Bias in Value-Added Models Raj Chetty, Stanford University and NBER John N. Friedman, Brown University and NBER Jonah Rockoff,
Effects of Software Products Findings from the National Evaluation Mark Dynarski Presentation at the 2009 IES Annual Conference Mark Dynarski Presentation.
Huntsville City Schools School Year School Instructional Targets October 3,
The SweSAT Vocabulary (word): understanding of words and concepts. Data Sufficiency (ds): numerical reasoning ability. Reading Comprehension (read): Swedish.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
Chapter 15 Panel Data Models.
Assessment of Learning 1
Jason Grissom, Demetra Kalogrides
Questions What are the sources of error in measurement?
Linear Mixed Models in JMP Pro
School Quality and the Black-White Achievement Gap
EVAAS Overview.
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Public Finance Seminar Professor Yinger
Lecture Slides Elementary Statistics Twelfth Edition
Presentation transcript:

Strategies for estimating the effects of teacher credentials Helen F. Ladd Based on joint work with Charles Clotfelter and Jacob Vigdor CALDER Conference, Oct. 4, 2007

Basic value-added model Definition: A it = a A it-1 + b TQ it + c X it + error it where A = student achievement (i.e. test score) ; and TQ = teacher qualifications X = control variables Justification: Education is a cumulative process a = estimate of persistence of knowledge from one year to the next. a =1 => complete persistence (no decay) a = 0 => no persistence (100 percent decay) b = estimate of the effects of the qualifications of the student’s teacher in year t on her achievement in year t. (Model assumes a and b are constant across years)

Three papers – NC data Cross sectional data – fifth graders “Teacher-Student Matching and the Assessment of Teacher Effectiveness” Longitudinal data – fourth and fifth graders, multiple cohorts of students “How and Why Do Teacher Credentials Matter for Student Achievement?”” Course-specific achievement in high school courses – multiple cohorts “Teacher Credentials and Student Achievement in High School: A Cross- Subject Analysis with Student Fixed Effects” Note. Student achievement is normalized by grade, year and subject so that the mean is 0 and the SD = 1.

Challenges for all three papers Data – Identification of each student’s teacher Elementary schools – EOG tests High schools – E0C tests In both cases, we start with proctor of the test but we keep the observation only if we are quite confident that the proctor is the relevant teacher. `(> 75 % match rate in both elementary schools and high schools) Middle schools – identification not feasible Estimation – Non-random sorting of teachers and students among class rooms. “Positive” sorting => upward biased coefficients of teacher credentials

Cross sectional model – 5 th graders Strategies to reduce bias of estimates: Add an extensive set of student covariates Rich set available in NC data – e.g. education level of parents, T.V. watching Include school fixed effects Rules out bias from teacher-student sorting across schools Restrict sample to schools with evenly balanced classroom Reduces bias from sorting across classrooms within schools.

Coefficients of teacher experience - Math (all coefficients are statistically significant.) Years of experience (Base = 0 years) Student covariates School fixed effects Restricted sample With school fixed effects > Observations60,656 25,711

Longitudinal – grades 4 and 5 Achievement levels (A it ) or achievement gains (A it - A i,t-1 ) Models 1-3 (of 5) No fixed effects 1. Levels (with prior year achievement). Upward biased coefficients because of teacher student matching; potential bias from lagged achievement With school fixed effects 2. Levels. Better but problem of matching within schools remains and potential bias from lagged achievement; direction of bias unclear (see earlier paper) 3. Gains. Downward bias from misspecified persistence variable

Longitudinal Data (cont.) Models 4 and 5 (preferred) Full use of the longitudinal aspect of the data With student fixed effects 4. Levels (but no lagged achievement). Lower bound estimates of teacher credentials 5.Gains. Upward bound estimates of teacher credentials

Teacher experience Coefficients from models 4 and 5 All are statistically significant Base= no experience Math Reading 1-2 years0.057 / / years0.072 / / years / / years0.082 / / years0.092 / / / / 0.089

High school cross-subject analysis Subjects – algebra 1, English I, biology, geometry, ELP Strategy – at least three test scores for every student; include student fixed effects Equivalent to estimating: (A is -A i *) = b (TQ is -TQ i *) + error terms. Where A* is the mean for the student. Consider one potentially problematic error term: (e is -e i *). Think of e as unmeasured student ability. Potential concern if ability for a given student differs by subject AND teachers are distributed in a systematic way by the relative ability of students Based on empirical tests reported in the paper, we have reasonable confidence in our approach.

Coefficients of teacher experience in high school courses Years of experience (base = 0 years) Model with student fixed effects More than Cf. rising coefficients with with teacher fixed effects