School Accountability and the Distribution of Student Achievement Randall Reback Barnard College Economics Department and Teachers College, Columbia University.

Slides:



Advertisements
Similar presentations
August 8, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Shannon Housson, Director Overview of.
Advertisements

Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Mark D. Reckase Michigan State University The Evaluation of Teachers and Schools Using the Educator Response Function (ERF)
School Accountability Ratings What Are Our District’s Accountability Ratings? What do they mean?
Presented to the State Board of Education August 22, 2012 Jonathan Wiens, PhD Office of Assessment and Information Services Oregon Department of Education.
‘No Child Left Behind’ Loudoun County Public Schools Department of Instruction.
1 Title I Faculty Presentation Department of Federal and State Programs or PX
Under Pressure: Job Security, Resource Allocation, and Productivity in Schools under NCLB Randall Reback Barnard College, Columbia University Jonah E.
Teacher Effectiveness in Urban Schools Richard Buddin & Gema Zamarro IES Research Conference, June 2010.
A “Best Fit” Approach to Improving Teacher Resources Jennifer King Rice University of Maryland.
School Report Cards 2004– The Bottom Line More schools are making Adequate Yearly Progress. Fewer students show serious academic problems (Level.
School Report Cards For 2003–2004
District Accountability Update May February 2007.
Do Accountability and Voucher Threats Improve Low-Performing Schools? David N. Figlio and Cecilia Elena Rouse NBER Working Paper No August 2005.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Multnomah County Student Achievement Presented to the Leaders Roundtable November 25, 2008 Source: Oregon Department of Education, Dr. Patrick.
Accountability and Assessment: From “A Nation at Risk”  NCLB  Race to the Top.
Analysis of Clustered and Longitudinal Data
TESTING. League of Women Voters of Orange County Nonpartisan since 1920 Takes positions after study. Promotes principles of good governance. Educates.
1 Comments on: “New Research on Training, Growing and Evaluating Teachers” 6 th Annual CALDER Conference February 21, 2013.
San Antonio Independent School District Attendance Issues ALT Presentation November 2003.
Michigan’s Accountability Scorecards A Brief Introduction.
Arizona’s Federal Accountability System 2011 David McNeil Director of Assessment, Accountability and Research.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
ACCOUNTABILITY UPDATE Accountability Services.
School Accountability and School Choice: Effects on Student Selection across Schools Comments: Eric J. Brunner University of Connecticut.
March 7, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Accountability Policy Advisory Committee.
Challenge to Lead Southern Regional Education Board Mississippi Challenge to Lead Goals for Education Mississippi is Moving Ahead Progress Report 2010.
The Policy Choices of Effective Principals David Figlio, Northwestern U/NBER Tim Sass, Florida State U July 2010.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA Modified by Dr. Teresa Cortez September 10, 2007.
Julian Betts, Department of Economics, UCSD and NBER.
FCD CWI 1 The Foundation for Child Development Index of Child Well- Being (CWI) 1975 to 2004 with Projections for 2005 A Social Indicators Project Supported.
1 Title I Faculty Presentation Department of Federal and State Programs or PX
Santa Ana Unified School District 2011 CST Enter School Name Version: Intermediate.
No Child Left Behind Education Week
How schools influence students' academic achievements? A behavioral approach and using data from Add health Yuemei JI University of Leuven.
Academic Excellence Indicator System Report For San Antonio ISD Public Meeting January 23, 2006 Board Report January 23, 2006 Department of Accountability,
WHY ARE WE TAKING SO MANY TESTS? Education Reform and the Modern Accountability Movement Wayne Zhang.
CREP Center for Research in Educational Policy SES Student Achievement Methods/Results: Multiple Years and States Steven M. Ross Allison Potter The University.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Assigns one of three ratings:  Met Standard – indicates campus/district met the targets in all required indexes. All campuses must meet Index 1 or 2.
ESEA Federal Accountability System Overview 1. Federal Accountability System Adequate Yearly Progress – AYP defined by the Elementary and Secondary Education.
ADEQUATE YEARLY PROGRESS. Adequate Yearly Progress Adequate Yearly Progress (AYP), – Is part of the federal No Child Left Behind Act (NCLB) – makes schools.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA Modified by Dr. Teresa Cortez January 2010.
University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Challenges for States and Schools in the No.
Public School Accountability System. Uses multiple indicators for broad picture of overall performance Uses multiple indicators for broad picture of overall.
What You Should Know About the State’s Two Year Old Accountability System.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Measuring College and Career Readiness PARCC RESULTS: YEAR ONE NETCONG SCHOOL DISTRICT DECEMBER 15, 2015 Dr. Gina Cinotti, CSA.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA Modified by Dr. Teresa Cortez September 1, 2008.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA May 2003 Modified by Dr. Teresa Cortez for Riverside Feeder Data Days February.
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
Developmentally Appropriate Practices. Five Guidelines For Developmentally Appropriate Practices.
Conversation about State Report Card November 28, 2016
Academic Performance Index (API) and AYP
Beresford School District Report Card Data 16-17
Academic Performance Index (API) and AYP
A Brief History Data-Based School & District Improvement
School Quality and the Black-White Achievement Gap
Local Growth Models for Accountability
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
TESTING: How We Measure Academic Achievement
Campus Comparison Groups and Distinction Designations
Education Briefings for Candidates for Office In 2008
Education Briefings for Candidates for Office In 2008
Presentation transcript:

School Accountability and the Distribution of Student Achievement Randall Reback Barnard College Economics Department and Teachers College, Columbia University

No Child Left Behind States must adopt accountability systems that assign ratings to schools based on student pass rates on exams in elementary, middle school, and high school grades School is not making ‘Adequate Yearly Progress’ if a pass rate is not sufficiently high, where the required pass rate increases each year Consequences –Stigma, financial rewards/penalties, loss of local control, changes in property values –Intra-district public school choice provision –Tutoring for economically disadvantaged children

Texas Accountability Program Precursor to No Child Left Behind Assigns schools one of four ratings based on –Dropout Rates –Attendance Rates –Fraction of students who pass exams (overall and within subgroups by race and family income) Testing Incentives are based on Pass Rates, not value-added measures of student-achievement

Key Provisions of the Texas Accountability System

Previous Research Related to School Accountability/Minimum Proficiency Relative performance of students at different points in distribution (Holmes, 2004; Deere & Strayer, 2001) Achievement trends –Grissmer & Flanagan (98)- Math NAEP in TX –Hanushek & Raymond- Math NAEP –Carnoy, Loeb, Smith (2002)- TX improvements didn’t correspond with improved th grade transitions, SAT participation, SAT performance Low-performing versus high-performing schools (Jacob, forthcoming States with or without HS graduation exams (Jacobsen, 1993) Gaming –Exemptions: Figlio & Getzler, Cullen & Reback –School meals: Figlio & Winicki –Disciplinary practices: Figlio

Theoretical Framework Subject Specific but not Student Specific Inputs (a s ) Not Subject Specific but Student Specific Inputs (b i ) Subject Specific and Student Specific Inputs (c s ) Assume only campus-wide Math (m) and Reading (r) pass rates count. Call all other subjects (z). Schools want to maximize:

The Data Texas Assessment of Academic Skills –Math Tested Grades 3-8 and 10 –Writing Tested Grades 4, 8, and 10 –Test Documents Submitted for Every Student –Includes Student Descriptors Campus Level Data on Attendance/Dropouts Texas Learning Index –Measures How Student Performs Compared to Grade Level –I Do Not Measure Test Score Gains for Observations with Prior Year’s Scores Below 30 or Above 84

Pass Rate Probabilities Based on Prior Year Test Score Range Passing Score=70

Estimating the Marginal Benefit to the School from a Moderate Increase in a Student’s Expected Performance (1)estimate the probability that each student passes by grouping students based on their performance during other years (2)use these student-level pass probabilities to compute the probability that the school will obtain each rating (3)find the marginal effect of a moderate improvement in the expected achievement of a particular student on the probability that the school obtains the various ratings….

How a ‘moderate improvement’ in a student’s achievement is determined hypothetical pass probability by re-estimating the student’s pass probability after dropping the bottom X% of the current year score distribution among students with identical prior year scores For example: distribution of this year’s Math scores for students scoring 53 last year in Math 0%: 36 Actual pass probability=.20 20%: 49 40%: 55 60%: 59 Pass probability with X% set at 20%: 80%: 70.2 /.8= %: 86

Dependent Variable Year-to-year variation in test scores might be greater at certain points of achievement distribution Value-added models examining distributional effects SHOULD NOT simply look at changes in levels or in relative place in distribution Instead, use conditional Z-score… Z-score among students with similar prior year scores This way, results are compared to typical progress at that place in the test score distribution

Empirical Model #1: Campus-Year Fixed Effects Student i in grade g during year y at school s S i,t includes control variables for student characteristics: Cubic terms for prior year scores in other subject Racial dummy variables, Low-income family dummy, and Race-income interactions

Achievement Gains and Marginal Accountability Incentives within Schools and within the Same Year

Model #2: Response to Infra-marginal Incentives (Cross-sectional comparisons) Schools might consider the impact of improving the expected performance of 5% of the students Define as the marginal change in the schools’ probability of a higher rating if all students in the ‘group’ are expected to do better

Achievement Gains and Infra- marginal Accountability Incentives

Model #3: Incentive to Improve Performance within a Grade-level Schools might use inputs that simultaneously affect multiple students Define as change in school’s probability of receiving a higher rating if all students in student i‘s grade at the school improve

Approximate Effect Sizes (SD change in Statewide Achievement Distribution from 1 SD increase in Accountability Incentive

Effects of Sample Selection Due to Student Exemptions & Grade Repetition Exemptions –negative relationship between student-level accountability incentive and likelihood that student is exempted from accountability pool –suggests that estimated effect of student-level accountability incentive may understate the true effect –Also suggests that estimated effect of grade-level incentives overstate the effects for lowest achievers and understate for others Grade Retention –small, positive relationship between the student-level accountability incentives and the probability of grade retention –effect on main results is unclear, but likely small

Conclusions Schools respond to specific incentives of a rating system Appear to respond with broad changes in teaching or resource allocation rather than narrowly-targeted changes Current findings may understate distributional effects –High achievers (top 50% Reading, top 33% math) are not included –May be permanent changes rather than response to short-run incentives NCLB-style ratings. Are they good? bad?... depends on one’s preferences.