Impact Evaluation of the Potential for Teacher Incentives to Improve Outcomes Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making.

Slides:



Advertisements
Similar presentations
Eric A. Hanushek Stanford University
Advertisements


Leon County Schools Performance Feedback Process August 2006 For more information
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
With D.D. Karopady M. Sreenivasa Rao The Azim Premji Foundation Karthik Muralidharan Department of Economics, UCSD Abhilash Mudaliar Venkatesh Sundararaman.
M SREENIVASA RAO AZIM PREMJI FOUNDATION 29 April, 2011 Achieving universal quality primary education in India Lessons from the Andhra Pradesh Randomized.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Kentucky’s School Report Card and Spreadsheets
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Performance Based Incentives for Learning in the Mexican Classroom Brian Fuller, MPA, Foundation Escalera Victor Steenbergen, MPA Candidate, London School.
Embedding the Early Brain & Child Development Framework into Quality Rating and Improvement Systems Meeting Name Presenter Name Date 1.
LCFF & LCAP PTO Presentation April, 2014 TEAM Charter School.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
CRICOS Provider No 00025B Strategies for enhancing teaching and learning: Reflections from Australia Merrilyn Goos Director Teaching and Educational Development.
@gardnercenter. Community Research for Youth and Families Amy Gerstein Children and Families Policy Symposium March 4,
Pascaline Dupas, UCLA Pupil-Teacher Ratios, Locally-Hired Contract Teachers, and School-Based Management: Evidence from Kenya Making Schools Accountable:
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Incentives James Habyarimana Impact Evaluation Initiative World Bank.
School-Based Management: Harry Anthony Patrinos
Los Angeles Unified School District Edgar Zazueta, Chief of Staff-External Affairs Valley Schools Task Force 1/29/14 Los Angeles Unified School District.
School Leadership Teams Collaborating for Effectiveness Begin to answer Questions #1-2 on the Handout: School Leadership Teams for Continuous Improvement.
Jackson Public School District Holistic Accountability in Action.
Thebroadfoundations PAY FOR PERFORMANCE PACE Conference Oakland and Los Angeles, CA March 2009.
Community Participation in Public Schools: Impact of Information Campaigns in three Indian States Priyanka Pandey, Sangeeta Goyal & Venkatesh Sundararaman.
1 Student Assessment Report One Goal: Support Student Success West Hempstead UFSD Board of Education Presentation August 20, 2013.
Alicia Currin-Moore Executive Director, TLE Oklahoma State Department of Education.
Education for Accountability Workshop, June 22 nd, 2009 Overview of the Evidence: Interventions in Teacher Incentives Barbara Bruns Lead Education Economist,
M&E progress in EFA Goals Prepared by Nyi Nyi THAUNG, UIS (Bangkok) Capacity Building Workshop on Monitoring and Evaluating Progress in Education in the.
Paying Teachers to Perform: The Effects of Bonus Pay in Pernambuco, Brazil Barbara Bruns Lead Education Economist, Latin America and The Caribbean Region.
PA School Performance Profile June /3/13. Your Role: Communicate the purpose and design of the proposed PA School Performance Profile (SPP) Create.
Toolkit #3: Effectively Teaching and Leading Implementation of the Oklahoma C 3 Standards, Including the Common Core.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
South Africa Presentation Dakar 2008, APEIE Jennifer Joshua Faith Kumalo Morongwa Masemula Justice Libago.
MEAP / MME New Cut Scores Gill Elementary February 2012.
Public Financing of Education in Mongolia Equity and Efficiency Implications M. Caridad Araujo and Katie Nesmith The World Bank.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Impediments to the estimation of teacher value added Steven Rivkin Jun Ishii April 2008.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
STAR 3 Parent Advisory Council UPDATE Questions to Parents Fact: More than 28% of children enter kindergarten more than 1 year below age level in.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Education finance equalization, spending, teacher quality and student outcomes: The case of Brazil ’ s FUNDEF Nora GordonEmiliana Vegas UC San Diego The.
1 Teacher Performance Pay: Experimental Evidence from India Karthik Muralidharan University of California, San Diego (with Venkatesh Sundararaman, World.
DRAFT Title I Annual Parent Meeting Sandpiper Elementary School Mrs. Camille LaChance.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
1 Teacher Performance Pay: Evidence from the Andhra Pradesh Randomized Evaluation Study (AP RESt) Government of Andhra Pradesh Azim Premji Foundation The.
Teacher Incentive Fund U.S. Department of Education.
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Unit 8.  Program improvement or appraisal  Assessing the value of a program  Measuring the efficacy of particular components of a program  Meeting.
Boyertown Area School District Data Summary
Overview Plan Input Outcome and Objective Measures Summary of Changes Board Feedback Finalization Next Steps.
APR 2014 Report: Data, Analysis and Action Plan for Full Accreditation.
Education 2018: Excellence for Every Student Presented to the Board of Education August 27,
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
TCAI: Lessons from first Endline TCAI Development Partners Feb 27, 2013.
Appleton Area School District
Roswell North Elementary School
Two District’s Best-Practices in Supporting Secondary LTELs
Lodi USD LCAP Data Review
Lodi USD LCAP Data Review
Central Middle School August 20, 2019
Roswell North Elementary School
Tomlinson Middle School August 27, 2019
Presentation transcript:

Impact Evaluation of the Potential for Teacher Incentives to Improve Outcomes Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May

Teacher incentives How to ensure that teachers make the most effort possible? 2

Teacher absenteeism Percent of teachers absent on the day of an unannounced visit to the school 3

Teacher incentives Intrinsic motivation Extrinsic motivation Models for improving incentives: – Higher standards for entry, higher average pay and pay gradients, career progression linked to skills and performance – “Contract teachers” -- local hiring, no tenure, performance evaluated directly by the school community – “Pay for performance” – bonus pay linked to effort and/or results 4

How could teacher incentives lead to better outcomes? Quality of staff – At entry: Accreditation and merit-based incentives lead to higher quality teaching professionals joining the education system – In remote areas: Locality-based financial incentives improve equity of teacher placements Increased attendance and effort – Decentralized hiring and monitoring of teacher performance leads to higher teacher attendance rates & teacher effort – Pay-for-performance systems incentivize improved quality and quantity of teaching (at least in the short-term) and thus improve student test scores. Sustained effort – Teacher career advancement incentives stimulate sustained teacher effort, thereby improving student performance (Empirically testable) Assumption is that increased quality and effort increase learning 5

Focus today: Teacher pay for performance schemes Linking pay to performance – As measured by tests – Note: Could also be linked to effort e.g. presence in the classroom at the beginning and end of day 6

Teacher pay for performance schemes: Potential downsides Assumes teacher know how to improve teaching Difficulty of accounting for characteristics of student body Perverse impacts at the level of students: – teaching to the test – manipulating who takes a test Perverse impacts at the level of teachers: – Demoralization – undermining intrinsic motivation  Impact evaluation will help us understand the tradeoffs between potential upsides and downsides of incentives 7

Case Study 1: Teacher incentives in India Teacher Performance Pay: Experimental Evidence from India By Karthik Muralidharan (University of California San Diego) and Venkatesh Sundararaman (World Bank) 8

9 Location of Study Indian State of Andhra Pradesh (AP) -5 th most populous state of India  Population of 80 Million -23 Districts (2-4 Million each) Close to All-India averages on many measures of human development IndiaAP Gross Enrollment (6-11) (%) Literacy (%) Teacher Absence (%) Infant Mortality (per 1000)6362

Incentive design Teachers were given bonus payments over and above their regular salary on the basis of average improvement of test scores of all students in grade/school over base line – Subjects considered were math and language – Assessment papers were designed by an independent testing agency (EI) – All assessments were conducted by an independent NGO (APF) Bonus formula – Rs. 500 bonus for every 1% point improvement in average scores – Calibrated to be around 3% of annual pay (and equal to input treatments) Both group and individual level incentives were studied – Free-riding/Peer monitoring/Gains to cooperation 10

11 Design Overview INCENTIVES (Conditional on Improvement in Student Learning) INPUTS (Unconditional) NONE GROUP BONUS INDIVIDUAL BONUS NONE CONTROL (100 Schools) 100 Schools EXTRA CONTRACT TEACHER 100 Schools EXTRA BLOCK GRANT 100 Schools

Summary of Experimental Design Study conducted across a representative sample of 500 primary schools in 5 districts of AP Conduct baseline tests in these schools (June/July 05) Stratified random allocation of 100 schools to each treatment (2 schools in each mandal to each treatment) (August 05) Monitor process variables over the course of the year via unannounced monthly tracking surveys (Sep 05 – Feb 06) Conduct 2 rounds of follow-up tests to assess the impact of various interventions on learning outcomes (March/April 06) Interview teachers after program but before outcomes are communicated to them (August 06) Provide bonus payments and communicate continuation of program (Sept 06) 12

Results Note: Smaller impact also found on non-incentivized subjects (science; social studies) 13

Group versus Individual Incentives 14

How did teacher behavior change? 15

Summary of results Incentive schools perform significantly better (0.22 SD) – Improvements are across the board (all grades, districts, baseline scores) – Limited evidence of heterogeneous treatment effects – Children in incentive schools perform better on mechanical and conceptual components of test, and also on non-incentive subjects No difference between group and individual incentives in the first year – but in the second year the individual incentives start outperforming the group incentives Teacher absence does not change, but incentive school teachers report higher levels of teaching activity conditional on attendance These differences in behavior are correlated with learning outcomes Much more cost effective than inputs of the same value 16

Case Study 2: Teacher incentives in Brazil Encouraging quality: Evaluating the effects of a bonus for Performance in education in Brazil By Claudio Ferraz (PUC Rio) and Barbara Bruns (World Bank) 17

Brazil: Study Aims to Understand… The effects of the introduction of a system of bonuses for students’ performance based on standardized tests. Variation in the impact of bonus according to characteristics of schools (e.g. social cohesion; teacher profiles). Strategies used to improve performance. 18

Study area: Pernambuco State, Brazil 19

Features of Brazil case study The Program of Educational Performance Bonus in Pernambuco was created by 2008 law Its goal was to create incentives for improvement in the quality of education, rewarding employees of schools that meet school-specific performance targets In the first year, targets were based on an index* of performance in 2005 (the last available information). Three groups of school: high performance mid-performance low performance * Index = Average test score * pass rate 20

Features of Brazil case study System wide implementation (not “experiment) Causal analysis of impacts possible using: – Differences-in-differences – Regression Discontinuity designs exploiting annual targets and rules for bonus 21

Impact evaluation methodology Differences-in-differences – Compare the performance of State schools of Pernambuco with State schools in other neighbouring States, before the bonus program ( ) and after (2009) Regression discontinuity – Targets are set according to whether the school was in the low, middle, or high category Low: reach the average score for the state of Pernambuco. Middle: reach 10% over the average index level for the Northeast region High: reach the average index level for all Brazilian states. 22

Illustration of RD design Goal for 2008 (in Portuguese) for each school (according to 2005 level) How do outcomes in these schools … … differ from outcomes in these schools 23

Bonus determination The proportion of goal reached by school is calculated as PI = (actual progress) / (required progress) Schools with at least 50% earn bonus Bonus is determined by initial salary and with the percentage of the target achieved 24

Brazil: Outcome measures Student learning and repetition, teacher attendance, school-level planning activities School level trust and social capital Teacher behavior “inside the black box” via standardized classroom observations Dynamic effects of schools’ receiving/not receiving bonus on subsequent years’ strategy and effort…and do schools know what to do?? 25

Preliminary results on Portuguese test scores 26

Brazil: “Stallings” method of classroom observation Used in all study schools to measure potential changes in in-classroom behavior 27

Brazil: Example of data generated 28

Thank you 29