Evaluating a test-based subsidy program for low-cost private schools: Regression-discontinuity evidence from Pakistan Felipe Barrera-Osorio Dhushyanth.

Slides:



Advertisements
Similar presentations
Equity - Research Reveals the What, the Where and the How November 21, 2011.
Advertisements

Monitoring and Evaluation of ICT Use in Rural Schools Ok-choon Park Global Symposium on ICT in Education – Measuring Impact: Monitoring and Evaluation.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
No Child Left Behind Act January 2002 Revision of Elementary and Secondary Education Act (ESEA) Education is a state and local responsibility Insure.
Teacher Effectiveness in Urban Schools Richard Buddin & Gema Zamarro IES Research Conference, June 2010.
Opportunity Scholarship Grants Information for Parents and School Administrators Representative Paul Stam August 14, 2014.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Educator Evaluations Education Accountability Summit August 26-28,
NCSC- National Center and State Collaborative Students with Significant Cognitive Disabilities.
N O C HILD L EFT B EHIND Testing Requirements of NCLB test annually in reading and mathematics in grades 3-8 test at least once in reading and mathematics.
Compact Termly Primary Headteacher Briefing November 2012 Headline Performance Data 2012.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Presented by CCSSO and Penn Hill Group December 4, 2014
Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of.
Community Input Discussions: Measuring the Progress of Young Children in Massachusetts August 2009.
The effective use of tests and tasks to support teacher assessment in Y2 4 th February 2014 Karen Samples.
The Impact of Postsecondary Remediation Using a Regression Discontinuity Approach: Addressing Endogenous Sorting and Noncompliance Juan Carlos Calcagno.
PARCC Assessment Administration Guidance
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Designing an Evaluation of the Effectiveness of NIH’s Extramural Loan Repayment Programs.
Austin Elementary School August 20 and 27, 2015
+ Equity Audit & Root Cause Analysis University of Mount Union.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
Alternate Assessment Attainment Task Part 2.1 Click here to download the Administration Guide Required for completion of this training Overview/Attainment.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
Running Start Making the right choice for you.. Running Start - What is It? A State program created to give qualified students the option to pursue their.
Evaluating a test-based subsidy program for low-cost private schools: Regression-discontinuity evidence from Pakistan Felipe Barrera-Osorio Dhushyanth.
Daniel H. Holloway Senior at Old Dominion University Coordinator of Database Services Gloucester County Public Schools.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Using A Regression Discontinuity Design (RDD) to Measure Educational Effectiveness: Howard S. Bloom
Using Regression Discontinuity Analysis to Measure the Impacts of Reading First Howard S. Bloom
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Applying impact evaluation tools A hypothetical fertilizer project.
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
No Child Left Behind. HISTORY President Lyndon B. Johnson signs Elementary and Secondary Education Act, 1965 Title I and ESEA coordinated through Improving.
Michigan School Report Card Update Michigan Department of Education.
1 United States Education at a Glance 2015 Andreas Schleicher Director for Education and Skills Release date: 24 November 2015.
Assessments used in teaching English as a foreign language at elementary schools in Asia: Cases from Korea, Taiwan, and Japan Yuko Goto Butler (University.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Implementing an impact evaluation under constraints Emanuela Galasso (DECRG) Prem Learning Week May 2 nd, 2006.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
Regression Discontinuity Design Case Study : National Evaluation of Early Reading First Peter Z. Schochet Decision Information Resources, Inc.
COLLEGE CREDIT PLUS Blanchester H.S. February, 2016.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Assessing the Impact of Informality on Wages in Tanzania: Is There a Penalty for Women? Pablo Suárez Robles (University Paris-Est Créteil) 1.
Revamping the Teaching Profession by Attracting Non-Teachers to It: Evidence from Enseña Chile Mariana Alfonso Education Division, Inter-American Development.
TRANSLATING RESEARCH INTO ACTION Randomized Evaluation Start-to-finish Abdul Latif Jameel Poverty Action Lab povertyactionlab.org.
© Take Charge Today – August 2013 – Understanding Credit Cards – Slide 1 Funded by a grant from Take Charge America, Inc. to the Norton School of Family.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Classroom Network Technology as a Support for Systemic Mathematics Reform: Examining the Effects of Texas Instruments’ MathForward Program on Student Achievement.
Leader slts PRIOR TO : Principal set a minimum of two goals, which express an expectation of student growth. SPS available for use.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Eric Hanushek, Steven Rivkin and Jeffrey Schiman February, 2017
Understanding Credit Cards
Teacher SLTs
A new fantastic source for updating the Statistical Business Register
Explanation of slide: Logos, to show while the audience arrive.
Methods and Approaches to investigate the UK Education System
Felipe Barrera-Osorio (HDNED World Bank)
Presentation transcript:

Evaluating a test-based subsidy program for low-cost private schools: Regression-discontinuity evidence from Pakistan Felipe Barrera-Osorio Dhushyanth Raju HDN Learning Week November 10, 2008

Presentation plan Program context and motivation Program design Research questions Regression-discontinuity research design Data and sample Findings Caveats and potential threats to internal validity 2

Foundation-Assisted Schools (FAS) program: Overview Administration: Designed and administered by semi-autonomous organization: the Punjab Education Foundation Objectives: Increase school participation among children from disadvantaged backgrounds as well as their achievement levels. Timeline: FAS program established in 2005 and expanded in phases (4 phases completed; applicant screening for phase 5 underway). Coverage: – Districts: 18 out of the 35 districts in Punjab; 87% of schools in 7 districts; – Schools: 1,082 low-cost private schools (grades 1-10); – Students: 474,000. – Largest PPP program in Pakistan. Program school characteristics: – Mean school size: 351 students. – Majority of schools are middle level (59%), coeducational (83%), registered (87%) and in rural areas (55%). 3

FAS program: Benefits Subsidy: Rs. 300 (US$4.3) per student per month transferred every month for 12 months in the year. Subsidy level set at upper-end of price range for low-cost sector. Use of subsidy largely unfettered. Teacher bonus: Rs. 10,000 (US$143) per teacher for a maximum of 5 teachers in each program school that attains a minimum pass rate of 90% in the Quality Assurance Test (QAT). Offered once a year. Bonus size: 370% of mean maximum monthly teacher salary at baseline. School bonus: Rs. 50,000 (US$714) to the school in each district with the highest pass rate in the QAT. Offered once a year. Bonus size: 76% of mean monthly subsidy payment to schools given mean enrollment size at baseline. 4

FAS program: Continued benefit eligibility conditions Maintain minimum enrollment size of 100 students. Eliminate all tuition and fees for all students. Place a PEF-issued signboard outside school gate which announces tuition- free schooling and provides PEF contact information. Report enrollment figures to PEF every month. Attain a minimum student pass rate of 67% in Quality Assurance Test (QAT) First violation: Penalties (e.g., frozen enrollment). Second violation: Permanent program disqualification. 5

FAS program: Continued benefit eligibility conditions Maximum student-teacher and student-classroom ratios of 35:1. One class per classroom in any period. Registration of school with local authorities within one year of program entry. Maintenance or upgradation of school infrastructure quality per instructions of PEF. Adequate furniture and teaching and learning materials and tools as determined by PEF. Less stringently applied. Schools which violate these conditions given grace period within which to comply (grace period determined on a case-by-case basis). 6

FAS program: Quality Assurance Test (QAT) Used for assessing eligibility for continued benefit receipt. Developed in-house by PEF; administered by contracted organizations. Based on common syllabi and textbooks used in low-cost private schools. Selected subjects: English, Urdu, Mathematics, and Science. Tests almost full range of cognitive learning levels (from knowledge to synthesis). Offered twice a year (in Oct/Nov and Mar/Apr). To date, 5 QATs offered. Test offered to two to three grades (# depends on school level). At least 67% of students in tested grades have to score 40% or higher on the QAT. 7

FAS program: Shortlisting Quality Assurance Test (SLQAT) Used by PEF for assessing initial eligibility for FAS program benefits. Pared-down version of QAT (less questions, less subjects, lower levels of learning only). Developed and administered by PEF. Offered to two to three grades (# depends on school level). At least 67% of students tested have to score 33% or higher on the test. Used as the final screening device for phase-3 and phase-4 applicants. 8

Research questions What is the causal effect of the FAS program on – Number of students – Average student learning – Inputs: teachers, classrooms, blackboards, toilets – Student-input ratios. Student learning data unavailable presently. 9

Identification Entry process: – Schools apply when a call for applications is issued. – Schools with qualifying applications are subject to a physical inspection by PEF. – In phase-3 and phase-4, schools that pass the physical inspection are offered the SLQAT. Assignment to FAS program (treatment) based ultimately on school’s pass rate on SLQAT. If school attains at least 67% (the cutoff), school is eligible; otherwise not. Virtually all schools that attained at least the cutoff, accepted treatment (take-up rate approx. 100%). No treatment dropouts to date. 10

Identification: Phase-4 SLQAT takers Applicant schools with pass rates at or above cutoff, eligible for treatment and took up treatment. Applicant schools with pass rates below cutoff, ineligible and not offered treatment. Data neatly fit a sharp RD design. Probability of treatment is zero below the cutoff, and one above. Treatment parameter: ATT at the cutoff. 11

Identification: Phase-3 SLQAT takers Phase-3 SLQAT failers had the opportunity to reapply and take the phase-4 SLQAT. Phase-3 SLQAT takers can be disaggregated into three subgroups, namely those who: (1) did not enter the program in either phase 3 or 4 (2) entered the program in phase 3 (3) entered the program in phase 4. Given that (3) were failers in the phase-3 SLQAT, these schools can be treated as crossovers. Data fit a fuzzy RD design with one-way noncompliance (crossovers). Probability of treatment is positive but less than one below cutoff, and one above. Treatment parameter: LATE at the cutoff. 12

13

Estimation Both data designs Estimator: ATT and LATE at cutoff estimated nonparametrically using local linear regression (LLR). Kernel (weighting function): Triangular. Bandwidth (window width): Optimal size heuristically-set such that kernel is applied to at least 30 observations on either side of the cutoff. Sensitivity analysis: Fixed kernel; selected increases in bandwidth size. 14

Baseline data Source: Administrative data. Data on school characteristics and non-learning outcomes for all phase-3 and phase-4 applicants obtained from applications maintained electronically by PEF. Data on SLQAT pass rates and average school test performance for phase-3 and phase-4 SLQAT takers obtained from individual student test score data maintained electronically by PEF. Test score and application databases linked together by visually matching on school name and location. 94% and 97% of schools in test database linked to application database in phase 3 and 4, respectively. Full SLQAT-taking samples: Phase 3: 747. Phase 4:

Endline data Source: Phone interviews (schools provided phone numbers in applications). Sample: Phase-3 and phase-4 testtakers with pass rates +/- 15% pts of the cutoff (sample obtained from application-test score linked database). Referred to as neighborhood samples. Original neighborhood sample sizes: Phase 3: 268 (36% of SLQAT taking sample). Phase 4: 319 (38%). Treatment period: Data collected 14 and 10 months after first subsidy payment to phase-3 and phase-4 entrants (partially spans two academic years). Unit nonresponse rate: Phase 3: 28% Phase 4: 26%. Nonreponse bias analysis: No significant correlation with treatment assignment; no strong evidence of correlations with school characteristics measured using baseline data. Neighborhood sample sizes with endline data: Phase 3: 192. Phase 4:

Impact findings Application of partially fuzzy RD design to phase-3 SLQAT neighborhood sample: – No evidence of significant program impacts. – RD LATE estimates sensitive to bandwidth choice. – Empirical standard errors inordinately large, suggesting weak identification. Application of sharp RD design to phase-4 SLQAT neighborhood sample: – Large positive effects on enrollment, teachers, classrooms, and blackboards; within short treatment period of 10 months. – RD ATT estimates somewhat sensitive to bandwidth choice. – Conservative estimates: +85 children (37% relative to baseline mean); +3.4 teachers (37%); +4 classrooms (47%); and +2.8 blackboards (27%). 17

LLR estimates of post-treatment discontinuities 18

Cost-effectiveness findings Annual cost of one additional student induced by the FAS program: – Underlying data: 3,600 rupees per student per year; mean baseline enrollment in phase-4 neighborhood schools: 232; impact of 85 students. – 13,426 rupees (US$189) to induce an additional student per year. Annual cost per student of increasing enrollment by 1%. – Underlying data: 3,600 rupees per student; impact: 37%. – 97 rupees (US$1.4) – Among the lowest cost-effectiveness ratios estimated. 19