TRANSLATING RESEARCH INTO ACTION Randomized Evaluation Start-to-finish Abdul Latif Jameel Poverty Action Lab povertyactionlab.org
Course Overview 1.Why evaluate? What is evaluation? 2.Outcomes, indicators and measuring impact 3.Impact evaluation – why randomize 4.How to randomize 5.Sampling and sample size 6.Implementing an evaluation 7.Analysis and inference 8.Randomized Evaluation: Start-to-finish
Punjab, Pakistan LEAPS Information Dissemination Educational Market The Setting
Primary enrollment is low Quality of education is poor Learning outcomes are poor The private school revolution School fees not reflective of quality Public vs. private differential The Need
Learning Outcomes: Math Children are performing significantly below curricular standards in Math
Basic recognition of simple words with pictures Only 33% can make sentence with common words Learning Outcomes: Urdu
Most children can recognize alphabets Only 29% can complete a simple word Learning Outcomes: English
Public-Private Gap Children in private schools are years ahead of children in government schools
Demand – Informed decision making Supply – Price-adjusted quality Investment – By households – By schools Program Theory
Child report card – Child test score and quintile rank – School average test score – Village average test score School report card – Average score and quintile rank for all schools in the village – Number of children tested Instructions on how to read the report card Report Cards: Information Provided
Child Report Card
School Report Card
Need to know what we are diving into Theory is great, but empirical evidence is even better Informed policy making now, saves from “face-saving” later This works! How can we make it work better? Why Evaluate?
Informed decision-making Better education quality Better learning outcomes More competitive educational market Better insight into the educational system Goals
Administered tests to children – English, Urdu, Math Surveyed schools and teachers – Infrastructure, prices, costs, facilities Surveyed households Measurement
Identify problem and proposed solution – Define the problem both through qualitative work and your own academic background research – Define the intervention – Learn key “hurdles” in design of operations Identify key players – Top management – Field staff – Donors Planning and Design
Identify key operations questions to include in study – How to best present the information? – How to ensure information is comprehensible? – How to accurately measure outcomes? Testing tactics Avoiding cheating – Types or extent of training? Planning and Design
Extensive piloting – Pilots vary in size & rigor – Pilots & qualitative steps are important. – Sometimes a “pilot” is the evaluation – Other times they are pilots for the evaluation Process
The need for an appropriate counterfactual Why Randomize?
Design randomization strategy – Basic strategy – Sample frame – Unit of randomization – Stratification Define data collection plan Planning and Design
Study Design: Basic Strategy Village Stratified by: District Randomly assigned to: Control Group Treatment Group Report Cards
School? Village? Educational marketplace Closed and complete market Sample size? Study Design: Randomization Unit
UnitNumberInstrumentCriteria Village CensusAt least one private school School823SurveyGovernment and private Child12,000Test, SurveyGrade III Teacher5,000SurveyCurrent or previous, Head teacher, Grade III teacher Household1,800SurveyHas grade III children, school- age children not in school Study Design: Sample Frame
A Typical Village in the Sample
Randomization balance – Compare treatment and control Pre-intervention measurements in order to measure changes in outcomes: – Education quality – School fees – Learning outcomes Baseline Survey: Two Purposes
1.Choose a representative sample and collect baseline data 2.Randomize – Real-time randomization – All-at-once randomization – Waves 3.Implement intervention to treatment group – Ensure internal control 4.Measure impact after necessary delay to allow impact to occur – Common question: “How long should we wait?” – Operational considerations must be traded off. No one- size-fits-all answer. – Want to wait long enough to make sure the impacts materialize. Implementation
Report cards delivered through discussion groups – Comprehension of information as important as distribution of information Discussion groups to focus on positive aspects of the card – Understand what influences test scores before blaming the child Report Card Distribution
Differential attrition – Children that gained less more likely to drop out or be absent in treatment villages? – Attrition rate was 18% and there was no differential attrition between treatment and control School switching – Parents more likely to switch their children to better schools in treatment villages? – Switching was only 5% and there was no additional switching in treatment compared to control villages Preview of the Warts!
Preview of Good Results Impact: – 0.1 standard deviation gain in average child learning – a third of the the average yearly gain experienced by children – 21% drop in the school fees of private schools (public schools are free) Impact Heterogeneity: – Learning improved for the initially low performing private schools, well over 0.3 standard deviations – Learning did not increase for initially better performing private schools, but their fees dropped by 23% – Learning improved by 0.1 standard deviations for government schools
Score Gains: Treatment Heterogeneity
Schools increase investment – More likely to have the tested class teacher improve qualification – Bad private schools see a large and significant effect in teaching material – Bad private schools decrease total break time Households don’t change investment – No significant change in the number of hours spent by parents helping their children with schoolwork – There is a decrease in annual spending on children’s education Investments
Cost of providing information for the entire population comparable to the fee reduction in initially well performing schools Household welfare – benefit of increased learning at (essentially) zero cost Gains highest for low performing private schools but government schools improved too Cost-Benefit Analysis
The net benefits of increasing quality decrease at higher quality levels – Concavity in quality/effort trade-off Increased information leads to greater pooling in price- adjusted quality across schools –After intervention, left with lower quality but free public sector and higher quality and somewhat more expensive private sector Forward-looking arrangements between parents and schools – Little switching in equilibrium, yet the private sector responds as if it faces greater competition Conclusions