Impact Evaluation and Administrative Data Systems N. Krishnan, C. Rockmore Africa Impact Evaluation Initiative Dakar, 18 December 2008
Opportunities and challenges Most impact evaluations use survey data Unique opportunity in education to use administrative and test data Need to improve the quality and usability of administrative data Integrate with test score data Case studies
Data sources Administrative Test data School surveys Beneficiary only Output and process data Facility level of disaggregation Systems in place, low additional cost Test data Individual level Main outcome School surveys Independent verification Complementary information Output and outcome data Costly
Administrative data quality Relevance: feedback loop from IE and others Accuracy: identify systematic misreporting Timeliness: September vs. January Accessibility: can it be integrated and used? Interpretability: methods and documentation Coherence: does it match other sources? Single ID,
Build within existing frameworks and plans Integrated and modular data systems Use common identifiers for the same concepts (schools, children, and teachers) Identifiers (codes) should have meaning Harmonize data collection efforts Look at sector plans and work within them
From evaluation to data systems Eritrea: supporting data for malaria 2004 evaluation entered district level data in a database Ministry of Health malaria program continued maintaining the database with facility data on a monthly basis intervention output data (nets distributed, medicines, insecticide by facility and date of delivery) and in and out-patient outcome data on fever cases.
Use of administrative data for assigning grants Niger: mitigating misreporting School grants on the basis of classroom use to avoid misreporting on number of pupils Uganda District level statistics to allocate rewards
Score cards (Tableaux de bord) Link input and output data to results Compare across administrative levels Feedback loop possible for lower levels Can identify the low-input, high-result and high-input, low-result cases
Use of data in impact evaluations
South Africa IE question: Did the Dinaledi schools program improve student enrolment and performance in Mathematics and Physical Science relative to similar schools? Outcome of interest: Number of students entering, writing and passing Math and Physical Science in the senior certificate/school-leaving exam Data used: School characteristics from school level EMIS data Outcome data from exam centre level testing data Issues: School codes and exam centre codes are not the same, and often follow no relation to each other A school could be associated with more than one exam centre and vice versa
Dinaledi results (Gauteng province) Pass rate out of students who entered Dinaledi Schools (Average) Matched Schools (Average) Difference: Treatment-Control HG Math 0.72 0.62 0.10*** SG Math 0.64 0.60 0.04** HG Physics 0.83 0.74 0.09*** SG Physics 0.68 0.08*** ***significant at 1%, **significant at 5%