Download presentation
Presentation is loading. Please wait.
Published byLorena Sparks Modified over 6 years ago
1
How to Get the Most Out of Education Impact Evaluations
Most of my experience draws on Education Evaluations from low- and middle-income countries, but I strongly believe that many of the same lessons will apply. David Evans World Bank
2
The Learning Crisis
3
Kenya, Tanzania, and Uganda
“The name of the dog is Puppy” Could not understand Grade 3 WDR 2018
4
Rural India 46 17 ? Grade 5 Could not solve WDR 2018
5
“Molly didn’t understand.”
England “Molly didn’t understand.” Could not understand Grade 2 WDR 2018
6
Progress in Math (PISA 15 year olds)
Brazil Progress in Math (PISA 15 year olds) If Brazilian 15-year-olds continue to improve at their current rate they will not reach the rich-country average score in math for 75 years. In reading, it will take 263 years. WDR 2018
7
75 years Time to reach OECD average in Math (PISA 15 year olds) Brazil
If Brazilian 15-year-olds continue to improve at their current rate they will not reach the rich-country average score in math for 75 years. In reading, it will take 263 years. WDR 2018
8
Brazil Time to reach OECD average in Math and Reading (PISA 15 year olds) OECD average >260 years If Brazilian 15-year-olds continue to improve at their current rate they will not reach the rich-country average score in math for 75 years. In reading, it will take 263 years. WDR 2018
9
Years of Schooling are not the same as Learning
Average years of schooling of year olds, unadjusted and adjusted for learning Years of schooling This adjusts by ratio of a country’s TIMSS score relative to that in Singapore (which is the best performer) WDR 2018
10
There is a learning crisis.
Solving it requires learning to improve learning.
11
Education Impact Evaluations
The Rise of Education Impact Evaluations
12
We have more evidence on improving learning than ever before.
Evans & Popova 2016; WDR 2018
13
But there is still so much to learn!
Cash transfers and student enrollment 19 studies Materials and learning outcomes 4 studies Remedial education and student learning 4 studies These have probably grown, but there’s huge variation. And in some areas, like pedagogical interventions, the interventions are almost as varied, so even if you have 10 evaluations, it just means you’ve tested 5 totally different interventions, twice a piece. Adapted from 3ie 2015
14
No one will buy it without the price tag!
McEwan 2015
15
That which seemed obvious…
…proves untrue.
16
Providing textbooks to students…
Sierra Leone Kenya Sabarwal, Evans, & Marshak 2014; Glewwe, Kremer, & Moulin 2009
17
Training teachers in active learning… …reduced learning in Costa Rica
Highlights from paper: * Experiment designed to affect the ability to reason and argue using mathematics. •Used structured pedagogical intervention aimed at secondary school students. •The intervention was implemented with high fidelity and was internally valid. •The control group learned more than the treatment group. Berlinski & Busso 2017
18
Small, cheap interventions can have a substantive, positive impact
Parent teacher conferences in Bangladesh Information in the Dominican Republic Islam 2016; Jensen 2010
19
Some really changed our minds.
20
Ability tracking USA Kenya India
USA: High achiever classes. “Minorities gain 0.5 standard deviation units in fourth-grade reading and math scores, with persistent gains through sixth grade.” Kenya: Tracking improved outcomes for both groups. India: One dedicated hour led to gains. India Card & Giuliano 2016; Duflo, Dupas, & Kremer 2011; Banerjee et al. 2016
21
How can we get more from these evaluations?
Learn from null results Learn about mechanisms Synthesize effectively Inform policy
22
Learn from Null Results
23
What does a null result mean?
Maybe the evaluation was poorly designed!
24
Evaluation design problems
Sample size too small Measured outcomes at the wrong time Evaluated while program was still resolving fundamental problems If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it.
25
Evaluation design problems
“The trend to measure impact has brought with it a proliferation of poor methods of doing so, resulting in organizations wasting huge amounts of money on bad ‘impact evaluations.’” If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it. - Gugerty & Karlan 2018
26
What does a null result mean?
Maybe the intervention was implemented poorly! Adapted from Glewwe & Muralidharan 2015
27
Textbooks in Sierra Leone
Mobile phone monitoring in Haiti If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it. Sabarwal, Evans, & Marshak 2014; Adelman et al. 2017
28
What does a null result mean?
Maybe the intervention led to compensating behavior! Adapted from Glewwe & Muralidharan 2015
29
School grants in Zambia
If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it. Das et al. 2013
30
What does a null result mean?
Maybe it only worked for some beneficiaries! Adapted from Glewwe & Muralidharan 2015
31
Textbooks in western Kenya
Only the highest performers benefitted Literacy instruction in eastern Kenya For word identification, only benefits for girls. If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it. Glewwe, Kremer, & Moulin 2009; Jukes et al. 2017
32
What does a null result mean?
Maybe it only worked with complementary programs! Adapted from Glewwe & Muralidharan 2015
33
Teacher pay-for-performance in Tanzania
Grants alone? No impact. Incentives alone? No impact. Together? Student learning gains. If these are the case, then we can’t necessarily learn so much. It is beholden on us, the evaluators, to make sure these elements of the evaluation are right. And if we can’t do an evaluation right, then we shouldn’t do it. Mbiti et al. 2017
34
What does a null result mean? Maybe the intervention doesn’t work!
Adapted from Glewwe & Muralidharan 2015
35
Unconditional increases in teacher salaries don’t increase student learning.
Indonesia Uruguay Zambia Indonesia: Story. Uruguay: RD – disadvantaged schools Zambia: Compared schools that just qualified for hardship allowance to those that didn’t (40 pp increase in 20% salary increase) De Ree et al. 2017;
36
Understand Mechanisms
37
“What is the biggest change required in behavior?”
Ask… “What is the biggest change required in behavior?” “What changes are likely to make the biggest difference to outcomes?” Then measure those. If they don’t work, then we understand why things went wrong. Multi-arm trials are a great way to do this, but where those aren’t possible, we can still learn. Jukes: Jukes 2018
38
Synthesize
39
Policymakers want more than just the results of your study
Policymakers want more than just the results of your study. They want all the results. Clearly synthesized. Simply explained. Easy, right? If they don’t work, then we understand why things went wrong. Multi-arm trials are a great way to do this, but where those aren’t possible, we can still learn. Jukes:
40
Two rules of research synthesis Be complete Be intelligent
If they don’t work, then we understand why things went wrong. Multi-arm trials are a great way to do this, but where those aren’t possible, we can still learn. Jukes:
41
Inform Policy
42
Their challenge, not your research
Relationships: Multiple interactions Use clear, non-technical language The right time If they don’t work, then we understand why things went wrong. Multi-arm trials are a great way to do this, but where those aren’t possible, we can still learn. Jukes:
43
Get the Most Out of Education Impact Evaluations
Learn from null results Learn about mechanisms Synthesize effectively Inform policy
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.