Measuring the impact of education interventions Stephen Taylor STELLENBOSCH, AUGUST 2015.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Tracey Cooper Consultant Nurse Infection Control
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The complex evaluation framework. 2 Simple projects, complicated programs and complex development interventions Complicated programs Simple projects blue.
Mywish K. Maredia Michigan State University
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Experimental Research Designs
Mixed methods in health services research: Pitfalls and pragmatism Robyn McDermott Mixed Methods Seminar JCU October 2014.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
MSc Applied Psychology PYM403 Research Methods Validity and Reliability in Research.
Impact and outcome evaluation involve measuring the effects of an intervention, investigating the direction and degree of change Impact evaluation assesses.
Types of Evaluation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Public Value Innovation and Research Evaluation Discussion by Karen Macours INRA - Paris School of Economics.
What research is Noun: The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions. Verb:
Skunk Works Evaluation Tools: How do we know if we are having an impact?
1 PROVINCIAL EDUCATION ASSESSMENT CENTRE(PEACE) Bureau of Curriculum and Extension & Extension Centre Quetta.
Research Methodology For IB Psychology Students. Empirical Investigation The collecting of objective information firsthand, by making careful measurements.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS.
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
The World Bank Human Development Network Spanish Impact Evaluation Fund.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Plymouth Health Community NICE Guidance Implementation Group Workshop Two: Debriding agents and specialist wound care clinics. Pressure ulcer risk assessment.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Methodology Matters: Doing Research in the Behavioral and Social Sciences ICS 205 Ha Nguyen Chad Ata.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Applying impact evaluation tools A hypothetical fertilizer project.
A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson.
Evaluating with Multimedia Tools Harouna Ba EDC Center for Children & Technology
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
S-005 Types of research in education. Types of research A wide variety of approaches: –Theoretical studies –Summaries of studies Reviews of the literature.
Chapter Eight: Quantitative Methods
Randomized Assignment Difference-in-Differences
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Building an evidence-base from randomised control trials Presentation of the findings of the impact evaluation of the Reading Catch-Up Programme 18 August.
1 Joint meeting of ESF Evaluation Partnership and DG REGIO Evaluation Network in Gdańsk (Poland) on 8 July 2011 The Use of Counterfactual Impact Evaluation.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
24 May 2016 Martin Gustafsson Stephen Taylor Department of Economics, University of Stellenbosch Department of Basic Education.
Zenex Foundation RESEP 24 May Presentation Introduce Zenex & rationale for our focus on Literacy Tracing our intervention journey Design, roll-out.
Defining 21st Century Skills: A Frameworks for Norfolk Public Schools NORFOLK BOARD OF EDUCATION Fall 2009.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Cari-Ana, Alexis, Sean, Matt
Department of Economics, University of Stellenbosch
Stephen Taylor & Brahm Fleisch ZENEX July 2015
School Improvement School to Circuit to District.
Carina Omoeva, FHI 360 Wael Moussa, FHI 360
Internal Assessment 2016 IB Chemistry Year 2 HL.
What can implementation research offer?
Qualitative vs. Quantitative research
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
The Use of Counterfactual Impact Evaluation Methods in Cohesion Policy
Implementation Challenges
Brahm Fleisch Research supported by the Zenex Foundation October 2017
Presentation transcript:

Measuring the impact of education interventions Stephen Taylor STELLENBOSCH, AUGUST 2015

PLAN Locating impact evaluation Menu of methods Case Study: A randomised experiment Challenges with RCTs in education Opportunities for research in government

Locating impact evaluation Qualitative work – Hoadley (2003,2007); Ensor et al (2009) Systemic analysis with mixed methods – Taylor, Vinjevold, Muller (2003); Fleisch (2008) Descriptive quantitative work – Reddy (2006); Taylor & Yu (2008); Spaull (2011) Correlational analysis – Crouch & Mabogoane (1998); Van der Berg (2008); Gustafsson (2007); Spaull (2012); Shepherd (2011) Moving toward causal quantitative analysis

The evaluation problem: knowing a counterfactual The evaluation problem: – We cannot observe the counterfactual: – 2 alternative scenarios for the same person or group. So we have to identify or construct comparison groups as a “pseudo-counterfactual”. – Or an estimate of the counterfactual The big question is: when is a comparison group a valid estimate of the counterfactual? – Internal validity Selection bias (endogeneity): – Years of Schooling and IQ – Libraries and learning outcomes

A menu of methods Pre & Post Simple Difference Difference-in-differences Regression & matching Fixed effects RCT RDD IV } } } Non- experimental (observed data) Experimental Quasi- Experimental

Case Study: The impact of study guides on matric performance: Evidence from a randomised experiment

Background to the “Mind The Gap” study Mind the Gap study guides developed during 2012 Aimed at acquiring the basic knowledge and skills necessary to pass the matric exam Distributed to schools in some parts of the country – Mainly underperforming districts in EC, NC, a bit in Gauteng and elsewhere, but not in Mpumalanga Impact evaluation using 4 subjects in MP – ACCN, ECON, GEOG, LFSC

The Sampling Frame National list of schools that were enrolled for the matric 2012 examination. The list was then restricted to only schools in Mpumalanga. Further restricted to schools registered to write the matric 2012 exam in English. The final sampling frame consists of 318 schools. Randomly allocated guides to 79 schools (books were couriered – delivery reliable) Leaves 239 control schools Books delivered late in Year: September

Main Results: OLS regressions with baseline To summarise: No significant impact in Accounting & Economics; Impacts of roughly 2 percentage points in Geography & Life Sciences

Heterogeneous effects

Did impact vary by school functionality? GeographyLife Sciences

Matric 2010 simulation Roughly a 1 percentage point increase in matric pass rate 5609 The number of children who did not pass matric in 2010 but would have passed had Mind The Gap been nationally available Geography and Life Sciences.

Interpreting the size of the impact Very rough rule of thumb: 1 year of learning = 0.4 to 0.5 standard deviations of test scores Geography: 13.5% SD Life Sciences: 14.4% SD Roughly a third of a year of learning The unit cost per study guide (reflecting material development, printing and distribution) is estimated to be R41,82

Kremer, Brannen & Glennerster, 2013 MTG: 3.04 SD per $100

Interpretation of results 2 guides had no impact: Interventions do not always impact on desired outcomes Interventions are not uniform in effectiveness The quality of the ACCN & ECON material? – Or of the GEOG & LFSC materials? Contextual factors pre-disposing LFSC & GEOG to have an impact but not ACCN & ECON? A certain level of school functionality / managerial capacity needed in order for resources to be effective Timing of the delivery of guides External validity – We are more certain about delivery in MP than if this were taken to scale – Awareness campaigns could increase the impact at scale

Critiques of RCTs External validity Necessary and sufficient conditions for impact evaluations (internal and external validity) Internal validity = causal inference External validity = transferability to population – Context: geography, time, etc...? E.g. Private schools, class size – Special experimental conditions Hawthorne effects Implementation agent System support

External validity: Recommendations Choose a representative & relevant study population Investigate heterogeneous impacts Investigate intermediate outcomes Use a realistic (scaleable) model of implementation and cost structure Work with government... But be careful No pre-test...? Or use administrative data (ANA & NSC provide opportunity here for DBE collaboration)

RCTs in Education: Practical challenges Fund raising Stakeholder engagement Test development Fieldwork quality Project management

Evaluations with Government: Advantages Accountability Shifts the focus from inputs (e.g. number of teachers trained) to outcomes; From form to function (mimicry). Cooperation between government and other actors (researchers, NGOs, etc) Encourages policy-makers to interact with research and evidence Thinking about theories of change – Shifts the focus from did government programme X succeed or fail, to why? The agency of programme recipients to change behaviour. Benefits for research: reduces publication bias

Evaluations in Government: Opportunities Low hanging RCTs – 1000 libraries; EGRA RDD Encouragement designs – Online tools; winter schools Good analysis of existing data – Grade R evaluation – LOLT paper

Concluding thought Broader benefits of an evaluation culture – Not all programmes/policies can be subjected to a quantitative impact evaluation – Theories of change – Accountability – Binding constraints – Interaction btw government and researchers