Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Policy Evaluation Antoine Bozio Institute for Fiscal Studies University of Oxford - January 2008.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Linking administrative data to TALIS and PISA
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
Social Impacts Measurement in Government and Academia Daniel Fujiwara Cabinet Office & London School of Economics.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
The Three Rs: The Scope for Literacy and Numeracy Policies to Raise Achievement Stephen Machin* and Sandra McNally** *University College London; CEE and.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Empirical methods take real-world data estimate size of relationship between variables two types  regression analysis  natural experiments take real-world.
NYU Abu Dhabi Conference on Education Media and Human Development. Quantitative Analysis of Education Policy in the UK Peter Dolton Royal Holloway College,
© Institute for Fiscal Studies The role of evaluation in social research: current perspectives and new developments Lorraine Dearden, Institute of Education.
The Centre for Vocational Education Research - purpose and activities Sandra McNally CVER, LSE.
TOOLS OF POSITIVE ANALYSIS
What influences English and Mathematics attainment at age 11? Evidence from the EPPSE project.
What’s new in the Child Poverty Unit – Research and Measurement Team Research and Measurement Team Child Poverty Unit.
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
“Applications of the NPD in Academic Research: Some Examples from the Centre for the Economics of Education” Joan Wilson, CEE, CEP, IoE DCSF, York, Friday.
Non Experimental Design in Education Ummul Ruthbah.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Case Studies Harry Anthony Patrinos World Bank November 2009.
Welfare Reform and Lone Parents Employment in the UK Paul Gregg and Susan Harkness.
1 The Need for Control: Learning what ESF achieves Robert Walker.
1 Do UK higher education students overestimate their starting salary? John Jerrim Institute of Education, University of London.
Faculty of Economics and Business Administration Page 1 Discussion on: Estimating the impact of a policy reform on welfare participation:
Programme Information Incredible Years (IY)Triple P (TP) – Level 4 GroupPromoting Alternative Thinking Strategies (PATHS) IY consists of 12 weekly (2-hour)
1 Data Linkage for Educational Research Royal Statistical Society March 19th 2007 Andrew Jenkins and Rosalind Levačić Institute of Education, University.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Mandating full New Deal participation for the over- 50s: an experimental analysis Richard Dorsett & Stefan Speckesser, Policy Studies Institute Commissioned.
Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Widening Participation in Higher Education: A Quantitative Analysis Institute of Education Institute for Fiscal Studies Centre for Economic Performance.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Ch. 2 Tools of Positive Economics. Theoretical Tools of Public Finance theoretical tools The set of tools designed to understand the mechanics behind.
The Choice Between Fixed and Random Effects Models: Some Considerations For Educational Research Clarke, Crawford, Steele and Vignoles and funding from.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Applying impact evaluation tools A hypothetical fertilizer project.
Chapter 6 STUDY DESIGN.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
© National Literacy Trust 2009 Partners in Literacy: Improving outcomes for families through local coordination Diagram showing how the PiL strategy can.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Lorraine Dearden Director of ADMIN Node Institute of Education
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
1 Joint meeting of ESF Evaluation Partnership and DG REGIO Evaluation Network in Gdańsk (Poland) on 8 July 2011 The Use of Counterfactual Impact Evaluation.
Public Finance and Public Policy Jonathan Gruber Third Edition Copyright © 2010 Worth Publishers 1 of 24 Copyright © 2010 Worth Publishers.
Innovation in Education: CEP research Sandra McNally.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
THE SCHOOL DAY AT DESS. Doha English Speaking School | Option 1 is for every class to finish one hour earlier the Key Stage 2 pupils (Years 3, 4, 5 and.
7 th June 2012 Is it better to fail than to succeed? A quantitative analysis of ‘just’ failing an English school inspection Rebecca Allen, Institute of.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
EVALUATIONS THAT CHALLENGE THE STATUS QUO: USE OF STATISTICAL TECHNIQUES IN MEASURING ADDITIONALITY Australasian Evaluation Society International Conference.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Labour Market Knowledge and A-level Choices: A Blinded Cluster Randomized Controlled Trial with Linked Administrative Data Neil Davies, Peter Davies, Tian.
Apprenticeships for Young People in England. Is there a payoff?
Impact Evaluation Methods
Let’s make education fairer: Disadvantage, school intakes and outcomes
Methods and Approaches to investigate the UK Education System
Positive analysis in public finance
Andrew Jenkins and Rosalind Levačić
Using Big Data to Solve Economic and Social Problems
Presentation transcript:

Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of Economics

Overview Some questions addressed in the CEP Education Programme Data Core methodological issues

Some current themes in the CEP Education Research Programme What works in schools… Teachers Peers, neighbours and compositional effects Parental preferences and admissions Higher education in the UK

What works (or not) in schools: CEP research

Most important data sets The National Pupil Database (potentially linkable with UCAS and HE data) Longitudinal data sets which can be linked with administrative data, e.g. Longitudinal Survey of Young People in England; Millennium Cohort Study

Core methodological issues 1.What is an appropriate counter-factual? – and therefore can we establish causality? 2.For whom is the causal effect identified? 3. Can we look at longer-term effects? 4. Can we do a cost-benefit analysis? 5. Can we extrapolate outside the study context?

Approaches Randomised Control Trials (Education Endowment Fund) Difference-in-difference approaches Instrumental Variable Approaches Regression Discontinuity Approaches OLS/Propensity Score Matching

Randomised Control Trials ‘Treatment’ is randomly assigned amongst target population An example: Design of an ‘information campaign’ about the costs and benefits of staying-on in education targeted to Year 10 students in London schools (McGuigan, McNally, Wyness, 2012)

Difference-in-Difference Approaches Compare the outcomes of a treatment and control group before and after the policy. An example: Structured way to teach reading (‘literacy hour’) was piloted in schools in some Local Authorities before it was rolled out to the rest of the UK (Machin and McNally, 2008)

Instrumental Variable Approaches Trying to find a variable that predicts the outcome variable only via the impact it has on the ‘treatment’ variable. An example Using changes in the minimum school-leaving age to estimate the impact of additional education on future earnings (Harmon and Walker, 1995).

Regression Discontinuity Designs Making use of a ‘cut-off’ for eligibility to a programme/policy. Compare the outcomes of people either side of this ‘cut-off’. An example Using administrative cut-offs in school admissions policies to identify the relationship between when children start school and their later outcomes. Crawford, Dearden and Greaves (2013).

OLS/Propensity Score matching Trying to measure the effect of a ‘treatment’ while controlling for all observable characteristics. Example Looking at whether the increase in ‘non- native’ English speakers has an influence on the educational attainment of native English speakers (Geay, McNally and Telhaj, 2012)