The rise of RCTs in education: statistical and practical challenges Kevan Collins.

Slides:



Advertisements
Similar presentations
© Institute for Fiscal Studies Evaluation design for Achieve Together Ellen Greaves and Luke Sibieta.
Advertisements

IAA National Conference Impact and Innovation: Using evidence to inform your leadership approach and support school improvement Kevan Collins 4 th.
USING EVIDENCE TO INFORM YOUR LEADERSHIP APPROACH AND SUPPORT SCHOOL IMPROVEMENT ROB CARPENTER 26 TH SEPTEMBER 2013
Dorset Leadership Conference, 2013 Using evidence to inform your leadership approach and support school improvement James Richardson 5 th November 2013.
Mywish K. Maredia Michigan State University
∂ What works…and who listens? Encouraging the experimental evidence base in education and the social sciences RCTs in the Social Sciences 9 th Annual Conference.
Exploring Research-Led Approaches to Increasing Pupil Learning Steve Higgins School of Education, Durham University Addressing.
Effective use of the Pupil Premium to close the attainment gap James Richardson Senior Analyst, Education Endowment Foundation 27 th June 2014
Disciplined innovation: the implications of harnessing evidence to drive improved outcomes for children and inform the design of the curriculum they are.
Adapting Designs Professor David Torgerson University of York Professor Carole Torgerson Durham University.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
The use of administrative data in Randomised Controlled Trials (RCT’s) John Jerrim Institute of Education, University of London.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Where now? Evaluation and evidence-based decision making Jodi Nelson Impact Planning and Improvement, Global Development April 13, 2009.
Using evidence to raise the attainment of children facing disadvantage James Richardson Senior Analyst, Education Endowment Foundation 1 st April 2014.
How can evidence contribute to closing the attainment gap? James Richardson & Jonathan Sharples Education Endowment Foundation 16 th March 2015
Clinical trials methodology group Simon Gates 9 February 2006.
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Research into research use: A ten-arm cluster RCT 10 th September 2014 Dr Ben Styles, Dr Anneka Dawson (NFER), Dr Lyn Robinson and Dr Christine Merrell.
Tobacco Control Interventions – Design Trade-Offs K. S. (Steve) Brown Department of Statistics and Actuarial Science Health Behaviour Research Group University.
Pupil Premium Grant: Report for Governors July 2014 PPG statement: Rationale and Principles: Luke’s CE Primary school acknowledges that the PPG is provided.
Problem Statement: Young people from low income areas do not reach their full academic potential at school and lack the relevant support to make informed.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Recent developments in the UK Using the indices and the underpinning data Tom Oxford Consultants for Social Inclusion (OCSI) David McLennan.
Evidence-based policing © College of Policing Limited 2013 Levin Wheller Principal Research Officer Knowledge, Research and Practice Unit.
The Pupil Premium: Using Evidence to Narrow the Gap Robbie Coleman 7 th July 2014
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Why Teacher Research Matters for Educational Research Dr Ian Thompson Department of Education University of Oxford.
Connecting research and practice Robert Coe Durham University.
Introduction to Evaluation. Objectives Introduce the five categories of evaluation that can be used to plan and assess ACSM activities. Demonstrate how.
The Impact of Health Coaching
Session Objectives At the end of the session you should be able to:- Explain how learning theories such as Kolb and Blooms impact teaching practice Identify.
© Nuffield Trust 22 June 2015 Matched Control Studies: Methods and case studies Cono Ariti
Session 3: Analysis and reporting Collecting data for cost estimates Jack Worth (NFER) Panel on EEF reporting and data archiving Peter Henderson, Camilla.
Big Lottery Fund Greenwich Action for Voluntary Service 17 th April 2015.
The Education Endowment Foundation Kevan Collins 26th September 2013
Using Evidence to Narrow the Gaps. What is the Education Endowment Foundation? In 2011 the Education Endowment Foundation was set up by Sutton Trust as.
London’s challenge… Kevan Collins
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Cabinet Office – Youth Policy Liverpool Thursday 10 December.
Assessing Student Learning Workshop 2: Making on-balance judgements and building consistency.
HTA Efficient Study Designs Peter Davidson Head of HTA at NETSCC.
A Quick Guide to Pupil Premium Spending. Interesting Data Based on 2013 data the gap between FSM and non FSM students gets wider as students get older.
The Educational Endowment Foundation (EEF) Application for funding to raise the attainment of FSM pupils Breakfast Briefing 5 th October 2011 Catherine.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
Pupil Premium 2014/15 Donna Munday – Schools Finance Manager Tel /
Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.
Challenges arising from the analysis of randomized trials in education
The English RCT of ‘Families and Schools Together’
What is the EEF Data Archive?
Conducting Efficacy Trials
Evidence in Action: Using Research to Narrow the Gap Eleanor Stringer
Next steps for raising attainment: leadership, school autonomy and improving standards in the classroom Allen Thurston, Professor of Education & Liam O’Hare,
Statistical Analysis Plan review
Tristram Hooley 19th April 2018 CEC Community Event, Manchester
Education Effectiveness Research in the United Kingdom
The Education Endowment Foundation
Analysing educational trials: the Education Endowment Foundation Archive Steve Higgins, Adetayo Kasim, ZhiMin Xiao, with Nasima Akhter, Ewoud De Troyer,
III. Practical Considerations in preparing a CIE
Interpreting and Using KS2 scores
The EEF Data Archive: Plans & Priorities
Applying to the EEF for funding: What are we looking for
Tackling Deprivation and Raising Standards
Andrew Jenkins and Rosalind Levačić
Review process What’s involved?.
Presentation transcript:

The rise of RCTs in education: statistical and practical challenges Kevan Collins

The EEF Summarise the existing evidence Make grants Evaluate projects Share and promote the use of evidence Rigorous, independent evaluations Founded with a grant from the DfE with an aim to spend over £220 million over 15 years Focus on projects with a good evidence-base 115 grants made to date EEF and Sutton Trust Toolkit, reports, scale up campaigns

EEF’s Approach to Evaluation Rigorous, independent evaluations Independent evaluation Robust counterfactual Focus on attainment outcomes Impact and process evaluations

A typical EEF evaluation Recruit schools Pre-test / pupil data / consent Post- test Set-up / Planning Intervention delivery Randomise Process evaluation When RCT not possible RDD, propensity score matching, synthetic analysis 94/115 projects evaluated using RCTs Keen to include non-cognitive outcomes Focus on attainment (end of KS, standardised commercial tests) 115 grants made to date

All EEF evaluation data will be submitted to a national data archive This data archive is still being established. The aim is to allow… ….the EEF to: track the impact of projects longitudinally look at the cumulative impact of projects understand better its target group ….the research community to: verify the results of EEF evaluations conduct further analysis on subgroups and interventions link to other datasets for research purposes Overarching evaluators need access Research community needs access

Some lessons from our first 4 years… Schools are willing to take part in RCTs. When we first set up, some people were sceptical But getting them to do what you need them to is difficult. Testing, passing on data, sticking to the intervention, not contaminating the control group… Effect sizes are often smaller than anticipated. Maybe due to the independence of the evaluation; the “real world” nature of the trials; intention to treat analysis This means that trials need to be very large to detect an impact. Therefore we are increasingly reliant on NPD data rather than testing 1,000s of pupils. Next big questions: Will schools listen to the evidence? And policy makers? Will researchers use the data archive?