Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.

Slides:



Advertisements
Similar presentations
TRANSFORMING EDUCATION THROUGH EVIDENCE. The Centre for Effective Education SCHOOL OF Education Conducting Educational Randomised Control Trials in Disadvantaged.
Advertisements

The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to breaking the link between family income and educational achievement,
USING EVIDENCE TO INFORM YOUR LEADERSHIP APPROACH AND SUPPORT SCHOOL IMPROVEMENT ROB CARPENTER 26 TH SEPTEMBER 2013
Dorset Leadership Conference, 2013 Using evidence to inform your leadership approach and support school improvement James Richardson 5 th November 2013.
Effective use of the Pupil Premium to close the attainment gap James Richardson Senior Analyst, Education Endowment Foundation 27 th June 2014
Using evidence to raise the attainment of children facing disadvantage James Richardson Senior Analyst, Education Endowment Foundation 1 st April 2014.
How can evidence contribute to closing the attainment gap? James Richardson & Jonathan Sharples Education Endowment Foundation 16 th March 2015
The Pupil Deprivation Grant
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
2013.  Established 2007;  One of the three Prevention and Early Intervention Programme Initiatives; “ We were set up with the objective of testing innovative.
Narrowing the gap and the effective use of the Pupil and Service Premium with SEN young people Glyn Wright Autumn Term 2013.
Research evidence and effective use of the Pupil Premium Professor Steve Higgins, School of Education, Durham
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
1 Effective use of the pupil premium to raise achievement Lessons from research Robbie Coleman 18 th June 2015.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
A Professional Development Model for Teachers in Child- Care Centers CEC National Conference April 2, 2009 Seattle, WA Madelyn James UIC PhD student in.
The Pupil Premium: Using Evidence to Narrow the Gap Robbie Coleman 7 th July 2014
1 ‘Making Best Use of Teaching Assistants’ guidance report – Summary of recommendations.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
© Nuffield Trust 22 June 2015 Matched Control Studies: Methods and case studies Cono Ariti
Joint Reviews of Local Authority Social Services JOINT REVIEW OF SALFORD COUNCIL 17 th June 2003.
Community Planning Training 5- Community Planning Training 5-1.
Quality First Teaching for All SENJIT 21 st May 2013.
The Education Endowment Foundation Kevan Collins 26th September 2013
Using Evidence to Narrow the Gaps. What is the Education Endowment Foundation? In 2011 the Education Endowment Foundation was set up by Sutton Trust as.
1 EEF ‘Making Best Use of Teaching Assistants’ guidance report – summary of recommendations.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
1 ‘Making Best Use of Teaching Assistants’ guidance report – Summary of recommendations.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
Enhancing Education Through Technology Round 8 Competitive.
The rise of RCTs in education: statistical and practical challenges Kevan Collins.
Leading Effective Intervention Objectives To give subject leaders an overview of the Strategy’s plans to refresh and develop intervention and targeted.
The Educational Endowment Foundation (EEF) Application for funding to raise the attainment of FSM pupils Breakfast Briefing 5 th October 2011 Catherine.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
The role of extra care housing in addressing the needs of people with dementia Key findings from “Opening Doors to Independence” – a three year tracking.
SEND Reforms Meeting for Parents SEND Reforms Meeting for Parents 17 th November 2014 SENDCo: Sandra Coggin Weston Turville CE School.
Education & Skills Authority (ESA) 4 March 2010 National Association of Head Teachers Dr Clare Mangan Director (Designate) Children and Young People’s.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
EEF Evaluators’ Conference 25 th June Session 1: Interpretation / impact 25 th June 2015.
New Economy Breakfast Seminar – 13 July What Has Changed?
Life After Levels Parent Workshop March 2016.
The Pupil Premium Action Plan- making it work for Tidcombe
Breakout 1 Can early intervention improve social mobility?
Raising standards, improving lives
Dissemination and scale-up is the key to the EEF’s impact.
The inspection of local areas effectiveness in identifying and meeting the needs of children and young people who have special educational needs and/or.
Evaluation in the Early Years: The Maths Champions Programme
Briefing: Interdisciplinary Preparation for Personnel Serving Children with Disabilities Who Have High-Intensity Needs CFDA K Office of.
The English RCT of ‘Families and Schools Together’
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
Conducting Efficacy Trials
Evaluation of Switch-on
Evidence in Action: Using Research to Narrow the Gap Eleanor Stringer
Small Charities Challenge Fund (SCCF) Guidance Webinar
Education Effectiveness Research in the United Kingdom
Professor Stephen Pilling PhD
Early Years – early language, social mobility and the home learning environment 15 March 2018.
Successful Bid Writing:
The Education Endowment Foundation
Swaledale Alliance Pupil Premium Research 13th October 2017
Reporting the evidence:
Pupil Premium Governing Body Training November 2017
School’s Cool Makes a Difference!
Evaluating Your Home Visiting Program
Early Childhood and Family Outcomes
Theory of Family-Centred Care
ESF Evaluation Plan England
TA Toolkit Teacher Session
Applying to the EEF for funding: What are we looking for
Finance – making the best of your resources budget planning, benchmarking, collaboration & seeking best value Welcome.
Presentation transcript:

Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation 24th January 2017 camilla.nevill@eefoundation.org.uk www.educationendowmentfoundation.org.uk @EducEndowFoundn

Introduction The EEF is an independent charity dedicated to breaking the link between family income and educational achievement. In 2011 the Education Endowment Foundation was set up by Sutton Trust as lead charity in partnership with the Impetus Trust. The EEF is funded by a Department for Education grant of £125m and will spend over £220m over its fifteen year lifespan. In 2013, the EEF was named with The Sutton Trust as the government-designated ‘What Works’ centre for improving education outcomes for school-aged children.

The EEF Two aims: Our approach 1. Break the link between family income and school attainment 2. Build the evidence base on the most promising ways of closing the attainment gap

The Teaching and Learning Toolkit A meta-analysis of education research Contains c.10,000 studies Cost, impact & security included to aid comparison

133 projects funded to date EEF, March 2016 7,500 schools currently participating in projects 133 projects funded to date 750,000 pupils currently involved in EEF projects £220m estimated spend over lifetime of the EEF 26 independent evaluation teams 100 RCTs £82m funding awarded to date 66 published reports

New approach, different perspectives, design challenges Design with the end user in mind There is no one right answer – communicate and compromise

New approach Evaluate projects Rigorous, independent evaluations Longitudinal outcomes Robust counterfactual (RCTs) Impact and process evaluations

Education v other fields How does this compare to evaluation in your field?

Trials: Education v public health Public health / development Some independent evaluation Usually not independently funded? Mostly cluster and multi-site trials. Clusters clearly defined. Mostly cluster and multi-site trials. Clusters less clearly defined? High ICC Low ICC? Obtaining consent can be easy Obtaining consent can be complex and difficult Follow up is in theory easy; children must attend school Follow up can be harder Administrative data (NPD) Depends on outcome Unfamiliarity with method More familiar. More respect in medicine

Main messages Design with the end user in mind There is no right answer – communicate and compromise

Process for appointing evaluators Grants team identify projects, 1st Grants Comm. shortlist Evaluation teams receive 1page project descriptio-ns Teams submit 2 page EoI Teams chosen to submit proposal Teams submit 8 page proposal Teams chosen to evaluate projects 2nd set up meeting with evaluation team, project team and EEF 2nd Grants Comm. shortlist First set-up meeting with evaluation team, project team and EEF Finalise evaluation design. Decide on eligibility criteria, details of protocol, process evaluation measures linked to logic model Share understanding of intervention logic. Decide overall design, timeline , sample size, control group condition. Developer (& evaluator) budgets set

Different perspectives EEF Evaluator Set-up meeting Developer

Different perspectives EEF Useful results Quick results Keep costs down Evaluator Publications Funding to do research Personal interests Set-up meeting Developer Funding to deliver programme Demonstrate impact Good relationships with schools Publications?

Design challenges Improving Working Memory Teaching memory strategies by playing computer games For 5 year-olds struggling at maths Delivered by Teaching Assistants Developed by Oxford University educational psychologists Evidence of improvement in WM from two small (30 and 150 children) controlled studies

Design challenges How many arms? Working Memory (WM) WM blended with maths Matched time maths support Business as usual (BAU)

Design challenges When would you randomise? Deliver programme (10 hours) 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment School Recruited Identify TAs and link teacher One-day training for TAs Identify pupils (bottom 1/3) Improved working memory Oxford University

Design challenges Deliver programme (10 hours) Delivery log Deliver programme (10 hours) Survey, observations, interviews WM test Maths test 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment School Recruited Identify TAs and link teacher Identify pupils (bottom 1/3) One-day training for TAs Improved working memory Oxford University Randomisation

Estimated months’ progress Matched time support v BAU Control Design challenges Catch Up Numeracy For 4 to 11 year-olds struggling at maths Delivered by Teaching Assistants 10 modules of tailored support Flexible delivery model (no fixed length) Evidence from EEF pupil-randomised efficacy trial: Group Number of pupils Effect size Estimated months’ progress Evidence strength Catch Up v BAU Control 108 0.21 (0.01, 0.42) +3 Matched time support v BAU Control 102 0.27 (0.06, 0.49) +4

Design challenges What control group would you use?

Design challenges Catch Up Numeracy 150 schools Recruited Identify TAs and ~8 children in years 3-5 behind in maths Randomise 75 schools, 600 children: Business as usual control group 75 schools, 600 children: Flexible Catch Up delivery model Follow up maths test

Problems with interpretation What if we see no effect of Catch Up and control group gets lots more support? What if we see a big effect of Catch Up and the control group has received lots less support?

A radical idea: Pre-specify interpretation! Positive effect No effect Negative effect Control longer than Catch Up Matched time Control shorter than Catch Up  

A radical idea: Pre-specify interpretation! Positive effect No effect Negative effect Control longer than Catch Up x Matched time Control shorter than Catch Up  

A radical idea: Pre-specify interpretation! Positive effect No effect Negative effect Control longer than Catch Up Catch Up is more effective, even when more active control time   →Do Catch Up (continuing active control without appropriate stopping may have a harmful effect) Both did or both did not work. Probably did given existing evidence? →Do Catch Up because same effect with less time Catch Up is less effective than providing longer active control. →Assess the cost of each and do active control if not much more expensive. Matched time Catch Up is more effective than active control Both did work or both did not work. Probably did given existing evidence? →Do Catch Up or active control Catch Up is less effective than active control →Do active control Control shorter than Catch Up Catch Up is more effective than less time active control →Do Catch Up because need structure to stop TAs stopping too early →Do active control as same effect with less time Catch Up is less effective than providing less active control.

Design challenges Boarding school Teenage Sleep Children in need at risk of going into care Referred by Local Authorities Teenage Sleep Changing school start times to later Positive effects from US trials (8am start v 11am start)

Main messages (and sub-messages) Design with the end user in mind Test the right intervention Make sure your comparison is relevant Measure implementation and cost There is no right answer – communicate and compromise Use logic model to understand the intervention Pre-specify the interpretation to aid decision making Not all interventions can be randomised

Thank you camilla. nevill@eefoundation. org. uk www Thank you camilla.nevill@eefoundation.org.uk www.educationendowmentfoundation.org.uk @EducEndowFoundn

Measuring the security of trials Summary of the security of evaluation findings ‘Padlocks’ developed in consultation with evaluators Five categories – combined to create overall rating: Group Number of pupils Effect size Estimated months’ progress Evidence strength Literacy intervention 550 0.10 (0.03, 0.18) +2 Rating 1. Design 2. Power (MDES) 3. Attrition 4. Balance 5. Threats to validity 5 Fair and clear experimental design (RCT) < 0.2 < 10% Well-balanced on observables No threats to validity 4 Fair and clear experimental design (RCT, RDD) < 0.3 < 20% 3 Well-matched comparison (quasi-experiment) < 0.4 < 30%   2 Matched comparison (quasi-experiment) < 0.5 < 40% 1 Comparison group with poor or no matching < 0.6 < 50% No comparator > 0.6 > 50% Imbalanced on observables Significant threats