The English RCT of ‘Families and Schools Together’

Slides:



Advertisements
Similar presentations
Evaluation of the Incredible Years TODDLER Parent Training Programme for nursery staff working with 2-3 year old children living in ‘high risk’ disadvantaged.
Advertisements

© Institute for Fiscal Studies Evaluation design for Achieve Together Ellen Greaves and Luke Sibieta.
Researching The Incredible Years Therapeutic Dinosaur School Programme Funded by the Big Lottery.
Robert Coe Neil Appleby Academic mentoring in schools: a small RCT to evaluate a large policy Randomised Controlled trials in the Social Sciences: Challenges.
Adapting Designs Professor David Torgerson University of York Professor Carole Torgerson Durham University.
Conference for EEF evaluators: Building evidence in education Hannah Ainsworth, York Trials Unit, University of York Professor David Torgerson, York Trials.
Session 3: Trial management Sarah Miller (Queens, Belfast) Laura Dunne (Queens, Belfast)
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Using evidence to raise the attainment of children facing disadvantage James Richardson Senior Analyst, Education Endowment Foundation 1 st April 2014.
How can evidence contribute to closing the attainment gap? James Richardson & Jonathan Sharples Education Endowment Foundation 16 th March 2015
GROVE WOOD PRIMARY SCHOOL PUPIL PREMIUM – OUR STORY.
Orchards C of E Primary and Nursery School Jaynie Lynch, Deputy Ann Purt, Chair of Governors.
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
Narrowing the gap and the effective use of the Pupil and Service Premium with SEN young people Glyn Wright Autumn Term 2013.
Research into research use: A ten-arm cluster RCT 10 th September 2014 Dr Ben Styles, Dr Anneka Dawson (NFER), Dr Lyn Robinson and Dr Christine Merrell.
The effective use of tests and tasks to support teacher assessment in Y2 4 th February 2014 Karen Samples.
Sarah Rivers Head Teacher of The Virtual School for Looked After Children “Ensure all Looked after Children receive a good education”
11 Professor Judy Hutchings Centre for Evidence Based Early Intervention Bangor University Results.
Evaluating the Incredible Years School Readiness Parenting Programme Supervised by Dr Tracey Bywater Incredible Years Wales School of Psychology Kirstie.
Talk to Us Approach to Evaluation Lindsay Wager 1 st April 2014.
Dr. Tracey Bywater Dr. Judy Hutchings The Incredible Years (IY) Programmes: Programmes for children, teachers & parents were developed by Professor Webster-Stratton,
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Early Years update. June Simon Francis – Services for Young Children.
1 The KiVa Anti-Bullying Programme: The Pilot and Randomised Controlled Trial Presentation at the Bangor Conference Suzy Clarkson Centre for Evidence Based.
Awareness Raising for Principals/ Senior Managers November 2010 New Statutory Assessment Arrangements from 2012/13.
Joy Carroll UG Manager Primary Placements Summer Term 2014 UG2 School Experience 2 March 2015.
Programme Information Incredible Years (IY)Triple P (TP) – Level 4 GroupPromoting Alternative Thinking Strategies (PATHS) IY consists of 12 weekly (2-hour)
RAISING THE LEVELS OF ACHIEVEMENT OF PUPILS WITH MLD USING RESEARCH LESSON STUDY Dr Jeff Jones Development Leader.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
The Coseley School A Co-operative Trust Closing the Gap Strategies – 2015/16 Believe, Achieve, Excel Closing the Gap Strategies – 2015/16 Believe, Achieve,
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
The Integrated Review bringing together health and early education reviews between 2-3yrs. Tina Jones and Wiltshire Health Visitors.
The photographs used in this Report are of a wide range of pupils in school, the images are collected as an example of all the good work we do in school.
How to design and deliver a successful evaluation 19 th October 2015 Sarah Lynch, Senior Research Manager National Foundation for Educational Research.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Helmingham Community Primary School Assessment Information Evening 10 February 2016.
EPHA Spring Term View from the Bridge. Education and Adoption Bill Bill passed by House of Commons 23 rd February 2016 Coasting Schools eligible for intervention.
The Educational Endowment Foundation (EEF) Application for funding to raise the attainment of FSM pupils Breakfast Briefing 5 th October 2011 Catherine.
The landscape in Wales Foundation phase2013 = 53%2014 = 58% + 5 percentage points Key Stage = 53%2014 = 59% + 6 percentage points Key Stage
Making effective use of data to support self- evaluation and target setting Governor Conference October 2015 Workshop – Teacher Assessment Verification.
Educational Attainment in Hastings Presentation to the Hastings LSP Fiona Wright October 2014.
Assessment, Standardisation and Moderation Ensuring consistency in teacher assessment 2015 – 2016
National and Local Updates SENCo Cluster Meeting Spring 2016.
Contact: Kirstie Cooper Tel: The School Readiness Programme What is School Readiness? The Incredible.
EEF Evaluators’ Conference 25 th June Session 1: Interpretation / impact 25 th June 2015.
Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.
Life After Levels Parent Workshop March 2016.
Learning First conference Chris Smith Head of Education Services
WPS Assessment Information Evening
Evaluation in the Early Years: The Maths Champions Programme
Parent Forum 29th September 2016.
SEND Information Report
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
CSTL Sharing Meeting 2016 School Plus Programme
KiVa Anti-Bullying Programme
St James’ C of E Primary School
Conducting Efficacy Trials
Evaluation of Switch-on
Aims of this evening To increase your awareness of the activities that the Academy and all our partner primary schools work on together To make you aware.
Statistical Analysis Plan review
Sia Gravani 10th May th ICTMC & 38th SCT, Liverpool
Assessment Update February 2015 Barbara Nunn.
Mixed Year Groups What? Why? How?.
About Us. About Us Who We Are Not-for-profit Social Enterprise Based in Greater Manchester Aim to help close the attainment gap in British education.
Assessment, Standardisation and Moderation
PARENT INFORMATION SESSION
MTSA programme School Direct.
Applying to the EEF for funding: What are we looking for
Aims of this evening To increase your awareness of the activities that the Academy and all our partner primary schools work on together To make you aware.
Presentation transcript:

The English RCT of ‘Families and Schools Together’ Pippa Lord and Dr Ben Styles NFER’s Education Trials Unit RCTs in the Social Sciences 11th Annual Conference 7th-9th September 2016 Public

About this presentation Outline the FAST programme FAST trial main design features Integrating FAST programme and trial design Analytical considerations (sample size, power, dilution) Quasi experimental design (QED) Conclusions for trials with this kind of ToC

Rationale FAST is ready for an effectiveness trial in England: the existing evidence base on softer outcomes from the US, the results of meta-analyses of the effect of parental engagement on attainment and the whole-school nature of FAST as applied in England.

The FAST programme in England The programme Founded in US by Prof. Lynn MacDonald in 1988 UK license holders – Middlesex University Delivered by Save the Children (SCUK) in UK primary schools Phase 1 – partner training, family recruitment, Hub model Phase 2 – 8-week programme (2.5 hrs a week), graduation Phase 3 – FASTworks (22 months)

FAST trial main features Effectiveness trial Cluster randomised (schools) – between 60+60 and 80+80 intervention/control Primary outcome – attainment in English and maths at Key Stage 1 (EYFS as baseline) Secondary outcomes – measures from the ‘Strengths and Difficulties Questionnaire’ (pre-, mid- and longer-term) QED – primary outcome, FAST sub-group and matched control sub-group Process evaluation

Integrating FAST and design How to recruit schools? ‘Normal’ SCUK recruitment (School Agreements adapted for trial) Sign up consent from schools and all Y1 families (opt out) When to randomise? Before family recruitment (families’ participation part of the programme) Before training (considerable effort by schools and local partners) Who to measure? FAST runs across EYFS and KS1; trial measures Y1 Invitations to families as normal, aim for 1/3 of Y1 in each school SDQs anonymous (opt out required for this trial) How to deliver programme and trial to 80 schools? Blocked, termly design for delivery, randomisations and data collections

FAST programme delivery flow School agreement School recruitment Local partner training Family recruitment Phase 1 Week 0 (pre-questionnaires) Graduation Week 9 (post-questionnaires) 8-week cycle 22 months Phase 3 feedback FASTworks

FAST trial delivery flow (per Block) KS1 and NFER end-point SDQs Randomise School agreement School consent School recruitment Local partner training Family recruitment Phase 1 Week 0 (pre-questionnaires) Graduation Week 9 (post-questionnaires) 8-week cycle 22 months Phase 3 feedback FASTworks Y1 parent consent, NFER pre-SDQs, pupil data, school pro-forma NFER mid-point-SDQs

Blocking Three overall blocks of between 40 and 60 schools (with sub-blocks 1a, 1b, 2a, 2b, 3a, 3b for randomisation and delivery) 1st block in September 2015 2nd block in January 2016 3rd block in April 2016 Summer 2017 for attainment measures across all three blocks

SCUK recruit between 120 and 160 schools (158 achieved, in 3 blocks) Parental opt-out consent for data sharing (KS1 data) Upload of pupil UPNs and school baseline pro-forma, to NFER secure portal SDQ baseline administration by teachers RANDOMISATION FAST programme Control (business as usual) Observations of training and of programme Mid-point SDQ Mid-point SDQ, £500 Evaluation visits post-programme FASTworks data collection, and follow-up KS1 testing, SDQ, pro-forma £1000, feedback KS1 testing, SDQ, pro-forma, feedback

SCUK recruit between 120 and 160 schools (158 achieved, in Blocks) Parental opt-out consent for data sharing (KS1 data) Upload of pupil UPNs and school baseline pro-forma, to NFER secure portal SDQ baseline administration by teachers RANDOMISATION FAST programme Control (business as usual) Observations of training and of programme Mid-point SDQ Mid-point SDQ, £500 Evaluation visits post-programme FASTworks data collection, and follow-up KS1 testing, SDQ, pro-forma, feedback KS1 testing, SDQ, pro-forma £1000, feedback

Process evaluation

Sample size FAST is considered a whole-school intervention Recruitment of families to attend the sessions is part of the intervention Spill-over to other families is intended Sample size therefore takes into account an estimated ‘dilution’ of the effect

Power curves for overall effect

But… We require FAST families to experience ES= 0.45-0.50 (assuming no spill-over) if a third volunteer This is unlikely for the primary outcome (although EEF toolkit cites one study of parental involvement with ES=0.6) A slightly better quasi-experiment than normal

QED Families in control group that would have received FAST if randomised to intervention Using PSM, select control Year 1 pupils on the basis of FSM, EYFS etc. (to ensure ‘common support’) Multi-level modelling using same background factors including pre-test (EYFS) Post-test (KS1; independently marked)

Sensitivity analysis for QED Targeting of families for FAST is not done using eligibility criteria We therefore miss factor(s) that lead to a family being selected and agreeing An unmeasured variable changing its correlation with the outcome by 0.1 can ‘remove’ an effect of 0.2 (Coe, 2009) Given the equivalence of the schools, will the QED results be more robust here?

Conclusions We are at the more difficult end of the RCT ‘continuum of feasibility’ Concessions in design and delivery Blocking Data security concerns The RCT will still allow robust conclusions about FAST effectiveness at the school level The ‘enhanced’ QED will allow conclusions about FAST effectiveness at the pupil level, with caveats

Acknowledgements Save the Children UK Education Endowment Foundation NFER Research Operations team: Dave Hereward and Asma Ullah Centre for Children and Families Research, Loughborough University

Contact us: Dr. Ben Styles (Head of NFER’s Education Trials Unit) b.styles@nfer.ac.uk Pippa Lord (FAST Trial Manager) p.lord@nfer.ac.uk