Conference for EEF evaluators: Building evidence in education Hannah Ainsworth, York Trials Unit, University of York Professor David Torgerson, York Trials.

Slides:



Advertisements
Similar presentations
The Principals Role in Systemic Change for Reading Commitment.
Advertisements

Introduction to Monitoring and Evaluation
‘Trials and Tribulations’ An RCT with the DCSF Carole Torgerson, Andy Wiggins and Hannah Ainsworth.
Characteristics of research. Designed to derive generalisable new knowledge.
School-based interventions for increasing physical activity and well-being: The MOVE project’s design and conceptual framework Katie Thomson, Sarah Curtis.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
∂ What works…and who listens? Encouraging the experimental evidence base in education and the social sciences RCTs in the Social Sciences 9 th Annual Conference.
Closing the progress gap. Key issues addressed by the study This study explored: – approaches to closing the gap for disadvantaged pupils –effective leadership.
Robert Coe Neil Appleby Academic mentoring in schools: a small RCT to evaluate a large policy Randomised Controlled trials in the Social Sciences: Challenges.
Experimental evaluation in education Professor Carole Torgerson School of Education, Durham University, United Kingdom International.
Challenges of undertaking RCTs in Education SIG Workshop Monday 18 th March 2013 Hannah Ainsworth, Research Fellow, York Trials Unit.
Adapting Designs Professor David Torgerson University of York Professor Carole Torgerson Durham University.
Session 3: Trial management Sarah Miller (Queens, Belfast) Laura Dunne (Queens, Belfast)
The use of administrative data in Randomised Controlled Trials (RCT’s) John Jerrim Institute of Education, University of London.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Response to Intervention RTI Teams: Following a Structured Problem- Solving Model Jim Wright
What is RtI? Considering the “I” in RtI CFN 207 Peggy Miller, Network Leader Lena Kim.
Session 4: Analysis and reporting Managing missing data Rob Coe (CEM, Durham) Developing a statistical analysis plan Hannah Buckley (York Trials Unit)
The Objectives:  Investigate the feasibility of consolidating three student administrative areas into one Centre.  Improve the student experience through.
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
 A New School System A Guide for Parents and Carers.
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
SEN Reform Update for Head teachers September 2014 David Carroll SEN/Inclusion Lead & Principal Educational Psychologist.
Grade 12 Subject Specific Ministry Training Sessions
Keeping track of learning through review, target setting and action planning e-Profiles – supporting personal development learning and information, advice.
Research evidence and effective use of the Pupil Premium Professor Steve Higgins, School of Education, Durham
Research into research use: A ten-arm cluster RCT 10 th September 2014 Dr Ben Styles, Dr Anneka Dawson (NFER), Dr Lyn Robinson and Dr Christine Merrell.
Building Evidence in Education: Conference for EEF Evaluators 11th July: Theory 12th July: Practice
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
The use of Stepped Wedge cluster randomized trials: Systematic Review Dr Celia Taylor The University of Birmingham On behalf of my co-authors: Noreen Mdege,
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Narrowing the Gap. Ensuring the gap is a priority for all schools Implementing individual pupil target setting Appointed Narrowing the Gap Adviser WBC’s.
1 ‘Making Best Use of Teaching Assistants’ guidance report – Summary of recommendations.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Page 1 Fall, 2010 Regional Cross Sector Meeting Elements of an Effective Protocol.
Programme Information Incredible Years (IY)Triple P (TP) – Level 4 GroupPromoting Alternative Thinking Strategies (PATHS) IY consists of 12 weekly (2-hour)
SCHOOL BOARD A democratically elected body that represents public ownership of schools through governance while serving as a bridge between public values.
Spicewood, IB World School Response to Intervention.
Engaging with best editorial and publication practice Shreeya Nanda Deputy Editor.
1 EEF ‘Making Best Use of Teaching Assistants’ guidance report – summary of recommendations.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
Assessment Without Levels December National Curriculum Levels From 1988 until July 2015, National Curriculum Levels were used from Y1 and through.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
The research challenges of large scale RCTs with volunteers and volunteering organisations Facilitator: Dr Matt Ryan Presenter: Professor Peter John.
Improving attainment and closing the gap: The Power of Collective Effort and Professional Trust Hartlepool Education Commission 23 rd June 2014 Dr Kevan.
Improving the Health Literacy Environment of Wisconsin Hospitals – A Collaborative Model Sue Gaard, RN, MS Wisconsin Primary Care Research & Quality Improvement.
Indirect and mixed treatment comparisons Hannah Buckley Co-authors: Hannah Ainsworth, Clare Heaps, Catherine Hewitt, Laura Jefferson, Natasha Mitchell,
Achievement Attainment and Progress What evidence will you need?
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Assessment PS502 Dr. Lenz. When and why assessments are performed Pre-employment screenings Evaluation and placement of children in school programs Determination.
CONSORT 2010 Balakrishnan S, Pondicherry Institute of Medical Sciences.
Y1 SBT Workshop EYFS Input Please ensure you have registered your name before you take a seat.
Regional Implementation of the Proposed Specific Learning Difficulties (SpLD) Support Model For Primary and Post Primary Schools 07/06/20161.
Language Learning in Scotland: A 1+2 Approach BSL in Schools Conference Tuesday 2 nd February.
EEF Evaluators’ Conference 25 th June Session 1: Interpretation / impact 25 th June 2015.
Dissemination and scale-up is the key to the EEF’s impact.
The English RCT of ‘Families and Schools Together’
Parent Forum 29th September 2016.
A randomised controlled trial to improve writing quality during the transition between primary and secondary school Natasha Mitchell, Research Fellow Hannah.
Participatory Action Research (PAR)
Conducting Efficacy Trials
DEIS Workshop. DEIS Workshop Understanding Planning in a DEIS School Ann O’Dwyer (Kerry ETB) Martin Gormley (Donegal ETB)
Evaluation of Switch-on
PARENT INFORMATION SESSION
How to apply successfully to the NIHR HTA Board?
STROBE Statement revision
Presentation transcript:

Conference for EEF evaluators: Building evidence in education Hannah Ainsworth, York Trials Unit, University of York Professor David Torgerson, York Trials Unit, University of York Professor Carole Torgerson, School of Education, Durham University Session 3: Implementation

Trial registration and CONSORT (HA – 10 mins.) Trial management (HA - 30mins) Model/approach (HA – 5mins) –Discussion (5mins) Protocol and other tools (HA – 5mins) –Discussion (5mins) Relationship with delivery partner (HA – 5mins) –Discussion (5mins) Recruitment and retention (CT – 20 mins.)

Trial registration Register trial with Current Controlled Trials at outset before beginning recruitment. You will be allocated an ISRCTN

Why is it important to register trials? »Public knowledge »Reduce duplication »Increase opportunities for collaboration »Reduce selective reporting and over reporting »Reduce publication bias

CONSORT Conduct and report trial to the CONSORT standards statement/ statement/ What is CONSORT? Why is it important? How can it help?

CONSORT checklist

CONSORT flow diagram

CONSORT flow diagram cluster trials From Campbell MK, Piaggio G, Elbourne DR, Altman DG; for the CONSORT Group. Consort 2010 statement: extension to cluster randomised trials. BMJ Sep 4;345:e5661.

Trial management Trial management approach/model Developing a trial protocol and other trial management tools Relationship with delivery partner

Trial management model/approach Current EEF model: Light touch approach to trial management as delivery partner is often taking responsibility for many of the ‘usual’ trial management responsibilities Think about everything you would normally do/take responsibility for as trial manager. document this and share with delivery partner. Offer advice and guide the process

Discussion What are YOUR experiences of trial management within EFF evaluations? What are the advantages and disadvantages of the current model/approach? Possible solutions?

Trial Protocol and other tools Develop a trial protocol as evaluation team Discuss and develop trial protocol with delivery partner Produce clear timeframes/deadlines which both evaluation team and delivery partner can work to

Produce evaluation diagrams Control Group Primary Schools N = 12 No intervention Primary School Randomisation Intervention Group Primary Schools N = 12 Intervention in Year 6 continued intervention in Year 7 in Secondary Schools. Children in target group n = 288 (based on average 12 children per school) Primary Schools n = 24 Secondary Schools n = 3 Long term follow up Routine test results and pupil characteristics recorded in National Pupil Database Baseline data collection Information on all Year 6 pupils including Key stage 2 English Teacher Assessments from Dec 2012 Follow up data collection Dec 2013 Progress in English 11 (long form) (Conducted in Secondary School)

Clear timeframes

Provide clear information Help delivery partner develop information for schools, parents and children Work with delivery partner to ensure clear instructions are given to schools Help delivery partner develop school agreement documents Help develop parent consent forms/opt out forms

Discussion Has the trial protocol been a useful shared document? What other tools have YOU used to aid the process?

Relationship with delivery partner Intervention developer has a lot invested in the intervention Evaluator must remain in equipoise Can be a challenging relationship Try to explain that it is important you remain impartial Refrain from voicing your own opinions about the intervention – let the research speak for itself

Relationship with delivery partner Manage expectations Be clear from outset who is responsible for what Be clear from outset on the data you will require, when and the format you will require it in Provide clear instructions for secure data transfer

Discussion What are YOUR experiences of the relationship with the delivery partner? How can challenges be overcome?

Recruitment and retention (CT) Randomisation ensures absence of selection bias Selection bias can still be introduced during recruitment or because of attrition

Recruitment bias Potential sources: »Developer-led recruitment »Timing »Randomisation of clusters before recruitment of individuals »Teacher not linked to class before randomisation Possible solutions: »Evaluators fully involved in recruitment process »Randomise after recruitment of clusters and within clusters »Ensure teachers are linked to classes before randomisation

Attrition bias Attrition after randomisation can introduce bias »Those who leave a trial tend to be different from those who remain in the trial »If there is unequal attrition this is worrying »All efforts must be made to retain participants after randomisation for post-tests even if they don’t receive the intervention

Example of attrition bias Random allocation 160 children 8 from Each school 76 children allocated to control 76 allocated to intervention group 1 school 8 children withdrew N = 17 children replaced following discussion with teacher

Discussion Discuss any issues you have experienced with the developer-led recruitment.