© Institute for Fiscal Studies Evaluation design for Achieve Together Ellen Greaves and Luke Sibieta.

Slides:



Advertisements
Similar presentations
Science Subject Leader Training
Advertisements

The Middle Leadership Development Pilot Further information:
1 Budgeting & Control Week 8. 2 The nature of budgeting Budget is a detailed plan, expressed in quantitative terms, that specifies how resources will.
5 th June 2008 Peer review and development Penny Silvester Divisional Manager Learning and skills.
Guide to Compass Evaluations and
Webinar: June 6, :00am – 11:30am EDT The Community Eligibility Option.
In August, the historic CORE district waiver was approved allowing these districts to pursue a new robust and holistic accountability model for schools.
Gaby Crolla 3rd December 2013
Creating a lifelong sporting habit Embedding Commissioning for Sport & Leisure: the Community Sport Activation Fund Joel Brookfield London CLOA Conference.
Designing an impact evaluation: Randomization, statistical power, and some more fun…
Success for All - Living up to its name? Dr Louise Tracey & Professor Bette Chambers 21 March 2013.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
Pilot studies Karla Hemming The University of Birmingham.
Understanding p-values Annie Herbert Medical Statistician Research and Development Support Unit
EAST SUSSEX PRIMARY HEADS MEETING Development of Area Groups, Alliances and School to School support Margaret Coleman and Anne Radford 20th June 2014.
Overview Securing School Improvement. Welcome and introduction AIM Understanding the Ofsted Inspection Framework To give some insight into how some of.
Challenges in evaluating social interventions: would an RCT design have been the answer to all our problems? Lyndal Bond, Kathryn Skivington, Gerry McCartney,
The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to breaking the link between family income and educational achievement,
Western Hemisphere Membership Development Strategy.
Donald T. Simeon Caribbean Health Research Council
PEPFAR’s Approach to Maximize Efficiency, Effectiveness and Impact
©PPRNet 2014 Impact of Patient Engagement on Treatment Decisions and Patient-Centered Outcomes in the Implementation of New Guidelines for the Treatment.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Pupil Premium Review  Format of the day  Additional information  Questions asked  Report findings  Future cctions.
6th European University-Business Forum PARTNERSHIPS FOR JOBS AND GROWTH Social Innovation and Social Entrepreneurship Ellen Shipley, Partnership &
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
Communication Leaders A project all about communication led by and for children and young people.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Communication Leaders A project all about communication led by and for children and young people.
Narrowing the gap and the effective use of the Pupil and Service Premium with SEN young people Glyn Wright Autumn Term 2013.
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Raising standards, improving lives The inspection arrangements for maintained schools and academies from September 2013.
Background to Adaptive Design Nigel Stallard Professor of Medical Statistics Director of Health Sciences Research Institute Warwick Medical School
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
1 Department for Work and Pensions Stephen Meredith Head of Social Justice Analysis, DWP 11 November 2013 Where do we go from here? Social Impact Bonds.
© Nuffield Trust 22 June 2015 Matched Control Studies: Methods and case studies Cono Ariti
Raising standards, improving lives The use of assessment to improve learning: the evidence 15 September Jacqueline White HMI National Adviser for Assessment.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Getting Strategic Provision Management in Schools.
Primary.  There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils.  Boys showed a greater level of.
Australian Teacher Performance and Development Framework Consultation proposal.
Strategies for S&C Intervention. Ofsted “The creation of a culture of high expectation and aspirations and scholastic excellence in which the.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
In this session we will aim to: Share the methodology behind Oriel High School’s use of Pupil Premium funding Share details of the interventions and approaches.
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Employment Social Impact Bonds. Our partnership Numbers4Good and the Centre for Economic and Social Inclusion (Inclusion) are launching a new partnership.
The rise of RCTs in education: statistical and practical challenges Kevan Collins.
Approaches to quantitative data analysis Lara Traeger, PhD Methods in Supportive Oncology Research.
Impact of School Grants in Primary Schools In The Gambia Impact Evaluation Team The Gambian Team AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program.
Standards report Standards Report CT Board 18 th March 2016.
EEF Evaluators’ Conference 25 th June Session 1: Interpretation / impact 25 th June 2015.
© Crown copyright 2007 Study Plus training. © Crown copyright 2007 Aims of Study plus To accelerate the progress of pupils who are not on track to attain.
Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.
The English RCT of ‘Families and Schools Together’
Food and Agriculture Organization of the United Nations
Evaluation of Complex Whole School Interventions
What is the EEF Data Archive?
Pupil Equity Funding in East Lothian
Narrowing the evaluation gap
Pupil Premium Governing Body Training November 2017
PowerPoint presentation
BU Career Development Grant Writing Course- Session 3, Approach
Sampling and Power Slides by Jishnu Das.
Applying to the EEF for funding: What are we looking for
Benchmarking and Collaboration
Sample Sizes for IE Power Calculations.
Title Team Members.
Presentation transcript:

© Institute for Fiscal Studies Evaluation design for Achieve Together Ellen Greaves and Luke Sibieta

© Institute for Fiscal Studies Achieve Together Bring together three programmes in a school: –Teach First –Teaching Leaders –Future Leaders Intensive human capital investment Original motivation was also to encourage schools to work together and to engage the local community and organisation in school-improvement  Cluster-design  Difficult to evaluate quantitatively Evaluation and pilot funded by the Education Endowment Foundation (EEF)

Outline The original design of the evaluation What went wrong –Design of the pilot –Recruitment (round 1) –Recruitment (round 2) Final design of the evaluation Lessons for evaluators © Institute for Fiscal Studies

Achieve Together Two pilots: 1.Area-based design 2.School-level human capital investment

© Institute for Fiscal Studies Achieve Together Two pilots: 1.Area-based design One-cluster in Bournemouth 4 primary schools and 6 secondary schools Involvement of local community/organisations Process evaluation 2.School-level human capital investment

© Institute for Fiscal Studies Achieve Together Two pilots: 1.Area-based design 2.School-level human capital investment School-level intervention No co-ordination within clusters or involvement of external organisations Quantitative evaluation and process evaluation

© Institute for Fiscal Studies Original evaluation design Randomised controlled trial Number of schools fixed by EEF: 24 treatment and 24 control Primary outcomes –Attainment at KS4 –Attainment at Year 7 (focus of Achieve Together impact project) Secondary outcomes –Number of persistent absentees –Overall absence rate

© Institute for Fiscal Studies Original evaluation design Randomised controlled trial Number of schools fixed by EEF: 24 treatment and 24 control Primary outcomes –Attainment at KS4 –Attainment at Year 7 (focus of Achieve Together impact project) Secondary outcomes –Number of persistent absentees –Overall absence rate Subgroups –Pupils eligible for free school meals –Pupils with low prior attainment “Business as usual” in control schools –Able to access one programme element of Achieve Together

Power calculations Model Model Model Note: These calculations represent the effect size that will be possible to detect using a two-sided hypothesis test with significance level of 5%, and with power against an alternative hypothesis of 80%. Model 1 reports the minimum detectable effect size when the variance of the outcome unexplained by attributes of the pupils (including prior attainment) is 60%. Model 2 reports a less optimistic scenario (70% unexplained), whilst Model 3 is more optimistic (50% unexplained). © Institute for Fiscal Studies

What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? 2.How will the power calculations be affected? © Institute for Fiscal Studies

What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? Is positive impact due to the human capital approach? Or better co-ordination between schools?  Our findings would be inconclusive 2.How will the power calculations be affected? © Institute for Fiscal Studies

What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? 2.How will the power calculations be affected? At the extreme, we can think of the unit of treatment as the cluster Uncertain risk for the minimum detectable effect size Required treatment effect from power calculations with clustering at the school level already looked ambitious... Clustering may increase the intra-cluster correlation and increase the challenge of detecting a significant effect © Institute for Fiscal Studies

What went wrong: recruitment (round 1) Target: 48 Recruited: 13 Problems for recruitment: Time available Uncertainty about staff availability Uncertainty about school budget (for costly programme) Risk of being allocated to control group Clarity about the pilot The recruited schools began Achieve Together in September 2013 © Institute for Fiscal Studies

What went wrong: recruitment (round 2) Target: 48 Recruited: 15 Problems for recruitment: Time available Uncertainty about staff availability Uncertainty about school budget (for costly programme) Risk of being allocated to control group Clarity about the pilot The recruited schools will begin Achieve Together in September 2014 © Institute for Fiscal Studies

Final evaluation design Non-experimental Matching (“well-matched comparison group”) 1.Similar in terms of observable characteristics 2.Expressed a strong interest in Achieve Together How credible are the non-experimental estimates? Depends on the factors that determine take-up and growth in pupil attainment - observable or unobservable? Assess the credibility of the non-experimental matching estimates Achieve Together round 1 schools: compare matching estimates to a “gold standard” comparison group - schools that are similar in both observable and unobservable characteristics Achieve Together round 2 schools © Institute for Fiscal Studies

Final evaluation design Matching likely to be credible Matching unlikely to be credible © Institute for Fiscal Studies

Lessons for evaluators (1) Evaluators must have good communication with the project team How are plans for the pilot developing? What are the implications for the evaluation design? Why is the evaluation important? Evaluators should be clear about the necessary requirements for the evaluation What is expected of control schools? Restrictions on “business as usual” What is expected of treatment schools? Additional testing Involvement with process evaluation What are non-negotiable elements of the evaluation © Institute for Fiscal Studies

Lessons for evaluators (2) Recruitment can be difficult! What barriers does the evaluation impose and can these be reduced? Be creative What evaluation design is feasible as circumstances change? Be selective! What is the potential for a robust and informative evaluation? What are the risks to the evaluation? © Institute for Fiscal Studies