Session 3: Analysis and reporting Collecting data for cost estimates Jack Worth (NFER) Panel on EEF reporting and data archiving Peter Henderson, Camilla.

Slides:



Advertisements
Similar presentations
Ofsted Preparation Session 2 How are your SENS supported?Christine.
Advertisements

The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to breaking the link between family income and educational achievement,
Effect Size Robert ResearchED 2013, Dulwich College, 7 Sept 2013.
USING EVIDENCE TO INFORM YOUR LEADERSHIP APPROACH AND SUPPORT SCHOOL IMPROVEMENT ROB CARPENTER 26 TH SEPTEMBER 2013
Dorset Leadership Conference, 2013 Using evidence to inform your leadership approach and support school improvement James Richardson 5 th November 2013.
The big challenge for our generation of school leaders: using the pupil premium to narrow the gap Worcestershire Pupil Premium conference 21 January 2015.
1 Guidance: Making Best Use of Teaching Assistants Summary of Recommendations Spring 2015.
Exploring Research-Led Approaches to Increasing Pupil Learning Steve Higgins School of Education, Durham University Addressing.
Pupil premium toolkit: what works best at raising school achievement? Dr Lee Elliot Major, Director of Research and Policy, Sutton Trust, and EEF trustee.
Building Evidence in Education: Workshop for EEF evaluators 2 nd June: York 6 th June: London
Effective use of the Pupil Premium to close the attainment gap James Richardson Senior Analyst, Education Endowment Foundation 27 th June 2014
Disciplined innovation: the implications of harnessing evidence to drive improved outcomes for children and inform the design of the curriculum they are.
Using evidence to raise the attainment of children facing disadvantage James Richardson Senior Analyst, Education Endowment Foundation 1 st April 2014.
Session 4: Analysis and reporting Managing missing data Rob Coe (CEM, Durham) Developing a statistical analysis plan Hannah Buckley (York Trials Unit)
Staff Development ED 571: School and Home-Based Programs for Children with Autism.
Inspection and regulation from September The government’s vision for early years with a revised Early Years Foundation Stage provides the timetable.
Monitoring and Tracking Progress and Achievement in the context of 5
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Using research to get the best value from the Pupil Premium Steve Higgins, School of Education, Durham National.
Research evidence and effective use of the Pupil Premium Professor Steve Higgins, School of Education, Durham
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Research into research use: A ten-arm cluster RCT 10 th September 2014 Dr Ben Styles, Dr Anneka Dawson (NFER), Dr Lyn Robinson and Dr Christine Merrell.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
1 Effective use of the pupil premium to raise achievement Lessons from research Robbie Coleman 18 th June 2015.
Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
HLTA North of England Professional Development Events JuneLeeds 30 JuneGateshead 1 JulyLiverpool Welcome Everyone!
SCR755 – ERCOT.com Website Enhancements CSWG Meeting April 15, 2013.
The Griffith PRO- Teaching Project A Process for Peer Review and Observation of Teaching.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
The Education Endowment Foundation Kevan Collins 26th September 2013
1 External influences shaping the evaluation of a tertiary child protection program Andrew Anderson, The Benevolent Society.
Using Evidence to Narrow the Gaps. What is the Education Endowment Foundation? In 2011 the Education Endowment Foundation was set up by Sutton Trust as.
London’s challenge… Kevan Collins
1 EEF ‘Making Best Use of Teaching Assistants’ guidance report – summary of recommendations.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
Validated Self Evaluation of Alcohol and Drug Partnerships Evidencing Implementation: The Quality Principles – Care Inspectorate/The Scottish Government.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
Name of Pilot Project: Developing pedagogy through collaboration, action research and reflection. Aim of the Project: to set up a collaborative partnership.
Session 4 Planning for Implementation of APP. 1 PROCESS Familiarisation with AFs & Standard files Practice in levelling Standard files using APP guidelines.
The importance of enhancing teachers’ research literacy Matt Walker UCET Annual Conference 4 th November 2015.
Birmingham Primary Strategy Team Renewing the Frameworks Training Session 4 Beginning the implementation process.
The rise of RCTs in education: statistical and practical challenges Kevan Collins.
Changing Attitudes towards Vocational Education and Apprenticeships Sutton Trust-Pearson Higher Ambitions Summit 2014 NFER: Tami McCrone.
Auto-evaluation School Libraries Model in England David Streatfield Information Management Associates, UK.
Applying evidence in practice: definitions and approaches Julie Nelson, NFER 14 th November 2014 Presentation for LSRN workshop: ‘the practicalities of.
How can stakeholders benefit from the two national initiatives? Tami McCrone, NFER 3 rd June 2015 Presentation for LSRN workshop: Developing Clear Messages.
Analyzing Data Module 8. 2 Where are we in the Cycle? Resources Establish Need Analyze Data Interpret Data Communicate Results Use Results Plan Collect.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
EEF Evaluators’ Conference 25 th June Session 1: Interpretation / impact 25 th June 2015.
Statistical Analysis Plans EEF Evaluators’ Conference 2016 Dr Ben Styles Head of NFER’s Education Trials Unit.
Young people’s transitions: how employers make a difference London Conference on Employer Engagement in Education and Training 2016 Tami McCrone and Susie.
Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.
Guidance: Making Best Use of Teaching Assistants Summary of Recommendations Spring 2015.
Dissemination and scale-up is the key to the EEF’s impact.
The English RCT of ‘Families and Schools Together’
Collaborative Expedition Workshop #71 National Science Foundation
What is the EEF Data Archive?
The way forward: Action Planning
Evidence in Action: Using Research to Narrow the Gap Eleanor Stringer
Project-based learning in university technical colleges
Progression and the Primary Framework
ESF Evaluation Plan England
The EEF Data Archive: Plans & Priorities
% Disadvantaged pupils
Workshop: Resources for research and evidence use
Imagine you were writing a test about driving a car
Writing competence based questions
Presentation transcript:

Session 3: Analysis and reporting Collecting data for cost estimates Jack Worth (NFER) Panel on EEF reporting and data archiving Peter Henderson, Camilla Nevill, Steve Higgins and Andrew Bibby

Data collection for cost estimates Jack Worth, NFER EEF London Evaluators Workshop 6 th June 2014

Summary Reporting cost is an important part of evaluations Collecting the right information can be theoretically and practically challenging Evaluators should be sharing experiences and best practice

Effectiveness

Cost effectiveness

Costs to consider collecting Direct costs – how much would it cost a school or parent to buy the intervention? –how much did it cost may differ Indirect costs – do staff have to put in more hours than usual? –always think: what is the counterfactual? Needs to be quantitative information

Collecting cost information Impact or process evaluation? –cost effectiveness relates to impact... –...but process methods e.g. surveys, case studies, often better suited to cost data Planning and communication –across evaluation project team –with development partner

Reporting cost information Present the average cost to compare with average effectiveness Cost per pupil or per school? –may depend on the specific intervention –cost per pupil is comparable to other interventions Present the assumptions made and data sources used

Sharing best practice Recommend agreeing principles for a common approach among evaluators Questions?

NFER provides evidence for excellence through its independence and insights, the breadth of its work, its connections, and a focus on outcomes. National Foundation for Educational Research The Mere, Upton Park Slough, Berks SL1 2DQ T: F: E:

EEF reporting and data archiving Peter Henderson (EEF) Camilla Nevill (EEF) Steve Higgins (Durham) - Chair Andrew Bibby (FFT)

The reporting process and publication of results on EEF’s website Peter Henderson(EEF)

Reporting process Evaluation team Dissemination team

Classifying the security of findings from EEF evaluations Camilla Nevill (EEF)

Example Appendix: Chatterbooks

Combining the results of evaluations with the meta-analysis in the Teaching and Learning Toolkit Steve Higgins (Durham)

Andrew Bibby Archiving EEF project data

1.Include permission for linking and archiving in consent forms 2.Retain pupil identifiers 3.Label values and variables 4.Save Syntax or Do files Prior to archiving…