How to give research results back to teachers?

Slides:



Advertisements
Similar presentations
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Advertisements

USING EVIDENCE TO INFORM YOUR LEADERSHIP APPROACH AND SUPPORT SCHOOL IMPROVEMENT ROB CARPENTER 26 TH SEPTEMBER 2013
Dorset Leadership Conference, 2013 Using evidence to inform your leadership approach and support school improvement James Richardson 5 th November 2013.
Exploring Research-Led Approaches to Increasing Pupil Learning Steve Higgins School of Education, Durham University Addressing.
Spending the Pupil Premium: Strategies to Improve Learning
Research-Led Approaches to Increasing Pupil Learning BOWDEN ROOM.
Effective use of the Pupil Premium to close the attainment gap James Richardson Senior Analyst, Education Endowment Foundation 27 th June 2014
Disciplined innovation: the implications of harnessing evidence to drive improved outcomes for children and inform the design of the curriculum they are.
Using evidence to raise the attainment of children facing disadvantage James Richardson Senior Analyst, Education Endowment Foundation 1 st April 2014.
Campbell Collaboration Colloquium 2012 Copenhagen, Denmark The effectiveness of volunteer tutoring programmes Dr Sarah Miller Centre.
Using research to get the best value from the Pupil Premium Steve Higgins, School of Education, Durham National.
Research evidence and effective use of the Pupil Premium Professor Steve Higgins, School of Education, Durham
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
The Ofsted ITE Inspection Framework 2014 A summary.
Overall Teacher Judgements
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
Welcome & Introduction.  The largest educational research unit in a UK university (75 staff)  1.1 million assessments are taken each year  More than.
The Pupil Premium: Using Evidence to Narrow the Gap Robbie Coleman 7 th July 2014
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Using Evidence to Narrow the Gaps. What is the Education Endowment Foundation? In 2011 the Education Endowment Foundation was set up by Sutton Trust as.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
Moderation and Validation of Teacher Judgements in School.
Planning high quality, evidence based provision to meet the needs and achieve the outcomes How do you know what works?
1 2 Assessing Pupils’ Progress Spring term 2009.
Evaluation in Education: 'new' approaches, different perspectives, design challenges Camilla Nevill Head of Evaluation, Education Endowment Foundation.
New Assessment Routines: Update for Parents
Curriculum Forum Secondary Tuesday 6 June 2017
Feedback and learning Professor Steve Higgins,
The Kansas Kindergarten Readiness Initiative: The Special Educator’s Role Barb Dayal Vera Stroup-Rentier.
Using evidence to review and moderate students’ progress against the Teachers’ Standards Explain that students should have provided their TP files and.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Knowledge Transfer Partnership Project Nottingham Trent University and Nottinghamshire County Council Dr Adam Barnard Rachel Clark Catherine Goodall 19/4/16.
Professor Steve Higgins, School of Education, Durham University
Welcome - Pupil Premium
Challenges arising from the analysis of randomized trials in education
Evidence Synthesis/Systematic Reviews of Eyewitness Accuracy
Evidence-based Medicine
The Kansas Kindergarten Readiness Initiative: The Special Educator’s Role Barb Dayal Vera Stroup-Rentier.
Curriculum, Assessment, Data, Progress, Reporting and Tracking.
Metacognition and Self-regulation.
Evidence-Based Practices: Tier 1
Evidence in Action: Using Research to Narrow the Gap Eleanor Stringer
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
European Network on teacher Education Policies
The Role of the EP in the Graduated Response
The Nature of Qualitative Research
Small Charities Challenge Fund (SCCF) Guidance Webinar
Tristram Hooley 19th April 2018 CEC Community Event, Manchester
The Education Endowment Foundation
Swaledale Alliance Pupil Premium Research 13th October 2017
Introducing Victorian Curriculum - Towards Foundation Levels A to D
Visual map of the EEF Sutton Trust toolkit
Systematic Review (Advanced_Course_Module_6_Appendix)
PARENT INFORMATION SESSION
Improve IEP Project IEP Form Improvement Project
school self-evaluation and improvement toolkit
Analysing educational trials: the Education Endowment Foundation Archive Steve Higgins, Adetayo Kasim, ZhiMin Xiao, with Nasima Akhter, Ewoud De Troyer,
Standard for Teachers’ Professional Development July 2016
TA Toolkit Teacher Session
Benchmarking and Collaboration
Assessing Pupil Progress in Science (APP) Department CPD session Spring 2009 Slide 0.1.
Finance – making the best of your resources budget planning, benchmarking, collaboration & seeking best value Welcome.
Visible Learning WHAT REALLY WORKS in Special and Inclusive Education
RBWM SCITT Mentor Meetings 2017.
The National and Local context
The EEF approach Sir Kevan Collins 25th March 2019.
Meta-analysis, systematic reviews and research syntheses
Systematic Review (Advanced Course: Module 6 Appendix)
Presentation transcript:

How to give research results back to teachers? Professor Steve Higgins, School of Education, Durham University s.e.higgins@durham.ac.uk @profstig 3rd meeting of the Nordic Forum November 10th, 2016 Danmarks Pædagogiske Universitet, Emdrup

Overview The Sutton Trust – Education Endowment Foundation Toolkit A model for research communication and use Some tensions and limitations Future developments

What does The Education Endowment Foundation do? Two aims: 1. Break the link between family income and school attainment 2. Build the evidence base on the most promising ways of closing the attainment gap The approach: Summarise existing evidence Test new ideas Share findings

A Model for Effective Research Communication and Use Some necessary conditions for effective research communication and use: Accurate in terms of research findings and the probability of benefit (internal and external validity) Accessible in terms of getting hold of the evidence and understanding it (external and internal) Applicable for specific context (age, phase, subject/ content etc.) and level of use (practitioner, manager, policy maker) Acceptable fit with teacher’s understanding and beliefs about what will bring about improvement Appropriate to context (a good solution to a real problem) Actionable practical and realistic, with tools/ scaffolding for implementation, retaining causal pathway

Sutton Trust/EEF Teaching & Learning Toolkit Best ‘buys’ on average from research Key messages for Pupil Premium spending Currently used by over 60% of schools http://educationendowmentfoundation.org.uk/toolkit

What we tried to do Summarise the evidence from meta-analysis about the impact of different strategies on learning (tested attainment) – series of related ‘umbrella’ reviews As found in research studies These are averages Apply quality criteria to evaluations: rigorous designs only Estimate the size of the effect Standardised Mean Difference = ‘Months of gain’ On tested attainment only Estimate the costs of adopting Information not always available

Overview and aims Basic cost-benefit analysis of educational interventions and approaches Based on cost effectiveness estimates of a range of approaches Average effects from meta-analyses (or other quantitative estimates) and estimate additional outlay to implement Evidence robustness estimates as ‘padlocks’ To inform professional decision-making about spending/ resource allocation To create a framework for evidence synthesis and evidence transactions To provide a structure for refinement and improvement

EEF Project Best bets (on average)

Good bets (on average)

High risk (on average)

Early years version

Summaries What is it? How effective is it? How secure is the evidence? What are the costs? What should I consider? Printable summary Technical Appendix Further reading Case studies/ video Related EEF projects

Technical Appendices Definition Search terms Evidence rating Additional cost information References Summary of effects Abstracts of meta-analyses

Impact as months’ progress Effect Size from … ... to Description -0.01 0.01 Very low or no effect 1 0.02 0.09 Low 2 0.10 0.18 3 0.19 0.26 Moderate 4 0.27 0.35 5 0.36 0.44 6 0.45 0.52 High 7 0.53 0.61 8 0.62 0.69 9 0.70 0.78 Very high 10 0.79 0.87 11 0.88 0.95 12 0.96 >1.0 In the Toolkit we have equated school progress in months to ES as a crude but meaningful equivalent. We have assumed that a year of progress is about equivalent to one standard deviation per year and corresponds with Glass’ observation that “the standard deviation of most achievement tests in elementary school is 1.0 grade equivalent units; hence the effect size of one year’s instruction at the elementary school level is about +1” (Glass, 1981: 103). However, in is important to note that the correspondence of one standard deviation to one year’s progress can vary considerably for different ages and types of test. For example, differences in ESs tend to reduce with age. Thus we provide a more conservative estimate with young learners. Where the data is available a weighted mean is used. This is based on calculating a weight for each meta-analysis according to its variance, based on the reciprocal of the square of the SE (Borenstein et al. 2010). Where the data is not available for this an estimate is given based on the available evidence and a judgement made about the most applicable estimate to use (such as the impact on disadvantaged pupils, or the most rigorous of the available meta-analyses). Where no meta-analyses of educational interventions in a given area could be found an ES is estimated from correlational studies or large scale studies investigating the relationship under review

Cost-effectiveness Cost Description Very low: up to about £2,000 per year per class of 25 pupils, or less than £80 per pupil per year. Low: £2,001 to £5,000 per year per class of 25 pupils, or up to about £200 per pupil per year Moderate: £5,001 to £18,000 per year per class of 25 pupils, or up to about £700 per pupil per year. High: £18,001 to £30,000 per year per class of 25 pupils, or up to £1,200 per pupil. Very high: over £30,000 per year per class of 25 pupils, or over £1,200 per pupil. *1st Bullet point: *Very helpful to inform schools & to provide information as to whether an intervention that seems to have a large ES is cost-effective & can be easily applied in schools. On the other hand if an intervention has a small ES but it is really cheap to administer this is good to know for schools too. *Estimates are based on a class of 25 pupils! *Where an approach does not require an additional resource, estimates are based on the cost of training or professional development which may be required to support establishing new practices.

Evidence Assessment Rating Description   Very limited: Quantitative evidence of impact from single studies, but with effect size data reported or calculable. No systematic reviews with quantitative data or meta-analyses located. Limited: At least one meta-analysis or systematic review with quantitative evidence of impact on attainment or cognitive or curriculum outcome measures. Moderate: Two or more rigorous meta-analyses of experimental studies of school age students with cognitive or curriculum outcome measures. Extensive: Three or more meta-analyses from well-controlled experiments mainly undertaken in schools using pupil attainment data with some exploration of causes of any identified heterogeneity. Very Extensive: Consistent high quality evidence from at least five robust and recent meta-analyses where the majority of the included studies have good ecological validity and where the outcome measures include curriculum measures or standardised tests in school subject areas. *As mentioned earlier along with the months progress and relevant costs we report the level of quality per strand depending on the studies found. *2 Toolkits: Main Toolkit for ages: 5 to 18 years and EY for 3 to 5 years of age presented on the next slides.

Accurate Plus Based on meta-analysis and aggregation of findings Identifying patterns of effects “on average” Communicate comparative benefit Minus Assumes even bias across fields Dependent on scope and quality of underlying meta-analyses Conversion to months’ progress over-simplifies Pedagogic and analytic heterogeneity prevent more precise estimates

Key issues The Toolkit does not provide definitive claims of ‘what works’ BUT attempts to give a best estimate of what has worked Caution needed since the applicability of an intervention to a new context may not be as effective RCTs dependent on ‘average treatment effects’ on a theoretical population causal mechanism may not be identified impact of researcher-led interventions may differ from school-led needs to be a solution to a problem to increase probability of benefit Lack of a clear causal link between general additional spending and learning The Toolkit does not provide definitive claims of ‘what works’ BUT attempts to give a best estimate of what has worked Caution needed since the applicability of an intervention to a new context may not be as effective RCTs dependent on ‘average treatment effects’ on a theoretical population causal mechanism may not be identified impact of researcher-led interventions may differ from school-led needs to be a solution to a problem to increase probability of benefit Lack of a clear causal link between general additional spending and learning Bullet point 1: of what is likely to be beneficial based on existing evidence.

Accessible Plus Comparative simplfied layout with impact, cost and evidence indicators Website 12k users per month Layers and links provide increasing detail and justification Minus May encourage simplistic interpretation Hard to develop deeper engagement Tension between accessibility and accuracy

http://educationendowmentfoundation.org.uk/toolkit

Overview of value for money 1.0 Promising Feedback Meta-cognition Could be worth it EY intervention Peer tutoring Effect Size (potential months gain) Homework (Secondary) 1-1 tutoring Summer schools Digital technology Phonics Smaller classes Parental involvement After school Needs careful thought Individualised learning Teaching assistants Performance pay £0 Ability grouping £1000 Cost per pupil

Use of strands Are practitioners using the toolkit? Which strands are more frequently consulted? What drives engagement? Google analytics Online reports from schools

Google Analytics Traffic analysis tool Tracks and reports website traffic

Toolkit access over time page views 3 month period

Views per month per strand Number of unique page views of homepage Months on website

Requirement to report Pupil Premium spending £133,320 £235,620 £175,770

Example of school reporting http://www.mornington.notts.sch.uk/index.php/pupil-premium-policy-4

Applicable Plus Patterns similar for Early Years and school Toolkits Meta-analyses tend to combine different ages and across contexts Minus Averages of averages – good general bet, but not age specific Tends to focus on pedagogical solutions, not subject or curriculum specific

Acceptable Has to fit with teachers’ beliefs about what they think will ‘work’ Has to challenge current practice to bring about successful change (A “zone of proximal professional development” ZPPD) Plus Range of options in the Toolkit ‘menu’ Minus Acceptable solutions may not be optimal

Appropriate To context (learners’ needs) and organisational and individual capability (school and teachers) Plus Set in a professional ‘expertise’ model Minus Identifying ‘fit’ is often problematic (external validity) Needs professional diagnosis and judgement

Actionable Has to be practical and manageable Has to retain (or improve) the causal pathway Plus Aim is to share the what and the why Minus RCTs and other evidence often only provide warrant for the what Meta-analyses confirm general approaches, not specific

EEF approach & the Toolkit DfE Commissioning partnerships EEF approach & the Toolkit Northern Rock Foundation Wellcome Trust Research commissioning ESRC Systematic searching Regular updates Research use trials Synthesis Findings Practitioner engagement Policy influence International partnerships Campaigns Website Evaluation Guide Conferences Campaigns

Australian version Global content Global structure Local research & examples Local costs

Current developments Formalising methodology (translating/adapting existing models) Cochrane/ Campbell/ EPPI PRISMA for reviews CONSORT for trials GRADE Guidelines for evidence New comparable and updatable meta-analyses for each strand Identifying factors affecting current effect size estimates Design (sample size, randomisation, clustering) Measurement issues (outcome complexity, outcome alignment) Intervention (duration, intensity) International partnerships Australia – 3 RCTs commissioned Chile

A Model for Effective Research Communication and Use Some necessary conditions for effective research communication and use: Accurate in terms of research findings and the probability of benefit (internal and external validity) Accessible in terms of getting hold of the evidence and understanding it (external and internal) Applicable for specific context (age, phase, subject/ content etc.) and level of use (practitioner, manager, policy maker) Acceptable fit with teacher’s understanding and beliefs about what will bring about improvement Appropriate to context (a good solution to a real problem) Actionable practical and realistic, with tools/ scaffolding for implementation, retaining causal pathway Too much alliteration? AND with Authentic Application it Augments current capability or supersedes less effective practice

Toolkit tensions Accurate Appropriate Applicable Actionable Acceptable Accessible