Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to give research results back to teachers?

Similar presentations


Presentation on theme: "How to give research results back to teachers?"— Presentation transcript:

1 How to give research results back to teachers?
Professor Steve Higgins, School of Education, Durham University @profstig 3rd meeting of the Nordic Forum November 10th, 2016 Danmarks Pædagogiske Universitet, Emdrup

2 Overview The Sutton Trust – Education Endowment Foundation Toolkit A model for research communication and use Some tensions and limitations Future developments

3 What does The Education Endowment Foundation do?
Two aims: 1. Break the link between family income and school attainment 2. Build the evidence base on the most promising ways of closing the attainment gap The approach: Summarise existing evidence Test new ideas Share findings

4 A Model for Effective Research Communication and Use
Some necessary conditions for effective research communication and use: Accurate in terms of research findings and the probability of benefit (internal and external validity) Accessible in terms of getting hold of the evidence and understanding it (external and internal) Applicable for specific context (age, phase, subject/ content etc.) and level of use (practitioner, manager, policy maker) Acceptable fit with teacher’s understanding and beliefs about what will bring about improvement Appropriate to context (a good solution to a real problem) Actionable practical and realistic, with tools/ scaffolding for implementation, retaining causal pathway

5 Sutton Trust/EEF Teaching & Learning Toolkit
Best ‘buys’ on average from research Key messages for Pupil Premium spending Currently used by over 60% of schools

6 What we tried to do Summarise the evidence from meta-analysis about the impact of different strategies on learning (tested attainment) – series of related ‘umbrella’ reviews As found in research studies These are averages Apply quality criteria to evaluations: rigorous designs only Estimate the size of the effect Standardised Mean Difference = ‘Months of gain’ On tested attainment only Estimate the costs of adopting Information not always available

7 Overview and aims Basic cost-benefit analysis of educational interventions and approaches Based on cost effectiveness estimates of a range of approaches Average effects from meta-analyses (or other quantitative estimates) and estimate additional outlay to implement Evidence robustness estimates as ‘padlocks’ To inform professional decision-making about spending/ resource allocation To create a framework for evidence synthesis and evidence transactions To provide a structure for refinement and improvement

8 EEF Project Best bets (on average)

9 Good bets (on average)

10 High risk (on average)

11 Early years version

12 Summaries What is it? How effective is it? How secure is the evidence?
What are the costs? What should I consider? Printable summary Technical Appendix Further reading Case studies/ video Related EEF projects

13 Technical Appendices Definition Search terms Evidence rating
Additional cost information References Summary of effects Abstracts of meta-analyses

14 Impact as months’ progress
Effect Size from … ... to Description -0.01 0.01 Very low or no effect 1 0.02 0.09 Low 2 0.10 0.18 3 0.19 0.26 Moderate 4 0.27 0.35 5 0.36 0.44 6 0.45 0.52 High 7 0.53 0.61 8 0.62 0.69 9 0.70 0.78 Very high 10 0.79 0.87 11 0.88 0.95 12 0.96 >1.0 In the Toolkit we have equated school progress in months to ES as a crude but meaningful equivalent. We have assumed that a year of progress is about equivalent to one standard deviation per year and corresponds with Glass’ observation that “the standard deviation of most achievement tests in elementary school is 1.0 grade equivalent units; hence the effect size of one year’s instruction at the elementary school level is about +1” (Glass, 1981: 103). However, in is important to note that the correspondence of one standard deviation to one year’s progress can vary considerably for different ages and types of test. For example, differences in ESs tend to reduce with age. Thus we provide a more conservative estimate with young learners. Where the data is available a weighted mean is used. This is based on calculating a weight for each meta-analysis according to its variance, based on the reciprocal of the square of the SE (Borenstein et al. 2010). Where the data is not available for this an estimate is given based on the available evidence and a judgement made about the most applicable estimate to use (such as the impact on disadvantaged pupils, or the most rigorous of the available meta-analyses). Where no meta-analyses of educational interventions in a given area could be found an ES is estimated from correlational studies or large scale studies investigating the relationship under review

15 Cost-effectiveness Cost Description
Very low: up to about £2,000 per year per class of 25 pupils, or less than £80 per pupil per year. Low: £2,001 to £5,000 per year per class of 25 pupils, or up to about £200 per pupil per year Moderate: £5,001 to £18,000 per year per class of 25 pupils, or up to about £700 per pupil per year. High: £18,001 to £30,000 per year per class of 25 pupils, or up to £1,200 per pupil. Very high: over £30,000 per year per class of 25 pupils, or over £1,200 per pupil. *1st Bullet point: *Very helpful to inform schools & to provide information as to whether an intervention that seems to have a large ES is cost-effective & can be easily applied in schools. On the other hand if an intervention has a small ES but it is really cheap to administer this is good to know for schools too. *Estimates are based on a class of 25 pupils! *Where an approach does not require an additional resource, estimates are based on the cost of training or professional development which may be required to support establishing new practices.

16 Evidence Assessment Rating Description
Very limited: Quantitative evidence of impact from single studies, but with effect size data reported or calculable. No systematic reviews with quantitative data or meta-analyses located. Limited: At least one meta-analysis or systematic review with quantitative evidence of impact on attainment or cognitive or curriculum outcome measures. Moderate: Two or more rigorous meta-analyses of experimental studies of school age students with cognitive or curriculum outcome measures. Extensive: Three or more meta-analyses from well-controlled experiments mainly undertaken in schools using pupil attainment data with some exploration of causes of any identified heterogeneity. Very Extensive: Consistent high quality evidence from at least five robust and recent meta-analyses where the majority of the included studies have good ecological validity and where the outcome measures include curriculum measures or standardised tests in school subject areas. *As mentioned earlier along with the months progress and relevant costs we report the level of quality per strand depending on the studies found. *2 Toolkits: Main Toolkit for ages: 5 to 18 years and EY for 3 to 5 years of age presented on the next slides.

17 Accurate Plus Based on meta-analysis and aggregation of findings Identifying patterns of effects “on average” Communicate comparative benefit Minus Assumes even bias across fields Dependent on scope and quality of underlying meta-analyses Conversion to months’ progress over-simplifies Pedagogic and analytic heterogeneity prevent more precise estimates

18 Key issues The Toolkit does not provide definitive claims of ‘what works’ BUT attempts to give a best estimate of what has worked Caution needed since the applicability of an intervention to a new context may not be as effective RCTs dependent on ‘average treatment effects’ on a theoretical population causal mechanism may not be identified impact of researcher-led interventions may differ from school-led needs to be a solution to a problem to increase probability of benefit Lack of a clear causal link between general additional spending and learning The Toolkit does not provide definitive claims of ‘what works’ BUT attempts to give a best estimate of what has worked Caution needed since the applicability of an intervention to a new context may not be as effective RCTs dependent on ‘average treatment effects’ on a theoretical population causal mechanism may not be identified impact of researcher-led interventions may differ from school-led needs to be a solution to a problem to increase probability of benefit Lack of a clear causal link between general additional spending and learning Bullet point 1: of what is likely to be beneficial based on existing evidence.

19 Accessible Plus Comparative simplfied layout with impact, cost and evidence indicators Website 12k users per month Layers and links provide increasing detail and justification Minus May encourage simplistic interpretation Hard to develop deeper engagement Tension between accessibility and accuracy

20

21 Overview of value for money
1.0 Promising Feedback Meta-cognition Could be worth it EY intervention Peer tutoring Effect Size (potential months gain) Homework (Secondary) 1-1 tutoring Summer schools Digital technology Phonics Smaller classes Parental involvement After school Needs careful thought Individualised learning Teaching assistants Performance pay £0 Ability grouping £1000 Cost per pupil

22 Use of strands Are practitioners using the toolkit?
Which strands are more frequently consulted? What drives engagement? Google analytics Online reports from schools

23 Google Analytics Traffic analysis tool
Tracks and reports website traffic

24 Toolkit access over time
page views 3 month period

25 Views per month per strand
Number of unique page views of homepage Months on website

26 Requirement to report Pupil Premium spending
£133,320 £235,620 £175,770

27 Example of school reporting

28 Applicable Plus Patterns similar for Early Years and school Toolkits Meta-analyses tend to combine different ages and across contexts Minus Averages of averages – good general bet, but not age specific Tends to focus on pedagogical solutions, not subject or curriculum specific

29 Acceptable Has to fit with teachers’ beliefs about what they think will ‘work’ Has to challenge current practice to bring about successful change (A “zone of proximal professional development” ZPPD) Plus Range of options in the Toolkit ‘menu’ Minus Acceptable solutions may not be optimal

30 Appropriate To context (learners’ needs) and organisational and individual capability (school and teachers) Plus Set in a professional ‘expertise’ model Minus Identifying ‘fit’ is often problematic (external validity) Needs professional diagnosis and judgement

31 Actionable Has to be practical and manageable Has to retain (or improve) the causal pathway Plus Aim is to share the what and the why Minus RCTs and other evidence often only provide warrant for the what Meta-analyses confirm general approaches, not specific

32 EEF approach & the Toolkit
DfE Commissioning partnerships EEF approach & the Toolkit Northern Rock Foundation Wellcome Trust Research commissioning ESRC Systematic searching Regular updates Research use trials Synthesis Findings Practitioner engagement Policy influence International partnerships Campaigns Website Evaluation Guide Conferences Campaigns

33 Australian version Global content Global structure
Local research & examples Local costs

34 Current developments Formalising methodology (translating/adapting existing models) Cochrane/ Campbell/ EPPI PRISMA for reviews CONSORT for trials GRADE Guidelines for evidence New comparable and updatable meta-analyses for each strand Identifying factors affecting current effect size estimates Design (sample size, randomisation, clustering) Measurement issues (outcome complexity, outcome alignment) Intervention (duration, intensity) International partnerships Australia – 3 RCTs commissioned Chile

35 A Model for Effective Research Communication and Use
Some necessary conditions for effective research communication and use: Accurate in terms of research findings and the probability of benefit (internal and external validity) Accessible in terms of getting hold of the evidence and understanding it (external and internal) Applicable for specific context (age, phase, subject/ content etc.) and level of use (practitioner, manager, policy maker) Acceptable fit with teacher’s understanding and beliefs about what will bring about improvement Appropriate to context (a good solution to a real problem) Actionable practical and realistic, with tools/ scaffolding for implementation, retaining causal pathway Too much alliteration? AND with Authentic Application it Augments current capability or supersedes less effective practice

36 Toolkit tensions Accurate Appropriate Applicable Actionable Acceptable
Accessible


Download ppt "How to give research results back to teachers?"

Similar presentations


Ads by Google