Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based use of the pupil premium Robert Coe Durham Leadership Conference, 26 June

Similar presentations


Presentation on theme: "Evidence-based use of the pupil premium Robert Coe Durham Leadership Conference, 26 June"— Presentation transcript:

1 Evidence-based use of the pupil premium Robert Coe Durham Leadership Conference, 26 June 2014 @ProfCoe www.twitter.com/ProfCoe

2 ∂ Outline  What can research tell us about the likely impacts and costs of different strategies?  How do we implement these strategies to … –Focus on what matters –Change classroom practice –Target areas of need –Produce demonstrable benefits 2 Improving Education: A triumph of hope over experience http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf

3 Evidence about the effectiveness of different strategies 3

4 ∂ Toolkit of Strategies to Improve Learning The Sutton Trust-EEF Teaching and Learning Toolkit http://www.educationendowmentfoundation.org.uk/toolkit/

5 ∂ Impact vs cost Cost per pupil Effect Size (months gain) £0 0 8 £1000 Meta-cognitive Peer tutoring Early Years 1-1 tuition Homework (Secondary) Teaching assistants Mentoring Summer schools After school Aspirations Performance pay Smaller classes Setting Most promising for raising attainment May be worth it Small effects / high cost Feedback Phonics Homework (Primary) Collaborative Small gp tuition Parental involvement Individualised learning ICT Behaviour Social www.educationendowmentfoundation.org.uk/toolkit

6 ∂  Some things that are popular or widely thought to be effective are probably not worth doing –Ability grouping (setting); After-school clubs; Teaching assistants; Smaller classes; Performance pay; Raising aspirations  Some things look ‘promising’ –Effective feedback; Meta-­cognitive and self regulation strategies; Peer tutoring/peer ‐ assisted learning strategies; Homework Key messages

7 ∂ Clear, simple advice:  Choose from the top left  Go back to school and do it 7 For every complex problem there is an answer that is clear, simple, and wrong H.L. Mencken

8 ∂ Why not?  We have been doing some of these things for a long time, but have generally not seen improvement  Research evidence is problematic –Sometimes the existing evidence is thin –Research studies may not reflect real life –Context and ‘support factors’ may matter  Implementation is problematic –We may think we are doing it, but are we doing it right? –We do not know how to get large groups of teachers and schools to implement these interventions in ways that are faithful, effective and sustainable 8

9 So what should we do? 9

10 ∂ Four steps to improvement  Think hard about learning  Invest in good professional development  Evaluate teaching quality  Evaluate impact of changes

11 1. Think hard about learning

12 ∂ Impact vs cost Cost per pupil Effect Size (months gain) £0 0 8 £1000 Meta-cognitive Peer tutoring Early Years 1-1 tuition Homework (Secondary) Teaching assistants Mentoring Summer schools After school Aspirations Performance pay Smaller classes Setting Most promising for raising attainment May be worth it Small effects / high cost Feedback Phonics Homework (Primary) Collaborative Small gp tuition Parental involvement Individualised learning ICT Behaviour Social www.educationendowmentfoundation.org.uk/toolkit

13 ∂ 1.Which strategies/interventions are very surprising (you really don’t believe it)? 2.Which strategies/interventions can you explain why they do (or don’t) improve attainment? 3.Which strategies/interventions o you want to know more about? 13

14 ∂ Poor Proxies for Learning  Students are busy: lots of work is done (especially written work)  Students are engaged, interested, motivated  Students are getting attention: feedback, explanations  Classroom is ordered, calm, under control  Curriculum has been ‘covered’ (ie presented to students in some form)  (At least some) students have supplied correct answers, even if they –Have not really understood them –Could not reproduce them independently –Will have forgotten it by next week (tomorrow?) –Already knew how to do this anyway 14

15 ∂ Learning happens when people have to think hard A better proxy for learning?

16 ∂ Hard questions about your school  How many minutes does an average pupil on an average day spend really thinking hard?  Do you really want pupils to be ‘stuck’ in your lessons?  If they knew the right answer but didn’t know why, how many pupils would care? 16

17 ∂ True or false? 1.Reducing class size is one of the most effective ways to increase learning [evidence] [evidence] 2.Differentiation and ‘personalised learning’ resources maximise learning [evidence] [evidence] 3.Praise encourages learners and helps them persist with hard tasks [evidence] [evidence] 4.Technology supports learning by engaging and motivating learners [evidence] [evidence] 5.The best way to raise attainment is to enhance motivation and interest [evidence] [evidence] 17

18 2. Invest in effective CPD

19 ∂ How do we get students to learn hard things? Eg  Place value  Persuasive writing  Music composition  Balancing chemical equations Explain what they should do Demonstrate it Get them to do it (with gradually reducing support) Provide feedback Get them to practise until it is secure Assess their skill/ understanding

20 ∂ How do we get teachers to learn hard things? Eg  Using formative assessment  Assertive discipline  How to teach algebra Explain what they should do

21 ∂  Intense: at least 15 contact hours, preferably 50  Sustained: over at least two terms  Content focused: on teachers’ knowledge of subject content & how students learn it  Active: opportunities to try it out & discuss  Supported: external feedback and networks to improve and sustain  Evidence based: promotes strategies supported by robust evaluation evidence What CPD helps learners? Do you do this?

22 3. Evaluate teaching quality

23 ∂ Improving Teaching TTeacher quality is what matters WWe need to focus on teacher learning TTeachers learn just like other people –B–Be clear what you want them to learn –G–Get good information about where they are at –G–Give good feedback 23

24 ∂ Why monitor teaching quality?  Good evidence of (potential) benefit from –Performance feedback (Coe, 2002) –Target setting (Locke & Latham, 2006) –Accountability (Coe & Sahlgren, 2014)  Individual teachers matter most  Teachers typically stop improving after 3-5 years  Everyone can improve  Judging real quality/effectiveness is very hard –Multidimensional –Not easily visible –Confounded 24

25 ∂ Monitoring the quality of teaching  Progress in assessments –Quality of assessment matters (cem.org/blog)cem.org/blog –Regular, high quality assessment across curriculum (InCAS, INSIGHT)InCAS INSIGHT  Classroom observation –Much harder than you think! (cem.org/blog)cem.org/blog –Multiple observations/ers, trained and QA’d  Student ratings –Extremely valuable, if done properly (http://www.cem.org/latest/student-evaluation-of-teaching-can- it-raise-attainment-in-secondary-schools)http://www.cem.org/latest/student-evaluation-of-teaching-can- it-raise-attainment-in-secondary-schools  Other –Parent ratings feedback –Student work scrutiny –Colleague perceptions (360) –Self assessment –Pedagogical content knowledge 25

26 ∂ Teacher Assessment  How do you know that it has captured understanding of key concepts? –vs ‘check-list’ (eg ‘;’=L5, 3 tenses=L7)  How do you know standards are comparable? –Across teachers, schools, subjects –Is progress good?  How have you resolved tensions from teacher judgments being used to judge teachers? –Summative assessment includes teacher feedback 26

27 ∂ Evidence-Based Lesson Observation  Behaviour and organisation –Maximise time on task, engagement, rules & consequences  Classroom climate –Respect, quality of interactions, failure OK, high expectations, growth mindset  Learning –What made students think hard? –Quality of: exposition, demonstration, scaffolding, feedback, practice, assessment –What provided evidence of students’ understanding? –How was this responded to? (Feedback) 27

28 4. Evaluateimpact of changes

29 ∂ School ‘improvement’ often isn’t  School would have improved anyway –Volunteers/enthusiasts improve: misattributed to intervention –Chance variation (esp. if start low)  Poor outcome measures –Perceptions of those who worked hard at it –No robust assessment of pupil learning  Poor evaluation designs –Weak evaluations more likely to show positive results –Improved intake mistaken for impact of intervention  Selective reporting –Dredging for anything positive (within a study) –Only success is publicised (Coe, 2009, 2013)

30 ∂  Clear, well defined, replicable intervention  Good assessment of appropriate outcomes  Well-matched comparison group EEF DIY Evaluation Guide Key elements of good evaluation What could you evaluate?

31 ∂ RISE: Research-leads Improving Students’ Education  With Alex Quigley, John Tomsett, Stuart Kime  Based around York  RCT: 20 school leaders trained in research, 20 controls  Contact: aj.quigley@huntington-ed.org.uk 31

32 Summary … 1.Think hard about learning 2.Invest in good CPD 3.Evaluate teaching quality 4.Evaluate impact of changes 1.Think hard about learning 2.Invest in good CPD 3.Evaluate teaching quality 4.Evaluate impact of changes Robert.Coe@cem.dur.ac.uk www.cem.org @ProfCoe


Download ppt "Evidence-based use of the pupil premium Robert Coe Durham Leadership Conference, 26 June"

Similar presentations


Ads by Google