Information for parents on the effectiveness of English secondary schools for different types of pupil Lorraine Dearden, John Micklewright and Anna Vignoles.

Slides:



Advertisements
Similar presentations
Maintaining Momentum in Primary School: messages from research and evaluation Presentation Prepared for the Social Mobility and Life Chances Forum Maintaining.
Advertisements

The limitations of using school league tables to inform school choice George Leckie and Harvey Goldstein Centre for Multilevel Modelling University of.
Rebecca Allen and Anna Vignoles Institute of Education, University of London and Presentation to Bristol Choice.
Measuring School Effectiveness: Introduction and Background Lorraine Dearden, Institute of Education, University of London.
The choice between fixed and random effects models: some considerations for educational research Claire Crawford with Paul Clarke, Fiona Steele & Anna.
Sue Rogers Director of Education KGA Presentation GCSE and Post 16 plus Closing the Gap.
Robert Coe Neil Appleby Academic mentoring in schools: a small RCT to evaluate a large policy Randomised Controlled trials in the Social Sciences: Challenges.
1 Contextual Value Added and Data For Dummies The mystery explained.
NCRM Annual Meeting January People Lorraine Dearden (Dir)John ‘Mac’ McDonald Sophia Rabe-HeskethAnna Vignoles Kirstine HansenNikos Tzavidis James.
Multilevel Modelling of PLASC data Harvey Goldstein University of Bristol.
Progress 8: An Explanation
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Beyond test scores: the role of primary schools in improving multiple child outcomes Claire Crawford and Anna Vignoles Institute of Education, University.
What influences English and Mathematics attainment at age 11? Evidence from the EPPSE project.
Key Stage 4 Accountability Measures from 2016 Heads’ Huddle Bishop Fox 19 November 2013.
Curriculum and Performance Measures ….an update.  Changes to content and assessment at every Key Stage  Key changes coming up  Possible considerations:
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
“Applications of the NPD in Academic Research: Some Examples from the Centre for the Economics of Education” Joan Wilson, CEE, CEP, IoE DCSF, York, Friday.
Key Characteristics of Effective Schools
Using M-Quantile Models to measure the Contextual Value- Added of Schools in London Nikos Tzavidis (University Manchester) James Brown (Institute of Education)
The Changing Economic Advantage from Private School* Francis Green *Talk based on: Green, F., S. Machin, R. Murphy and Y. Zhu (2010). The Changing Economic.
BECTa ICT Research Conference – June 2002 Intro  Survey Details  Secondary Surveys conducted July 2000 and June/July 2001  Sponsored by Fischer Family.
Evaluating Teacher Performance Daniel Muijs, University of Southampton.
Feyisa Demie Adviser for school self-evaluation and
STATISTICS IN SCHOOLS Vinay Bhardwaj Kim Jackson Catherine Rich Amy Zaffarese.
How Big was Response Bias in England to PISA 2003? John Micklewright & Sylke V. Schnepf July 2008.
Level 7 Research Project Laura Bridge Robert Owen EBITT.
Key issues from yesterday and for panel Lorraine Dearden.
1 Do UK higher education students overestimate their starting salary? John Jerrim Institute of Education, University of London.
1 Data Linkage for Educational Research Royal Statistical Society March 19th 2007 Andrew Jenkins and Rosalind Levačić Institute of Education, University.
1 New Performance Tables: 2011 and Beyond… Cathy Christieson Head of School Performance Data Unit 13 th May 2011.
Widening Participation in Higher Education: A Quantitative Analysis Institute of Education Institute for Fiscal Studies Centre for Economic Performance.
Mike Treadaway Director of Research Fischer Family Trust Using FFT Live Secondary Schools.
IFS When you are born matters: the impact of date of birth on child cognitive outcomes in England Claire Crawford, Lorraine Dearden & Costas Meghir Institute.
The Choice Between Fixed and Random Effects Models: Some Considerations For Educational Research Clarke, Crawford, Steele and Vignoles and funding from.
© Institute for Fiscal Studies, 2008 When you are born matters: the impact of date of birth on child cognitive outcomes in England Claire Crawford, Lorraine.
Mike Treadaway Director of Research Fischer Family Trust.
Using Performance Data to Improve Governor Effectiveness Julie Johnson Assistant Director of Schools (Primary) Diocese of Shrewsbury Department of Education.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
Using Performance Data to Improve Governor Effectiveness
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Assessment meeting for parents of children in Y1 to Y6 Wednesday 9 th December 6pm & Thursday 10th December 9:15am.
RAISEonline Data Analysis for Governors and Staff Beaver Road Primary School Clive Davies OBE Beaver Road (c)
Primary Assessment Arrangements for 2016 January 2016.
Broomfields Junior School Y6 ASSESSMENT INFORMATION EVENING THURSDAY 28 TH JANUARY 2016.
RAISEonline 2011 The same broad approach to the new reports is used for both primary and secondary analyses.
SEF Describing good or better achievement and standards What is laid down, ordered, factual is never enough to embrace the whole truth: life spills over.
What is CVA? ….Contextual Value Added. Attainment Eg. 62% 5+ A*- C…the headline figure Attainment – where we are now (eg 62%) Achievement – the progress.
 The introduction of the new assessment framework in line with the new curriculum now that levels have gone.  Help parents understand how their children.
The New Accountability Measures
RAISEonline David Robinson & Martin Kaliszewski.
Changes to assessment and reporting of children’s attainment Monday 12 th October A guide for Parents.
A case study. Content School context Challenges Outcomes Curriculum pathways What works in our context Process Ofsted & progression to HE – a view.
7 th June 2012 Is it better to fail than to succeed? A quantitative analysis of ‘just’ failing an English school inspection Rebecca Allen, Institute of.
Hertfordshire County Council The Role of the Secondary Assessment Co-ordinator Day One 5 th July 2005.
FFT Data Analysis Project Who wants to be in the top 1 percent?
Analysing the Primary RAISE
Progress 8 and Attainment 8:
Daniel Muijs Saad Chahine
Who wants to be in the top 1 percent?
Methodological Challenges in School Effectiveness Research
Governors’ Update RaiseOnline & Fischer Family Trust
What do the data and research really tell us?
RAISEonline Data Analysis for Governors and Staff
Resource Slides.
Progress 8 and Attainment 8:
Progress 8: an explanation
Gender and Educational Attainment in Schools
Let’s make education fairer: Disadvantage, school intakes and outcomes
Andrew Jenkins and Rosalind Levačić
Presentation transcript:

Information for parents on the effectiveness of English secondary schools for different types of pupil Lorraine Dearden, John Micklewright and Anna Vignoles Institute of Education, University of London

Motivation DfE provides information on schools and pupil achievement in a number of ways, including raw scores DCSF also measures school performance with a contextualised value added model, which takes account of the different pupil intakes of schools (Ray, 2006) – better guide to school effectiveness than raw GCSE scores, which capture differences in school intake characteristics But evidence that parents look more at raw scores than CVA (Hansen and Machin, 2010) Our first objective is to try to find a simple measure which is easy to understand for parents

Motivation Also assumes an average CVA score of a school is meaningful as a summary statistic of the performance of a school Yet the literature has shown schools to be differentially effective – Jesson and Gray, 1991; Teddlie and Reynolds, 2000; Thomas et al. 1997; Wilson and Piebalga 2009 Our second objective is to try to provide a simple measure which allows for differential effectiveness

Research aims If schools are differentially effective then parents need to know the value added by a school for children with similar prior attainment to their own child We propose a measure that would do this Abstracts from issues of sorting into schools and mobility

Key research questions To what extent do summary measures of school performance, such as CVA, hide differential performance of schools for different types of children? Are simple descriptive measures of the differential effectiveness of a school good enough approximations?

Literature We contribute to the following literatures: – technical limitations of published school performance measures (Goldstein and Spiegelhalter 1996, Leckie and Goldstein 2009) – measurement of differentially effective schools (Jesson and Gray, 1991; Teddlie and Reynolds, 2000; Thomas et al. 1997; Wilson and Piebalga 2009). – incentives for schools when using performance measures to improve school accountability (Ladd and Walsh, 2000)

Methodology Divide pupils into prior attainment groups on the basis of KS2 scores (parents are only given group information) Calculate various measures of individual performance at GCSE for pupils in each of the prior attainment groups at KS2 For each school we average across the values for its pupils in each prior attainment group - 8 summary statistics of pupil performance. If these group averages vary significantly - school is differentially effective.

Data Integrated National Pupil Database (NPD)/Pupil Level School Census (PLASC) Two cohorts of pupils in year 11 (age 16) in 2006/7 and 2007/8. State school pupils for whom we have KS2 test scores

Prior attainment groups Key Stage 2’ (KS2) English and mathematics attainment (age of 10/11 year 6) Expected level of achievement is 4 5 x 5 combinations of mathematics and English into 8 groups Eight groups are below level 3; level 3-3; level 4-3; level 3-4; level 4-4; level 4-5; level 5-4 and level 5-5.

KS2 prior attainment groups for year 11 children in state secondary schools in 2006/7 and 2007/8 KS2 groupFrequency% (cumul.) Below level 373, , , , , , , , Total1,116,

Outcomes Capped GCSE scores Based on pupil’s 8 best GCSE scores Points achieved in English and mathematics GCSE added to capped score Ensures that essential academic skills in mathematics and English are included – If already present in the capped score, this implies that maths and English enter our measure twice This augmented capped score has recently been adopted in official CVA model

Adjusted raw score measure Individual’s KS4 score minus the mean of other individuals in the KS2 prior attainment group Similar to the value-added (VA) measure used by DCSF – We use the mean group score rather than the median – We use prior attainment groups rather than a univariate score – We do not include science – Our KS4 measure is the capped 8 score augmented by English and maths rather than the straight capped 8 score. DCSF summarised school performance by taking the average of these individual-level differences across all pupils in the school. We calculate 8 separate averages for each school, one for each prior attainment group.

VA and Adjusted VA measures VA measure then allows fully for prior attainment by estimating the following equation by group to predict expected KS4 KS4 ig = a g + b g.KS2 ig + u ig g = 1..8 groups CVA measure then allows for contextual factors by adding controls – gender, month of birth, IDACI, FSM, EAL, SEN, ethnicity

AbsoluteRelative Group adjusted raw score (crudely allows for prior attainment group) 1. diff KS4 = KS4-KS4 mean metric: KS4 points 2. Z KS4 = [KS4-KS4 mean ]/KS4 SD metric: group KS4 SDs VA (value added controlling for prior KS2 score) 3. residual of regression of KS4 on KS2 metric: KS4 points 4. residual of regression of Z KS4 on Z KS2, where latter defined analogously [equivalent to measure 3 divided by KS4 SD ] metric: group KS4 SDs Adjusted VA (value added with covariates) 5. as for measure 3 but with controls in regression metric: KS4 points 6. as for measure 4 but with controls in regression metric: group KS4 SDs

Groups Group adjusted raw scoreVA Covariate Adjusted VA No. Obs.% total Whole School - All Group [12.270][12.352][11.981] Group [12.695][12.603][11.899] Group [14.580][14.360][13.312] Group [12.970][11.968][11.721] Group [ 4.621][ 4.409][ 4.267] Group [ 9.064][ 8.544][ 8.238] Group [ 7.055][ 6.491][ 6.778] Group [ 5.783][ 5.231][ 4.857] P-value (Groups same)

Groups Group adjusted raw scoreVA Covariate adjusted VA No. Obs Whole School - All [ 4.478][ 3.517][ 3.363] Group [22.419][22.059][19.209] Group [15.359][14.887][13.988] Group [19.419][20.035][19.396] Group [ 8.902][ 8.012][ 8.131] Group [ 5.652][ 5.256][ 4.933] Group [ 9.956][ 9.043][ 8.297] Group [11.686][11.157][11.487] Group [ 6.374][ 6.241][ 6.066] Groups av [ 3.612][ 3.452][ 3.307] P value

How common is differential effectiveness? This slide shows the % of schools that are differentially effective, as measured by a significant difference (at the 5% level) in the means of the measures across the prior attainment groups.

Differential effectiveness and selective schools This slide shows the % of schools that are differentially effective including and excluding selective schools.

Robustness Test This slide shows the % of schools that are differentially effective as measured by a significant difference at both the 5% level and the 1% level in the means of the measures across the prior attainment groups.

Rank correlations within group

Value Added rank correlations excluding selective schools Group 22Group 33Group 44Group 55 Group Group Group Group Average

Robustness checks Sample size issues so re-estimated results where n>10 in each prior attainment group in each school Robustness to missing data problems – using teacher predictions Things to do.... Multiple comparisons with the best/ comparison statistics Noise in rank correlations

Conclusions Schools are differentially effective but estimates are sensitive to how this is measured – 30-40% of schools are differentially effective at 5% level of significance – 20% of schools are differentially effective at 1% level of significance – estimates vary somewhat across measures (raw scores, VA, adjusted VA) though there is high correlation between measures Even the most conservative estimate suggests one in six schools are differentially effective

Conclusions For school league tables (and hence parents) this differential effectiveness would seem to matter – the rank of schools varies substantially for different prior attainment groups (correlation across groups ) – this of course abstracts from the statistical significance of the differences But the results suggest that for a non trivial proportion of schools parents need information on value added by school for a particular prior attainment group

Implications Simple measures also suggest significant amounts of differential effectiveness but as estimates do vary by measure we need to specify preferred measure Results indicate different rankings of schools for different ability groups but further work needed on multiple comparisons and identifying significant differences in rank correlations Implications for policy: a sizeable minority of schools add different value for pupils with different prior attainment and there are simple measures that can communicate this to parents.

References Goldstein, H. and Spiegelhalter, D. J. (1996) League tables and their limitations: statistical issues in comparisons of institutional performance. Journal of the Royal Statistical Society: Series A, 159, Goldstein H, Rasbash J, Yang M, Woodhouse, G, Pan H, Nuttall, D, and Thomas, S (1993) ‘A multilevel analysis of school examination results’ Oxford Review of Education, 19: Gorard, S. (2010) All evidence is equal: the flaw in statistical reasoning, Oxford Review of Education, (forthcoming). Jesson, D and Gray J (1991). Slants on Slopes: Using Multi-level Models to Investigate Differential School Effectiveness and its Impact on Pupils’ Examination Results. School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice. 2(3): Ladd and Walsh (2000) ‘Implementing value-added measures of school effectiveness: getting the incentives right’, Economics of Education Review, vol. 2 part 1 pp. 1–17. Leckie, G. and Goldstein, H. (2009) The limitations of using school league tables to inform school choice. Journal of the Royal Statistical Society: Series A. vol. 127 part 4, pp Ray, A. (2006) School Value Added Measures in England. Paper for the OECD Project on the Development of Value- Added Models in Education Systems. London, Department for Education and Skills Teddlie, C. and Reynolds, D. (2000) The International Handbook of School Effectiveness Research, Reynolds, Falmer Press, London and New York. Thomas, S, Sammons, P, Mortimore, P and Smees, R, (1997) ‘Differential secondary school effectiveness : examining the size, extent and consistency of school and departmental effects on GCSE outcomes for different groups of students over three years’, British Educational Research Journal, no. 23, part 4, p Wilson D and Piebalga A (2008) ‘Performance measures, ranking and parental choice: an analysis of the English school league tables’ International Public Management Journal, 11: