Data for Monitoring Target Setting and Reporting

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

Peter Finlayson Quality improvement Officer February 2013.
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Joining the dots Supporting & challenging your school Governor Dashboard 1 Paul Charman Director of Strategy & Operations, FFT Chair of Governors, Dyson.
Experienced Users of MidYIS and Yellis Ian Sanderson
© New Media Learning Ltd.
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Introduction to Value-Added Data Dr Robert Clark.
Secondary Information Systems
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
M.Greenaway. Analysing Data.
Introduction to CEM and Computer Adaptive Assessments
Feyisa Demie Adviser for school self-evaluation and
SEN Capacity Building Conference Making Effective Use of Data Damian Harvey – C2k.
Year 7 Settling – in Evening. Assessment Process and Ability Grouping.
STATISTICS IN SCHOOLS Vinay Bhardwaj Kim Jackson Catherine Rich Amy Zaffarese.
New Users of MidYIS and Yellis Ian Sanderson
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Introduction to Value-Added Data Dr Robert Clark.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
Using data from the whole school Perspective CEM CONFERENCE EXETER Geoff Davies Day 2 Final session 28 th February 2013.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Welcome The challenges of the new National Curriculum & Life without Levels.
YEAR 10 GUIDANCE EVENING Progress 8 The Government have introduced a new value-added performance measure which will replace 5+ A*-C inc Maths/English.
The Learner Achievement Tracker (LAT)
Widening Participation in Higher Education: A Quantitative Analysis Institute of Education Institute for Fiscal Studies Centre for Economic Performance.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
CEM Conference 8 June 2011 Using CEM data: Convincing colleagues, pupils & parents.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
Assessment at KS4 Bury C of E High School Engaging Parents Information.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Reforms to Primary Assessment and Accountability
Holy Family Catholic School Assessment in 2015/6 Plan for the Workshop: 1)Why Levels have gone 2) Our New Assessment Model 3) Monitoring and Assessment.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Life without Levels Assessing children without levels.
Reforms to Primary Assessment and Accountability Catherine Wreyford, Department for Education October 2015.
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
SAT’s Information Parent’s Meeting 10 th February February 2016.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Assessment at CPS A new way of working. Background - No more levels New National Curriculum to be taught in all schools from September 2014 (apart from.
Lostock Gralam CE Primary School Parent Information Meeting January 2016.
Data for Target Setting and Monitoring Course: Using CEM data in Practice Day 2 Session 3 Wed 30 th May 2012 Peter Hendry: CEM Consultant
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Empowering Informed Decisions Using RAISEonline data to improve governor effectiveness Dave Thomson Head of Data Analysis, RM Education.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
SIMS Assessment Project West Sussex Schools Judith Matson SCAS & Jacky Gray General Adviser Challenge & Performance.
Setting Consistent Appraisal Targets. Starter: Think about targets that you have been set How did you feel? DepressedScaredStimulatedWorriedChallenged.
A Curriculum for the future The new Secondary Curriculum.
Good Morning and welcome. Thank you for attending this meeting to discuss assessment of learning, pupil progress and end of year school reports.
Monitoring Attainment and Progress from September 2016 John Crowley Senior Achievement Adviser.
Hertfordshire County Council The Role of the Secondary Assessment Co-ordinator Day One 5 th July 2005.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Target Setting and monitoring in year 7.  To provide students with a meaningful aim for the end of year and end of GCSEs  To allow students to track.
ACCESS for ELLs Score Changes
Analysing the Primary RAISE
Charlton Kings Junior School
Curriculum, Assessment, Data, Progress, Reporting and Tracking.
Experienced Users of Alis
Introduction to CEM Secondary Pre-16 Information Systems
Assessment and Reporting at SHOM
Proposal for changes to KS3 Monitoring and Reporting
Course: CEM Information Systems for Beginners and New Users
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

Data for Monitoring Target Setting and Reporting CEM CONFERENCE EXETER Data for Monitoring Target Setting and Reporting Day 2 Session 2 28th February 2013 Geoff Davies geoffdaviescem@yahoo.co.uk

Unintelligent target setting Health service examples Bankers pay! Prison Service What you measure is what you get! Some Key stage assessment targets Attendance targets Concentration on only certain borderlines Tick box mentality Payment by results English Bac? Some targets are ‘cancelled’ before the event!

‘Intelligent’ Target Setting involves: Using reliable predictive data Points and/or Grades Nationally standardised baseline Independent sector standardised baseline (MidYIS only) Prior value-added (MidYIS, Yellis and Alis) Chances graphs Dialogue between the users: teachers, parents and students? (empowering, ownership, and taking responsibility) The use of professional judgement……..

There is wide-ranging practice using CEM data to set student, department and institution targets. Increasingly sophisticated methods are used by schools and colleges. The simplest model is to use the student grade predictions. These then become the targets against which student progress and achievement can be monitored. Theoretically, if these targets were to be met, residuals would be zero so overall progress would be average. The school/college would be at the 50th percentile.

More challenging targets would be those based on the basis of history More challenging targets would be those based on the basis of history. For example. Where is the school/college now? Where is your subject now? If your subject value added history shows that performance is in the upper quartile it may be sensible to adjust targets. This may have the effect of raising point predictions between 0.2-0.5 of a grade. This would be a useful starting point, but it would not be advisable to use the predictions for below average subjects, which might lead to continuing under achievement.

Paris97.xls

YELLIS PREDICTIONS FOR MODELLING FOUR approaches YELLIS GCSE Predictions YELLIS GCSE Predictions + say 0.5 a grade Prior value added analysis based on 3 year VA per department 75th percentile analysis

Setting targets: one suggested approach Discuss previous value added data with each HoD Start with an agreed REALISTIC representative figure based previous years (3 ideally) of value added data add to each pupil prediction, and convert to grade (i.e. in-built value added) By discussion with students and using professional judgement, AND THE CHANCES GRAPHS, adjust target grade calculate the department’s target grades from the addition of individual pupil’s targets

SHARED DATA eg Year 10 French class

ALIS You are the subject teacher and are discussing possible A2 target grades with individual students. You are about to talk to Jonathan who achieved an average GCSE score of 6.22. This gives a statistical prediction=28.35x6.22-99.57= 77 UCAS points using the regression formula at A2 for this subject (Grade C at A2). Assume that the computer adaptive baseline test confirms this prediction. Chances graphs for this subject are shown showing the percentage of students with similar profiles achieving the various grades. Individual chances graph for Jonathan 1

a) Why are these two chances graphs different? ---------------------------------------------------------------------------------------------------------- (b) ‘Most candidates with Jonathan’s GCSE background score achieved a C in my subject last year so Jonathan’s target grade should be a C’. What are the weaknesses of this statement? ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ (c) What other factors should be taken into consideration apart from chances graph data, when determining a target grade? ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 1

The difference in the chances graphs is that one of them provides for a range of GCSE scores whilst the other is linked to Jonathan’s individual average GCSE score of 6.22. The strength of the chances graph is that it shows more than a bald prediction. True, most students starting from an average GCSE score like Jonathan did achieve a C grade at A2 in examinations for this subject. However the probability of a B grade is also high since his score was not at the bottom of this range. This might be reflected too if the department also has a history of high prior value added. The converse is also true with a D grade probability warning against complacency. Students are not robots who will always fit with statistics so it is dangerous to make sweeping statements based on one set of results. As well as looking at the prediction you should use the chances graph as a starting point, with your professional judgement taking into and account factors such as his and the departments’ previous performance in the subject, his attitude to work what he is likely to achieve based on your own experience. You might want to start with the most popular outcome grade C and use your judgement to decide how far up (or down!) to go. He may be a very committed student and if the department has achieved high value added in the past, an A/B grade may be more appropriate though A* looks unlikely. If you are using aspirational targets for psychological reasons with students then A may be appropriate even though it less probable than B/C.

Chances graphs MidYIS and YELLIS Situation You are a tutor to a Year 10 pupil and you wish to help him/her to set target grades. Here is a chances graph based on the pupil’s Year 7 MidYIS score (114) and one based on the Year 10 Yellis test (58%) Yellis Chances Graph MidYIS Chances Graph This graph is based on the pupil’s exact midyis score, adjusted to include the school’s previous value-added performance. This graph is based on one ability band and has no value-added adjustment. 2

a) What do the graphs tell you about this pupil’s GCSE chances in this subject (Maths)? b) What could account for the differences between the two graphs and are these important? IMPORTANT FOR STAFF AND STUDENTS TO UNDERSTAND THE DIFFERENCE Fixed Mindset: [My intelligence is fixed and tests tell me how clever I am.] This graph tells me I’m going to get a B, but I thought I was going to get an A. I’m obviously not as clever as I hoped I was and so the A and A* grades I’ve got for my work so far can’t really be true. Growth Mindset: [My intelligence can develop and tests tell me how far I have got.] This tells me that most people with the same MidYIS score as me achieved a B last year, but I think I have a good chance of an A and I know that my work has been about that level so far so I must be doing well. What do I need to do to be one of the 10% who gets an A*? How was this information produced? The MidYIS graphs are produced using the predictions spreadsheet. Select the pupil(s) and subject(s) to display or print using the GCSE Pupil Summary 1 tab. Adjustments for value-added can be made for individual subjects on the GCSE Preds tab. The Yellis graphs for all GCSE subjects (showing all four ability bands) can be downloaded from the Yellis+ website. 2

From Midyis The most likely grade is a B (35%) but remember there is a 65% (100-65) chance of getting a different grade but also a 75% (35+30+10) chance of the top three grades. From Yellis The most likely grade appears to be a C but remember that the band has been decided over a range, not for the individual student and this pupils score is near the top of that range, 58 compared with 60.8. It has also not been adjusted for this school’s prior value added in the past. In an interview with the student one has to use your professional judgement about that student, taking everything into account. Certainly the Yellis chart warns against complacency, but if the school has a strong value added history it is better to rely in this case on the Midyis chart for negotiating a target. Grade A is a fair aspirational target for the student but accountability for a teacher cannot fairly be judged by not achieving this grade with this student. Even a very good teacher may only achieve B or C with this student. Can the aspirational target set for the student be the same as that used for staff accountability purposes? There is a trap here. 3

Case study no.1: setting targets. Uses valid and reliable data e.g chances graphs Involves sharing data with the students Gives ownership of the learning to the student Enables a shared responsibility between student, parent(s)/guardian, and the teacher Encourages professional judgement Leads to the teachers working smarter and not harder Leads to students being challenged and not ‘over supported’, thus becoming independent learners…

20

21

Most likely grade Prediction/expected grade: 5.4 grade B/C Student no.1 GCSE Geography Prediction/expected grade: 5.4 grade B/C Most likely grade

Most likely grade Prediction/expected grade: 6.2 grade B Student no.1 GCSE Geography Prediction/expected grade: 6.2 grade B Most likely grade

COMMENTS? 24

Monitoring Student Progress Monitoring students’ work against target grades is established practice in schools and colleges, and there are many diverse monitoring systems in place. Simple monitoring systems can be very effective Current student achievement compared to the target grade done at predetermined regular intervals to coincide with, for example internal assessments/examinations Designated staff having an overview of each student’s achievements across subjects All parents being informed of progress compared to targets Review of progress between parents and staff Subject progress being monitored by a member of the management team in conjunction with the head of subject/department A tracking system to show progress over time for subjects and students

Pupil Tracking

Tracking at departmental level for one student

Traditional mark book approach

Targets for learning…. reporting to pupils ..\..\MidYis Proformanonames.xls

SOME TRAPS TO AVOID

TRAP 1 Y6 class of 10 pupils Each predicted a Level 4 Each with 90% chance of success Best estimate is that one will not make it Best estimate = 90% L4+

PSYCHOLOGICAL EFFECT ON PUPILS TRAP 2 PSYCHOLOGICAL EFFECT ON PUPILS THE C/D boundary problem at GCSE PUPILS NEED HIGH EXPECTATIONS Teachers who set high expectations should not be criticised for setting them slightly too high at times. What are the implications for the performance management of teachers?

MONITORING PITFALLS Tracking developed ability measures over time. Looking at average standardised residuals for teaching sets. Effect of one result in a small group of students

Note the regression towards the mean pattern. See next two slides SOSCA Reading YEAR 9 SOSCA Reading Band LONDON READING YEAR 7 DIFFERENCE NEGATIVES 104 B 129 -25 99 120 -21 108 128 -20 111 -18 90 C 101 118 -17 88 -16 122 A 137 -15 85 D -14 103 117 71 83 -12 115 127 106 96 107 -11 100 -10 94 89 97 -9 119 SOSCA Reading YEAR 9 SOSCA Reading Band LONDON READING YEAR 7 DIFFERENCE POSITIVES Sample results from spreadsheet comparing performance in reading in Year 7 and year 9 on two different tests for cohorts of 2007 and 2008. Correlation is 0.75. Note the regression towards the mean pattern. See next two slides 127 A 108 19 122 102 20 115 95 97 C 77 121 100 21 118 125 104 112 91 146 129 107 22 90 111 B 89 113 23 134 110 24 25 116 109 84 139 32 130 33 141 106 35

REGRESSION TOWARDS THE MEAN Pupils with high MidYIS scores tend to have high SOSCA scores but not quite as high. Similarly pupils with low MidYIS scores tend to have low SOSCA scores, but not quite as low.  It is a phenomenon seen in any matched dataset of correlated and normally-distributed scores, the classic example is a comparison of fathers' and sons' heights.  Regression lines reflect this phenomenon - if you look at the predictions used in the SOSCA value-added you can see that for pupils with high MidYIS scores their predicted SOSCA scores are lower than their MidYIS scores, whereas for pupils with low MidYIS scores their predicted SOSCA scores are higher than their MidYIS scores.  

CLASS REVIEW BEWARE PITFALLS INTERPRETATION The Critchlow Effect Teaching Sets

SUBJECT M

Standardised Residual MONITORING MIDYIS YEAR 7 TO SOSCA SCIENCE SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 95 97 2 0.2 B 105 98 -1 -0.1 C M 102 96 -2 -0.2 D 72 80 76 -4 -0.4 E 152 126 142 16 1.5

Standardised Residual MONITORING MIDYIS YEAR 7 TO SOSCA READING SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 96 91 -5 -0.5 B 105 100 115 15 1.7 C M 102 98 87 -11 -1.2 D 72 80 7 0.8 E 152 128 134 6 0.6

Standardised Residual MONITORING MIDYIS YEAR 7 TO SOSCA MATHS SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 93 96 3 0.4 B 105 97 -1 -0.1 C M 102 95 86 -10 -1.1 D 72 73 1 0.1 E 152 134 121 -13 -1.5

Data for Monitoring Target Setting and Reporting CEM CONFERENCE EXETER Data for Monitoring Target Setting and Reporting Day 2 Session 2 28th February 2013 Geoff Davies geoffdaviescem@yahoo.co.uk