Download presentation
Published byCharity Jordan Modified over 9 years ago
1
Data for Monitoring Target Setting and Reporting
CEM CONFERENCE EXETER Data for Monitoring Target Setting and Reporting Day 2 Session th February 2013 Geoff Davies
2
Unintelligent target setting
Health service examples Bankers pay! Prison Service What you measure is what you get! Some Key stage assessment targets Attendance targets Concentration on only certain borderlines Tick box mentality Payment by results English Bac? Some targets are ‘cancelled’ before the event!
3
‘Intelligent’ Target Setting involves:
Using reliable predictive data Points and/or Grades Nationally standardised baseline Independent sector standardised baseline (MidYIS only) Prior value-added (MidYIS, Yellis and Alis) Chances graphs Dialogue between the users: teachers, parents and students? (empowering, ownership, and taking responsibility) The use of professional judgement……..
4
There is wide-ranging practice using CEM data to set student, department and institution targets. Increasingly sophisticated methods are used by schools and colleges. The simplest model is to use the student grade predictions. These then become the targets against which student progress and achievement can be monitored. Theoretically, if these targets were to be met, residuals would be zero so overall progress would be average. The school/college would be at the 50th percentile.
5
More challenging targets would be those based on the basis of history
More challenging targets would be those based on the basis of history. For example. Where is the school/college now? Where is your subject now? If your subject value added history shows that performance is in the upper quartile it may be sensible to adjust targets. This may have the effect of raising point predictions between of a grade. This would be a useful starting point, but it would not be advisable to use the predictions for below average subjects, which might lead to continuing under achievement.
6
Paris97.xls
10
YELLIS PREDICTIONS FOR MODELLING
FOUR approaches YELLIS GCSE Predictions YELLIS GCSE Predictions + say 0.5 a grade Prior value added analysis based on 3 year VA per department 75th percentile analysis
11
Setting targets: one suggested approach
Discuss previous value added data with each HoD Start with an agreed REALISTIC representative figure based previous years (3 ideally) of value added data add to each pupil prediction, and convert to grade (i.e. in-built value added) By discussion with students and using professional judgement, AND THE CHANCES GRAPHS, adjust target grade calculate the department’s target grades from the addition of individual pupil’s targets
12
SHARED DATA eg Year 10 French class
13
ALIS You are the subject teacher and are discussing possible A2 target grades with individual students. You are about to talk to Jonathan who achieved an average GCSE score of This gives a statistical prediction=28.35x = 77 UCAS points using the regression formula at A2 for this subject (Grade C at A2). Assume that the computer adaptive baseline test confirms this prediction. Chances graphs for this subject are shown showing the percentage of students with similar profiles achieving the various grades. Individual chances graph for Jonathan 1
14
a) Why are these two chances graphs different?
(b) ‘Most candidates with Jonathan’s GCSE background score achieved a C in my subject last year so Jonathan’s target grade should be a C’. What are the weaknesses of this statement? (c) What other factors should be taken into consideration apart from chances graph data, when determining a target grade? 1
15
The difference in the chances graphs is that one of them provides for a range of GCSE scores whilst the other is linked to Jonathan’s individual average GCSE score of The strength of the chances graph is that it shows more than a bald prediction. True, most students starting from an average GCSE score like Jonathan did achieve a C grade at A2 in examinations for this subject. However the probability of a B grade is also high since his score was not at the bottom of this range. This might be reflected too if the department also has a history of high prior value added. The converse is also true with a D grade probability warning against complacency. Students are not robots who will always fit with statistics so it is dangerous to make sweeping statements based on one set of results. As well as looking at the prediction you should use the chances graph as a starting point, with your professional judgement taking into and account factors such as his and the departments’ previous performance in the subject, his attitude to work what he is likely to achieve based on your own experience. You might want to start with the most popular outcome grade C and use your judgement to decide how far up (or down!) to go. He may be a very committed student and if the department has achieved high value added in the past, an A/B grade may be more appropriate though A* looks unlikely. If you are using aspirational targets for psychological reasons with students then A may be appropriate even though it less probable than B/C.
16
Chances graphs MidYIS and YELLIS
Situation You are a tutor to a Year 10 pupil and you wish to help him/her to set target grades. Here is a chances graph based on the pupil’s Year 7 MidYIS score (114) and one based on the Year 10 Yellis test (58%) Yellis Chances Graph MidYIS Chances Graph This graph is based on the pupil’s exact midyis score, adjusted to include the school’s previous value-added performance. This graph is based on one ability band and has no value-added adjustment. 2
17
a) What do the graphs tell you about this pupil’s GCSE chances in this subject (Maths)?
b) What could account for the differences between the two graphs and are these important? IMPORTANT FOR STAFF AND STUDENTS TO UNDERSTAND THE DIFFERENCE Fixed Mindset: [My intelligence is fixed and tests tell me how clever I am.] This graph tells me I’m going to get a B, but I thought I was going to get an A. I’m obviously not as clever as I hoped I was and so the A and A* grades I’ve got for my work so far can’t really be true. Growth Mindset: [My intelligence can develop and tests tell me how far I have got.] This tells me that most people with the same MidYIS score as me achieved a B last year, but I think I have a good chance of an A and I know that my work has been about that level so far so I must be doing well. What do I need to do to be one of the 10% who gets an A*? How was this information produced? The MidYIS graphs are produced using the predictions spreadsheet. Select the pupil(s) and subject(s) to display or print using the GCSE Pupil Summary 1 tab. Adjustments for value-added can be made for individual subjects on the GCSE Preds tab. The Yellis graphs for all GCSE subjects (showing all four ability bands) can be downloaded from the Yellis+ website. 2
18
From Midyis The most likely grade is a B (35%) but remember there is a 65% (100-65) chance of getting a different grade but also a 75% ( ) chance of the top three grades. From Yellis The most likely grade appears to be a C but remember that the band has been decided over a range, not for the individual student and this pupils score is near the top of that range, 58 compared with It has also not been adjusted for this school’s prior value added in the past. In an interview with the student one has to use your professional judgement about that student, taking everything into account. Certainly the Yellis chart warns against complacency, but if the school has a strong value added history it is better to rely in this case on the Midyis chart for negotiating a target. Grade A is a fair aspirational target for the student but accountability for a teacher cannot fairly be judged by not achieving this grade with this student. Even a very good teacher may only achieve B or C with this student. Can the aspirational target set for the student be the same as that used for staff accountability purposes? There is a trap here. 3
19
Case study no.1: setting targets.
Uses valid and reliable data e.g chances graphs Involves sharing data with the students Gives ownership of the learning to the student Enables a shared responsibility between student, parent(s)/guardian, and the teacher Encourages professional judgement Leads to the teachers working smarter and not harder Leads to students being challenged and not ‘over supported’, thus becoming independent learners…
20
20
21
21
22
Most likely grade Prediction/expected grade: 5.4 grade B/C
Student no.1 GCSE Geography Prediction/expected grade: 5.4 grade B/C Most likely grade
23
Most likely grade Prediction/expected grade: 6.2 grade B
Student no.1 GCSE Geography Prediction/expected grade: 6.2 grade B Most likely grade
24
COMMENTS? 24
25
Monitoring Student Progress
Monitoring students’ work against target grades is established practice in schools and colleges, and there are many diverse monitoring systems in place. Simple monitoring systems can be very effective Current student achievement compared to the target grade done at predetermined regular intervals to coincide with, for example internal assessments/examinations Designated staff having an overview of each student’s achievements across subjects All parents being informed of progress compared to targets Review of progress between parents and staff Subject progress being monitored by a member of the management team in conjunction with the head of subject/department A tracking system to show progress over time for subjects and students
26
Pupil Tracking
27
Tracking at departmental level for one student
28
Traditional mark book approach
29
Targets for learning…. reporting to pupils
..\..\MidYis Proformanonames.xls
30
SOME TRAPS TO AVOID
31
TRAP 1 Y6 class of 10 pupils Each predicted a Level 4
Each with 90% chance of success Best estimate is that one will not make it Best estimate = 90% L4+
32
PSYCHOLOGICAL EFFECT ON PUPILS
TRAP 2 PSYCHOLOGICAL EFFECT ON PUPILS THE C/D boundary problem at GCSE PUPILS NEED HIGH EXPECTATIONS Teachers who set high expectations should not be criticised for setting them slightly too high at times. What are the implications for the performance management of teachers?
33
MONITORING PITFALLS Tracking developed ability measures over time. Looking at average standardised residuals for teaching sets. Effect of one result in a small group of students
34
Note the regression towards the mean pattern. See next two slides
SOSCA Reading YEAR 9 SOSCA Reading Band LONDON READING YEAR 7 DIFFERENCE NEGATIVES 104 B 129 -25 99 120 -21 108 128 -20 111 -18 90 C 101 118 -17 88 -16 122 A 137 -15 85 D -14 103 117 71 83 -12 115 127 106 96 107 -11 100 -10 94 89 97 -9 119 SOSCA Reading YEAR 9 SOSCA Reading Band LONDON READING YEAR 7 DIFFERENCE POSITIVES Sample results from spreadsheet comparing performance in reading in Year 7 and year 9 on two different tests for cohorts of 2007 and Correlation is 0.75. Note the regression towards the mean pattern. See next two slides 127 A 108 19 122 102 20 115 95 97 C 77 121 100 21 118 125 104 112 91 146 129 107 22 90 111 B 89 113 23 134 110 24 25 116 109 84 139 32 130 33 141 106 35
36
REGRESSION TOWARDS THE MEAN
Pupils with high MidYIS scores tend to have high SOSCA scores but not quite as high. Similarly pupils with low MidYIS scores tend to have low SOSCA scores, but not quite as low. It is a phenomenon seen in any matched dataset of correlated and normally-distributed scores, the classic example is a comparison of fathers' and sons' heights. Regression lines reflect this phenomenon - if you look at the predictions used in the SOSCA value-added you can see that for pupils with high MidYIS scores their predicted SOSCA scores are lower than their MidYIS scores, whereas for pupils with low MidYIS scores their predicted SOSCA scores are higher than their MidYIS scores.
37
CLASS REVIEW BEWARE PITFALLS INTERPRETATION The Critchlow Effect
Teaching Sets
38
SUBJECT M
39
Standardised Residual
MONITORING MIDYIS YEAR 7 TO SOSCA SCIENCE SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 95 97 2 0.2 B 105 98 -1 -0.1 C M 102 96 -2 -0.2 D 72 80 76 -4 -0.4 E 152 126 142 16 1.5
40
Standardised Residual
MONITORING MIDYIS YEAR 7 TO SOSCA READING SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 96 91 -5 -0.5 B 105 100 115 15 1.7 C M 102 98 87 -11 -1.2 D 72 80 7 0.8 E 152 128 134 6 0.6
41
Standardised Residual
MONITORING MIDYIS YEAR 7 TO SOSCA MATHS SCORE YEAR 9 Surname Forename Sex MidYIS Test Score Predicted SOSCA Score Actual SOSCA Score Raw Residual Standardised Residual A F 99 93 96 3 0.4 B 105 97 -1 -0.1 C M 102 95 86 -10 -1.1 D 72 73 1 0.1 E 152 134 121 -13 -1.5
42
Data for Monitoring Target Setting and Reporting
CEM CONFERENCE EXETER Data for Monitoring Target Setting and Reporting Day 2 Session th February 2013 Geoff Davies
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.