Download presentation
Presentation is loading. Please wait.
Published byMolly Griffith Modified over 5 years ago
1
Introduction to CEM Secondary Pre-16 Information Systems
Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011
2
Ensuring Fairness
3
Principles of Fair Analysis :
Use an Appropriate Baseline Compare ‘Like’ with ‘Like’ Reflect Statistical Uncertainty Not all students have KS2 baselines Prior Value-Added Can you add value at every Key Stage ? Under achievement leading to under expectation One teacher’s output = another teacher’s input Teaching to the test Does performance represent ability, effort or exam technique ? Transfer of Data
4
CEM Systems Principle of Fair Analysis No1 : Use an Appropriate Baseline
5
The Projects Alis Yellis KS4 KS3 MidYIS A / AS / Btec / IB etc Year 13
GCSE Computer Adaptive Baseline Test (paper test also available) Year 13 Year 12 GCSE KS4 Year 11 Yellis Computer Adaptive Baseline Test or paper-based test Year 10 Combines curriculum tests with developed ability INSIGHT KS3 Year 9 MidYIS Computer Adaptive Baseline Test or paper-based test Year 8 (+ additional) Year 7
6
Paper or Computer Adaptive
The Assessments Year Groups When Delivery Includes MidYIS 7 8 9 Term 1 + catch ups Paper or Computer Adaptive 1 session Developed Ability Vocabulary Maths Non Verbal Skills Yellis 10 11 INSIGHT 4 week testing window mid April – mid May + catch ups Computer Adaptive 3 or 4 sessions Curriculum-based Reading Science + Attitudes & Developed Ability
7
The Assessments INSIGHT – curriculum-based Assessment
Maths - Number & Algebra - Handling Data & Space - Shapes & Measures Science - Biology - Chemistry - Physics - + Attitudes to Science Reading - Speed Reading - Text Comprehension - Passage Comprehension Additionally for INSIGHT: Developed Vocabulary, Ability Non verbal, Skills - Attitudinal measures
8
The Assessments What is Computer Adaptive Assessment?
Questions adapt to pupil Efficient No time Wasting Wider Ability Range More Enjoyable Green 8
9
Computer-adaptive vs Paper-based testing
2/25/2019 Computer-adaptive vs Paper-based testing Computer Adaptive Paper-based Number of Students per Test Session Limited by number of computers available – multiple testing sessions Can test all students in a single session (in hall or in form groups) or in more than the one session Cost Roughly 30% cheaper than paper-based test Standard cost Processing of Baseline Feedback Baseline feedback available within a couple of hours of testing Takes around 2-4 weeks for papers to be marked Preparation Must be able to install the software or access the internet version of the test No pre-test set up Student Experience “Tailored” assessment All students see all questions, irrespective of suitability The pros and cons of computer adaptive and paper based testing. In the adaptive test pupils are directed to harder or easier items depending upon their response to the previous question. 9
10
The Analysis
11
Linear Least Squares Regression
. Linear Least Squares Regression Subject X -ve VA +ve VA Residuals Regression Line (…Trend Line, Line of Best Fit) Outcome = gradient x baseline + intercept Correlation Coefficient (~ 0.7)
12
e.g. MidYIS, INSIGHT, Yellis standardised scores
Making Predictions Subject X A* A A* B e.g. GCSE C D E F G U e.g. MidYIS, INSIGHT, Yellis standardised scores
13
Some Subjects are More Equal than Others….
1.5 grades’ difference! Principle of Fair Analysis No2 : Compare ‘Like’ with ‘Like’
14
The Assessments
15
Developed Ability - Maths
15
16
Developed Ability - Vocabulary
16
17
Developed Ability - Non-verbal
17
18
Developed Ability - Skills
18
19
INSIGHT - Maths 19
20
INSIGHT - Science 20
21
INSIGHT - Reading 21
22
Baseline Assessment and Predictive Feedback
23
Baseline Feedback Nationally-Standardised Feedback
How did your pupils perform on the assessment? What strengths and weaknesses do they have? As a group how able are they? Predictions Given their performances on the test, how well might they do at KS3 or GCSE?
24
Baseline Feedback Feedback can be used at the pupil, class and cohort level. to guide individuals to monitor pupil progress to monitor subject-wide and department level progress For classroom teachers, Head teachers or SMT as a quality assurance tool. Data can be aggregated at other levels. We support & provide software tools to help schools to do this e.g. Paris software.
25
Baseline Feedback – Test Scores
26
Baseline Feedback – Test Scores
· National Mean =100, Standard Deviation =15 · 4 Performance Bands A, B, C, D · Averages & Band Profiles for the cohort · 95% of scores lie between 70 & 130 · No ceiling at 130+ or floor at 70
27
Baseline Feedback – Band Profile Graphs
27
28
Baseline Feedback-Gifted Pupils
Standardised Test Score
29
Baseline Feedback Individual Pupil Recordsheets (IPRs)
29
30
Predictive Feedback Predictions…...
Average performance by similar pupils in past examinations What is a ‘Prediction’ ? NOT a forecast of the grade the student will get An indication of the grade (points score) achieved on average by students of similar ability in the previous year Targets ? Minimum Targets – Round CEM prediction down ? Realistic Targets – Use nearest CEM grade ? Challenging Targets – 75th percentile ? Prior Value Added ? Arbitrary grade fraction ? 75th Percentile Predictions If all students attain 75th percentile predictions, School VA will be at top 25% Provided by all Projects in addition to 50th percentile Alis : Excel spreadsheet - ‘Predictions – Spreadsheet (75th Percentile)’ Pre-16 : Adjust button on predictions spreadsheet Prior Value Added Only where prior VA is positive ? 1 year or 3 ? Reasonable to use raw residual figures as this is an approximate measure and raw residuals give grade fractions Alis : Can be calculated using PARIS software Data used to inform, not replace, professional judgement
31
Predictive Feedback 31
32
Predictive Feedback
33
Predictive Feedback- Chances Graphs
English Language - band D 2 10 23 31 24 9 1 20 30 40 50 U G F E D C B A A* Grade Percent English Language - band C 6 35 29 7 English Language - band B 21 5 English Language - band A 8 26
34
Predictive Feedback- Individual Chances Graphs
30% chance of a grade D – the most likely single grade. 70% chance of a different grade Point Prediction = 3.8 Chances Graphs based on Pupil’s actual Test Score NOT Band
35
Chances Graphs The Chances graphs show that, from almost any baseline score, students come up with almost any grade there are just different probabilities for each grade depending on the baseline score. In working with students these graphs are more useful than a single predicted or target grade Chances graphs serve as a warning for top scoring students and a motivator for low scoring students
36
Value Added Feedback
37
Value Added Feedback For each subject, answer questions:
Given their abilities, have pupils done better or worse than expected? Can we draw any conclusions at the department level?
38
Value Added Feedback For each pupil in each subject:
Raw residual = Achieved – predicted Standardised residual – allows fair comparison between different subjects and years At subject level: Confidence bounds are narrower with more pupils If average standardised residual lies within bounds you cannot draw any conclusions If average standardised residual lies outside bounds you can be confident that something significant is happening in that subject.
39
Principle of Fair Analysis No3 : Reflect Statistical Uncertainty
Value Added Feedback Burning Question : What is my Value-Added Score ? Better Question : Is it Important ? Principle of Fair Analysis No3 : Reflect Statistical Uncertainty
40
Value Added Feedback – Scatter Plot
GCSE English GCSE Points Equivalent Baseline Score Look for Patterns… General under- or over-achievement ? Do any groups of students stand out ? – high ability vs low ability ? – male vs female ?
41
Year 7 Pupil Level Residuals to GCSE
Value Added Feedback Year 7 Pupil Level Residuals to GCSE
42
Standardised Residuals Graph
Value Added Feedback – Standardised Residuals Graph Standardised Residuals shown with confidence limits at 2 (95%) and 3 (99.7%) standard deviations Standardised Residuals can be compared fairly between subjects and over years
43
Statistical Process Control (SPC) Chart
Value Added Feedback - Statistical Process Control (SPC) Chart Subject: X
44
Attitudinal Surveys
45
Attitudinal Feedback Your data is above the average
Your data is below the average Your data is about the same as the average
46
Attitudinal Feedback
47
Secondary Pre-16 Contact Details
2/25/2019 Secondary Pre-16 Contact Details Tel: Web: Contact details for the Yellis Project. 47
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.