Introduction to CEM Secondary Pre-16 Information Systems

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

Predicting Future Attainment: Is GCSE Always the Answer ? Dr Robert Clark ALIS Project Manager.
The MidYIS Test.
MIDYIS – A BRIEF INTRODUCTION
Introduction to CEM and the Baseline Tests
Peter Finlayson Quality improvement Officer February 2013.
Interpreting and Using ePIPS(P3,5,7) Feedback Glasgow 13 th February 2013.
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Interactive Computerised Assessment System
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Introduction to Value-Added Data Dr Robert Clark.
Secondary Information Systems
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
Introduction to Value-Added Data Robert Clark Neil Defty Nicola Forster.
Secondary Information Systems
M.Greenaway. Analysing Data.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Data for Monitoring Target Setting and Reporting
Introduction to CEM and Computer Adaptive Assessments
New Users of MidYIS and Yellis Ian Sanderson
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Introduction to Value-Added Data Dr Robert Clark.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
Using data from the whole school Perspective CEM CONFERENCE EXETER Geoff Davies Day 2 Final session 28 th February 2013.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
Monitoring Schools Using CEM Data David Tickner Deputy Head Newcastle School for Boys.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
 The introduction of the new assessment framework in line with the new curriculum now that levels have gone.  Help parents understand how their children.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Empowering Informed Decisions Using RAISEonline data to improve governor effectiveness Dave Thomson Head of Data Analysis, RM Education.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Key Stage 2 SATS 2017.
Analysing the Primary RAISE
Head teacher presentation 2015
SOSCA Assessments at end S2
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Governors’ Update RaiseOnline & Fischer Family Trust
English Hub School networks GCSE English Language
Curriculum, Assessment, Data, Progress, Reporting and Tracking.
Welcome to the Year 9 Parents’ Information Evening
Shimna Integrated College Jacqueline Conn
Introduction to Alis Dr Robert Clark ALIS Project Manager.
Year 6 SATs Meeting Tuesday 6th February 2018.
Year 6 Parent Forum Amina Patel: Head Teacher
SATs 2018 Red Hill Primary School 15th March 2018
Experienced Users of Alis
Key Stage 2 SATs.
The MidYIS Test.
Sir James Smith’s Community School
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Proposal for changes to KS3 Monitoring and Reporting
Understanding Progress 8
Course: CEM Information Systems for Beginners and New Users
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Using CEM data for T and L
Target Setting and Monitoring
3rd-5th Year Grade Card Changes
Presentation transcript:

Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011

Ensuring Fairness

Principles of Fair Analysis : Use an Appropriate Baseline Compare ‘Like’ with ‘Like’ Reflect Statistical Uncertainty Not all students have KS2 baselines Prior Value-Added Can you add value at every Key Stage ? Under achievement leading to under expectation One teacher’s output = another teacher’s input Teaching to the test Does performance represent ability, effort or exam technique ? Transfer of Data

CEM Systems Principle of Fair Analysis No1 : Use an Appropriate Baseline

The Projects Alis Yellis KS4 KS3 MidYIS A / AS / Btec / IB etc Year 13 GCSE Computer Adaptive Baseline Test (paper test also available) Year 13 Year 12 GCSE KS4 Year 11 Yellis Computer Adaptive Baseline Test or paper-based test Year 10 Combines curriculum tests with developed ability INSIGHT KS3 Year 9 MidYIS Computer Adaptive Baseline Test or paper-based test Year 8 (+ additional) Year 7

Paper or Computer Adaptive The Assessments Year Groups When Delivery Includes MidYIS 7 8 9 Term 1 + catch ups Paper or Computer Adaptive 1 session Developed Ability Vocabulary Maths Non Verbal Skills Yellis 10 11 INSIGHT 4 week testing window mid April – mid May + catch ups Computer Adaptive 3 or 4 sessions Curriculum-based Reading Science + Attitudes & Developed Ability

The Assessments INSIGHT – curriculum-based Assessment Maths - Number & Algebra - Handling Data & Space - Shapes & Measures Science - Biology - Chemistry - Physics - + Attitudes to Science Reading - Speed Reading - Text Comprehension - Passage Comprehension Additionally for INSIGHT: Developed - Vocabulary, Ability Non verbal, Skills - Attitudinal measures

The Assessments What is Computer Adaptive Assessment? Questions adapt to pupil Efficient No time Wasting Wider Ability Range More Enjoyable Green 8

Computer-adaptive vs Paper-based testing 2/25/2019 Computer-adaptive vs Paper-based testing Computer Adaptive Paper-based Number of Students per Test Session Limited by number of computers available – multiple testing sessions Can test all students in a single session (in hall or in form groups) or in more than the one session Cost Roughly 30% cheaper than paper-based test Standard cost Processing of Baseline Feedback Baseline feedback available within a couple of hours of testing Takes around 2-4 weeks for papers to be marked Preparation Must be able to install the software or access the internet version of the test No pre-test set up Student Experience “Tailored” assessment All students see all questions, irrespective of suitability The pros and cons of computer adaptive and paper based testing. In the adaptive test pupils are directed to harder or easier items depending upon their response to the previous question. 9

The Analysis

Linear Least Squares Regression . Linear Least Squares Regression Subject X -ve VA +ve VA 50 100 150 Residuals Regression Line (…Trend Line, Line of Best Fit) Outcome = gradient x baseline + intercept Correlation Coefficient (~ 0.7)

e.g. MidYIS, INSIGHT, Yellis standardised scores Making Predictions Subject X A* A A* B e.g. GCSE C D E F G U 50 100 150 e.g. MidYIS, INSIGHT, Yellis standardised scores

Some Subjects are More Equal than Others…. 1.5 grades’ difference! Principle of Fair Analysis No2 : Compare ‘Like’ with ‘Like’

The Assessments

Developed Ability - Maths 15

Developed Ability - Vocabulary 16

Developed Ability - Non-verbal 17

Developed Ability - Skills 18

INSIGHT - Maths 19

INSIGHT - Science 20

INSIGHT - Reading 21

Baseline Assessment and Predictive Feedback

Baseline Feedback Nationally-Standardised Feedback How did your pupils perform on the assessment? What strengths and weaknesses do they have? As a group how able are they? Predictions Given their performances on the test, how well might they do at KS3 or GCSE?

Baseline Feedback Feedback can be used at the pupil, class and cohort level. to guide individuals to monitor pupil progress to monitor subject-wide and department level progress For classroom teachers, Head teachers or SMT as a quality assurance tool. Data can be aggregated at other levels. We support & provide software tools to help schools to do this e.g. Paris software.

Baseline Feedback – Test Scores

Baseline Feedback – Test Scores ·       National Mean =100, Standard Deviation =15 ·       4 Performance Bands A, B, C, D ·       Averages & Band Profiles for the cohort ·       95% of scores lie between 70 & 130 ·       No ceiling at 130+ or floor at 70

Baseline Feedback – Band Profile Graphs 27

Baseline Feedback-Gifted Pupils Standardised Test Score

Baseline Feedback Individual Pupil Recordsheets (IPRs) 29

Predictive Feedback Predictions…... Average performance by similar pupils in past examinations What is a ‘Prediction’ ? NOT a forecast of the grade the student will get An indication of the grade (points score) achieved on average by students of similar ability in the previous year Targets ? Minimum Targets – Round CEM prediction down ? Realistic Targets – Use nearest CEM grade ? Challenging Targets – 75th percentile ? Prior Value Added ? Arbitrary grade fraction ? 75th Percentile Predictions If all students attain 75th percentile predictions, School VA will be at top 25% Provided by all Projects in addition to 50th percentile Alis : Excel spreadsheet - ‘Predictions – Spreadsheet (75th Percentile)’ Pre-16 : Adjust button on predictions spreadsheet Prior Value Added Only where prior VA is positive ? 1 year or 3 ? Reasonable to use raw residual figures as this is an approximate measure and raw residuals give grade fractions Alis : Can be calculated using PARIS software Data used to inform, not replace, professional judgement

Predictive Feedback 31

Predictive Feedback

Predictive Feedback- Chances Graphs English Language - band D 2 10 23 31 24 9 1 20 30 40 50 U G F E D C B A A* Grade Percent English Language - band C 6 35 29 7 English Language - band B 21 5 English Language - band A 8 26

Predictive Feedback- Individual Chances Graphs 30% chance of a grade D – the most likely single grade. 70% chance of a different grade Point Prediction = 3.8 Chances Graphs based on Pupil’s actual Test Score NOT Band

Chances Graphs The Chances graphs show that, from almost any baseline score, students come up with almost any grade - - - there are just different probabilities for each grade depending on the baseline score. In working with students these graphs are more useful than a single predicted or target grade Chances graphs serve as a warning for top scoring students and a motivator for low scoring students

Value Added Feedback

Value Added Feedback For each subject, answer questions: Given their abilities, have pupils done better or worse than expected? Can we draw any conclusions at the department level?

Value Added Feedback For each pupil in each subject: Raw residual = Achieved – predicted Standardised residual – allows fair comparison between different subjects and years At subject level: Confidence bounds are narrower with more pupils If average standardised residual lies within bounds you cannot draw any conclusions If average standardised residual lies outside bounds you can be confident that something significant is happening in that subject.

Principle of Fair Analysis No3 : Reflect Statistical Uncertainty Value Added Feedback Burning Question : What is my Value-Added Score ? Better Question : Is it Important ? Principle of Fair Analysis No3 : Reflect Statistical Uncertainty

Value Added Feedback – Scatter Plot GCSE English GCSE Points Equivalent Baseline Score Look for Patterns… General under- or over-achievement ? Do any groups of students stand out ? – high ability vs low ability ? – male vs female ?

Year 7 Pupil Level Residuals to GCSE Value Added Feedback Year 7 Pupil Level Residuals to GCSE

Standardised Residuals Graph Value Added Feedback – Standardised Residuals Graph Standardised Residuals shown with confidence limits at 2 (95%) and 3 (99.7%) standard deviations Standardised Residuals can be compared fairly between subjects and over years

Statistical Process Control (SPC) Chart Value Added Feedback - Statistical Process Control (SPC) Chart Subject: X

Attitudinal Surveys

Attitudinal Feedback Your data is above the average Your data is below the average Your data is about the same as the average

Attitudinal Feedback

Secondary Pre-16 Contact Details 2/25/2019 Secondary Pre-16 Contact Details Tel: 0191 334 4255 Email: secondary.support@cem.dur.ac.uk Web: www.cemcentre.org Contact details for the Yellis Project. 47