Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

Assessment and Tracking Evening Foundation Stage 2 & Key Stage 1.
MIDYIS – A BRIEF INTRODUCTION
Introduction to CEM and the Baseline Tests
Peter Finlayson Quality improvement Officer February 2013.
Moderation Workshop for Y2 and Y3 Teachers Karen Samples.
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Experienced Users of MidYIS and Yellis Ian Sanderson
 Cognitive  Ability  Testing  Objective testing – independently marked and interpreted by an outside agency  Completed online with both auditory.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
1 Contextual Value Added and Data For Dummies The mystery explained.
Assessment and Data Year 7 and 8
Interpreting Feedback from Baseline tests- Whole School and Individual Student Data CEM CONFERENCE EXETER Geoff Davies Day 1 Session 2 27 th February 2013.
Secondary Information Systems
FFT Data Analysis Project – Supporting Self Evaluation  Fischer Family Trust / Fischer Education Project Extracts may be reproduced for non commercial.
Pat Preedy has worked with CEM since 1992 and was part of the team that developed the baseline assessments.
Secondary Information Systems
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Data for Monitoring Target Setting and Reporting
Introduction to CEM and Computer Adaptive Assessments
Reporting levels Parents Evening. SO WHAT LEVELS DO YOU EXPECT YOUR CHILD TO BE WORKING AT? National Curriculum Levels range from level 1 to level 8,
Year 7 Settling – in Evening. Assessment Process and Ability Grouping.
STATISTICS IN SCHOOLS Vinay Bhardwaj Kim Jackson Catherine Rich Amy Zaffarese.
New Users of MidYIS and Yellis Ian Sanderson
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
CEM Data and Self-Evaluation Dr Robert Clark ALIS Project Manager.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
SIMS Performance Suite Supporting School Improvement, Meeting Parental Engagement.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
EYFS – and the OFSTED Framework Sue Monypenny Senior Education Standards and Effectiveness Officer.
Monitoring Schools Using CEM Data David Tickner Deputy Head Newcastle School for Boys.
CEM Conference 8 June 2011 Using CEM data: Convincing colleagues, pupils & parents.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
SEF Describing good or better achievement and standards What is laid down, ordered, factual is never enough to embrace the whole truth: life spills over.
Data for Target Setting and Monitoring Course: Using CEM data in Practice Day 2 Session 3 Wed 30 th May 2012 Peter Hendry: CEM Consultant
Interpreting Feedback from Baseline Tests - Whole School & Individual Student Data Course: CEM Information Systems for Beginners and New Users Day 1 Session.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
Good Morning and welcome. Thank you for attending this meeting to discuss assessment of learning, pupil progress and end of year school reports.
Hertfordshire County Council The Role of the Secondary Assessment Co-ordinator Day One 5 th July 2005.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Analysing the Primary RAISE
Assessment – a new way forward!
SOSCA Assessments at end S2
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Curriculum, Assessment, Data, Progress, Reporting and Tracking.
St James’ C of E Primary School
Y7 DATA.
Experienced Users of Alis
Target Setting & Reporting
Introduction to CEM Secondary Pre-16 Information Systems
Assessment and Reporting at SHOM
The MidYIS Test.
Proposal for changes to KS3 Monitoring and Reporting
Understanding Progress 8
Course: CEM Information Systems for Beginners and New Users
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM

To inform professional judgement To start a conversation Baseline Assessment Data

USING BASELINE DATA FOR TEACHING AND LEARNING IF FEEDBACK DATA IS TO BE TRUSTED THEN PROBABLY: The students understood the purpose of the assessments And each student did their best at the time……. As a year group, how able are they? What strengths and weaknesses does the group have? How did each student perform?

Bands, percentiles, standardised scores… Standardised scores D C B A Percentiles: 2575

Alis: year 12 students, two baseline profiles for the same school Nationally, 25% in each band

Band Profile Graph: all MidYIS cohort Checking this graph each year will give you an immediate overview of your intake. A school with a ‘completely average’ intake would have 25% of pupils within each band. Band DBand A

Band D

On another day, with 95% certainty, the score of (+/- 1.1 x 2) would not be higher than = would not be lower than – 2.2 = 98.4

(See for report: “Using MidYIS Individual Pupil Records to Inform Teaching and Learning”) To start a conversation Individual Pupil Records (IPRs)

INSIGHT Pupil IPR

IPRs in KS3 Levels NATIONAL AVERAGE

The code: j: reduces size k: infills m: inverts

“Predictions”

BASELINE SCORE GRADE * ** *** ******************* * ** *** ******************************** * ** *** ********************************* *** ** * * ** *** ************************************ *** ** * * ** *** ******************************** **** ** * * ** ***************************** *** ** * * ** *** ******************* ** * * * ** *********** ** * How is a ‘prediction’ generated? C A* A B

3 key points are: The higher the baseline score the higher the final grade Any one grade is achievable from a range of baseline scores From any baseline score, a range of grades are possible

BASELINE SCORE GRADE * ** *** ******************* * ** *** ******************************** * ** *** ********************************* *** ** * * ** *** ************************************ *** ** * * ** *** ******************************** **** ** * * ** ***************************** *** ** * * ** *** ******************* ** * * * ** *********** ** * Subject National trend line (regression line) How is a ‘prediction’ generated? 50% on or above the trend line 50% on or below the trend line ‘PREDICTION’ (expected grade) C A* A B

‘Predictions’…...are based on Average performance by similar pupils in past examinations The problem with the word ‘prediction’ is…? An alternative is ‘expected’ grade

Predictions 4 points = D 6.6 points = A/B Trend line

FACTORS THAT WILL INFLUENCE RELIABILITY OF PREDICTIONS: Knowledge of student Parental support/home life Peer influences/social life Student attitude, interest, language Expectations of staff Department/institution ethos Resources Quality of teaching and learning: pace of lessons Understanding how children learn……… And the reliability of the predictions......

29 Correlation = 1 Result

Correlation = 0 Correlation = 0.7

The graph below shows the middle 2/3 of some subject trend lines

Prediction/expected grade: 5.1 grade C Most likely grade

Not a label for life...just another piece of information The Chances graphs show that, from almost any baseline score, students come up with almost any grade - - -there are just different probabilities for each grade depending on the baseline score. In working with students these graphs are more useful than a single predicted or target grade Chances graphs show what can be achieved: –By students of similar ability –By students with lower baseline scores

to place school at 75 th percentile of value added results

Or insert own values and click Adjust

Prediction/expected grade: 6.4 grade A/B Most likely grade