Making the most of Assessment Data in the Secondary Years Dr Robert Clark.

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

The MidYIS Test.
Reflect on our progress since OFSTED (focus on assessment) Identify areas in which each of us can make assessment more effective.
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
Introduction to CEM and the Baseline Tests
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Experienced Users of MidYIS and Yellis Ian Sanderson
 Cognitive  Ability  Testing  Objective testing – independently marked and interpreted by an outside agency  Completed online with both auditory.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Assessment and Data Year 7 and 8
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Introduction to Value-Added Data Dr Robert Clark.
Interpreting Feedback from Baseline tests- Whole School and Individual Student Data CEM CONFERENCE EXETER Geoff Davies Day 1 Session 2 27 th February 2013.
Using MidYIS to inform Teaching & Learning
Advanced Users – MidYIS, Yellis & ALIS
Secondary Information Systems
Secondary Information Systems
MATHEMATICAL DISABILITIES Reasoning, Computation, Making Connections.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
M.Greenaway. Analysing Data.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Introduction to CEM and Computer Adaptive Assessments
Year 7 Settling – in Evening. Assessment Process and Ability Grouping.
The Interest Profile. Your Interest Profile M = suggestion that you should look at also. I = the career areas you said you were interested in.
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Level 7 Research Project Laura Bridge Robert Owen EBITT.
Introduction to Value-Added Data Dr Robert Clark.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
YEAR 10 GUIDANCE EVENING Progress 8 The Government have introduced a new value-added performance measure which will replace 5+ A*-C inc Maths/English.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
1 Assessment and Monitoring in the Primary Years
Evaluating the Impact of Interventions The gains made by pupils who took part in a specific intervention, using ‘before’ and ‘after’ measures.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
TARGET SETTING AT KEY STAGE 4. TARGET SETTING Achieve your potential. Effective when used properly. Motivate. Rewards & Intervention.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Standards-Based Report Cards Glynn County Schools 1 st -3 rd Grades
Interpreting Feedback from Baseline Tests - Whole School & Individual Student Data Course: CEM Information Systems for Beginners and New Users Day 1 Session.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Charlton Kings Junior School INFORMATION EVENING FOR YEAR 6 PARENTS.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Year 7 Curriculum Evening
Year 8 Curriculum Evening
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Governors’ Update RaiseOnline & Fischer Family Trust
Assessment without Levels
English Hub School networks GCSE English Language
Purpose of this presentation
Year 7 Curriculum Evening
New Key Stage 3 Assessment
St. James & St. John CE Primary School
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Year 7 Curriculum Evening
Proposal for changes to KS3 Monitoring and Reporting
Course: CEM Information Systems for Beginners and New Users
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

Making the most of Assessment Data in the Secondary Years Dr Robert Clark

Baseline Assessment Year 7 (8) Year 9 Year 10 (11) Computer Adaptive Baseline Test VocabMathsNon Verbal

What is an Adaptive test ? All questions allocated to groups of different difficulty All students in a year group get common difficulty starting question If answer correct, next question harder If answer wrong, next question easier Test looks at all answers in a section and homes in on questions of a suitable difficulty for student Try it yourself at

X       

Standardisation Test scores are standardised; Mean = 100, SD = 15 Standardised Score National Percentage Comment >130Top 2.5%Traditional classification of ‘mentally gifted’ >120Top 10% >108Top 30% >100Top 50% <70Bottom 2.5%Potential special educational needs ??

Feedback from Baseline Test Intake Profiles Individual Pupil Records (IPRs) Predictions (GCSE)

Individual Pupil Record Sheets (IPRs) Look for sections that are inconsistent

General IPR Patterns Pupils with high scores across all components Pupils with low scores across all components Pupils with significant differences between one or two components Vocab lower than others Vocab higher than others Maths higher than others Maths lower than others Non-Verbal higher than others Non-Verbal lower than others Low Skills High Skills

Pupils with high scores across all components Score > 130 – top 2.5% Gifted ? Challenging Work ? Are they being Stretched ?

Pupils with low scores across all components Score =70 – bottom 2.5% Special Educational Needs ? Specialist testing required ?

Vocab significantly lower than other sections English Second Language ? Understanding language used in learning and assessment ? Language enrichment ?

Vocab significantly higher than other sections Good Communicator ? Work in class may not be to this standard →Weak Non-verbal →Weak Maths →Weak Skills (speed of working ?) Many benefit from verbal descriptors ?

Maths significantly higher than other sections Strong Maths ability Not 100% curriculum free May depend on prior teaching effectiveness Far East influence ?

Maths significantly lower than other sections Implications not just for maths but other numerate or data based subjects General poor numeracy ? Remedial Maths ?

Non-Verbal Significantly Higher than other sections Good spatial and non-verbal ability May have high specific skills Low Vocab, Maths & Skills may indicate has difficulty communicating Frustration ?

Non-Verbal Significantly Lower than other sections Difficulty understanding diagrams or graphical instructions ? Verbal explanation ? Physical demonstration ? Physical Models ?

Low Skills Scores Skills = Proof Reading and Perceptual Speed & Accuracy Speed of Working Work well in class / homework but underachieve in exams ? Problems checking work or decoding questions ? Low Skills + Low Vocab →Poor written work in class (unable to work quickly) →Dyslexia ? Further specialist assessment required

High Skills Scores Skills = Proof Reading and Perceptual Speed & Accuracy Can work quickly & accurately Difficulty communicating and expressing ideas ? May perform poorly in areas using numeracy skills and subjects needing 3D visualisation and spatial concepts ? May struggle in most areas of curriculum.

Predictions  NOT a forecast of the grade the student will get An indication of the grade (points score) achieved on average by students of similar ability in the previous year  NOT a Target Targets ? Minimum Targets – Round CEM prediction down ? Realistic Targets – Use nearest CEM grade ? Challenging Targets →75 th percentile ? →Prior Value Added ? →Arbitrary grade fraction ?

Adjust Predictions to reflect school expectation Convert to A*=58 Display predictions as Grades

75 th Percentile Prior Value Added

Chances Graphs

Curriculum Assessment Y9 - INSIGHT Combines Curriculum based assessment with baseline assessment →Maths →Science →Reading →Developed Ability Feedback includes →Standardised Scores (& KS3 Levels) →IPRs →Predictions (including chances graphs) - From Y7 and to GCSE →Value-Added from Y7 →Value-Added to GCSE

IPRs Also available as KS3 equivalent (sub) levels VocabMathsScience Developed Ability

Thank You Dr Robert Clark