The MidYIS Test.

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
MIDYIS – A BRIEF INTRODUCTION
Introduction to CEM and the Baseline Tests
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Assessment and Data Year 7 and 8
St Alphege CE Infant School KS1 SATs Meeting Parent’s Information Evening Monday 23 rd March 2015 Steph Guthrie 2015.
St Aloysius’ College Junior School Glasgow. St Aloysius’ College The Junior School.
Using MidYIS to inform Teaching & Learning
Secondary Information Systems
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Introduction to CEM and Computer Adaptive Assessments
Omm OFMDFM Signature Project Improving Literacy and Numeracy Induction Training 7-11 October 2013 Day 2 Post-Primary.
End of Key Stage One Assessments / SATs 1. SATs at KS1 SATs stands for ‘Standard Assessment Tasks’. They are provided by the Department of Education.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
1 The Development of A Computer Assisted Design, Analysis and Testing System for Analysing Students’ Performance Q. He & P. Tymms, CEM CENTRE, UNIVERSITY.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Welcome The challenges of the new National Curriculum & Life without Levels.
1 An Investigation of The Response Time for Maths Items in A Computer Adaptive Test C. Wheadon & Q. He, CEM CENTRE, DURHAM UNIVERSITY, UK Chris Wheadon.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Attainment Peter Gorrie, QIO September 2014.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Assessment Without Levels December National Curriculum Levels From 1988 until July 2015, National Curriculum Levels were used from Y1 and through.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
KEY STAGE 2 SATS Session Aims To understand what SATs are and why we have them. What will be different in SATs 2016? To share timetable for SATs.
Key Stage 1 SATs. ‘Old’ national curriculum levels (e.g. Level 3, 4, 5) have now been abolished, as set out in the government guidelines. From 2016, test.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Holy Family Catholic School Assessment in 2015/6 Plan for the Workshop: 1)Why Levels have gone 2) Our New Assessment Model 3) Monitoring and Assessment.
Assessment meeting for parents of children in Y1 to Y6 Wednesday 9 th December 6pm & Thursday 10th December 9:15am.
Changes to end of Key Stage assessment arrangements. Monday, November 16 th 2015.
Assessment without Levels 2015 Meadow Primary School Parents as Partners.
KS2 SATs Maths Arithmetic One test paper Length: 30 minutes Marks: 40 Mathematical Reasoning Two test papers Length: 40 minutes per paper Marks:
SAT’s Information Parent’s Meeting 10 th February February 2016.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
 The introduction of the new assessment framework in line with the new curriculum now that levels have gone.  Help parents understand how their children.
National Curriculum – changes and implications Assessment – changes and implications SATs 2016 – Year 2 & 6.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
 End of Key Stage 1 Assessment Meeting March 2016.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
Changes to assessment and reporting of children’s attainment Monday 12 th October A guide for Parents.
Key Stage One Statutory Assessment Mrs McGuigan – April 2016 The Gordon Children’s Academy.
Key Stage 1 National Curriculum Assessments Information and Guidance on the Changes and Expectations for 2015/16 A School Presentation to Parents.
Charlton Kings Junior School INFORMATION EVENING FOR YEAR 6 PARENTS.
Key Stage 1 and 2 Tests 2016 Presentation to Parents and Carers Otterbourne Primary School April 2016.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
KS2 SATS Guidance for Parents
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Monday 24th April 2017 KS1 SATs Meeting.
Target Setting at KS3 and KS4
Feedback bit.ly/AskMackie. Welcome to Mackie Academy S1 Curriculum Information Evening September 2017.
Y7 DATA.
Key Stage 2 SATs.
End of Key Stage Two SATs Meeting for Parents
Year 6 Parent Forum Amina Patel: Head Teacher
St. James & St. John CE Primary School
Sat night With Your P.H.S Counseling team.
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Key Stage 1 National Curriculum
Key Stage 1 National Curriculum
Course: CEM Information Systems for Beginners and New Users
End of Key Stage Two SATs Meeting for Parents
Responding to Recent Debates in Education: Review of KS2 Testing, Assessment & Accountability
Welcome to KS2 SATs explained
Target Setting and Monitoring
Presentation transcript:

The MidYIS Test

The CEM at Durham University The Centre for Evaluation and Monitoring is the largest educational research unit in a UK university. Works with schools, colleges, education authorities and government agencies to provide high-quality information through scientifically grounded research, using evidence rather than authority or opinion as a guide to educational practice. Home of a widely-used family of baseline testing systems including ALIS (6th form), Yellis (Key Stage 4), MidYIS (Key Stage 3) and PIPS (Key Stage 2).

The MidYIS test MidYIS stands for Middle Years Information System and is currently operating in over 3000 secondary schools. Tests are designed to measure ability and aptitude for learning rather than achievement. Not an IQ Test, as it is designed to provide a measure of ‘typical’ performance which can be used to give an expected level of attainment at GCSE based on the national average results.

Computer Adaptive Testing Adaptive test is based on growing bank of questions. The adaptive nature of the test means that all pupils are challenged and receive a bespoke test suited to their ability. The most able pupils don't waste time on items that are too easy for them and lower ability pupils are not discouraged when faced with questions they cannot answer. Adaptive testing is considered the most efficient method of ascertaining a measure of pupils' abilities. It offers pupils questions based on their answers to previous questions, and can therefore quickly focus in on each pupil's ability measure without requiring them to answer reams of unnecessary questions: questions that may be too easy (in the case of your more able pupils) or too hard (for lower ability pupils).

What is tested? Vocabulary: The CEM’s research has shown that development of vocabulary is not fundamentally affected by teaching and is a vital component for success in all subjects. Maths: pupils are presented with a variety of mathematical questions, ranging from basic arithmetic through to algebra. Non-verbal: a battery of tests looking at the ability to handle shapes in order to test visual and spatial skills. Skills: This section is a test of proof-reading and also perceptual speed and accuracy. Pupils have to identify grammatical errors in short passages, such as spelling mistakes, incorrect punctuation and capitalisation.

Standard Score A pupil’s raw MidYIS scores are standardised against the results of everyone in the country taking part in the project to allow scores to be compared with different year groups or pupils in different schools. A score of 100 represents the national average. Only 2% of pupils nationally will score more than 130. The scores can be summarised as follows: 76-87 Well below the national average 88-95 Below average 96-105 Average 106-112 Above Average 113-124 Well above average ≥125 Far above average

MidYIS Band The score for each component and for the overall test is used to place pupils in a performance band. Each band represents 25% of the national ability range, with A being the highest. The scores for each band are as follows: <90 Band D (bottom 25%) 91-99 Band C 100-109 Band B ≥110 Band A (top 25%)

Stanine The MidYIS Band places pupils into four ability bands with equal numbers of pupils in each band. The ‘stanines’ divide the results scale into nine more or less equally-spaced divisions, with different numbers of pupils in each stanine. 9 is the highest and has the fewest pupils. Most will be in the middle – stanines 4 to 6. When plotted on a graph, the results appear as a ‘bell curve’. Most pupils are, inevitably, ‘average’ and so the highest point of the curve is the average score, standardised to 100.

The diagram shows how the four ability bands and the nine stanines are mapped.

Percentile Measures relative performance by showing the percentage of pupils who achieved a lower score nationally in a particular component. E.G. a score of 91 would show that only 9% of students in the national sample performed better in that component. The percentile is often helpful in showing discrepancies in performance between different components.

Confidence Limits

Confidence Limits It is important to be aware that there is a margin for error in the tests. A pupil could have sat the same test on a different day and have got slightly different scores in individual components and overall. The graph shows the score with an upper and lower limit for potential variation – a 95% confidence limit. The Vocabulary score achieved was 104. The graph shows us therefore that we can be 95% confident that the maximum this could have been is about 112 and the lowest about 95. The confidence limit is generally much narrower for the overall score than for any of the individual components. As a rule we can say this means that we can be very confident about the overall score, but must be aware that there is a possible variation of about ±5.