Introduction to CEM and the Baseline Tests

Slides:



Advertisements
Similar presentations
IB Economics & General Economics Course Outlines Mrs. K.Balan.
Advertisements

Quality Assuring the Scottish Baccalaureate Inter-disciplinary Project Session
Divisional Title BTEC Approval, Registration and Certification September 2008 – Examination Officer Training.
St Pauls Catholic School Year 8 Information Evening Tuesday 29 th April 2014.
prepared by ReLabs Research Laboratories
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Law School 1 Using Blackboard Assignment tool for e-submission, e-marking, e-feedback Jane Daly 21 st March 2013.
Predicting Future Attainment: Is GCSE Always the Answer ? Dr Robert Clark ALIS Project Manager.
EiS – Education iT Services Our passion in EiS is to make a real difference in education and ultimately childrens lives by providing innovative solutions.
The MidYIS Test.
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
Introduction to Acuity Reading the Reports School District of Philadelphia Grades 1-8 Extended Day Program and 9 th Grade Strategic & Intensive Intervention.
Access to HE Diploma Grading. The Access to HE grading model unit grading all level 3 units (level 2 units will not be graded) no aggregate or single.
RAJAR Website Training Manual Berlin: March 2013.
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
Dr. Jon Milleman, Assistant Superintendent Sabra Gage, WTEA President MSDWT Teacher Evaluation Information Review of Updates
Observer Schedules Pre-Observation Conference To schedule a pre-observation conference, select Schedule.
We’ll be spending minutes talking about Quiz 1 that you’ll be taking at the next class session before you take the Gateway Quiz today.
Enhanced Indiana Online Reporting System (INORS) Training Spring 2014 Title.
MIDYIS – A BRIEF INTRODUCTION
What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
Peter Finlayson Quality improvement Officer February 2013.
Visit the College Website –
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
CEM Assessments in the Early Years Dr Chris Jellis Centre for Evaluation and Monitoring
Using…. EasyCBM Reasons to use EasyCBM
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Secondary Information Systems
. GCSE Computer Science. General Information The spec has been developed with the support of Microsoft The specification and sample assessment materials.
Secondary Information Systems
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
MOVING UP TO SECONDARY SCHOOL IN SEPTEMBER THE SELECTION PROCESS More information at
Introduction to CEM and Computer Adaptive Assessments
Smarter Balanced Digital Library Welcome to State Leadership Team Training 4 May 2014.
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
CEM Data and Self-Evaluation Dr Robert Clark ALIS Project Manager.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
Step 4 Presenter: Updated 6/21/2013. Training Overview Introduction Walk Through Step 4 Scheduled FET Trainings & Completion Dates for FET Step(s) Question(s)
Monitoring Schools Using CEM Data David Tickner Deputy Head Newcastle School for Boys.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
When you first log in, this is the page you will see. It lists all the courses you’re enrolled in – and differentiates between those that are active and.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Holland Central School District Opening Day September 3, 2013.
Nikki Tilson – Assessment and Achievement Adviser (Nursery and Primary) Assessment Update September 2015.
Data for Target Setting and Monitoring Course: Using CEM data in Practice Day 2 Session 3 Wed 30 th May 2012 Peter Hendry: CEM Consultant
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
Policy & practice Some thoughts There is no assessment panacea. There are strengths and weaknesses for both observation and question- based assessment.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Stephen Albone 1 CEM Monitoring Systems for the Foundation Stage.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Use PowerPoint deck to share with your staff details about:
CEM (NZ) Centre for Evaluation & Monitoring College of Education
CEM (NZ) Centre for Evaluation & Monitoring College of Education
The School Point of View
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Course: CEM Information Systems for Beginners and New Users
Using CEM data for T and L
Target Setting and Monitoring
Electronic OAR – Introduction to
Presentation transcript:

Introduction to CEM and the Baseline Tests Dr Robert Clark

Who / What is CEM ?

Centre for Evaluation & Monitoring Not for Profit Organisation Part of Durham University Close links to School of Education Established 1985 Nursery / Reception  Post-16 Monitoring Systems, Research and Evaluation Projects UK & International Over 1/3 Secondary & Post-16 Establishments Informed by Evidence from Research

CEM Provides Information Systems from Nursery to Post-16 Student Guidance & Chances Performance Monitoring Evaluation & Research Projects Entrance Testing Support – Phone, email, doccumentation Training INSET Roadshows Conferences (including exhibitions)

The Systems

The Systems Alis Yellis MidYIS A / AS / Btec / IB etc Year 13 Year 12 GCSE Computer Adaptive Baseline Test Year 13 Year 12 GCSE Year 11 Yellis Computer Adaptive Baseline Test Year 10 KS3 Combines curriculum tests with developed ability INSIGHT Year 9 MidYIS Computer Adaptive Baseline Test Year 8 Year 7

Early Testing & Predictions Typical Timeline Testing is available from June at end of previous academic year e.g. Y6 / Y9 / Y11 Early Testing & Predictions The baseline score is measured as early as possible and the information sent to CEM The institution then receives feedback from CEM: baseline test results and predictions The public examination results are sent to CEM when received by the institution The value added data is available from September via the Institution’s website at CEM

Basic Principles: Measure Student’s ability Intake Profiles IPR - Strengths and Weaknesses Project future performance from national trends Predictions Record Final Grade Compare actual performance to Projected Performance Value-Added Reports

Starting Up

Pre-16 Complete and return registration form Organise Baseline Testing Paper for new schools Online through secure website for renewals Organise Baseline Testing Available from 1st June (early testing) through to 30th April Upload pupil list via secure website Set up web links to test (or download LAN version) Test students (can be done over extended period) Download Feedback Available within 2 hours of a student completing the test Feedback updated as more students complete test

Secure Website - Secondary : Pre-16

Secondary : Pre-16 – Adaptive Test Instructions Download Instructions Upload Pupil Details Link to tests

Secondary : Pre-16 – INSIGHT Test Instructions Download Instructions Upload Pupil Details Link to tests

Post-16 Upload ‘Registration Spreadsheet’ Complete and return registration form Paper for new schools Online through secure website for renewals Organise Baseline Testing Available from 1st June (early testing) through to 30th April Student list NOT used – do not need to upload Set up web links to test (or download LAN version) Test students (can be done over extended period) Upload ‘Registration Spreadsheet’ Names, gender, dob, average GCSE score, post-16 course choices etc Upload via secure website once students confirmed on course (mid September ?) Download Feedback Early Predictions available from adaptive test within 2 hours Formal feedback including IPRS, GCSE predictions and chances graphs available once Registration Spreadsheet processed & ‘Testing Complete’ clicked

Secure Website - Post-16 Quick links to reports Date last generated

Post-16 : Adaptive Test Instructions Use this web link

Baseline Assessments of Developed Ability

Baseline Assessment Year 7 (8) Year 9 Year 10 (11) Year 12 (13+) Computer Adaptive Baseline Test Vocab Maths Non Verbal

What is an Adaptive test ? All questions allocated to groups of different difficulty All students in a year group get common difficulty starting question If answer correct, next question harder If answer wrong, next question easier Test looks at all answers in a section and homes in on questions of a suitable difficulty for student Try it yourself at www.intuproject.org/demos

X          20

Standardisation Test scores are standardised; Mean = 100, SD = 15 Standardised Score National Percentage Comment >130 Top 2.5% Traditional classification of ‘mentally gifted’ >120 Top 10% >100 Top 50% <80 Bottom 10% <70 Bottom 2.5% Potential special educational needs ??

The Computer Adaptive Baseline Test: Adaptive components common to MidYIS, Yellis, and Alis are: Vocabulary Maths Non- adaptive sections: Non-verbal (MidYIS, Yellis, Alis) Skills (MidYIS only) Assessments are ‘curriculum free’: i.e. students are not taught prior to the test Measure of acquired ability There is no preparation for the test The vocabulary and maths sections are the most important to the prediction of all subjects. Maths most difficult section to be curriculum free Non verbal is suitable for pupils for English is a second or additional language – allows ability to be displayed without the confounds of language No preparation for the test – this is important to ensure that every pupil starts from the same fair point. No lesson of preparation – each pupil hears the same examples, instructions and receives the same amount of help. This standardised administration ensures that you get a measure of typical performance for each pupil an indication of how much work you will need to put into take each pupil through to the next stage of education – through KS3 and onto GCSE. Developed ability is a set of acquired skills For example you are born with an IQ but your environment is so important. Yes you are born with a particular set of genes but you can’t read – you have to learn how to, you’re born with NV ability, but you can’t you can’t solve non verbal puzzles – you have to learn how to answer them. 22

Difficult vocabulary Vocab - Closely related to reading and English – a proxy – pupils need to be able communicate ideas across the curriculum areas Vocab is caught not taught – absorbed from the world around us. On its own is an excellent predictor for later academic achievement. Contributes well to the prediction of English History Foreign Languages Choose word or phrase with most similar meaning from those shown on right. How is vocab relevant to my subject? 23

Difficult maths x/4 + 2 And a hard question – could you do this? Seems tricky – especially for 11 year olds! However 1 in 15 pupils are getting this right! Good maths skills will mean that any areas of the curriculum that require numeracy - the pupil to think logically, perform calculations, reading data Maths – here’s an item of medium difficulty. Good predictor of Maths# Stats ICT D&T Economics 24

Non-Verbal: Cross-sections Non verbal – 3D visualisation, spatial awareness, pattern recognition. NV consists of 3 sections – crss sections, block counting, pictures. A low score here will mean that students will struggle will any elements of the curriculum that needs them to be able to think in 3 dimensions – for example geography, Art Drama Maths Science D & T Solid been cut with plane to form a cross section. 25

Non-Verbal: Blocks 3 Piles of blocks number of small and large blocks 26

Non-Verbal: Pictures Pictures 27 Pictures – addition, subtraction, sequences of pictures. Low scores in Non verbal – anecdotally number of times PH has been told on INSET that low NV corresponds to observed problems in D&T. 27

Skills: Proof Reading 28 Skills = PR & PSA – skills for modern world Proof reading - Essay writing 28

Skills: Perceptual Speed and Accuracy PSA – skimming & scanning skills. Decoding skills – can pupils read and understand the question? Low skills relate to under performance – not checking work, not able to work quickly (related to speed of work in class), not spotting errors In PH’s school he found that pupils with a D band in vocab and a D band in Skills on many occasions turned to be pupils who were dyslexic – either already diagnosed or diagnosed following further testing. PSA - related to speed of work in class High ability who is a slow worker, - ma doubt thaier ability because they don’t get wok finished Interesting to look at pupils who have low skills scores overall but the PSA element is low – here you will get pupils who do not take care intheir work – ie don’t check it or spot mistakes but who will rush. You may blame their incorrect work on their rushing but their low skills score means they probably would have got it wrong anyway! ie low Skills – rushing to get get work finished, getting things wrong 29

Alis - Patterns PSA – skimming & scanning skills. Decoding skills – can pupils read and understand the question? Low skills relate to under performance – not checking work, not able to work quickly (related to speed of work in class), not spotting errors In PH’s school he found that pupils with a D band in vocab and a D band in Skills on many occasions turned to be pupils who were dyslexic – either already diagnosed or diagnosed following further testing. PSA - related to speed of work in class High ability who is a slow worker, - ma doubt thaier ability because they don’t get wok finished Interesting to look at pupils who have low skills scores overall but the PSA element is low – here you will get pupils who do not take care intheir work – ie don’t check it or spot mistakes but who will rush. You may blame their incorrect work on their rushing but their low skills score means they probably would have got it wrong anyway! ie low Skills – rushing to get get work finished, getting things wrong 30

Alis – Logical Reasoning PSA – skimming & scanning skills. Decoding skills – can pupils read and understand the question? Low skills relate to under performance – not checking work, not able to work quickly (related to speed of work in class), not spotting errors In PH’s school he found that pupils with a D band in vocab and a D band in Skills on many occasions turned to be pupils who were dyslexic – either already diagnosed or diagnosed following further testing. PSA - related to speed of work in class High ability who is a slow worker, - ma doubt thaier ability because they don’t get wok finished Interesting to look at pupils who have low skills scores overall but the PSA element is low – here you will get pupils who do not take care intheir work – ie don’t check it or spot mistakes but who will rush. You may blame their incorrect work on their rushing but their low skills score means they probably would have got it wrong anyway! ie low Skills – rushing to get get work finished, getting things wrong 31

INSIGHT Curriculum-based assessments covering: Maths - Number & Algebra - Handling Data - Space, Shape & Measurement Science - Biology, Chemistry, Physics, - Attitudes to Science Reading - Speed Reading - Text Comprehension - Passage Comprehension Additionally for INSIGHT: Developed Ability Vocabulary Non verbal, Skills Attitudinal measures

INSIGHT - Maths 7

INSIGHT - Maths

INSIGHT - Science Plus Attitude to Science: Perception of Science, Relevance to Self, Environmental Actions, Enjoyment of Science

INSIGHT – Reading (Text Comprehension)

INSIGHT – Reading (Passage Comprehension)

INSIGHT – Reading (Speed Reading)

INSIGHT – Developed Ability

Measuring Ability Key Stage Data or Baseline Test ?

Key Stage Baselines – KS2; GCSE Strengths Weaknesses Related to curriculum Dependent on teaching effectiveness (prior Value-Added) as well as student ability Automatically available Open to manipulation In Depth Not all students have KS data Linked to student learning experience Reliability of KS measure Consistency of KS score (i.e. GCSE & Btec First etc) Independent Baseline Test – CEM Adaptive Test

Robert Clark – robert.clark@cem.dur.ac.uk Thank You Robert Clark – robert.clark@cem.dur.ac.uk