Interpreting Feedback from Baseline tests- Whole School and Individual Student Data CEM CONFERENCE EXETER Geoff Davies Day 1 Session 2 27 th February 2013.

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

The MidYIS Test.
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
MIDYIS – A BRIEF INTRODUCTION
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Experienced Users of MidYIS and Yellis Ian Sanderson
 Cognitive  Ability  Testing  Objective testing – independently marked and interpreted by an outside agency  Completed online with both auditory.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Assessment and Data Year 7 and 8
USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies.
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Using MidYIS to inform Teaching & Learning
Key Messages The role of the link teacher is to help promote numeracy in the school Developing positive attitudes and an awareness of numeracy is the responsibility.
Advanced Users – MidYIS, Yellis & ALIS
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Introduction to CEM and Computer Adaptive Assessments
Year 7 Settling – in Evening. Assessment Process and Ability Grouping.
New Users of MidYIS and Yellis Ian Sanderson
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
CEM Conference 8 June 2011 Using CEM data: Convincing colleagues, pupils & parents.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
SAT’s Information Parent’s Meeting 10 th February February 2016.
SEF Describing good or better achievement and standards What is laid down, ordered, factual is never enough to embrace the whole truth: life spills over.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
 The introduction of the new assessment framework in line with the new curriculum now that levels have gone.  Help parents understand how their children.
Interpreting Feedback from Baseline Tests - Whole School & Individual Student Data Course: CEM Information Systems for Beginners and New Users Day 1 Session.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
KS1 SATS Guidance for Parents
Key Stage 4 Options March Aims of this session To explain;  how the options process works  what’s compulsory (the core) and what’s optional 
KS1 SATS KS1 SATS information for parents. KS1 Assessment in 2016 New Statutory assessment in Year 2 new national curriculum tests scaled scores – KS1.
Key Stage 1 and 2 Tests 2016 Presentation to Parents and Carers Otterbourne Primary School April 2016.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Whitehall Primary School
Year 7 Curriculum Evening
Preparation for End of Key Stage 1 Testing 2017
Y6 Assessment Information Evening
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
KS1 SATS Guidance for Parents
Information for Parents Key Stage 3 Statutory Assessment Arrangements
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Assessment and Reporting Without Levels February 2016
Y7 DATA.
Year 7 Curriculum Evening
Teacher slide Objectives: To consider how A level choices affect your options when applying to universities. You need: A level choices – student worksheet.
Intervention Strategies for borderline students
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
KS1 SATS Guidance for Parents
Preparation for End of Key Stage 1 Testing and Assessment. 2018
Key Stage One National Testing Arrangements
St. James & St. John CE Primary School
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Year 7 Curriculum Evening
Course: CEM Information Systems for Beginners and New Users
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

Interpreting Feedback from Baseline tests- Whole School and Individual Student Data CEM CONFERENCE EXETER Geoff Davies Day 1 Session 2 27 th February 2013

Using Baseline Data for Teaching and Learning Informing professional judgement Begging Questions Supporting teachers Starting conversations Starting a diagnostic process in some cases How able is this year group? What are the strengths and weaknesses of this cohort? How did individuals perform?

Bands, percentiles, standardised scores… Standardised scores D C B A Percentiles: 2575

National Quartile Ability Bands

Band Profile Graph: all MidYIS cohort Checking this graph each year will give you an immediate overview of your intake. A school with a ‘completely average’ intake would have 25% of pupils within each band. Band DBand A

Independent schools only

A Grammar School present year 11 (5 th year) YELLIS BANDS

Alis: year 12 students, two baseline profiles for the same school Nationally, 25% in each band Comments and potential implications?

Band A Band D Comments?

What do the sections of the MIDYIS test measure? The Vocabulary component of the test is generally an important element for most subjects. For English, History and some Foreign Languages it is the best. However the Vocabulary score is perhaps the most culturally linked of all the scores. Those who have not been exposed to vocabulary-rich talk or a wide variety of reading material or whose first language is not English are unlikely to have developed as high a vocabulary score as they would have developed in a different environment. Maths Score The Maths score is well correlated with most subjects but is particularly important when predicting Maths, Statistics, ICT, Design Technology and Economics. The Maths section has been designed with the emphasis on speed and fluency, rather than knowledge of Maths. Like the Vocabulary score, the Maths score is a good predictor of later academic performance. Vocabulary Score

Non-Verbal Score The Non-Verbal score is composed of the three sub tests: Cross- Sections, Block Counting and Pictures. The Non-verbal score is important when predicting Maths, Science, Design Technology Geography, Art and Drama. It provides a measure of the pupil’s ability in 3-D visualisation, spatial aptitude, pattern recognition and logical thinking. It can give an insight in to the developed ability for pupils for whom English is a second language

Skills Score In the Proof Reading section pupils are asked to spot mistakes in the spelling, punctuation and grammar of a passage of text. eg mis-spelling of words like ‘there’ and ‘their’. The PSA (Perceptual speed and accuracy) section asks pupils to look for matches between a sequence of symbols on the left and a number of possible choices on the right. Given enough time most pupils would probably get the answers correct but we are measuring how quickly pupils can find a correct match. An interesting result from our work with the Deaf and Hearing Impaired community shows that on average, Hearing Impaired pupils score well above the national average on the PSA section of the test. The PSA section allows speed to be demonstrated free from the demands of memory. The Proof Reading and PSA tests are tests for the modern world, and are designed to measure fluency and speed. They rely on a pupil’s scanning and skimming skills, skills that are desirable in examination situations.

16 Individual Pupil Records: Show pupils’ strengths and weakness of different sections of the baseline test Bands A,B,C, & D (quartiles) Scores are standardised to have a mean of 100 and a standard deviation of 15 Tables and graphs which include error bars Stanines and percentiles Confidence limits

Top of confidence limit Bottom of confidence limit IPRs (Individual Pupil Record Sheets) An exercise for you on this later!!

On another day, with 95% certainty, the score of (+/- 1.1 x 2) would not be higher than = would not be lower than – 2.2 = 98.4 How might this information be useful?

2009 VocabularyMathsNon-VerbalSkillsOverall ScoreBandScoreBandScoreBandScoreBand ScoreBand Pupil 01122A125A116A107B 126A Pupil 02105B110A127A95C 108B Pupil 03105B93C110B89D 99C Pupil 0491C116A130A115A 103B Pupil 05111A144A122A103B 129A Pupil 06107B112A85D97C 109B Pupil 07115A106B100C86D 112A Pupil 08141A137A132A135A 143A Pupil 09104B92C105B109B 98C Pupil 1099C119A114A99C 109B Pupil 11108B126A130A140A 118A Pupil 12106B123A120A105B 116A Pupil 13103B96C103B104B 99C Pupil 14108B110B112A108B 110A Pupil 1595C104B103B122A 99C Some pupil data MIDYIS

A relatively lower vocabulary score might indicate a difficulty which: could contribute to under-performance in most, if not all subjects might lead to ‘stressful situations’ may lead to further investigation and subsequent pupil support

A relatively low maths score might indicate potential weaknesses in subject areas which require: numerical skills logical thinking skills such as sequencing CONVERSELY: a high maths score but a low vocab/reading score…….

A relatively high non-verbal score might indicate potential strengths in subject areas which require: 3d, and 3d into 2d, visualisation Spatial awareness Understanding images in 2d representing 3d Extracting information from visual images Science, D and T, Art, Geography…… and vice versa

A relatively low skills score might indicate potential weaknesses such as: Speed of processing/working Potential underperformance in test/examination conditions Poor written work (SPG etc.) and vice versa

Skills: Proof Reading

Skills: Perceptual Speed and Accuracy

SEN use of MIDYIS test results Analysis of baseline test individual skill profiles to indicate potential areas of learning difficulties e.g. If the two lowest scores are the vocabulary and skills section and are, or close to being, statistically significant (see IPR) Then the student might be dyslexic Follow-up with appropriate diagnostic tests…..

 Scores over 130 – top 2% nationally  Scores over 126 – top 5% nationally  Scores over 120 – top 10% nationally  Scores over 110 – top 25% nationally Using national baseline test scores to Identify Gifted Pupils

Using MidYIS IPRs to Inform Teaching and Learning The IPR on its own simply tells us about the relative performances of the pupil on the separate sections of the test, where the pupil is strong, where performance has been significantly above or below national averages or where the pupil has significantly outperformed in one section or another. It is when the IPR is placed in the hands of a teacher who knows that pupil that it becomes a powerful tool. It is what teachers know about individual pupils: what has happened in the past, how they respond to given situations and how they work in the teacher’s specific subject that inform the interpretation of the IPR If the IPR data from MidYIS, the teacher’s personal and subject specific knowledge and experiences regarding the pupil can be shared, then there becomes a much more powerful instrument for supporting pupils’ learning needs.

Some pupils will display an IPR pattern with significant differences between one or two components of the MidYIS Test. These can be the most interesting and possibly the most challenging pupils for mainstream classroom teachers. Scenarios and anecdotal findings Handout Page 3

A SELECTION OF MIDYIS SCORES FOR ‘WATERLOO ROAD’ !! VocabularyMaths Non VerbalSkills MidYIS Score St. ScoreBandSt. ScoreBandSt. ScoreBandSt. ScoreBandSt. ScoreBand SurnameSex AF81D110B108B112A94C BF128A107B105B94C120A CM106B121A103B90D114A DF107B84D96C107B96C EM C90D130A91C92C FF86D D120A74D84D GF100B115A80D103B108B HF121A96C114A86D111A IM92C100C96C123A95C JM100C105B100C99C102B KM128A132A114A131A133A LM76D70D74D73D71D What do I need to know/do to teach this (difficult) class of twelve pupils Why would this be a very challenging class to teach? These are real anonymous scores from a number of schools around the UK 3

Vocabulary scores significantly lower than other component scores Second language? Deprived areas? Difficulty accessing curriculum.? Targeted help does work. Seen in nearly all schools. Worth further diagnosis. Could potentially affect performance in all subjects Vocabulary scores significantly higher than other component scores Good communicators. Get on. Put Maths problems in words? Mathematics significantly higher than other scores From Far East? Done entrance tests? Primary experience? Mathematics significantly lower than other scores Primary experience. Use words and diagrams? Sometimes difficult to change attitude..Difficultiess with logical thinking and skills such as sequencing Low Mathematics scores with High Non-verbal Scores Use diagrams. Confidence building often needed Pupils with non-verbal scores different from others – High Non-verbal Scores Frustration? Behaviour problems? Don’t do as well as good communicators or numerate pupils? Good at 3D and 3D to 2D visualisation and spatial awareness. Good at extracting information from visual images Pupils with non verbal scores different from others – Low Non-verbal Scores Peak at GCSE? A level ? Pupils with low Skills scores Exams a difficulty after good coursework? Suggests slow speed of processing. High Skills Scores Do well in exams compared with classwork? The Average Pupil They do exist! High scores throughout Above a score of 130 puts the pupil in the top 2% nationally Low scores throughout Below a score of 70 puts the pupil in the bottom 2% nationally 3

Sharing data with colleagues: e.g. baseline test data

INSIGHT Pupil IPR

Maths

IPRs in KS3 Levels NATIONAL AVERAGE

The code: j: reduces size k: infills m: inverts

Sharing the baseline Information within School Using your MIS systems Baseline test data can be useful to indicate reasons for student learning difficulties and may go some way to explain lack of progress, flag up causes for underachievement and even behaviour problems. for all teachers and support staff. It can help to support professional judgement and give a better understanding of the progress students make at school and their potential later performance. to refer to for pupil reviews, writing reports, meeting parents, monitoring progress and interim assessments.

HANDOUT 1 pages 1 and 2

2013 Current Year Current Year Current Year Current Year Current Year Current Year Current Year Last year’s Year 13 YearBand ABand BBand CBand D OVERALLMIDYIS VOCABULARY YearBand ABand BBand CBand D Percentage of pupils falling into each MidYIS Band over time 1

MATHS YearBand ABand BBand CBand D NON VERBAL YearBand ABand BBand CBand D Scores Standardised on a Nationally Representative Sample of Schools 1

YearBand ABand BBand CBand D SKILLS PROOF READING Year Band A Band B Band C Band D Year Band A Band B Band C Band D PERCEPTUAL SPEED AND ACCURACY 2

1) Some teachers feel that the intake into the school has changed in the past few years. If so in which areas is the intake stronger or weaker? 2)Look at the breakdown of the individual test components for last year’s year 11. (2008 entry). What do you notice? a) Achievements in a core subject at GCSE in 2012 were below what had been achieved in previous years. Can you suggest which subject it was? b) Why would you expect better results at GCSE in this subject this year. c) This subject faces challenges in at least two further entry cohorts. Which are they? Suggest a strategy that they might follow. 3) What factors could explain the discrepancy between scores in the individual components and also in those scores from year to year? Is there anything that can be done about it? 4) Why do you think that the discrepancies between the score components can be vastly different between schools. Which scores are particularly high compared to the whole national cohort for this school. 2

HANDOUT 1 Page 3 (slides 37/38)

Handout 1 Page 4 and 5

Exercise What are the Strengths and weaknesses of this A/AS level student? To use the IPR (Individual pupil record) familiarise yourself with the terms standard score, band, stanine, percentile and confidence band a)Which AS/A level subjects might be avoided? b) This student chose English, Film Studies, Music Technology and Psychology. Is this a good choice? Do you foresee any problems?

NameOverallVocabMathsNon Verbal AverageA Level subjects chosen St.ScoreBandSt.ScoreBandSt.ScoreBandSt.ScoreBand GCSE A78D49D99B92Cna biologymathsbusiness art B94C115A85D104Bna biology businesspsychologyenglish C88D97C85D104B5.6 historypsychologyenglishmedia D101B107B97C80D5.9 businesshistoryenglishdrama E104B87D112A116Ana biologyphysicsmaths business F81D47D103B111Bna mathsfurthermathsbusiness G93C113A84D113Ana biologybusinessfrenchgeography H97C111A89D99C7 artenglishpsychologyreligiousst. I87D68D100B109B5.4 mathsgeographyfrenchmusic J105B67D124A85D6.1 mathsfurthermathspsychologyeconomi cs K96C71D110A97Cna biologymathsartenglish L92C60D111A97Cna mathshistoryreligiousst.english You are given data relating to an institution where students completed the ALIS computer adaptive test. They are chosen because they show significant differences between the various parts of the test. Remember scores are standardised around 100. a) Are there any apparent mismatches between the subjects being followed and this data? b) What support can be given to those students who have particular weaknesses in Vocabulary or Mathematics ? c) How might predictions made for these students be tempered in the light of the inconsistencies in the test components and missing average GCSE points scores?