USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies.

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

The MidYIS Test.
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
MIDYIS – A BRIEF INTRODUCTION
Peter Finlayson Quality improvement Officer February 2013.
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Experienced Users of MidYIS and Yellis Ian Sanderson
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Interpreting Feedback from Baseline tests- Whole School and Individual Student Data CEM CONFERENCE EXETER Geoff Davies Day 1 Session 2 27 th February 2013.
Using MidYIS to inform Teaching & Learning
Advanced Users – MidYIS, Yellis & ALIS
Secondary Information Systems
Secondary Information Systems
M.Greenaway. Analysing Data.
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Data for Monitoring Target Setting and Reporting
Introduction to CEM and Computer Adaptive Assessments
Curriculum and Assessment in Northern Ireland
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
Our assessment objectives
Using data from the whole school Perspective CEM CONFERENCE EXETER Geoff Davies Day 2 Final session 28 th February 2013.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Changes to end of Key Stage assessment arrangements. Monday, November 16 th 2015.
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Welcome to the Key Stage 1 Standardised Assessment Tests Information and Guidance on the Changes and Expectations for 2015/16 Information for Parents Wednesday.
SAT’s Information Parent’s Meeting 10 th February February 2016.
Assessment at CPS A new way of working. Background - No more levels New National Curriculum to be taught in all schools from September 2014 (apart from.
Helmingham Community Primary School Assessment Information Evening 10 February 2016.
Lostock Gralam CE Primary School Parent Information Meeting January 2016.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Key Stage 2 SATs Information and Guidance on the Changes and Expectations for 2015/16 A School Presentation to Parents.
Assessment At Ivy Bank Parents' Meeting What has changed? We have a new national curriculum. In September 2014 it was introduced for all subjects.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Key Stage 2 SATs Willand School. Key Stage 2 SATs Changes In 2014/15 a new national curriculum framework was introduced by the government for Years 1,
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
KS1 SATS Guidance for Parents
Assessment Background September 2014 – New National Curriculum introduced into schools Years 1 and 2 (KS1), Years 3 and 4 (Lower KS2), Years 5 and 6 (Upper.
Key Stage 1 National Curriculum Assessments Information and Guidance on the Changes and Expectations for 2015/16 A School Presentation to Parents.
KS1 SATS KS1 SATS information for parents. KS1 Assessment in 2016 New Statutory assessment in Year 2 new national curriculum tests scaled scores – KS1.
Hertfordshire County Council The Role of the Secondary Assessment Co-ordinator Day One 5 th July 2005.
Key Stage 1 and 2 Tests 2016 Presentation to Parents and Carers Otterbourne Primary School April 2016.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Key Stage 1 National Curriculum Assessments. In 2014/15 a new national curriculum framework was introduced by the Government for Years 1, 3, 4 and 5.
In 2014/15 a new national curriculum framework was introduced by the government for Years 1, 3, 4 and 5. However, Years 2 and 6 (due to statutory testing)
Key Stage 1 National Curriculum
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
Key Stage 1 National Curriculum
SATs KS1 – YEAR 2 We all matter.
Information for Parents Key Stage 3 Statutory Assessment Arrangements
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Welcome to the Key Stage 1 Standardised Assessment Tests
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
Tuesday 27th March 2018 KS1 SATs Meeting.
St. James & St. John CE Primary School
Introduction to CEM Secondary Pre-16 Information Systems
Key Stage 1 National Curriculum
The MidYIS Test.
Course: CEM Information Systems for Beginners and New Users
Tuesday 5th March 2019 KS1 SATs Meeting.
Presentation transcript:

USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies

CEM Secondary data includes: Baseline test data (developed ability) ‘Predictive’ data including chances graphs Value-added data Attitudinal data Curriculum assessments (Insight/Sosca) PARIS software programmes

The use of this data needs to allow us to do our best to help every pupil to at least achieve if not exceed their potential. It may challenge The culture of ‘my’ school Accountability policy Expectations Staff training in use of data and ability to cope with data (data overload) Integrating the data into school procedures, storage, retrieval, distribution and access Roles and Responsibilities

Carol Fitz-Gibbon 2001 British Psychological Society It gradually dawned on me that providing the data to schools was the most important outcome of the effort, far more important than writing research papers….. The provision of data to practitioners meant that they participated in the research. Indeed they were the only ones who knew the surrounding circumstances for their classrooms, their department, each pupil, each family, etc. They were the major players: the ones who could interpret and learn from the detailed data.

…..there is a need for teacher researcher posts on the Senior Management team with a brief to develop research that is useful. Given time in the timetable thousands of teachers could become active researchers…. Educational research should be a practical reality contacting scientific enlightenment not a mathematical weight lifting exercise The sense and usefulness of what we are doing induces creative thoughtfulness.

creative thoughtfulness QUESTION IS THIS STIFLED IN THE PRESENT EDUCATIONAL CLIMATE?

I keep six honest serving men Who taught me all I know Their names are What and Why and When And How and Where and Who Rudyard Kipling The Elephant’s Child Can school leaders find the time? Important but not urgent Urgent but not important Is it more like? 65%-80% or 15% Is it more like? 15% or 60% PRIORITISING

QUESTIONS INSPIRED BY CEM CENTRE DATA How do we improve our professional judgement? Do MIDYIS scores tell us more than we think? Which ability range of pupils gets the best deal from our school? Which ability range of pupils is served best by a particular subject department? Can we do action research in our school using historical data? Can standardised residuals tell us more about a department than just the + or -? Does learning support work? What do attitudinal surveys over time give us? Can we compare standardised scores with comments made by teachers on reports? Drilling down from SPC charts. Which pupils are we succeeding with? Which pupils are we not succeeding with? What can be done about it? Is there a pattern? What are the gender issues? What are the prior learning issues? What can we learn from plotting standardised scores over time? Can SOSCA help with boy/girl issues? Is key stage 3 teacher assessment sufficiently rigorous? How can ALIS be used to inform teaching and learning?

WHAT information do I want? WHY do I want it? WHEN do I need it? HOW do I collect it? WHERE can I find it? From WHO do I get it? The CEM centre ticks many of these boxes for schools

A small selection of questions we will look at 1.Do Midyis scores tell us more than we think? 2. Which ability of students do most or least well in our school (using YELLIS data)? 3. Can we review post 16 pedagogy (using ALIS data)? 4. What did SOSCA teach us about Key Stage 3 assessment?

Do Midyis scores tell us more than we think? Using Midyis baseline data Using MidYIS IPRs Booklet.pdf

What do the sections of the test measure? The Vocabulary component of the test is generally an important element for most subjects. For English, History and some Foreign Languages it is the best. However the Vocabulary score is perhaps the most culturally linked of all the scores. Those who have not been exposed to vocabulary-rich talk or a wide variety of reading material or whose first language is not English are unlikely to have developed as high a vocabulary score as they would have developed in a different environment. Maths Score The Maths score is well correlated with most subjects but is particularly important when predicting Maths, Statistics, ICT, Design Technology and Economics. The Maths section has been designed with the emphasis on speed and fluency, rather than knowledge of Maths. Like the Vocabulary score, the Maths score is a good predictor of later academic performance. Vocabulary Score

Non-Verbal Score The Non-Verbal score is composed of the three sub tests: Cross- Sections, Block Counting and Pictures. The Non-verbal score is important when predicting Maths, Science, Design Technology Geography, Art and Drama. It provides a measure of the pupil’s ability in 3-D visualisation, spatial aptitude, pattern recognition and logical thinking. It can give an insight in to the developed ability for pupils for whom English is a second language

Skills Score In the Proof Reading section pupils are asked to spot mistakes in the spelling, punctuation and grammar of a passage of text. eg mis-spelling of words like ‘there’ and ‘their’. The PSA (Perceptual speed and accuracy) section asks pupils to look for matches between a sequence of symbols on the left and a number of possible choices on the right. Given enough time most pupils would probably get the answers correct but we are measuring how quickly pupils can find a correct match. The PSA section allows speed to be demonstrated free from the demands of memory. The Proof Reading and PSA tests are tests for the modern world, and are designed to measure fluency and speed. They rely on a pupil’s scanning and skimming skills, skills that are desirable in examination situations.

Some pupils will display an IPR pattern with significant differences between one or two components of the MidYIS Test. These can be the most interesting and possibly the most challenging pupils for mainstream classroom teachers. Scenarios and anecdotal findings It is when the IPR is placed in the hands of a teacher who knows that pupil that it becomes a powerful tool.

Confidence Limits The pupil scored 114 on the Vocabulary section. The error bars range from about 105 to 123, about 9 points either side of the pupil’s score. If this pupil was to take this test afresh 100 times, we would expect that 95 of those times the pupil’s score would fall within the range denoted by the error bars

Are the scores significant? Relative to the Average Performance? Performance in Vocabulary is significantly better than average performance and Maths performance is significantly below average. The error bars for the Non-verbal, Skills and Overall MidYIS scores do cross the line at 100 and hence the pupil cannot be considered to have performed significantly different to the average pupil overall. Comparing Maths and Vocabulary Scores The error bars for Vocabulary and Maths do not cross the line at 100 ( av. performance).

A SELECTION OF MIDYIS SCORES FOR ‘WATERLOO ROAD’ !! VocabularyMaths Non VerbalSkills MidYIS Score St. ScoreBandSt. ScoreBandSt. ScoreBandSt. ScoreBandSt. ScoreBand SurnameSex AF81D110B108B112A94C BF128A107B105B94C120A CM106B121A103B90D114A DF107B84D96C107B96C EM C90D130A91C92C FF86D D120A74D84D GF100B115A80D103B108B HF121A96C114A86D111A IM92C100C96C123A95C JM100C105B100C99C102B KM128A132A114A131A133A LM76D70D74D73D71D What do I need to know/do to teach this (difficult) class of twelve pupils Why would this be a very challenging class to teach? These are real anonymous scores from a number of schools around the UK

Vocabulary scores significantly lower than other component scores Second language? Deprived areas? Difficulty accessing curriculum.? Targeted help does work. Seen in nearly all schools. Worth further diagnosis Vocabulary scores significantly higher than other component scores Good communicators. Get on. Put Maths problems in words? Mathematics significantly higher than other scores From Far East? Done entrance tests? Primary experience? Mathematics significantly lower than other scores Primary experience. Use words and diagrams? Sometimes difficult to change attitude.. Low Mathematics scores with High Non-verbal Scores Use diagrams. Confidence building often needed Pupils with non-verbal scores different from others – High Non-verbal Scores Frustration? Behaviour problems? Don’t do as well as good communicators or numerate pupils? Pupils with non verbal scores different from others – Low Non-verbal Scores Peak at GCSE? A level ? Pupils with low Skills scores Exams a difficulty after good coursework? High Skills Scores Do well in exams compared with classwork? The Average Pupil They do exist! High scores throughout Above a score of 130 puts the pupil in the top 2% nationally Low scores throughout Below a score of 70 puts the puil in the bottom 2% nationally

Sharing the MidYIS Information within School Once you have received your MidYIS feedback you need to decide who will be privy to which information. Some schools decide to keep the data within the senior management team, others with Heads of Department and/or Heads of Year, some share with all staff and what about pupils and their parents? Use you MIS systems to put the data where it matters MidYIS data can be useful: to indicate reasons for student learning difficulties and may go some way to explain lack of progress, flag up causes for underachievement and even behaviour problems. for all teachers and support staff. It can help to support professional judgement and give a better understanding of the progress students make at school and their potential later performance. to refer to for pupil reviews, writing reports, meeting parents, monitoring progress and interim assessments.

2. Which ability of students do most or least well in our school (using YELLIS data) Which ability range of pupils gets the best deal from our school? Which ability range of pupils is served best by a particular subject department? Can standardised residuals tell us more about a department than just the + or -? Using standardised residuals in a different way

CONTRAST THIS

WITH THIS

yellis exercise.doc As on the last two slides CEM provides value added charts using average standardised residuals for departments We are going to show how standardised residuals can be used in a different way in your school

1.Review of post 16 pedagogy (using ALIS data)

SUBJECT A 2008 Why the improvement?

PEDAGOGY….ALIS surveys

SUBJECT A 2005 SUBJECT A 2007

SUBJECT A 2008 DO NOT GET TOO EXCITED!

We have made a comparison of perceived teaching methods as analysed by ALIS in with those in Some subject areas have appeared to change their methods radically. Others have not. Though the samples are small it is an interesting exercise to try to correlate it with the departments statistical process charts over that period. One would like to say that changes in the variety of teaching methods result in improvement but the evidence is a little tenuous so far.

4. What have we learned from SOSCA?

GRAPH 1 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO KEY STAGE THREE TEACHER ASSESSMENT

GRAPH 2 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO SOSCA

GRAPH 1 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO KEY STAGE THREE TEACHER ASSESSMENT GRAPH 2 AVERAGE STANDARDISED RESIDUALS BOYS/GIRLS MIDYIS TO SOSCA

Using MIDYIS and SOSCA puts the school in a strong position to improve its professional judgment of teacher assessments at Key stage 3. Statutory testing disappeared in Wales some five years ago. Comparing the value added for MIDYIS to SOSCA and MIDYIS to KS 3 teacher assessment shows up some interesting data Schools who depend on teacher assessment data only to measure value added from Key stage 3 to Key stage 4 need to be aware of the pitfalls. The use of SOSCA data in this exercise highlights that see Subjective views of pupils - as well as pressure from parents - make model unreliable, warns Professor Tymms.

The differences appear to relate to the types of assessment used in the various subject areas. English and Welsh use extended writing for teacher assessment which is more likely to have subjective judgments. What we have learnt from this is that despite a moderation process built on portfolios of work for teacher assessment, it is not sufficient in isolation. Computer adaptive tests such as SOSCA and the resulting value added information from MIDYIS are more informative in a diagnostic sense than levels produced by teachers for statutory assessment. SOSCA has also been used to investigate any changes in reading from baseline testing. A high correlation was found between the London Reading score given to pupils on entry the MIDYIS score and the SOSCA reading test.

PITFALLS 1.Tracking developed ability measures over time. 2.Looking at average standardised residuals for teaching sets. 3.Effect of one result in a small group of students

REGRESSION TOWARDS THE MEAN Pupils with high MidYIS scores tend to have high SOSCA scores but not quite as high. Similarly pupils with low MidYIS scores tend to have low SOSCA scores, but not quite as low. It is a phenomenon seen in any matched dataset of correlated and normally-distributed scores, the classic example is a comparison of fathers' and sons' heights. Regression lines reflect this phenomenon - if you look at the predictions used in the SOSCA value-added you can see that for pupils with high MidYIS scores their predicted SOSCA scores are lower than their MidYIS scores, whereas for pupils with low MidYIS scores their predicted SOSCA scores are higher than their MidYIS scores.

CLASS REVIEW BEWARE PITFALLS INTERPRETATION

SUBJECT M

creative thoughtfulness PLEA DON’T LET THE SYSTEM DESTROY THIS

USING CEM SECONDARY DATA PRACTICAL APPLICATIONS IN SCHOOL APRIL 2011 Bristol Conference Geoff Davies