Secondary Information Systems

Slides:



Advertisements
Similar presentations
Using CEM Data in Practice: Feedback from CEM Secondary Assessments
Advertisements

Predicting Future Attainment: Is GCSE Always the Answer ? Dr Robert Clark ALIS Project Manager.
The MidYIS Test.
Introduction to CEM and the Baseline Tests
Peter Finlayson Quality improvement Officer February 2013.
NEW USERS OF ALIS WORKSHOP JUNE 2011 London Conference Geoff Davies.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
OPTION CHOICES CORE SUBJECTSLESSONS English (including English Literature)7 Mathematics7 Science (Biology, Chemistry and Physics)12 ICT (GCSE ICT)2.
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Welcome to the Key Stage 4 Information Evening 22 January 2014.
GCSE Options 2013 for 2014/2015 Information Evening for Year 9 Students and Parents.
Experienced Users of MidYIS and Yellis Ian Sanderson
Neil Defty, Secondary Systems Programme Manager New Developments from CEM Bristol, April 2011.
Use of Data At start of each academic year, HODs are provided with the following data GCE and GCSE Broadsheets and summaries Residual data for courses,
Introduction to Value-Added Data Dr Robert Clark.
Options Phoenix School 2015.
Phoenix Academy The Future Ahead It is an ever challenging world for the present year 9. More competitive and changing At the Phoenix Academy, we.
Pat Preedy has worked with CEM since 1992 and was part of the team that developed the baseline assessments.
Achievement Pathways Options for success Welcome to our Achievement Pathways Evening 2012 Today’s presentation What are achievement pathways? Which subjects.
Introduction to Value-Added Data Robert Clark Neil Defty Nicola Forster.
Secondary Information Systems
Neil Defty, Secondary Systems Programme Manager New Developments from CEM London, January 2011.
Data for Monitoring Target Setting and Reporting
Introduction to CEM and Computer Adaptive Assessments
BECTa ICT Research Conference – June 2002 Intro  Survey Details  Secondary Surveys conducted July 2000 and June/July 2001  Sponsored by Fischer Family.
Year 7 Settling – in Evening. Assessment Process and Ability Grouping.
Introduction to Alis Dr Robert Clark ALIS Project Manager.
New Users of MidYIS and Yellis Ian Sanderson
Using Alis Predictive Data Dr Robert Clark Alis Project Manager.
Introduction to Value-Added Data Dr Robert Clark.
Interpreting Feedback from Baseline Tests – Predictive Data Course: CEM Information Systems for Beginners and New Users Day 1 Session 3 Wednesday 17 th.
CEM Data and Self-Evaluation Dr Robert Clark ALIS Project Manager.
Yr 12 Parents Forum: ALIS data West Island School October 2013 Mike Williams (Data Development Leader)
YEAR 10 GUIDANCE EVENING Progress 8 The Government have introduced a new value-added performance measure which will replace 5+ A*-C inc Maths/English.
Baseline testing in Reporting and Assessment Patrick Moore – Head of Assessment and Reporting.
The use of CEM data for teachers of pupils with Specific Learning Difficulties.
Introduction to CEM Secondary Pre-16 Information Systems Neil Defty Secondary Systems Programme Manager.
Working with Colleagues, Parents and Students Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013 Rob Smith: CEM Inset Provider.
Introduction to CEM Secondary Pre-16 Information Systems Nicola Forster & Neil Defty Secondary Systems Programme Managers London, June 2011.
Belfast VPs’ cluster group Value Added May Why Measure Value Added? As a means of self-evaluation Raising academic standards As a measure of effectiveness.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Running a School on Data: the role of Value-Added Data Course: Using CEM data in Practice Day 2 Session 4 Wed 30 th May 2012 Peter Hendry: CEM Consultant.
YEAR 9 OPTIONS INFORMATION EVENING 2nd December 2015
Using CEM Data for Self-Evaluation and Improvement Running Your School on Data 7 th June 2011
Monitoring Achievement and Progress in Independent Schools Running Your School on Data January 2011 Peter Hendry: CEM Consultant
Data for Target Setting and Monitoring Course: Using CEM data in Practice Day 2 Session 3 Wed 30 th May 2012 Peter Hendry: CEM Consultant
National Curriculum – changes and implications Assessment – changes and implications SATs 2016 – Year 2 & 6.
SAM Learning GO! An Overview for Teachers. SAM Learning GO! Objective Raising attainment throughout the school year by providing student and teacher support.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
KEY STAGE 4 OPTIONS EVENING. KEY STAGE 4 Well-qualified, successful individuals, who enjoy learning, make progress and achieve Independent, confident.
Making the most of Assessment Data in the Secondary Years Dr Robert Clark.
Welcome to. Success at Key Stage 4 Success at Key Stage 4 What can I achieve? How do I achieve it?
Key Stage 4 Options March Aims of this session To explain;  how the options process works  what’s compulsory (the core) and what’s optional 
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Feedback from CEM Assessments: Individual Pupil Records & Predictions Belfast, March 6 th 2013 Neil Defty Business & Development Manager CEM
Year 7 Curriculum Evening
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Target Setting at KS3 and KS4
Year 8 Options - Parent Briefing
Year 7 Curriculum Evening
Introduction to Alis Dr Robert Clark ALIS Project Manager.
Option Choices in Year 9.
Experienced Users of Alis
KEY STAGE 4 OPTIONS EVENING.
Introduction to CEM Secondary Pre-16 Information Systems
The MidYIS Test.
Year 7 Curriculum Evening
Understanding Progress 8
Using CEM data for T and L
Curriculum & Tracking Mr Rhodes - Headteacher
Presentation transcript:

Secondary Information Systems Introduction to CEM Secondary Information Systems Dr Robert Clark ALIS Project Manager

The Projects

The Projects Alis Yellis MidYIS A / AS / Btec / IB etc Year 13 GCSE Computer Adaptive Baseline Test (paper test also available) Year 13 Year 12 GCSE Year 11 Yellis Computer Adaptive Baseline Test (paper test also available) Year 10 Combines curriculum tests with developed ability INSIGHT Year 9 MidYIS Computer Adaptive Baseline Test (paper test also available) Year 8 (+ additional) Year 7

Early Testing & Predictions Typical Timeline Testing is available from June at end of previous academic year i.e. Y6 / Y9 / Y11 Early Testing & Predictions The baseline score is measured as early as possible and the information sent to the CEM Centre The institution then receives feedback from the CEM Centre: baseline test results and predictions The public examination results are sent to the CEM Centre when received by the institution The value added data is available from September via the Institution’s website at the CEM Centre

Predictions

How CEM ‘Predictions’ are made… . How CEM ‘Predictions’ are made… Subject X A / B C 0 2 4 6 8 Regression Line (…Trend Line, Line of Best Fit) Outcome = gradient x baseline + intercept Correlation Coefficient (~ 0.7)

Some Subjects are More Equal than Others…. D E >1 grade

Some Subjects are More Equal than Others…. GCSE (MidYIS or Yellis) F E D C B A A* Test Score GCSE Grades Art & Design Biology Chemistry Economics English French Geography German History Ict Mathematics Media Studies Music Physical Education Physics Religious Studies Science (Double) Spanish 1 grade

Baseline Measurement

Problems with Key Stage Baselines Not all students have KS baselines No KS3 Foreign Students Vocational Students Adult Learners KS exams do not always represent ‘Start of Course’ ability Post-16 : Year(s) out or intermediate years Prior Value-Added Can you add value at every Key Stage ? Under achievement leading to under expectation One teacher’s output = another teacher’s input Teaching to the test Does performance represent ability, effort or exam technique ? Aptitude & fluency vs Achievement & knowledge Transfer of Data

Although Key Stage baselines can be a very good indicator of potential attainment, by themselves they are not sufficient. Key Stage baselines are confounded by the effects of prior treatment.

The Computer Adaptive Test Test performed online – results automatically transmitted to CEM. Minimal installation / setup required - if any. Adaptive – difficulty of questions changes in relation to ability of student. Efficient – no time wasted answering questions that are far too easy or difficult. Wider range of ability Less stressful on students – more enjoyable experience than paper test. Less demanding invigilation. Cheaper ! In 2010/ 2011 over 200,000 students across yrs 7-13 sat this test Try it yourself at www.intuproject.org/demos

Baseline Feedback Reports, Graphs & Predictions

IPRs (Individual Pupil Record Sheets) Look for sections that are inconsistent Also available based on MidYIS, Alis, SOSCA & INSIGHT scores

Intake Profiles Also available based on MidYIS, Yellis, SOSCA and INSIGHT scores

Intake Profiles (Historical)

Predictions – MidYIS example Similar spreadsheets available from Yellis, SOSCA, INSIGHT

Predictions - Alis example

Chances Graphs Grade Percent Grade Percent Grade Percent Grade Percent English Language - band D 2 10 23 31 24 9 1 20 30 40 50 U G F E D C B A A* Grade Percent English Language - band C 1 2 6 20 35 29 7 10 30 40 50 U G F E D C B A A* Grade Percent English Language - band B 2 7 24 40 21 5 10 20 30 50 U G F E D C B A A* Grade Percent English Language - band A 1 8 26 35 23 7 10 20 30 40 50 U G F E D C B A A* Grade Percent

Predictions vs Targets

What is a ‘Prediction’ ? Targets ? NOT a forecast of the grade the student will get An indication of the grade (points score) achieved on average by students of similar ability in the previous year Targets ? Minimum Targets – Round CEM prediction down ? Realistic Targets – Use nearest CEM grade ? Challenging Targets – 75th percentile ? Prior Value Added ? Arbitrary grade fraction ?

75th Percentile Predictions If all students attain 75th percentile predictions, School VA will be at top 25% Provided by all Projects in addition to 50th percentile Alis : Excel spreadsheet - ‘Predictions – Spreadsheet (75th Percentile)’ Pre-16 : Adjust button on predictions spreadsheet Prior Value Added Only where prior VA is positive ? 1 year or 3 ? Reasonable to use raw residual figures as this is an approximate measure and raw residuals give grade fractions Alis : Can be calculated using PARIS software Data used to inform, not replace, professional judgement

75th Percentile Prior Value Added

75th Percentile Step 1 Prior Value-Added

Value-Added Feedback Reports & Graphs

Measuring Value-Added – Terminology . Measuring Value-Added – Terminology Exam grade -ve VA +ve VA BASELINE SCORE Raw Residual Trend Line/Regression Line

The position of the national trend line is of critical importance Measuring Value-Added – An Example National Trend ‘Average’ Student A* U B C D E F G Result -ve (- 2 grades) Alf Bob Subject A Subject B +ve (+ 2 grades) Chris Low Ability Average Ability High Ability Baseline Score The position of the national trend line is of critical importance

Standardisation of Residuals (Raw) Residuals can be used to examine an individual’s performance Standardised Residuals are used to compare performance of groups Standardised Residuals are independent of year or qualification type For a class, subject, department or whole institution the Average Standardised Residual is the ‘Value-Added Score’ Standardised Residual = Residual / Standard Deviation (National Sample) When using Standardised Residuals then for an individual subject where N = number of results in the group (for combinations of subjects consult the relevant project) 95% Confidence Limit = 2.0 x Standard Error 99% Confidence Limit = 2.6 x Standard Error 99.7% Confidence Limit = 3.0 x Standard Error

Burning Question : What is my Value-Added Score ? Better Question : Is it Important ?

Statistical Process Control (SPC) Chart Value Added Feedback… Statistical Process Control (SPC) Chart 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 Year

Subject Summary Standardised Residual Graph -0.2 -0.1 0.1 -0.6 -0.7 -0.4 -2.2 -0.5 2.0 0.4 -4.0 -3.0 -2.0 -1.0 0.0 1.0 3.0 4.0 Art Biology Chemistry Design and Technology Drama English Language English Literature French Geography German History Home Economics Information Technology Maths Media Studies Music Physics Religious Studies Spanish Average Standardised Residual

The Scatter Plot Grade Points Equivalent Look for Patterns… Baseline Score Grade Points Equivalent Look for Patterns… General Underachievement / over achievement ? Do any groups of students stand out ? – high ability vs low ability ? – male vs female ?

Other things to look for… Why did these students do so badly ? Why did this student do so well ? How did they do in their other subjects ?

PARIS Software

PARIS is ….. PARIS analyses ….. PARIS provides ….. Software to install and use in school An interactive reporting tool Included free with Alis / Yellis / MidYIS PARIS analyses ….. Potential Performance Intermediate Performance Actual Performance PARIS provides ….. Student level reports Subject Level reports Institution Level Reports

Attitudes

There is more to school / college than exams…. Student attitudes Student Welfare & Safety Non-academic activities Support Social and personal development Parental Survey Induction Survey Self Evaluation (Every Child Matters) Attitudinal MidYIS INSIGHT Attitudinal Yellis Full ALIS Try it yourself at www.intuproject.org/demos

Other Issues

Points to mull over… Independence – no agendas Transparency of Analysis Self Evaluation Straightforward and standardised administration Prompt Feedback Full working hours phone / email support Student focus Replacement for KS3 Innovative online adaptive testing available – student experience Longitudinal analysis with appropriate error backgrounds Non-curriculum embedded baselines available Attitudinal surveys available – Every Child Matters…

Dr Robert Clark Alis Project Manager robert.clark@cem.dur.ac.uk 0191 33 44 193