1 The Development of A Computer Assisted Design, Analysis and Testing System for Analysing Students’ Performance Q. He & P. Tymms, CEM CENTRE, UNIVERSITY.

Slides:



Advertisements
Similar presentations
Demystifying Construction Lecture 10 : Course Recap & Assessment Exercise Created by Antony Wood, School of the Built Environment, University of Nottingham.
Advertisements

Using CEM Data in Practice: Feedback from CEM Secondary Assessments
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
The MidYIS Test.
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
Measure of Academic Progress
Introduction to CEM Pre-16 Computer Adaptive Assessments Glasgow 13 th February 2013
MIDYIS – A BRIEF INTRODUCTION
Peter Finlayson Quality improvement Officer February 2013.
An Introduction to CEM Secondary Monitoring Systems Assessment for Excellence S1/S2 Baseline Assessment (MidYIS) & S2 Curriculum-based.
A quasi-experimental comparison of assessment feedback mechanisms Sven Venema School of Information and Communication Technology.
Item Response Theory in Health Measurement
Yr 12 Parents Forum: ALIS data West Island School November 2012.
Interactive Computerised Assessment System
The challenges of equating tests between Russia and the UK Higher School of Economics, Moscow, Alina Ivanova, Elena Kardanova (HSE, Moscow,
Modified Achievement Tests for Students with Disabilities: Basic Psychometrics and Group Analyses Ryan J. Kettler Vanderbilt University CCSSO’s National.
Standard setting and maintenance for Reformed GCSEs Robert Coe.
Assessment and Data Year 7 and 8
Eduardo Miranda C. Marc Ramirez Stanford University PEER Annual Meeting - Thrust Area 1 Meeting Jan. 20 th, 2007 PEER Loss Estimation Toolbox.
Chapter 10.  Basic Functions  Insert Graphics, Audio/Video  Add Text  Create Links  Capture Brainstormed Ideas  Generate Outline  Organize Graphics,
PAT - MATHS Progressive Achievement Tests in Mathematics 3 rd Edition.
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Modified for EPE/EDP 711 by Kelly Bradley on January 8, 2013.
Identification of Misfit Item Using IRT Models Dr Muhammad Naveed Khalid.
1 The Development of A Fraction Testing and Tutoring System Q. He & P. Brna Qingping He (CEM Centre, University of Durham, UK) & Paul Brna (SCRE Centre,
1 Perspectives on the Future of Assessment in England and Internationally Robert Coe CEM conference, 25th January 2012.
Assessment Tomorrow Conference Edinburgh, 22nd November 2012
Modern Test Theory Item Response Theory (IRT). Limitations of classical test theory An examinee’s ability is defined in terms of a particular test The.
Introduction to CEM and Computer Adaptive Assessments
A centre of expertise in digital information managementwww.ukoln.ac.uk Approaches To E-Learning: The Users’ Perspective Brian Kelly UKOLN University of.
Using ICT for Assessment Teacher observation Screening checklists Curriculum profiles Running records Standardised screening tests Diagnostic tests Rating.
CEM Conference. Welcome Peter Tymms.
1 Assessment and Monitoring
Yr 7 Parents Forum: Introduction to MidYIS West Island School October 2013.
Helen Smith “Live” Business Problems in Learning and Assessment.
1 An Investigation of The Response Time for Maths Items in A Computer Adaptive Test C. Wheadon & Q. He, CEM CENTRE, DURHAM UNIVERSITY, UK Chris Wheadon.
Monitoring Schools Using CEM Data David Tickner Deputy Head Newcastle School for Boys.
The use of CEM data for teachers of pupils with Specific Learning Difficulties.
IT Directors Group 13 & 14 October 2008 Item of the Agenda Seasonal Adjustment software Cristina Calizzani - Unit B5.
Neil Defty, Secondary Systems Programme Manager, CEM November An Introduction to CEM Secondary Monitoring Systems MidYIS (S1.
Chapter 4 Automated Tools for Systems Development Modern Systems Analysis and Design Third Edition 4.1.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
TM Peoria Public Schools NWEA – Measure of Academic Progress
Study of Device Comparability within the PARCC Field Test.
Item Response Theory in Health Measurement
Assessing Learning Outcomes Polices, Progress and Challenges 1.
Item Analysis: Classical and Beyond SCROLLA Symposium Measurement Theory and Item Analysis Heriot Watt University 12th February 2003.
Using ICT in the Classroom Assessment, Templates and Individual Education Plans.
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
The challenges of equating tests between Russia and Scotland Higher School of Economics, Moscow, Alina Ivanova, Elena Kardanova, Irina.
Course Construction Details Instructional Material Grading Criteria Math is better than Ice- Cream!! The End.
Policy & practice Some thoughts There is no assessment panacea. There are strengths and weaknesses for both observation and question- based assessment.
Planning and assessing Session 3.  Aims and Principles; what are those in settings like? How do they compare with personal aims/ principles?  Journal.
Using Simulation to evaluate Rasch Models John Little CEM, Durham University
CEM (NZ) Centre for Evaluation & Monitoring College of Education Dr John Boereboom Director Centre for Evaluation & Monitoring (CEM) University of Canterbury.
Daniel Muijs Saad Chahine
CEM (NZ) Centre for Evaluation & Monitoring College of Education
Modern Systems Analysis and Design Third Edition
Using Item Response Theory to Track Longitudinal Course Changes
Introduction to Assessment and Monitoring
الاختبارات محكية المرجع بناء وتحليل (دراسة مقارنة )
Modern Systems Analysis and Design Third Edition
Mohamed Dirir, Norma Sinclair, and Erin Strauts
Modern Systems Analysis and Design Third Edition
Investigating item difficulty change by item positions under the Rasch model Luc Le & Van Nguyen 17th International meeting of the Psychometric Society,
The MidYIS Test.
Measure of Academic Progress
Lesson 14 Spreadsheet Software.
Using CEM data for T and L
Target Setting and Monitoring
Presentation transcript:

1 The Development of A Computer Assisted Design, Analysis and Testing System for Analysing Students’ Performance Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK Qingping He and Peter Tymms (CEM Centre, University of Durham, UK)

2 The Context Classic and Modern Test Theories Limitations of Classic Test Theory Advantages of Item Response Theory (IRT) Progress in Computer Assisted Assessment Which Incorporates Test Theories Existing Computer Software Systems Expensive Expertise Functionalities Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

3 The Aim Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK Develop A System That is Easy and Economic for Schools (and other test organisations) to Use and Can Provide a Range of Useful Functionalities

4 Build Items and Construct Item Banks Design Tests Effectively (both Classic and IRT Model – based Tests) Analyse Test Results – Detailed Diagnostic Analysis of Students’ Performance: Individual Student Level School/Class level Analyse Test Items and Equate Different Tests (Rasch) Conduct Classic and IRT- based Tests Including Computer Adaptive Tests (CATs) on Computers CADATS – Computer Assisted Design, Analysis and Testing System Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

5 The Architecture Item and Bank Building, Test Designing and Results Analysis Component Testing Component Test XML Files Response XML Files Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

6 System Implementation Both Components Have Been Developed as A Microsoft Excel Visual Basic for Application Project Embedded with Macromedia Flash Player for Easy Use and Provision of Intuitive Graphic Representation of Test Results Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

7 Item and Bank Building, Test Designing and Results Analysis Component – Excel Workbook: The Main Menu Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

8 Writing Items and Building an Item Bank

9 Designing Tests Effectively

10 Analysing Test Results: Report on Individual/School Overall Scores + More

11 Item Calibration and Equating Different Tests

12 The Test Delivery Component – Excel Workbook: The Test Interface Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

13 Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK Analysis of Students’ Performance – The Hong Kong Year 7 Baseline Maths Test Case Study Hong Kong Schools Involved in Middle Years Information System (MidYIS) Project The Year 7 Baseline Test For Hong Kong Schools Analysing Maths Test Results for Students from 5 Schools in Hong Kong Taking the Baseline Test in 2003

14 Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK Maths Questions: Different Subject Areas

15 Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK Item Difficulty and Student Ability Distributions: All 5 Schools

16 Individual Level: Individual Performance and Diagnosis Analysis Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

17 Individual Level: Individual Performance and Diagnosis Analysis Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

18 Individual Level: Individual Performance and Diagnosis Analysis Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

19 Individual Level: Individual Performance and Diagnosis Analysis – Comparison within School Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

20 Individual Level: Individual Performance and Diagnosis Analysis – Comparison with A Norm (e.g. National Average) Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

21 School Level: School/Class Performance and Diagnosis Analysis- Comparison with A Norm (e.g. National Average) Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK

22 Conclusions The system is easy to use and provides a variety of functions The system can be used to generate information on the performance of students at both individual and school/class levels and the performance of test items The information generated by the system can be used to identify curriculum areas where students are under performing The use of the system will help schools improve students’ performance Q. He & P. Tymms, CEM CENTRE, UNIVERSITY OF DURHAM, UK