‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003.

Slides:



Advertisements
Similar presentations
Shaping the Future Strategic planning and development of ICT in Schools.
Advertisements

Online Student Success: Teaching the ABCs of Online Proficiency to Produce As, Bs, and Cs in Online Classes.
Blue skies storyboards. Iterative design process Blue skies design at Programme level Re-design at Programme level Risk assessment Design at Activity.
Websydian Anne-Marie Arnvig Manager, Websydian Communications & Relations.
Miranda Simond Why should we welcome e-assessment?
Websydian products.
Level 2 Cambridge Nationals in ICT. ICT Pathway 3hrs a week Two routes you can take one being Cambridge Nationals and the other being GCSE Computing You.
Are you ready for online learning? 20 questions to measure your readiness to study online at YTC College.
1 Copyright © 2010 AQA and its licensors. All rights reserved. Introduction to the new specification GCSE Computer Science Paul Varey.
1 MIS 2000 Class 22 System Security Update: Winter 2015.
Presenter: Kay Fenton UNITEC Institute of Technology Auckland, New Zealand.
Innovation Innovation Innovation eAssessment and eLearning May 2006.
WJEC Online Examination Review OER (Available from the autumn term)
On-line assessment. ‘If lower-order learning is an unintended educational consequence of on-line assessment, then any perceived or real gains made in.
02/12/00 E-Business Architecture
ON-SCREEN TESTING on THE ISLE OF WIGHT The Power of Digital Assessment June 15 th 2005 – Extending the classroom walls.
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
1 Technology Readiness Maryland /2015 Admin Schedule 2 AssessmentOnline/CBT Testing Dates PARCC - PBAMarch 2 – May 8 MSA ScienceApril 13.
Make a difference Welcome A Level ICT. Contents Introduction to OCR Introduction to ICT Why change to our specification? Support and training Next steps.
Design, goal of design, design process in SE context, Process of design – Quality guidelines and attributes Evolution of software design process – Procedural,
Computer Based Testing in Orthopaedics The way forward in the written paper.
. GCSE Computer Science. General Information The spec has been developed with the support of Microsoft The specification and sample assessment materials.
MADE Mobile Agents based system for Distance Evaluation Vikram Jamwal KReSIT, IIT Bombay Guide : Prof. Sridhar Iyer.
BIF713 Operating Systems & Project Management Instructor: Murray Saul
ICEE 2005 July 25-29, Gliwice, Poland Implementation of E-Learning in Engineering Education: Evaluation of Students Skills and Learning Approaches James.
FUNCTIONAL SKILLS JUNE 2011 © CITY & GUILDS.
David Overton Head of Small Business Technology – Head of Small Business Technology – Microsoft solutions for.
SQA’s digital ambitions CIT-eA Workshop One, City of Glasgow College, Friday 5 September 2014 Martyn Ware Head of Assessment Development and Delivery.
How FACILITY CMIS and E-Portal are used within the organisation
Questions? Type in Q&A Computer Based Testing Best Practices November 28 & 29, 2012 Grade 5 FCAT 2.0 Computer Based Testing Best Practices.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
School Test Coordinators Training Overview. Big Picture Objectives Understand the roles and responsibilities of school test coordinators Be able to support.
21 st Century assessment Martin Ripley 15 September, 2005.
Event Management & ITIL V3
Collecting Data Types, coding, accuracy, file formats and the effect of data loss.
Institutional Considerations
Digital Dictation Pilot within CMHT Anton Faulconbridge Rantmedia.
Planning for security Microsoft View
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Continuous Backup for Business CrashPlan PRO offers a paradigm of backup that includes a single solution for on-site and off-site backups that is more.
Cloud Computing Project By:Jessica, Fadiah, and Bill.
LESSON 3. Properties of Well-Engineered Software The attributes or properties of a software product are characteristics displayed by the product once.
ICT in Education. 2 key points you need to learn/understand/revise Use of computers for teaching and learning Use of computers for school/college administration.
What is CrossLoop? FREE secure screen sharing utility designed for people of all technical skill levels. Connect any two PCs on Earth Available in 21 Languages.
The KS3 ICT test development project: Using IT to transform assessment Sue Walton, National Assessment Agency Glasgow, May 2006 Add name hereAdd date.
Appropriate and inappropriate applications of ICT.
Practical Craft Skills CPD Presentation Unit Specifications and Unit Assessment Support Packs (UASP) National 3, 4 and 5.
Thepul Ginige Lecture-7 Implementation of Information System Thepul Ginige.
1 CS 501 Spring 2003 CS 501: Software Engineering Lecture 13 Usability 1.
Using e-assessments Dublin – 13 October, 2005 Suzana Lopes – Director, Sales and Marketing Assessment Tomorrow.
Presentation on “Technology used by university student”
Able to transfer and adapt their skills Able to understand the consequences of their actions Able to use software and devices efficiently Responsible.
School Test Coordinators Training Overview. 3/17/2016Free Template from 2 Understand the roles and responsibilities of school test.
Computer Security Sample security policy Dr Alexei Vernitski.
Copyright © 2010 Pearson Education, Inc. or its affiliate(s). All rights reserved.1 | Assessment & Information 1 Online Testing Administrator Training.
How to fix Missing Windows Sockets Registry Entries required for Network Connectivity in Windows 10 /pages/Reimage- Repair- Tool/ /u/6/b/
Training By SSDN Technologies. It’s a Small, Small World Everyday our world is growing smaller and bigger at the same time. With the advancement in technology,
RAMSDENTelecommunications Training RAMSDEN Telecommunications Training RTT75 Create Technical Documentation.
Unit 6 Application Design.
Patrick Grant Enriching Assessment with e-Portfolios
OCR Cambridge Technical Introductory Diploma in IT Bridging Work
An Introduction to e-Assessment
LO2 - Be Able to Design IT Systems to Meet Business Needs
Managing Sessions in PearsonAccessnext
Managing Sessions in PearsonAccessnext
Queries raised regarding use of SOLAR
Managing Sessions in PearsonAccessnext
Managing Sessions in PearsonAccessnext
Managing Sessions in PearsonAccessnext
Managing Sessions in PearsonAccessnext
Presentation transcript:

‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003

Evaluating Computer-based Exams Key Stakeholders: candidates test centres examiners Awarding Bodies Key Stakeholders: candidates test centres examiners Awarding Bodies Key Success Factors: engagement accessibility technical usability system reliability security test validity, reliability data analysis Key Success Factors: engagement accessibility technical usability system reliability security test validity, reliability data analysis

Project Background PEP 2 Project Schools: 19 schools GCSE style exams over 1,000 candidates approx. 1,400 exams learndirect & Army Test Centres: 13 test centres real National Tests over 300 candidates approx. 300 tests

The ExamBase System ExamBase Client ExamBase Server MicroBoard website Test Centre Assessment Producer exam question marked question data ExamBase Client marked ‘papers’ Standard PCs Not limit use of PCs at other times Secure, Reliable Easy-to-use, Engaging, Accessible Standard PCs Not limit use of PCs at other times Secure, Reliable Easy-to-use, Engaging, Accessible

On-screen marking MicroBoard Test Centre MarkerBase Server exam Examiner Marking Centre partially marked ‘papers’ fully marked ‘papers’ Examiner

Exam Security Web services MicroBoard Web interface Web service consumer ExamBase Server Test Centre ExamBase Client  Secure internet (https/ssl)  Encrypted file transfer  Secure internet (https/ssl)  Encrypted file transfer (htps/ssl/proprietary)  Encrypted file transfer (htps/ssl/proprietary)

MicroBoard

ExamBase Server

 (red) Downloaded but locked  (red) Downloaded but locked This exam has been downloaded but has not been scheduled to run  (orange) Downloaded and Scheduled  (orange) Downloaded and Scheduled This exam has been scheduled but is not accessible by candidates until the scheduled start time.  (green) Unlocked and ready to run  (green) Unlocked and ready to run The four hour scheduling window has begun and the exam is ready to be accessed by candidates.  (yellow) Paused  (yellow) Paused The exam is currently paused by the administrator.  (blue) Finished  (blue) Finished The exam has been scheduled, should have been used and the four hour window for access has now finished or the exam has been closed (see 3.5 Finish exam).  (red) Downloaded but locked  (red) Downloaded but locked This exam has been downloaded but has not been scheduled to run  (orange) Downloaded and Scheduled  (orange) Downloaded and Scheduled This exam has been scheduled but is not accessible by candidates until the scheduled start time.  (green) Unlocked and ready to run  (green) Unlocked and ready to run The four hour scheduling window has begun and the exam is ready to be accessed by candidates.  (yellow) Paused  (yellow) Paused The exam is currently paused by the administrator.  (blue) Finished  (blue) Finished The exam has been scheduled, should have been used and the four hour window for access has now finished or the exam has been closed (see 3.5 Finish exam). ExamBase Server

Example On-Screen Assessment Questions

The Test Centre Experience Priority 1 – a reliable technical infrastructure:  includes trained support staff  unreliability may interrupt an exam  funding, management processes are key factors Priority 1 – a reliable technical infrastructure:  includes trained support staff  unreliability may interrupt an exam  funding, management processes are key factors Priority 2 – an appropriate exam environment:  setting-up takes longer  can be difficult to provide adequate privacy Priority 2 – an appropriate exam environment:  setting-up takes longer  can be difficult to provide adequate privacy

The Test Centre Experience Schools:  computer-based exams entirely new to schools  transfer of responsibility from administrators to technical staff  48% found software installation easy or very easy  54% had no problems with candidate registration  all schools ran exams successfully Schools:  computer-based exams entirely new to schools  transfer of responsibility from administrators to technical staff  48% found software installation easy or very easy  54% had no problems with candidate registration  all schools ran exams successfully

The Test Centre Experience learndirect centres :  technical infrastructure already exists for e- learning and formative assessment  used to maintaining ‘business critical’ IT systems  95% of 29 Skills for Life project centres found the installation easy or very easy  all centres found registration of candidates using MicroBoard easy learndirect centres :  technical infrastructure already exists for e- learning and formative assessment  used to maintaining ‘business critical’ IT systems  95% of 29 Skills for Life project centres found the installation easy or very easy  all centres found registration of candidates using MicroBoard easy

The Candidate Experience School Pupils:  ‘digital natives’  at ease with IT  over 94% use a computer at  home at least once a week  exams are mandatory School Pupils:  ‘digital natives’  at ease with IT  over 94% use a computer at  home at least once a week  exams are mandatory Adult Learners:  great disparity of IT experience  some with no IT experience  have opted for National Tests Adult Learners:  great disparity of IT experience  some with no IT experience  have opted for National Tests

The Candidate Experience School Pupils:  Overwhelmingly positive  92% said they enjoyed the tests School Pupils:  Overwhelmingly positive  92% said they enjoyed the tests Adult Learners:  computer-based exams seen as new and positive  “almost unanimously they reported preferring to take an online test rather than a paper test” - Ufi Adult Learners:  computer-based exams seen as new and positive  “almost unanimously they reported preferring to take an online test rather than a paper test” - Ufi “It didn’t feel like an exam” “more enjoyable than writing”

The Candidate Experience Common concerns:  technology failures: rare but disconcerting mainly due to PC system problems candidates can continue on a spare PC without losing work  the exam environment keyboard noise in particular Common concerns:  technology failures: rare but disconcerting mainly due to PC system problems candidates can continue on a spare PC without losing work  the exam environment keyboard noise in particular

MarkerBase

The Examiner Experience General responses:  9 out of the 10 examiners found the marking either very enjoyable or enjoyable  all found the software either very easy or easy to use  most would prefer to work at home  liked choice of being able to mark ‘by candidate’, ‘by question’ or both General responses:  9 out of the 10 examiners found the marking either very enjoyable or enjoyable  all found the software either very easy or easy to use  most would prefer to work at home  liked choice of being able to mark ‘by candidate’, ‘by question’ or both

Potential for more rapid marking :  marking speed increased with experience  marking ‘by question’ increases efficiency Potential for more rapid marking :  marking speed increased with experience  marking ‘by question’ increases efficiency The Examiner Experience More accurate tallying:  automatic tallying of marks  error-prone task removed More accurate tallying:  automatic tallying of marks  error-prone task removed

The Awarding Body Experience Engaging learners:  closer to day-to-day experiences for many  exam stress reduced  positive impact on self-esteem  more engaging than paper-based exams Engaging learners:  closer to day-to-day experiences for many  exam stress reduced  positive impact on self-esteem  more engaging than paper-based exams

Accessibility:  Much greater flexibility: can register candidates up to the exam start ‘on demand testing’ will be possible  ‘on screen’ approach allows tests to be taken: in mobile situations (e.g. touring bus) in remote locations (e.g. using laptops) Accessibility:  Much greater flexibility: can register candidates up to the exam start ‘on demand testing’ will be possible  ‘on screen’ approach allows tests to be taken: in mobile situations (e.g. touring bus) in remote locations (e.g. using laptops) The Awarding Body Experience

Quality assurance:  Less opportunity for human error  Quality checks have to be refocused Quality assurance:  Less opportunity for human error  Quality checks have to be refocused Security:  less vulnerable to lost or mishandled papers  encryption technologies increase security Security:  less vulnerable to lost or mishandled papers  encryption technologies increase security

The Awarding Body Experience Exam reliability:  100% reliability for ‘closed’ question types  ‘open’ question types: marking remains subjective MarkerBase helps examiners to be more consistent Exam reliability:  100% reliability for ‘closed’ question types  ‘open’ question types: marking remains subjective MarkerBase helps examiners to be more consistent

The Awarding Body Experience Exam validity:  Good correlation with paper-based National Tests  GCSEs, using wider range of question types, will require additional validation effort Exam validity:  Good correlation with paper-based National Tests  GCSEs, using wider range of question types, will require additional validation effort

The Awarding Body Experience Results generation:  possibility of more rapid feedback to candidates  provide results to test centre rather than to candidates Results generation:  possibility of more rapid feedback to candidates  provide results to test centre rather than to candidates Data analysis:  richer feedback to candidates and test centres  easier to detect anomalies in candidate responses Data analysis:  richer feedback to candidates and test centres  easier to detect anomalies in candidate responses

Conclusions  very popular with candidates - more engaging and interactive, but additional validation needed  greater security and reduced risk of human error  improved accessibility and flexibility – highly desirable for Awarding Bodies  quality of technical infrastructure is vital  may initially place a burden on test centres, but with the promise of efficiency gains over time  very popular with candidates - more engaging and interactive, but additional validation needed  greater security and reduced risk of human error  improved accessibility and flexibility – highly desirable for Awarding Bodies  quality of technical infrastructure is vital  may initially place a burden on test centres, but with the promise of efficiency gains over time