Presentation is loading. Please wait.

Presentation is loading. Please wait.

‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003.

Similar presentations


Presentation on theme: "‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003."— Presentation transcript:

1 ‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003

2 Evaluating Computer-based Exams Key Stakeholders: candidates test centres examiners Awarding Bodies Key Stakeholders: candidates test centres examiners Awarding Bodies Key Success Factors: engagement accessibility technical usability system reliability security test validity, reliability data analysis Key Success Factors: engagement accessibility technical usability system reliability security test validity, reliability data analysis

3 Project Background PEP 2 Project Schools: 19 schools GCSE style exams over 1,000 candidates approx. 1,400 exams learndirect & Army Test Centres: 13 test centres real National Tests over 300 candidates approx. 300 tests

4 The ExamBase System ExamBase Client ExamBase Server MicroBoard website Test Centre Assessment Producer exam question marked question data ExamBase Client marked ‘papers’ Standard PCs Not limit use of PCs at other times Secure, Reliable Easy-to-use, Engaging, Accessible Standard PCs Not limit use of PCs at other times Secure, Reliable Easy-to-use, Engaging, Accessible

5 On-screen marking MicroBoard Test Centre MarkerBase Server exam Examiner Marking Centre partially marked ‘papers’ fully marked ‘papers’ Examiner

6 Exam Security Web services MicroBoard Web interface Web service consumer ExamBase Server Test Centre ExamBase Client  Secure internet (https/ssl)  Encrypted file transfer  Secure internet (https/ssl)  Encrypted file transfer (htps/ssl/proprietary)  Encrypted file transfer (htps/ssl/proprietary)

7 MicroBoard

8

9

10 ExamBase Server

11  (red) Downloaded but locked  (red) Downloaded but locked This exam has been downloaded but has not been scheduled to run  (orange) Downloaded and Scheduled  (orange) Downloaded and Scheduled This exam has been scheduled but is not accessible by candidates until the scheduled start time.  (green) Unlocked and ready to run  (green) Unlocked and ready to run The four hour scheduling window has begun and the exam is ready to be accessed by candidates.  (yellow) Paused  (yellow) Paused The exam is currently paused by the administrator.  (blue) Finished  (blue) Finished The exam has been scheduled, should have been used and the four hour window for access has now finished or the exam has been closed (see 3.5 Finish exam).  (red) Downloaded but locked  (red) Downloaded but locked This exam has been downloaded but has not been scheduled to run  (orange) Downloaded and Scheduled  (orange) Downloaded and Scheduled This exam has been scheduled but is not accessible by candidates until the scheduled start time.  (green) Unlocked and ready to run  (green) Unlocked and ready to run The four hour scheduling window has begun and the exam is ready to be accessed by candidates.  (yellow) Paused  (yellow) Paused The exam is currently paused by the administrator.  (blue) Finished  (blue) Finished The exam has been scheduled, should have been used and the four hour window for access has now finished or the exam has been closed (see 3.5 Finish exam). ExamBase Server

12 Example On-Screen Assessment Questions

13 The Test Centre Experience Priority 1 – a reliable technical infrastructure:  includes trained support staff  unreliability may interrupt an exam  funding, management processes are key factors Priority 1 – a reliable technical infrastructure:  includes trained support staff  unreliability may interrupt an exam  funding, management processes are key factors Priority 2 – an appropriate exam environment:  setting-up takes longer  can be difficult to provide adequate privacy Priority 2 – an appropriate exam environment:  setting-up takes longer  can be difficult to provide adequate privacy

14 The Test Centre Experience Schools:  computer-based exams entirely new to schools  transfer of responsibility from administrators to technical staff  48% found software installation easy or very easy  54% had no problems with candidate registration  all schools ran exams successfully Schools:  computer-based exams entirely new to schools  transfer of responsibility from administrators to technical staff  48% found software installation easy or very easy  54% had no problems with candidate registration  all schools ran exams successfully

15 The Test Centre Experience learndirect centres :  technical infrastructure already exists for e- learning and formative assessment  used to maintaining ‘business critical’ IT systems  95% of 29 Skills for Life project centres found the installation easy or very easy  all centres found registration of candidates using MicroBoard easy learndirect centres :  technical infrastructure already exists for e- learning and formative assessment  used to maintaining ‘business critical’ IT systems  95% of 29 Skills for Life project centres found the installation easy or very easy  all centres found registration of candidates using MicroBoard easy

16 The Candidate Experience School Pupils:  ‘digital natives’  at ease with IT  over 94% use a computer at  home at least once a week  exams are mandatory School Pupils:  ‘digital natives’  at ease with IT  over 94% use a computer at  home at least once a week  exams are mandatory Adult Learners:  great disparity of IT experience  some with no IT experience  have opted for National Tests Adult Learners:  great disparity of IT experience  some with no IT experience  have opted for National Tests

17 The Candidate Experience School Pupils:  Overwhelmingly positive  92% said they enjoyed the tests School Pupils:  Overwhelmingly positive  92% said they enjoyed the tests Adult Learners:  computer-based exams seen as new and positive  “almost unanimously they reported preferring to take an online test rather than a paper test” - Ufi Adult Learners:  computer-based exams seen as new and positive  “almost unanimously they reported preferring to take an online test rather than a paper test” - Ufi “It didn’t feel like an exam” “more enjoyable than writing”

18 The Candidate Experience Common concerns:  technology failures: rare but disconcerting mainly due to PC system problems candidates can continue on a spare PC without losing work  the exam environment keyboard noise in particular Common concerns:  technology failures: rare but disconcerting mainly due to PC system problems candidates can continue on a spare PC without losing work  the exam environment keyboard noise in particular

19 MarkerBase

20 The Examiner Experience General responses:  9 out of the 10 examiners found the marking either very enjoyable or enjoyable  all found the software either very easy or easy to use  most would prefer to work at home  liked choice of being able to mark ‘by candidate’, ‘by question’ or both General responses:  9 out of the 10 examiners found the marking either very enjoyable or enjoyable  all found the software either very easy or easy to use  most would prefer to work at home  liked choice of being able to mark ‘by candidate’, ‘by question’ or both

21 Potential for more rapid marking :  marking speed increased with experience  marking ‘by question’ increases efficiency Potential for more rapid marking :  marking speed increased with experience  marking ‘by question’ increases efficiency The Examiner Experience More accurate tallying:  automatic tallying of marks  error-prone task removed More accurate tallying:  automatic tallying of marks  error-prone task removed

22 The Awarding Body Experience Engaging learners:  closer to day-to-day experiences for many  exam stress reduced  positive impact on self-esteem  more engaging than paper-based exams Engaging learners:  closer to day-to-day experiences for many  exam stress reduced  positive impact on self-esteem  more engaging than paper-based exams

23 Accessibility:  Much greater flexibility: can register candidates up to the exam start ‘on demand testing’ will be possible  ‘on screen’ approach allows tests to be taken: in mobile situations (e.g. touring bus) in remote locations (e.g. using laptops) Accessibility:  Much greater flexibility: can register candidates up to the exam start ‘on demand testing’ will be possible  ‘on screen’ approach allows tests to be taken: in mobile situations (e.g. touring bus) in remote locations (e.g. using laptops) The Awarding Body Experience

24 Quality assurance:  Less opportunity for human error  Quality checks have to be refocused Quality assurance:  Less opportunity for human error  Quality checks have to be refocused Security:  less vulnerable to lost or mishandled papers  encryption technologies increase security Security:  less vulnerable to lost or mishandled papers  encryption technologies increase security

25 The Awarding Body Experience Exam reliability:  100% reliability for ‘closed’ question types  ‘open’ question types: marking remains subjective MarkerBase helps examiners to be more consistent Exam reliability:  100% reliability for ‘closed’ question types  ‘open’ question types: marking remains subjective MarkerBase helps examiners to be more consistent

26 The Awarding Body Experience Exam validity:  Good correlation with paper-based National Tests  GCSEs, using wider range of question types, will require additional validation effort Exam validity:  Good correlation with paper-based National Tests  GCSEs, using wider range of question types, will require additional validation effort

27 The Awarding Body Experience Results generation:  possibility of more rapid feedback to candidates  provide results to test centre rather than to candidates Results generation:  possibility of more rapid feedback to candidates  provide results to test centre rather than to candidates Data analysis:  richer feedback to candidates and test centres  easier to detect anomalies in candidate responses Data analysis:  richer feedback to candidates and test centres  easier to detect anomalies in candidate responses

28 Conclusions  very popular with candidates - more engaging and interactive, but additional validation needed  greater security and reduced risk of human error  improved accessibility and flexibility – highly desirable for Awarding Bodies  quality of technical infrastructure is vital  may initially place a burden on test centres, but with the promise of efficiency gains over time  very popular with candidates - more engaging and interactive, but additional validation needed  greater security and reduced risk of human error  improved accessibility and flexibility – highly desirable for Awarding Bodies  quality of technical infrastructure is vital  may initially place a burden on test centres, but with the promise of efficiency gains over time


Download ppt "‘At the Coal Face’ Experiences of Computer-based Exams John Winkley, BTL Paul Humphries, Edexcel Dorit Reppert, CCEA July 8 th 2003."

Similar presentations


Ads by Google