Download presentation
Presentation is loading. Please wait.
Published byHarriet O’Brien’ Modified over 9 years ago
1
Evaluation of Respondus assessment tool as a means of delivering summative assessments A KU SADRAS Project Student Academic Development Research Associate Scheme Laura Marshall, Mark Coombs, Peter Garside
2
Aims and Hypothesis An evaluation of the Respondus software as a means of delivering MCQ online summative assessments. To investigate the student experience, preparation, expectations, impact and results, in relation to their background and learning environment. Students critically reflected on the process and considered how satisfied they were that the MCQ and software offered a fair assessment - one which reflected their perception of their understanding. Our core hypothesis was that: – Students would prefer using the respondus software as a means of delivering a MCQ summative assessment, when compared to other forms of assessment, and that it would be a more objective reflection of knowledge.
3
Respondus and MCQ The software package is for institutional use, facilitating online examinations such as multiple choice tests. Delivered through Study Space with the aid of a lock down browser. The test was one assessment for a module on the geographical contours of capitalism. It lasted for one hour and consisted of 50 questions, 47 multiple choice and 3 missing word responses. The questions were randomised and responses stored automatically – one question per screen. Students were given an exemplar test prior to the exam.
4
An evaluation of the process A group of students undertook evaluation under the SADRAS scheme. Devised methodology in conjunction with staff. Online questionnaire and focus groups. All students who did the MCQ completed the questionnaire and focus group. The evaluation team were part of the module.
5
Questionnaire Contradictions! Pros 62.5% agreed MCQ encouraged attendance 77% felt level of difficulty was appropriate 93.7% felt they were well guided for exam 70% felt less stressed preparing for MCQ than another form of exam 87% felt exam material was well covered in module 75% felt more confident taking this form of exam as opposed to other types Cons 56.2 felt results and attendance correlated 50% had trouble deciding what to revise 31% spent less than 4 hrs revising 56% felt prepared 38% felt pattern of answers made them question their response 56% felt questions were worded clearly
6
Exam Mark Correlation between attendance and Exam Mark ValueApprox. Sig. Pearson's R.787000 c Spearman Correlation..741001 c Correlation between Exam Mark and Essay Mark ValueApprox. Sig. Pearson's R.277.010 c Spearman Correlation.092.735 c Essay Mark
7
Focus groups topics regarding the software. -A dummy run of using the software meant people knew what to expect from exam software. -Software was straightforward and easy to navigate. -Confident answers were being automatically stored. - Some mentioned that they didn’t realise they could go back to questions to revise answers to questions. -Would have been useful to flag answers to questions you were unsure of, to go back to. -Short answer question would have been preferable over a missing word question (type a one word answer into a box). -Font and size of text didn’t appear to be consistent throughout – possible fault – distraction! -Separate pages for each question prevented an awareness of a response pattern.
8
-More comfortable, knowing answer was there (through the multiple choice format). -Someone made the point that they think males are better at making snap judgements, whereas females are more careful and review the answers. -Essays focus in on one part of the course whereas the multiple choice exam has a wider range of topics. -tested memory on specific facts, concepts and theorists -Tests comprehension. -“Maybe I didn’t know it to the depth that I should”. -“ Was a wake up call as I thought I had a good grasp on the topics but realised I was uncertain on some aspects”. -“Not sure if it was a lack of studying or not knowing what topics to revise.” -“I thought it was a fair assessment of the material covered.” -Would have liked it to be formative rather than summative. -Would prefer a test like this to be done every two weeks. -Some people didn’t achieve a grade they thought was representative of their comprehension of the subject. -Some preferred presentations or essays as forms of assessment. Focus group topics regarding the assessment On the exam… On the outcomes of the exam… On how the assessment fits in with the course overall…
9
Findings & conclusions Relationship between attendance and result was significant. The relationship between exam and essay results was negligible. The software and environment worked well. Exam produced a wider distribution of marks (objective?) Exam did not favour any particular group. Staff front end resource intensive – developing questions and testing software.
10
Recommendations Student have ability to flag questions to return to. Student have ability to change text size. Staff have ability to check final test from student perspective – consistency font, etc. Run questions by students from year above? Have summative MCQ tests regular intervals. Avoid factual questions just based on recall. Students have ability to review responses and answers at a later date.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.