Download presentation
Presentation is loading. Please wait.
Published byLee Nicholson Modified over 9 years ago
1
Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart Lewis School of Computing University of Glamorgan
2
Need for Assessment? As tutors we are trying to “separate” the sheep from the goats via the assessment process. This can often be difficult with the time constraints imposed on tutors, in what Chanock describes as “more goat-friendly times” (Chanock, 2000). Problem …. Feedback against time!
3
Defining Peer-Assessment In describing the teacher.. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a lightbulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)
4
What functionality do we require in a computerised peer-assessment system? Method to Peer-Mark & COMMENT Method to allow students to view comments Method to permit conversation anonymous Method to take into account high/low markers … Fair to all Method to Qualitatively Assess marking & commenting processes (higher order skills) Method to permit a student to make comments that are understandable (framework) to them and owner Security / Recognise and Avoid Plagiarism / Flexibility
7
AUTOMATICALLY EMAIL THE MARKER.. ANONYMOUS
9
Must be rewarded for doing the ‘mark for marking’ process.. Based on quality How to judge? Standard of expectation (self-assessment) Marking consistency Commenting, quality, measure against mark Discussion Element Need for additional comments – black mark? Reaction to requests / further clarification
11
Feedback Index Produce an index that reflects the quality of commenting Produce an average feedback index for an essay (also compensated?) Compare against marker in a similar manner to marks analysis Where does this feedback index come from and is it valid?
14
-7-6-5-4-3-2-0+0+1+2+3+4 34 51565256616370646870 39526146575862726570 444552546659626968 3435576665 6669 565968646371 615966 5548606968 6157586573 596160 63 62 68 3437.845.859.655.658.862.962.867.568 70
16
CAA Conference 2003 Future Work It should be noted that students marking work only tend to use a subset of these comments. From their feedback have a different regard to the weighting of each of the comments with respect to their commenting on the quality of an essay.
17
Exercise 1.I think you’ve missed out a big area of the research 2.You’ve included a ‘big chunk’ that you haven’t cited 3.There aren’t any examples given to help me understand 4.Grammatically it is not what it should be like 5.Your spelling is atroceious 6.You haven’t explained your acronyms to me 7.You’ve directly copied my notes as your answer to the question 8.50% of what you’ve said isn’t about the question
18
Each Student is using a different set of comments … these new weightings MAY give a better feedback index? Currently being evaluated
19
Is it my job to teach students how to write essays, etc? Assessment MUST be directed at subject skills Why bother writing essays, doing exam questions, etc. … doesn’t relate to needs or learning outcomes of subject Post HND … N-tier … Assess the essays of the final year (last year) Preparation/Research: Judge knowledge against last year’s results.. Both marks & comments Mistake!!
20
e.g. a group of marking differences +4, -22, +16, -30, +8, +12 would result in an average difference of -12 / 6 = -2 (taking 0 as the expected difference). The absolute differences from this value of -2 are 6, 20, 18, 28, 10, 14. This gives an average consistency valuation of 13 (96/6). This shows a poor consistency by this marker. Compare this with a student whose marks produced were +4, -4, -10, -8, 6, 0. The average difference for this marker is again -2 (-12/6). The absolute differences from this value however are 6, 2, 8, 6, 8, 2. This gives a consistency valuation of 5.33 (32/6). This student deserves much more credit for their marking even though the average standard deviation of the two sets of markings was the same. The fact that a student always high or low marked is now removed as it is the absolute difference that is being compared.
22
MarksMarking DifferenceFeedback DifferenceMapping for MCQ & Essays 5<4<290% - 100% 4<8<480% - 89% 3<12<660% - 79% 2<16<840% - 59% 1<20<1020% - 39% 020 or more10 or more0% - 19% Who benefited the most by doing this exercise? Cured plagiarism?
23
Can the same principles be applied in other subject areas? Java Programming with Coursemarker Stuart Lewis’ idea Students create a solution to a programming assignment Submission(s) Peer-Evaluate other solutions Comments … Marks for Marking (weightings)
24
CourseMarker Core CM File Storage System Marking System Evaluation System Exercise Developm. SystemStudent Exercise Environment assignments exercises notes questions test methods solution template marking scheme exercise setup submission edit compile link run feedback and mark final mark position in class course statistics flagging-up of ‘problem cases’ immediate support TEACHER STUDENT re-usability automated marking - fair - frees time plagiarism check steep learning curve difficult setup (but it’s getting easier) immediate feedback fast support additional overheads TEACHER STUDENTS UNIX (Linux), Windows, Mac, based all platforms Assessment of text I/O assignments only no marking of graphical output remote student / teacher access distance learning, open all hours Advantages / Disadvantages FEATURES Computer Assisted Teaching and Assessment Modula-2 Java C comments / questions
25
PeerMarker Screen
26
Student while marking Exposure to different solutions Development of critical evaluative skills Useful experience of reading code for future employment situations Plagiarism? … Good solution / No understanding
27
Student while reviewing feedback from peers Range of subjective marking Confirmation of objective automated marking Anonymous discussion between marker and marked
28
Current position Test system working Changes following beta test in progress Plans to try sample study again (at a more convenient time, and with added rewards!) Employed 2 nd Placement Student Graphical Interface
29
Some Points Outstanding or Outstanding Points What should students do if they identify plagiarism? Is it ethical to get students to mark the work of their peers? Is a computerised solution valid for all? At what age / level can we trust the use of peer assessment? How do we assess the time required to perform the marking task? What split of the marks between creation & marking BEST STORY
30
Contact Information pdavies@glam.ac.uk sflewis@glam.ac.uk Phil Davies / Stuart Lewis School of Computing University of Glamorgan Innovations in Education & Teaching International ALT-J
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.