Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.

Slides:



Advertisements
Similar presentations
Experience of using formative assessment and students perception of formative assessment Paul Ong Greg Benfield Margaret Price.
Advertisements

Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
Top Tips for Using Turnitin for Originality Checking and Online Marking A Quick Overview Humanities eLearning Team
Writing an effective assignment in A-Level Leisure Studies.
The 5 Minute Marking Plan The big picture? (The purpose of marking for this piece of work / project?) Key marking points to share with students? Common.
Course assessment: Setting and Grading Tests and Examinations By Dr C. Bangira Chinhoyi University of Technology Organized by the Academy of Teaching and.
Session Outcomes Explain how assessment contributes to the learning process Use a model of feedback to enhance student learning Identify a range of feedback.
Making the most of what we’ve got: How we maximised the use of Grademark to facilitate learning and assessment Jamie McDermott Programme Leader - MSc Occupational.
Technical Communication and the RosE-Portfolio Documenting and Reflecting on the Development of Your Communication Skills.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Footloose Feedback.
Responding to the Assessment Challenges of Large Classes.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
1 Angela Ho, EDC Chan Chi Hung, Learning to Learn Project.
Improving Students’ understanding of Feedback
FORMATIVE ASSESSMENT Nisreen Ahmed Wilma Teviotdale.
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
Enhancing Student Learning Through Error Analysis
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
The Learning and Teaching Conference nd April 2015.
OB : Building Effective Interviewing Skills Building Effective Interviewing Skills Structure Objectives Basic Design Content Areas Questions Interview.
Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT.
Learning from students: Using Peer Mark as part of an online research portfolio Dr Cath Jones Head of Research Informed Teaching Catherine Naamani – Head.
Action Research Use of wikispaces to improve levels of independent learning in AS Physics Cath Lowe.
Academic Regulations for Taught Programmes 2014/15 Stewart Smith-Langridge Annette Cooke Governance Services 5 November
Lack of Learning or Lack of Studying? An Inquiry into Low Exam Scores Katherine M. Sauer Metropolitan State College of Denver February.
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
Marking Leslie Croxford & Kevin Millam. Purpose To help you to… mark consistently assess consistently develop robust assessment systems …in line with.
Assessment for Learning: supporting post graduate students’ Margo McKeever.
Dissertation Document?
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 3 1 Software Size Estimation I Material adapted from: Disciplined.
Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart.
Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of.
Disciplined Software Engineering Lecture #3 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
The Quality of Peer-Feedback in the Computerised Peer-Assessment of Essays? The case for awarding ‘marks for marking’ based upon peer feedback not marks.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
The Continual Assessment of Confidence or Knowledge with Hidden MCQ? Short Paper W.I.P. Phil Davies School of Computing University of Glamorgan.
Module, Course and Unit Evaluations Module, course or unit evaluations give you the opportunity to make your voice heard by giving feedback about your.
0 Employee Engagement. Why is employee engagement important? o Employee Engagement is a measurable degree of an employee's positive or negative emotional.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Computerised Peer-Assessment Phil Davies University of Glamorgan South Wales Lecturer getting out of doing marking.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
Why bother giving feedback?. How not to provide feedback?
Online Peer Assessment Prof Nathan Clarke & Dr Paul Dowland Centre for Security, Communications and Network Research School of Computing, Electronics and.
SAT’s Information Parent’s Meeting 10 th February February 2016.
1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Peer feedback on a draft TMA answer: Is it usable? Is it used? Mirabelle Walker Department of Communication and Systems.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
What do you think? The most effective method for assessing my students is to use a large end of unit test.
Learning Technology Development. edgehill.ac.uk/ls David Callaghan September 2013 “How I engaged my students” One tutor’s experience that produced outstanding.
1 Phil Davies School of Computing University of Glamorgan “Super U” Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized.
Business for Health Business Skills for Private Medical Practices
PeerWise Student Instructions
Leading Enhancement in Assessment and Feedback in Medical Sciences
Staff and student experience of flipped teaching
2nd Annual Faculty e-Learning Symposium 26th May 2016 Mrs E D Joubert
ELT. General Supervision
Information for Parents on Key Stage 2 SATs
Performance Management Employee Guide
DEVELOPMENTAL LEARNING AND TARGETED TEACHING
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Technical Communication and the RosE-Portfolio
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan
LO4 - Be Able to Use IT Applications to Meet the Business Needs
A Moodle-based Peer Assessment Tool
Presentation transcript:

Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan

AUTOMATICALLY CREATE A MARK THAT REFLECTS THE QUALITY OF AN ESSAY/PRODUCT VIA PEER MARKING, What do we need to provide to have fully Automated Peer –Assessment System? AUTOMATICALLY A MARK THAT REFLECTS THE QUALITY OF THE PEER MARKING PROCESS i.e. A FAIR/REFLECTIVE MARK FOR MARKING AND COMMENTING SLIGHT GAP

THE FIRST CAP MARKING INTERFACE

Typical Assignment Process Students register to use system - CAP Students register to use system - CAP Create an essay in an area associated with the module using an RTF template of headings Create an essay in an area associated with the module using an RTF template of headings Submit via Bboard Digital Drop-Box Submit via Bboard Digital Drop-Box Anonymous code given to essay automatically by system Anonymous code given to essay automatically by system Use CAP system to mark Use CAP system to mark

Self/Peer Assessment Often Self-Assessment stage used Often Self-Assessment stage used Set Personal Criteria Set Personal Criteria Opportunity to identify errors Opportunity to identify errors Get used to system Get used to system Normally peer-mark about 5/6 Normally peer-mark about 5/6 Raw peer MEDIAN mark produced Raw peer MEDIAN mark produced Need for student to receive Comments + Marks Need for student to receive Comments + Marks

Compensation High and Low Markers Need to take this into account Need to take this into account Each essay has a ‘raw’ peer generated mark - MEDIAN Each essay has a ‘raw’ peer generated mark - MEDIAN Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker Offset mark given by this value Offset mark given by this value Create a COMPENSATED PEER MARK Create a COMPENSATED PEER MARK

Below are comments given to students. Select the 3 most Important to YOU 1. I think you’ve missed out a big area of the research 2. You’ve included a ‘big chunk’ - word for word that you haven’t cited properly 3. There aren’t any examples given to help me understand 4. Grammatically it is not what it should be like 5. Your spelling is atroceious 6. You haven’t explained your acronyms to me 7. You’ve directly copied my notes as your answer to the question 8. 50% of what you’ve said isn’t about the question 9. Your answer is not aimed at the correct level of audience 10. All the points you make in the essay lack any references for support

Each Student is using a different set of weighted comments Comments databases sent to tutor

First Stage => Self Assess own Work Second Stage (button on server) => Peer Assess 6 Essays Comments – Both Positive and Negative in the various categories. Provides a Subjective Framework for Commenting & Marking

Feedback Index Produced an index that reflects the quality of commenting Produced an index that reflects the quality of commenting Produced a Weighted Feedback Index Produced a Weighted Feedback Index Compare how a marker has performed against these averages per essay for both Marking + Commenting – Looking for consistency Compare how a marker has performed against these averages per essay for both Marking + Commenting – Looking for consistency

The Review Element Originally in Communications within CAP marking process, it requires the owner of the file to ‘ask’ questions of the marker Originally in Communications within CAP marking process, it requires the owner of the file to ‘ask’ questions of the marker Emphasis ‘should’ be on the marker Emphasis ‘should’ be on the marker Marker does NOT see comments of other markers who’ve marked the essays that they have marked Marker does NOT see comments of other markers who’ve marked the essays that they have marked Marker does not really get to reflect on their own marking – get a reflective 2 nd chance Marker does not really get to reflect on their own marking – get a reflective 2 nd chance I’ve avoided this in past -> get it right first time I’ve avoided this in past -> get it right first time

Click on button to get an essay previously marked + comments and marks

Click on button to get to view comments of another marker

Can change any marks and/or comments they feel appropriate and submit by clicking button

Trialled with Post-Graduate Group 13 students 13 students 76 markings 76 markings Average time per marking = 42 minutes (range 3-72) Average time per marking = 42 minutes (range 3-72) Average number of menu comments/marking = 15.7 Average number of menu comments/marking = 15.7 Peer Avge. mark = 59.69% (before review 60.15%) Peer Avge. mark = 59.69% (before review 60.15%) Number of students who did replacements = 10 (out of 13) Number of students who did replacements = 10 (out of 13) 41 ‘Replaced’ markings (54%) 41 ‘Replaced’ markings (54%) Out of 41 Markings ‘Replaced’ –> 26 changed mark 26/76 (34%) Out of 41 Markings ‘Replaced’ –> 26 changed mark 26/76 (34%) Only 33 out of 41 REALLY CHANGED ANYTHING Only 33 out of 41 REALLY CHANGED ANYTHING 2 students ‘Replaced’ ALL his/her markings 2 students ‘Replaced’ ALL his/her markings -8, -7, -7, -7, -6, -6,, -6, -5, -5, -5, -4, -4, -3, -3, -2, -2, -2, -2, -1, -1 -8, -7, -7, -7, -6, -6,, -6, -5, -5, -5, -4, -4, -3, -3, -2, -2, -2, -2, -1, -1 +1, +2, +6,+7, +8, +9 +1, +2, +6,+7, +8, +9

Mapping of Feedback Indexes to Compensated Peer Essay Marks

#S/A Ref S/A Raw Peer Comp Peer Raw P/Rev Comp Peer P/Rev Initial Average Self Assessment 68.33% Reflective Average Self Assessment 64.63%

#S/A Ref S/A Raw Peer Comp Peer Raw P/Rev Comp Peer P/Rev Raw Peer Generated Mark Pre- Review 60.38% Compensated Peer Generated Mark Pre- Review 60.15%

#S/A Ref S/A Raw Peer Comp Peer Raw P/Rev Comp Peer P/Rev Raw Peer Generated Mark Post- Review 59.69% Compensated Peer Generated Mark Post- Review 59.69%

NEGATIVEPOSITIVE AVERAGE MARK CHANGE = Student Mark Changes During Review Stage ; – ; 67-60; ; 52-46

How to work out Mark (& Comment) Consistency Marker on average OVER marks by 10% Marker on average OVER marks by 10% Essay worth 60% Essay worth 60% Marker gave it 75% Marker gave it 75% Marker is 15% over Marker is 15% over Would expect 10% over, therefore Actual Consistency index (Difference) = 5 Would expect 10% over, therefore Actual Consistency index (Difference) = 5 If the marker on average had UNDER marked by 10% - Difference would have been 25 If the marker on average had UNDER marked by 10% - Difference would have been 25 Summing and Averaging these differences produces a Marking Consistency Index (low is good – high is poor) Summing and Averaging these differences produces a Marking Consistency Index (low is good – high is poor) This can be done for all marks and comments This can be done for all marks and comments

Pre-ReviewPost-Review Student Number Average Mark Difference Mark Consistency Average Mark Difference Mark Consistency Would hope for MARK CONSISTENCY to DECREASE following review > Would hope for AVERAGE MARK DIFFERENCE to DECREASE following review > -0.72

Automatically Generate Mark for Marking Linear scale mapped directly to consistency … the way in HE? Linear scale mapped directly to consistency … the way in HE? Expectation of Normalised Results within a particular cohort / subject / institution? Map to Essay Grade Scale achieved (better reflecting ability of group)? Map to Essay Grade Scale achieved (better reflecting ability of group)?

Current ‘Simple’ method Average mark for essay e.g. 55% Average mark for essay e.g. 55% Ranges Highest – Lowest marks achieved for essay e.g. 45% 70% Ranges Highest – Lowest marks achieved for essay e.g. 45% 70% Average Marking Consistency e.g. 5.0 Average Marking Consistency e.g. 5.0 Ranges Highest – Lowest consistency indexes achieved e.g Ranges Highest – Lowest consistency indexes achieved e.g Essay 45 55% 70 Mark Essay/Mark 3.33 / / 1 e.g. Mark Cons = 6.0 > 1 below average Mark for marking = 55% - (1*3.33)= 51.66%

Student Number Mark Consistency Consistency Difference from Average 5.37 % Mark for Marking based on range diff. (Avge 59.69%) Mark they received for essay %72% %68% %53% %51% %52% %62% %81% %61% %73% %43% %66%

What about the commenting? Does not take into account the Quality of the Commenting Does not take into account the Quality of the Commenting Should look at the Average Feedback Differences per marker to get a Commenting Consistency Grade Should look at the Average Feedback Differences per marker to get a Commenting Consistency Grade Same as creating Mark Consistency Same as creating Mark Consistency Create a Commenting Consistency Index Create a Commenting Consistency Index

Student Number Average Feedback Difference Feedback Consistency Avge 2.80 Consistency Difference from Average % Mark for Comments based on range diff. (Avge 59.69%) Mark they received for essay %72% %68% %53% %51% %52% %62% %81% %61% %73% %43% %66%

Student Number MARK FOR MARKING MARK FOR COMMENTING MARK FOR ESSAY FINAL MARK 60 / 20 / %59%72%67% 265%63%68%66% 381%59%53%60% 468%65%51%57% 5 653%75%52%57% 743%43%62%54% 865%62%81%74% %57%61%60% 1174%81%73%75% 1256%64%43%50% 1371%73%66%68% Correlation between Marking & Commenting Consistency 0.49

Student Number MARK FOR MARKING MARK FOR COMMENTING MARK FOR ESSAY FINAL MARK 60 / 20 / %59%72%67% 265%63%68%66% 381%59%53%60% 468%65%51%57% 5 653%75%52%57% 743%43%62%54% 865%62%81%74% %57%61%60% 1174%81%73%75% 1256%64%43%50% 1371%73%66%68% Correlation between Marking Consistency and Essay Mark 0.17

Student Number MARK FOR MARKING MARK FOR COMMENTING MARK FOR ESSAY FINAL MARK 60 / 20 / %59%72%67% 265%63%68%66% 381%59%53%60% 468%65%51%57% 5 653%75%52%57% 743%43%62%54% 865%62%81%74% %57%61%60% 1174%81%73%75% 1256%64%43%50% 1371%73%66%68% Correlation between Commenting Consistency and Essay Mark 0.05 Final Grade for Coursework takes into account Essay Grade, Mark for Marking and Mark for CommentingP ercentages

Split in Marks 60 / 20 / 20 Is it reasonable? Is it reasonable? Higher Order skills of Marking – worth more? Higher Order skills of Marking – worth more? If we’re judging marking process on consistency – then should be rewarded for showing consistency within marking AND commenting If we’re judging marking process on consistency – then should be rewarded for showing consistency within marking AND commenting Revised split 60 / 15 / 15 / 10 Revised split 60 / 15 / 15 / 10

Student # MARK FOR MARKING MARK FOR COMMENTING MARK FOR BEING CONSISTENT IN MARKING & COMMENTING MARK FOR ESSAY FINAL MARK 15 / 15 / 10 / %59%80%72%69% 265%63%72%68%67% 381%59%43%53%57% 468%65%68%51%57% 5 653%75%43%52%55% 743%43%81%62%58% 865%62%68%81%75% %57%79%61%62% 1174%81%58%73%73% 1256%64%56%43%49% 1371%73%74%66%69% Correlation between Final ESSAY grade & Mark for Marking, Commenting & Consistency is 0.54 Correlation between FINAL MARK & ESSAY GRADE is 0.85

Student # MARK FOR MARKING MARK FOR COMMENTING MARK FOR BEING CONSISTENT IN MARKING & COMMENTING MARK FOR ESSAY FINAL MARK 15 / 15 / 10 / %59%80%72%69% 265%63%72%68%67% 381%59%43%53%57% 468%65%68%51%57% 5 653%75%43%52%55% 743%43%81%62%58% 865%62%68%81%75% %57%79%61%62% 1174%81%58%73%73% 1256%64%56%43%49% 1371%73%74%66%69% IS IT WORTH THE HASSLE??

Student Comments Have you used Peer-Assessment in Past? Have you used Peer-Assessment in Past? Have you used Peer-Assessment in Past? Have you used Peer-Assessment in Past? How did you find self-assessment? How did you find self-assessment? How did you find self-assessment? How did you find self-assessment? Creating the Comments Database? Creating the Comments Database? Creating the Comments Database? Creating the Comments Database? How did they find using the CAP system and peer-assessment? How did they find using the CAP system and peer-assessment? How did they find using the CAP system and peer-assessment? How did they find using the CAP system and peer-assessment? Thoughts on new Review Stage? Thoughts on new Review Stage? Thoughts on new Review Stage? Thoughts on new Review Stage? Thoughts on Mark for Marking? Thoughts on Mark for Marking? Thoughts on Mark for Marking? Thoughts on Mark for Marking?

Two Main Points to Consider How do we assess the time required to perform the marking task? How do we assess the time required to perform the marking task? What split of the marks between creation & marking What split of the marks between creation & marking Definition Definition Definition Student or Lecturer Comments Student or Lecturer Comments Student or Lecturer Comments Student or Lecturer Comments

Contact Information Phone: Dr Phil Davies J317 Department of Computing & Mathematical Sciences Faculty of Advanced Technology University of Glamorgan

THE END

General View of Peer Assessment Lecturer or Student? Lectures getting out of doing their jobs i.e. marking Good for developing student skills & employability How can all students be expected to mark as ‘good’ as ‘experts’ Why should I mark ‘properly’ and waste my time - I get a fixed mark for doing it The feedback given by students is not of the same standard that I give. The best thing I’ve ever done to make me reflect salary WANT BENEFITS NOW!! VERY PERSONAL

Defining Peer-Assessment In describing the teacher.. In describing the teacher.. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a light bulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000, page 68)

Student Comments Used Peer-Assessment in past? Used Peer-Assessment in past? None to any real degree None to any real degree A couple for staff development type activities A couple for staff development type activities

Student Comments How did you feel about performing Self- assessment? Very Difficult. Very Difficult. Helped to promote critical thinking ready for peer-assessment stage. Helped to promote critical thinking ready for peer-assessment stage. Made me think about how I was going to assess others Made me think about how I was going to assess others

Student Comments Creating Comments Database? Creating Comments Database? Very difficult – not knowing what comments they’d need Very difficult – not knowing what comments they’d need Weighting really helped me create criteria ready for marking Weighting really helped me create criteria ready for marking Could have helped to do dummy marking Could have helped to do dummy marking

Student Comments How did they find using the CAP system and peer-assessment? How did they find using the CAP system and peer-assessment? Very positive & interesting Very positive & interesting Very time consuming Very time consuming Would do it better next time Would do it better next time Important to maintain anonymity Important to maintain anonymity Interesting & complex – thought more about the assessment process Interesting & complex – thought more about the assessment process Really helped student development Really helped student development

Student Comments Thoughts on new Review Stage Thoughts on new Review Stage Liked 2 nd chance to review own marks Liked 2 nd chance to review own marks Gained experience going through process Gained experience going through process Didn’t really take much note of peers’ comments Didn’t really take much note of peers’ comments Liked to see that others felt the same about an essay Liked to see that others felt the same about an essay

Student Comments Thoughts on Mark for Marking Thoughts on Mark for Marking Good rewarded appropriately Good rewarded appropriately Difficult to fully understand Difficult to fully understand Let owner of essay provide mark for marking Let owner of essay provide mark for marking