Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT.

Slides:



Advertisements
Similar presentations
Top Tips for Using Turnitin for Originality Checking and Online Marking A Quick Overview Humanities eLearning Team
Advertisements

NoodleBib Create a [bibliography, source list…] * [Your name/title/contact info] *Note: For the brackets, fill in your specific information.
The Writing Process Communication Arts.
Session Outcomes Explain how assessment contributes to the learning process Use a model of feedback to enhance student learning Identify a range of feedback.
The Benefits from Allowing Plagiarism Workshop Facilitator Phil Davies School of Computing University of Glamorgan.
Making the most of what we’ve got: How we maximised the use of Grademark to facilitate learning and assessment Jamie McDermott Programme Leader - MSc Occupational.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.
The Writing Process.
Online STUDENT PORTAL - TURNITIN Student Manual Ver 1 LSC GROUP OF COLLEGES Belgrade IT Department 2011 LSC GROUP OF COLLEGES Belgrade IT Department 2011.
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Footloose Feedback.
Consistency of Assessment
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
Welcome to Turnitin.com’s Peer Review! This tour will take you through the basics of Turnitin.com’s Peer Review. The goal of this tour is to give you.
Improving Students’ understanding of Feedback
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
Software Development, Programming, Testing & Implementation.
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
8/23/2015 Information for Parents on Key Stage 1 SATs.
Assessment for Learning
The Writing Process My Favorite Things.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
How to Administer Constructive and Effective Feedback to Online Students Denielle R. Vazquez, M.S.Ed – 2014 Teaching and Learning.
Hiram College WAC Based upon work by Erika Lindemann, Donald McAndrew, and Thomas Reigstad.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Jordan Girling.
The Learning and Teaching Conference nd April 2015.
Approaches to Assessment Starter Activity On the post it notes provided write down as many different types of assessment as possible, one per post it note.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
Learning from students: Using Peer Mark as part of an online research portfolio Dr Cath Jones Head of Research Informed Teaching Catherine Naamani – Head.
Action Research Use of wikispaces to improve levels of independent learning in AS Physics Cath Lowe.
Peer Edit Day Today’s Quiz Grade: 50 points= two peer edits of a student draft 50 points= complete rough draft * MLA format of all parts * minimum of 4-5.
BIO1130 Lab 2 Scientific literature. Laboratory objectives After completing this laboratory, you should be able to: Determine whether a publication can.
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
Plagiarism Whose fault is it? Phil Davies FAT. Plagiarism in Education Plagiarism in education is not a student problem, it is one that is created by.
Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart.
Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of.
The Quality of Peer-Feedback in the Computerised Peer-Assessment of Essays? The case for awarding ‘marks for marking’ based upon peer feedback not marks.
Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
Longmoor Primary School KS2 SATS 2016 Y6 Information Evening Monday 5 th October 2015.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Formative Assessment: Journey To Higher Achievement Martha Lamb September 2010.
Computerised Peer-Assessment Phil Davies University of Glamorgan South Wales Lecturer getting out of doing marking.
Authentic Assessment Using Rubrics to Evaluate Project-Based Learning Curriculum content created and presented by Dr. Alexandra Leavell Associate Professor.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
Why bother giving feedback?. How not to provide feedback?
Online Peer Assessment Prof Nathan Clarke & Dr Paul Dowland Centre for Security, Communications and Network Research School of Computing, Electronics and.
Marking and Assessing Andy Wilson Loughborough University.
1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Low Level Programming Introduction & Recap Duncan Smeed.
ASSESSMENT TO IMPROVE SELF REGULATED LEARNING 5 th July 2006, 10 th CAA conference, Poppy Pickard Assessment to improve Self Regulated Learning.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
What do you think? The most effective method for assessing my students is to use a large end of unit test.
3P SYSTEM Better, Faster, More Meaningful Grades.
1 Phil Davies School of Computing University of Glamorgan “Super U” Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized.
How to Be Successful in English What to Do the First Week O Get the book – either hard cover or e-book O Read the Orientation Materials O Watch.
BIO1130 Lab 2 Scientific literature
Business technology 3 Course Details… Grade: 8th
UCAS Progress – Year 11 Introduction
Information for Parents on Key Stage 2 SATs
The Five Stages of Writing
An Introduction to e-Assessment
BIO1130 Lab 2 Scientific literature
All you need to know when applying for university.
[insert Module title here]
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan
Presentation and project
Presentation transcript:

Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of Glamorgan

General View of Peer Assessment Lecturer or Student? Lectures getting out of doing their jobs i.e. marking Good for developing student reflection – So what? Where’s the marks How can students be expected to mark as ‘good’ as ‘experts’ Why should I mark ‘properly’ and waste my time - I get a fixed mark for doing it The feedback given by students is not of the same standard that I give. I still have to do it again myself to make sure they marked ‘properly’

Defining Peer-Assessment In describing the teacher.. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a light bulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

AUTOMATICALLY CREATE A MARK THAT REFLECTS THE QUALITY OF AN ESSAY/PRODUCT VIA PEER MARKING, + A MARK THAT REFLECTS THE QUALITY OF THE PEER MARKING PROCESS i.e. A FAIR/REFLECTIVE MARK FOR MARKING AND COMMENTING What will make Computerized Peer-Assessment ACCEPTABLE to ALL?

THE FIRST CAP MARKING INTERFACE

Typical Assignment Process Students register to use system - CAP Create an essay in an area associated with the module Provide RTF template of headings Submit via Bboard Digital Drop-Box Anonymous code given to essay automatically by system Use CAP system to mark

Self/Peer Assessment Often Self-Assessment stage used –Set Personal Criteria –Opportunity to identify errors –Get used to system Normally peer-mark about 5/6 Raw peer MEDIAN mark produced Need for student to receive Comments + Marks

Compensation High and Low Markers Need to take this into account Each essay has a ‘raw’ peer generated mark - MEDIAN Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker Offset mark given by this value Create a COMPENSATED PEER MARK It’s GOOD TO TALK – Tim Nice but Dim

S THE MARKER.. ANONYMOUS

Below are comments given to students. Select the 3 most Important to YOU 1.I think you’ve missed out a big area of the research 2.You’ve included a ‘big chunk’ - word for word that you haven’t cited properly 3.There aren’t any examples given to help me understand 4.Grammatically it is not what it should be like 5.Your spelling is atroceious 6.You haven’t explained your acronyms to me 7.You’ve directly copied my notes as your answer to the question 8.50% of what you’ve said isn’t about the question 9.Your answer is not aimed at the correct level of audience 10.All the points you make in the essay lack any references for support

Order of Answers Were the results all in the ‘CORRECT’ order – probably not? -> Why not! Subject specific? Level specific – school, FE, HE Teacher/Lecturer specific? Peer-Assessment is no different – Objectivity through Subjectivity Remember – Feedback Comments as important as marks! Students need to be rewarded for marking and commenting WELL -> QUANTIFY COMMENTS

Each Student is using a different set of weighted comments Comments databases sent to tutor

First Stage => Self Assess own Work Second Stage (button on server) => Peer Assess 6 Essays Comments – Both Positive and Negative in the various categories. Provides a Subjective Framework for Commenting & Marking

Feedback Index Produce an index that reflects the quality of commenting Produce a Weighted Feedback Index Compare how a marker has performed against these averages per essay for both Marking + Commenting – Looking for consistency

The Review Element Requires the owner of the file to ‘ask’ questions of the marker Emphasis ‘should’ be on the marker Marker does NOT see comments of other markers who’ve marked the essays that they have marked Marker does not really get to reflect on their own marking – get a reflective 2 nd chance I’ve avoided this in past -> get it right first time

Used on Final Year Degree + MSc MSc EL&A 13 students 76 markings 41 replaced markings (54%) Average time per marking = 42 minutes Range of time taken to do markings 3-72 minutes Average number of menu comments/marking = 15.7 Raw average mark = 61% Out of 41 Markings ‘replaced’ –> 26 changed mark 26/76 (34%) Number of students who did replacements = 8 (out of 13) 2 students ‘Replaced’ ALL his/her markings 26 markings actually changed mark -1,+9, -2,-2, +1, -8, -3,-5, +2, +8, -2, +6, +18(71-89), - 1, -4, -6, -5, -7, +7, -6, -3, +6, -7, -7, -2, -5 (Avge -0.2)

How to work out Mark (& Comment) Consistency Marker on average OVER marks by 10% Essay worth 60% Marker gave it 75% Marker is 15% over Actual consistency index (Difference) = 5 This can be done for all marks and comments Creates a consistency factor for marking and commenting

Automatically Generate Mark for Marking Linear scale mapped directly to consistency … the way in HE? Map to Essay Grade Scale achieved (better reflecting ability of group)? Expectation of Normalised Results within a particular cohort / subject / institution?

Current ‘Simple’ Method Average Marks –Essay Mark = 57% –Marking Consistency = 5.37 Ranges –Essay 79% 31% –Marking Consistency Range Above Avge 22% 3.25 (6.76=1) Range Below Avge 26% 5.40 (4.81=1)

ALT-J journal entitled ‘Don’t Write, Just Mark; The Validity of Assessing Student Ability via their Computerized Peer-Marking of an Essay rather than their Creation of an Essay’ ALT-J (CALT) Vol. 12 No. 3, pp Took a Risk No necessity to write an essay Judged against previous essays from past – knew mark and feedback index NO PLAGIARISM opportunity Worked really well

Some Points Outstanding or Outstanding Points What should students do if they identify plagiarism? What about accessibility? Is a computerised solution valid for all subject areas? At what age / level can we trust the use of peer assessment? How do we assess the time required to perform the marking task? What split of the marks between creation & marking

Summary Research / Meeting Pedagogical Needs / Improving relationship between assessment & learning –Keep asking yourself WHY & WHAT am I assessing? DON’T LET THE TECHNOLOGISTS DRIVE THE ASSESSMENT PROCESS!! i.e. Lunatics taking over the asylum Don’t let the ‘artificial’ need to adhere to standards & large scale implementation of on-line assessment be detrimental to the assessment and learning needs of you and your students. ‘Suck it and see’. By using composite e-assessment methods are then able to assess full range of learning + assessment Favourite Story to finish

Personal Reference Sources CAA Conference Proceedings ‘Computerized Peer-Assessment’, Innovations in Education and Training International Journal (IETI), 37,4, pp , Nov 2000 ‘Using Student Reflective Self-Assessment for Awarding Degree Classifications’, Innovations in Education and Training International Journal (IETI), 39,4, pp , Nov ALT-J journal entitled ‘Closing the Communications Loop on the Computerized Peer Assessment of Essays’, 11, 1, pp 41-54, ALT-C 2003 Research stream paper, Peer-Assessment: No marks required, just feedback, Sheffield University, Sept ALT-J journal entitled ‘Don’t Write, Just Mark; The Validity of Assessing Student Ability via their Computerized Peer-Marking of an Essay rather than their Creation of an Essay’ ALT-J (CALT) Vol. 12 No. 3, pp Peer-Assessment: Judging the quality of student work by the comments not the marks?, Innovations in Education and Teaching International (IETI), 43, 1, pp 69-82, 2006.

Contact Information Phil Davies J University of Glamorgan