Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of.

Slides:



Advertisements
Similar presentations
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Advertisements

Top Tips for Using Turnitin for Originality Checking and Online Marking A Quick Overview Humanities eLearning Team
{ Learning Agreement Debs Wilson.  A Learning Agreement (LA) is an opportunity for you to reflect upon your goals for this course as well as your goals.
Dr. Cath Jones and Alice Lau Putting Assessment at the Heart of Learning – The Story at The University of Glamorgan.
The 5 Minute Marking Plan The big picture? (The purpose of marking for this piece of work / project?) Key marking points to share with students? Common.
© Cambridge International Examinations 2013 Component/Paper 1.
Footloose Feedback.
Consistency of Assessment
LTI CONFERENCE University of Hertfordshire MAY 6 TH 2010 Karen Clark and Sharon Korek.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
Grant Proposal Basics 101 Office of Research & Sponsored Programs.
Kailee Brennan & Cecilia Rands.  encourages students to be independent - the teacher’s opinion isn’t the only one that counts  teacher perspective.
Assessment for Learning
Dos and don’ts. A Typical Essay Test: 2 essay questions. Answer 1 of 2 (average length: 2-3 pages in a small blue book). Please write on only one side.
The Math Studies Project for Internal Assessment A good project should be able to be followed by a non-mathematician and be self explanatory all the way.
Professor Norah Jones and Alice Lau Putting Assessment at the Heart of Learning – The Story at The University of Glamorgan.
What is Pupil Self Evaluation? It is looking at encouraging the children to become involved in their own learning, and identifying areas for development.
The Learning and Teaching Conference nd April 2015.
Designing in and designing out: strategies for deterring student plagiarism through course and task design Jude Carroll, Oxford Brookes University 22 April.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT.
Learning from students: Using Peer Mark as part of an online research portfolio Dr Cath Jones Head of Research Informed Teaching Catherine Naamani – Head.
Action Research Use of wikispaces to improve levels of independent learning in AS Physics Cath Lowe.
Academic Regulations for Taught Programmes 2014/15 Stewart Smith-Langridge Annette Cooke Governance Services 5 November
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
The New Primary Curriculum and its Assessment. Aim The aim of this meeting is to give you information about the changes that are happening in education.
Reports 2015! Our new reports are intended to inform you about how your child is performing against age- related expectations linked with the new national.
Welcome The challenges of the new National Curriculum & Life without Levels.
Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart.
The Quality of Peer-Feedback in the Computerised Peer-Assessment of Essays? The case for awarding ‘marks for marking’ based upon peer feedback not marks.
Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.
Longmoor Primary School KS2 SATS 2016 Y6 Information Evening Monday 5 th October 2015.
Task Analysis Exercise Project criteria Command term objectives Describe – give a detailed account of the PROBLEM and DESIGN NEED OR OPPORTUNITY (Page.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Computerised Peer-Assessment Phil Davies University of Glamorgan South Wales Lecturer getting out of doing marking.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
Why bother giving feedback?. How not to provide feedback?
Online Peer Assessment Prof Nathan Clarke & Dr Paul Dowland Centre for Security, Communications and Network Research School of Computing, Electronics and.
1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
College of Education Meeting with the Professor: The Field Assignment Project (Case Study)
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
LAKE COUNTY VIRTUAL SCHOOL WELCOME TO Earth-Space Science.
LAKE COUNTY VIRTUAL SCHOOL WELCOME TO Language Arts 2.
1 Phil Davies School of Computing University of Glamorgan “Super U” Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized.
How to Be Successful in English What to Do the First Week O Get the book – either hard cover or e-book O Read the Orientation Materials O Watch.
Welcome! Following the introduction of the reformed national curriculum in 2014 and the removal of national curriculum levels, new statutory assessment.
Priors Wood Primary School KS2 SATS 2017 Y6 Information Evening Monday 17th October 2016.
Charlton Kings Junior School
Reformed GCSEs Reporting and Assessment
The 5 Minute Marking Plan
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Marks/Exams Information – All Years
How to Be Successful in English 3000
Business technology 3 Course Details… Grade: 8th
St Mary’s Catholic Primary School KS2 SATS 2018
Information for Parents on Key Stage 2 SATs
CTAERN/DOE System Level Counselor Coordinator Profile Entry Initiative
Overview Basic Information Lecture Labs Lab Reports Homework Exams
The 5 Minute Marking Plan
The Math Studies Project for Internal Assessment
The 5 Minute Marking Plan
CTAERN/DOE System Level Counselor Coordinator Profile Entry Initiative
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan
Designing Your Performance Task Assessment
Key Stage 2 SATs Presentation to Parents of Year 6 children at St. Wilfrid’s Church of England Primary Academy.
Presentation transcript:

Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of Glamorgan

Defining Peer-Assessment In describing the teacher.. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a light bulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

AUTOMATICALLY CREATE A MARK THAT REFLECTS THE QUALITY OF AN ESSAY/PRODUCT VIA PEER MARKING, AND ALSO A MARK THAT REFLECTS THE QUALITY OF THE PEER MARKING PROCESS i.e. A FAIR/REFLECTIVE MARK FOR MARKING AND COMMENTING

Below are comments given to students. Place in Top FOUR Order of Importance to YOU 1.I think you’ve missed out a big area of the research 2.You’ve included a ‘big chunk’ that you haven’t cited 3.There aren’t any examples given to help me understand 4.Grammatically it is not what it should be like 5.Your spelling is atroceious 6.You haven’t explained your acronyms to me 7.You’ve directly copied my notes as your answer to the question 8.50% of what you’ve said isn’t about the question 9.Your answer is not aimed at the correct level of audience 10.All the points you make in the essay lack any references for support

Order of Answers Were the results all in the ‘CORRECT’ order – probably not? Why not! Subject specific? Level specific – school, FE, HE Teacher/Lecturer specific? Peer-Assessment is no different – Objectivity through Subjectivity

Typical Assignment Process Students register to use system - CAP Create an essay in an area associated with the module Provide RTF template of headings Submit via Bboard Digital Drop-Box Anonymous code given to essay automatically by system Create comments database / categories

Each Student is using a different set of weighted comments Comments databases sent to tutor

First Stage => Self Assess own Work Second Stage (button on server) => Peer Assess 6 Essays

Self/Peer Assessment Often Self-Assessment stage used –Set Personal Criteria –Opportunity to identify errors –Get used to system Normally peer-mark about 5/6 Raw peer MEDIAN mark produced Need for student to receive Comments + Marks Need for communication element?

AUTOMATICALLY THE MARKER.. ANONYMOUS

The communications element Requires the owner of the file to ‘ask’ questions of the marker Emphasis ‘should’ be on the marker Marker does NOT see comments of other markers who’ve marked the essays that they have marked Marker does not really get to reflect on their own marking – get a reflective 2 nd chance I’ve avoided this in past -> get it right first time

Feedback Index Produce an index that reflects the quality of commenting Produce a Weighted Feedback Index Compare how a marker has performed against these averages Judge quality of marking and commenting i.e. provide a mark for marking AUTOMATICALLY

Compensation High and Low Markers Need to take this into account Each essay has a ‘raw’ peer generated mark - MEDIAN Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker Offset mark given by this value Create a COMPENSATED PEER MARK

How to work out Mark (& Comment) Consistency Marker on average OVER marks by 10% Essay worth 60% Marker gave it 75% Marker is 15% over Actual consistency index (Difference) = 5 This is done for all marks and comments Creates a consistency factor for marking and commenting

Marks to Comments Correlation Jennifer Robinson – a third of comments not useful Liu – Holistic comments not specific Davies – Really good correlation between marks and comments received

Range Frequency Mark Difference Mark Consistency Mark Consistency Ranges Feedback Difference Feedback Consistency Feedback Consistency Ranges Weighted Feedback Difference Weighted Feedback Consistency Weighted Feedback Ranges 80> > ,

Automatically Generate Mark for Marking Linear scale mapped directly to consistency … the way in HE? Map to Essay Grade Scale achieved (better reflecting ability of group)? Expectation of Normalised Results within a particular cohort / subject / institution?

Current ‘Simple’ Method Average Marks –Essay Mark = 57% –Marking Consistency = 5.37 Ranges –Essay 79% 31% –Marking Consistency Range Above Avge 22% 3.25 (6.76=1) Range Below Avge 26% 5.40 (4.81=1)

Innovation Grant Proposal Put the emphasis on the marker to get it right Get the opportunity to ‘reflect’ on COMMENTS before go back to essay owner 2 nd chance – not sure if I want the results to have a major effect – hope they get it right the 1 st time – consistency Is there a Need to have discussion between markers at this stage? – NO as it is dynamic Will review stage remove need for compensation?

Used on Final Year Degree + MSc DEGREE DCS 36 students on module 192 markings 25 ‘replaced’ markings out of 192 (13%) Average time per peer marking = 37 minutes Range of time taken to do markings Average number of menu comments/marking = 9.8 Raw average mark for essays = 61% Out of the 25 Markings ‘replaced’ (1 student replaced a marking twice) only 6 marks changed 6/192 (3%) Number of students who did replacements = 11(out of 36) 1 student ‘Replaced’ ALL his/her markings 6 markings actually changed mark +7, -4, -9, +3, -6, +6 (Avge = -0.5)

Used on Final Year Degree + MSc MSc EL&A 13 students 76 markings 41 replaced markings (54%) Average time per marking = 42 minutes Range of time taken to do markings 3-72 minutes Average number of menu comments/marking = 15.7 Raw average mark = 61% Out of 41 Markings ‘replaced’ –> 26 changed mark 26/76 (34%) Number of students who did replacements = 8 (out of 13) 2 students ‘Replaced’ ALL his/her markings 26 markings actually changed mark -1,+9, -2,-2, +1, -8, -3,-5, +2, +8, -2, +6, +18(71-89), -1, -4, -6, -5, -7, +7, -6, -3, +6, -7, -7, -2, -5 (Avge -0.2)

Current Conclusions The results of the mapping of the compensated peer-marks to the average feedback indexes are very positive. Although the weighted development of the average feedback index only produces a slight improvement to an already very positive correlation, it addresses a concern that the subjectivity of the comments derived from the menu driven system were not totally subjective. The main concern of this method of automatically developing a mark for marking & commenting is the mapping of the consistency factors to an absolute grade. It should be kept in mind how difficult it currently is to explain to a student why they have been awarded 69% and their colleague has 71% within a traditional assessment. Review Stage -> Tangible or Non-Tangible -> MARKS OR REFLECTION

Some Points Outstanding or Outstanding Points What should students do if they identify plagiarism? What about accessibility? Is a computerised solution valid for all? At what age / level can we trust the use of peer assessment? How do we assess the time required to perform the marking task? What split of the marks between creation & marking

Contact Information Phil Davies J316 X2247 University of Glamorgan