Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart.

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Graded Unit 2 Marking Exercise
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
The Cost of Authoring with a Knowledge Layer Judy Kay and Lichao Li School of Information Technologies The University of Sydney, Australia.
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
A Toolbox for Blackboard Tim Roberts
The Benefits from Allowing Plagiarism Workshop Facilitator Phil Davies School of Computing University of Glamorgan.
Assignment Marking via Online Self Assess Margot Schuhmacher, Lecturer Higher Education Development Unit, Centre for Learning and Teaching Support, Monash.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
Peer & Self Assessment Strategies and Pitfalls. Benefits of Peer & Self Assessment Peer Assessment Learners provide each other with lots of additional.
Self, Peer and Tutor Assessment of Text Online. Design, Delivery and Analysis Dr Richard Parsons Centre for Learning and Teaching University of Dundee.
Mark Turner Cuesta College Bridging The Technology Gap - Helping Students Succeed in College Algebra.
I. Pribela, M. Ivanović Neum, Content Automated assessment Testovid system Test generator Module generators Conclusion.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Jordan Girling.
The Learning and Teaching Conference nd April 2015.
Designing in and designing out: strategies for deterring student plagiarism through course and task design Jude Carroll, Oxford Brookes University 22 April.
Damien Raftery Lecturer & eLearning Development Officer Teaching & Learning Centre Institute of Technology Carlow Using Google Docs to support Project-based.
Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT.
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
etools.massey.ac.nz Tools to help lecturers mark assignments John Milne Eva Heinrich.
1 Project Information and Acceptance Testing Integrating Your Code Final Code Submission Acceptance Testing Other Advice and Reminders.
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
IPC144 An Introduction to Programming Using C. Instructor Murray Saul Office: Rm –Office hours are posted on my IPC144 web page or on bulletin board.
Fundamental Programming: Fundamental Programming K.Chinnasarn, Ph.D.
The Learning and Teaching Conference 22 nd April 2015.
Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of.
The Quality of Peer-Feedback in the Computerised Peer-Assessment of Essays? The case for awarding ‘marks for marking’ based upon peer feedback not marks.
Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.
WORD PROCESSOR AND POWER POINT IN THE CLASSROOM. A word processing and power point program does not require highly advanced hardware. This means that.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
The Continual Assessment of Confidence or Knowledge with Hidden MCQ? Short Paper W.I.P. Phil Davies School of Computing University of Glamorgan.
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Selina J. Gonzalez March 10, 2013 EDTC Application Instructional Tech.
Using portfolios for learning and assessment - frying pan to fire? David Baume PhD FSEDA.
Computerised Peer-Assessment Phil Davies University of Glamorgan South Wales Lecturer getting out of doing marking.
Welcome to CITB223 Mngt. Info. System An Overview of the Course.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
INTRODUCTION: WELCOME TO STAT 200 January 5 th, 2009.
1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment.
SCROLLA 16th June 2004 The Future of Computer Aided Assessment Helen Ashton Ruth Thomas Heriot-Watt University.
MM2422 Managing Business Information Systems & Applications — Before we start…
Course Information CSE 2031 Fall Instructor U. T. Nguyen /new-yen/ Office: CSEB Office hours:  Tuesday,
Economics Network Assessment and Feedback.
A Kelvin Grove guide to success!! Some material adapted from QUT Guide to Editing and Proofreading
Course Information CSE 2031 Fall Instructor U.T. Nguyen Office: CSE Home page:
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Learning Technology Development. edgehill.ac.uk/ls David Callaghan September 2013 “How I engaged my students” One tutor’s experience that produced outstanding.
Talk about the assignment! April 27th 2015 #TOOC15 Webinar.
Creating Assessments that Engage Students & Staff Professor Jon Green.
 Spring 2014 MCED 3430 Kelly Chaney.  Shared Group Grade o The group submits one product and all group members receive the same grade, regardless of.
true potential An Introduction to the First Line Manager Programme’s CMI Qualifications.
1 Phil Davies School of Computing University of Glamorgan “Super U” Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized.
How to Be Successful in English What to Do the First Week O Get the book – either hard cover or e-book O Read the Orientation Materials O Watch.
The New Zealand Qualifications Framework Otago Girls’ High School.
Course Information EECS 2031 – Section A Fall 2017.
Phil Denton and David McIlroy, Faculty of Science
Post Exam Student Reflection
How to Be Successful in English 3000
Seminar Four Quality Academic feedback: oral and written
Computer Aided Teaching & Testing
Computerised Accounting
An Introduction to e-Assessment
Course Information EECS 2031 Fall 2016.
Phil Davies School of Computing University of Glamorgan “Super U”
Creative assessment and feedback
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan
Presentation transcript:

Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart Lewis School of Computing University of Glamorgan

Need for Assessment? As tutors we are trying to “separate” the sheep from the goats via the assessment process. This can often be difficult with the time constraints imposed on tutors, in what Chanock describes as “more goat-friendly times” (Chanock, 2000). Problem …. Feedback against time!

Defining Peer-Assessment In describing the teacher.. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a lightbulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

What functionality do we require in a computerised peer-assessment system? Method to Peer-Mark & COMMENT Method to allow students to view comments Method to permit conversation anonymous Method to take into account high/low markers … Fair to all Method to Qualitatively Assess marking & commenting processes (higher order skills) Method to permit a student to make comments that are understandable (framework) to them and owner Security / Recognise and Avoid Plagiarism / Flexibility

AUTOMATICALLY THE MARKER.. ANONYMOUS

Must be rewarded for doing the ‘mark for marking’ process.. Based on quality How to judge? Standard of expectation (self-assessment) Marking consistency Commenting, quality, measure against mark Discussion Element Need for additional comments – black mark? Reaction to requests / further clarification

Feedback Index Produce an index that reflects the quality of commenting Produce an average feedback index for an essay (also compensated?) Compare against marker in a similar manner to marks analysis Where does this feedback index come from and is it valid?

CAA Conference 2003 Future Work It should be noted that students marking work only tend to use a subset of these comments. From their feedback have a different regard to the weighting of each of the comments with respect to their commenting on the quality of an essay.

Exercise 1.I think you’ve missed out a big area of the research 2.You’ve included a ‘big chunk’ that you haven’t cited 3.There aren’t any examples given to help me understand 4.Grammatically it is not what it should be like 5.Your spelling is atroceious 6.You haven’t explained your acronyms to me 7.You’ve directly copied my notes as your answer to the question 8.50% of what you’ve said isn’t about the question

Each Student is using a different set of comments … these new weightings MAY give a better feedback index? Currently being evaluated

Is it my job to teach students how to write essays, etc? Assessment MUST be directed at subject skills Why bother writing essays, doing exam questions, etc. … doesn’t relate to needs or learning outcomes of subject Post HND … N-tier … Assess the essays of the final year (last year) Preparation/Research: Judge knowledge against last year’s results.. Both marks & comments Mistake!!

e.g. a group of marking differences +4, -22, +16, -30, +8, +12 would result in an average difference of -12 / 6 = -2 (taking 0 as the expected difference). The absolute differences from this value of -2 are 6, 20, 18, 28, 10, 14. This gives an average consistency valuation of 13 (96/6). This shows a poor consistency by this marker. Compare this with a student whose marks produced were +4, -4, -10, -8, 6, 0. The average difference for this marker is again -2 (-12/6). The absolute differences from this value however are 6, 2, 8, 6, 8, 2. This gives a consistency valuation of 5.33 (32/6). This student deserves much more credit for their marking even though the average standard deviation of the two sets of markings was the same. The fact that a student always high or low marked is now removed as it is the absolute difference that is being compared.

MarksMarking DifferenceFeedback DifferenceMapping for MCQ & Essays 5<4<290% - 100% 4<8<480% - 89% 3<12<660% - 79% 2<16<840% - 59% 1<20<1020% - 39% 020 or more10 or more0% - 19% Who benefited the most by doing this exercise? Cured plagiarism?

Can the same principles be applied in other subject areas? Java Programming with Coursemarker Stuart Lewis’ idea Students create a solution to a programming assignment Submission(s) Peer-Evaluate other solutions Comments … Marks for Marking (weightings)

CourseMarker Core CM File Storage System Marking System Evaluation System Exercise Developm. SystemStudent Exercise Environment assignments exercises notes questions test methods solution template marking scheme exercise setup submission edit compile link run feedback and mark final mark position in class course statistics flagging-up of ‘problem cases’ immediate support TEACHER STUDENT re-usability automated marking - fair - frees time plagiarism check steep learning curve difficult setup (but it’s getting easier) immediate feedback fast support additional overheads TEACHER STUDENTS UNIX (Linux), Windows, Mac, based all platforms Assessment of text I/O assignments only no marking of graphical output remote student / teacher access distance learning, open all hours Advantages / Disadvantages FEATURES Computer Assisted Teaching and Assessment Modula-2 Java C comments / questions

PeerMarker Screen

Student while marking Exposure to different solutions Development of critical evaluative skills Useful experience of reading code for future employment situations Plagiarism? … Good solution / No understanding

Student while reviewing feedback from peers Range of subjective marking Confirmation of objective automated marking Anonymous discussion between marker and marked

Current position Test system working Changes following beta test in progress Plans to try sample study again (at a more convenient time, and with added rewards!) Employed 2 nd Placement Student Graphical Interface

Some Points Outstanding or Outstanding Points What should students do if they identify plagiarism? Is it ethical to get students to mark the work of their peers? Is a computerised solution valid for all? At what age / level can we trust the use of peer assessment? How do we assess the time required to perform the marking task? What split of the marks between creation & marking BEST STORY

Contact Information Phil Davies / Stuart Lewis School of Computing University of Glamorgan Innovations in Education & Teaching International ALT-J