1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment.

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Good Evaluation. Good Evaluation Should … be simple be fair be purposeful be related to the curriculum assess skills & strategies set priorities use multiple.
Andrea Han and Joanne Fox
Chapter 9 - Eisner Reshaping Assessment. Why the need to broaden evaluation? During the 1970s, it became increasingly clear that SAT scores were dropping.
The Benefits from Allowing Plagiarism Workshop Facilitator Phil Davies School of Computing University of Glamorgan.
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.
G052 : Publishing A04 Evaluation of Publication. Evaluation of Publication (G052) Your Publication.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
Peer and Self – Assessment using Computer Assisted Self & Peer Assessment Ratings (CASPAR) Dr Holly Henderson.
1 Testing Writing Pertemuan 21 Matakuliah: >/ > Tahun: >
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
FREMA Exploratory Visualisations Steve Jeyes. A Domain view – after initial reflection on QCA, York and Kingston visits Items Validation Final Grade Structuring.
Your solution to unsupervised practice, assessment, accountability.
The Feedback Loop “all those activities undertaken by teachers, and by their students in assessing themselves, which provide information to be used as.
Enhancing Student Learning Through Error Analysis
Invisible writing Invisible Writing The link between stupidity and the semicolon Dr Pat Hill, FHEA Academic Skills Tutor /Senior Lecturer School of Music,
Module 3: Unit 2, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 2, Session 3.
Heriot-Watt University The Use of Computerized Peer-Assessment Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT.
EDU 385 Education Assessment in the Classroom
Assessment Feedback using Comment Banks: New Methods Jeff Bray Dr. Miguel moital School of Services Management Education Enhancement.
E- ASSESSMENT Why use it.. whatever it is? Phil Davies School of Computing University of Glamorgan.
etools.massey.ac.nz Tools to help lecturers mark assignments John Milne Eva Heinrich.
Nick Shennett Alex Timofeev. GroupWork The term “group work” means to be deep and active as opposed to surface and passive. Davies, M. (2009). Group work.
Assessment Professional Learning Module 5: Making Consistent Judgements.
Directorate of Human Resources The use of the Nominal Group Technique in evaluating student experience Diana Williams OCSLD.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Reports 2015! Our new reports are intended to inform you about how your child is performing against age- related expectations linked with the new national.
Student feedback – consolidating the community of practice using the Google Blogger Peter Chalk Faculty of Computing.
EDUC 5555: Assessment and Intervention Class 2. HW article Inside the Black Box: Raising Standards Through Classroom Assessment Summary sheet Review independently.
Romeo and Juliet Unit Portfolio Presentation Penny Archer.
1 Assessment Professional Learning Module 5: Making Consistent Judgements.
Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart.
Peer-Review/Assessment Aid to Learning & Assessment Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University of.
The Quality of Peer-Feedback in the Computerised Peer-Assessment of Essays? The case for awarding ‘marks for marking’ based upon peer feedback not marks.
Grading and Analysis Report For Clinical Portfolio 1.
Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
C+ Pass/Fail A A- 85% F S Unsatisfactory 67% D C B 93%
Evaluation of the ‘Electronic Feedback’ Marking Assistant and Analysis of a Novel Collusion Detection Facility Phil Denton Faculty Learning Development.
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF LECTURE: TWENTY SEVEN 1.
Read aloud. Take turns reading your essays out loud to your group. As you are reading, your group members need to take notes on the following items: What.
“The End Justifies the Means” When and how to use Computerised Assessment to Promote Learning Phil Davies School of Computing.
Formative Assessment: Journey To Higher Achievement Martha Lamb September 2010.
Computerised Peer-Assessment Phil Davies University of Glamorgan South Wales Lecturer getting out of doing marking.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
Scoring Rubrics: Validity and Reliability Barbara M. Moskal Colorado School of Mines.
Assessment: Marking and Feedback How to evidence progression effectively.
Marking and Feedback CPD Follow up to marking. Expectations and ground rules Respect the views of others Give everyone space to make a contribution All.
What attitude towards assessment did you have as a high school student?
Day 16 Objectives SWBATD analysis by identifying an author’s implicit and stated assumptions about a subject, based upon evidence in the selection. Language:
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
Using extracts of student work Patrick Andrews. Outline ›Context – courses taught ›Purposes of using student extracts ›The practical issues ›Student responses.
 Spring 2014 MCED 3430 Kelly Chaney.  Shared Group Grade o The group submits one product and all group members receive the same grade, regardless of.
Assessment & Feedback Working Group Developing Departmental Assessment & Feedback Practices The ‘Quick Wins’ Paper.
1 Phil Davies School of Computing University of Glamorgan “Super U” Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized.
Fact vs. Fancy Joyce Sherry P. Catacutan Jenniza I. Estillore.
Understanding Standards: Nominee Training Event
Tender Evaluation Briefing
Phil Denton and David McIlroy, Faculty of Science
Consistency and Reliability in Rating Student’s Work
Peer Reviews Tips for the Reviewer.
Annual Professional Development Conference
An Introduction to e-Assessment
Dr Sarah Alix, Deputy Head of Education (Chelmsford)
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan “Super U”
Phil Davies School of Computing University of Glamorgan
Students reflecting on assessments and feedback:
Presentation transcript:

1 Phil Davies School of Computing University of Glamorgan “Super U” The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment of Essays

2 Computerised Peer-Assessment  CAP System  Permits students to mark & comment the work of other students.  Also self-assess … used as a standard of expectation  Internet, not Web-based system (developed in Visual Basic / Access)

3

4 Having done the marking, what next?  Students should receive feedback  What feedback? –Marks –Comments  Which is most important?

5

6 AUTOMATICALLY THE MARKER.. ANONYMOUS

7 What should the marker do? Reflect  Look at essay again  Take into account the essay owner’s comments  Further clarification (if it is needed, then is this a ‘black mark’ against the marker?)  Try to ‘appease’ the essay owner?  Modify mark based upon reflection?  Give more feedback

8

9 Must be rewarded for doing the ‘mark for marking’ process  How to judge?  Standard of expectation (self-assessment)  Marking consistency  Commenting, quality, measure against mark  Discussion Element  Need for additional comments – black mark?  Reaction to requests / further clarification

10 Does this student deserve a good mark? ESSAY Compensated Median Mark?? Student Mark Difference W65%67%+2% X48%54%+6% Y53%57%+4% Z61%59%-2% +2% on average

11 Standard of Expectation Self-Assess = 68% Peer-Assessed as = 58% ESSAY Compensated Median Mark Student Mark Difference W65%67% +2% (8) X48%54% +6% (4) Y53%57% +4% (6) Z61%59% -2% (12) X +2% (7.5)

12

13 How easy is it to do?  Statistically fairly easy to create a mark for marking based upon marks  Take into account high and low markers  Standard of expectation  Consistency … judge against final mark awarded for an essay (compensated median)  What about the comments?

14 Feedback Index  Produce an index that reflects the quality of commenting  Produce an average feedback index for an essay  Compare against marker in a similar manner to marks analysis  Where does this feedback index come from and is it valid?

15 The way to get the feedback index?  Develop an application?? –C-Rater?  Spelling Mistakes  Similar Meanings? –That was cool –Really Choc –Really Good Essay  Manually

16

17

18

19

20

21

22

23

24 DAI SMITH ESSAY /10 Differences from actual: F/Index Mark Score ESSAY /10 FINAL MARK AWARDED 5/10 …………. … …. …

25 Time Consuming?  Can we formulate the marking process  Take away need for quantification process of analyzing comments  Is it still peer-assessment if the students are told what to say?

26

27 STUDENT FRED REFERENCES: Positive ……… Negative ……. Personal Valuation 5, 3, 2, 1 3, 1, 2

28 Some points outstanding  What should students do if they identify plagiarism?  Is it ethical to get students to mark the work of their peers?  Is a computerised solution valid for all?  At what age / level can we trust the use of peer assessment?  How do we assess the time required to perform the marking task?

29 The Automatic Generation of ‘Marks for Marking’ within the Computerised Peer-Assessment of Essays  Student Acceptance:  Anonymity / Discussion / High & Low Markers  Tangible Reward  I’m more concerned with rewarding ‘higher order skills’  Getting there