Phil Denton and David McIlroy, Faculty of Science

Slides:



Advertisements
Similar presentations
Feedback sessions - Helping first year students get the most out of assessment and feedback Sue R Whittle & Linda B Bonnett Faculty of Biological Sciences.
Advertisements

Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Assignment Marking via Online Self Assess Margot Schuhmacher, Lecturer Higher Education Development Unit, Centre for Learning and Teaching Support, Monash.
The impact of case studies formatively and summatively assessed on students’ examination performance. Geeta Hitch (Senior Lecturer, Dept of Pharmacy) Janet.
How feedback works for some of the people some of the time Liz McDowell.
Recording Excellence Nicole Duplain School of Humanities.
Exploring Interactive Computer Marked Assessments for Logic Constructs in Level 1 Computing Centre for Open Learning of Mathematics, Science, Computing.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
DEPARTMENT OF MARKETING Presented by: Mercedes Douglas, Senior Tutor, Other Reap Team Members:
etools.massey.ac.nz Tools to help lecturers mark assignments John Milne Eva Heinrich.
Exploring the learner experience of ePortfolios for formative and summative feedback in the health sciences Susi Peacock (CAP), Sue Murray (CAP) and Alison.
IPC144 An Introduction to Programming Using C. Instructor Murray Saul Office: Rm –Office hours are posted on my IPC144 web page or on bulletin board.
Copyright © 2008 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of.
Assessment Careers: getting better value out of feedback Dr Gwyneth Hughes and Dr Holly Smith.
Alessio Peluso 1 Critical evaluation of the module ‘Introduction to Engineering Thermo Fluid Dynamics’ First Steps in Learning and Teaching in Higher Education.
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
Re-engineering Assessment Practices in Scottish Higher Education [REAP] Dr David Nicol, Project Director Centre for Academic Practice and Learning Enhancement.
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
Introduction Feedback is ‘one of the most powerful ways to… enhance and strengthen student learning’ (Hepplestone et al 2009) Anecdotal evidence suggests.
Online Quality Course Design vs. Quality Teaching:
What Works at Wolves? A roll out in the Institute of Sport
Anthony Williams, Maria Northcote, Jason Morton and John Seddon
Teacher SLTs
Developing a Teaching Dossier
Institute for Learning Innovation and Development
Department of Political Science & Sociology North South University
Teacher SLTs
Tender Evaluation Briefing
Their role within Schools and Colleges
2017 Program Session Two.
B.A. 4 Placement Overview (Placement 1) 4th October 2016
Secondary Initial Teacher Education: routes into teaching
The use, benefits & pitfalls of self and peer assessment for formative feedback in a large generic nursing module: An example from practice Julia Petty,
Conducting the performance appraisal
Our Research Into Assessment
COMP390/3/4/5 Final Year Project Design
Conducting the performance appraisal
KA2 Strategic Partnerships – HU01-KA
Dr Anna Stodter FST Department of Sport and Exercise Sciences
Our new quality framework and methodology:
Magnus M B Ross & Mary P Welsh
Roles and Responsibilities of an External Examiner
Rick McGee, PhD and Bill Lowe, MD Faculty Affairs and NUCATS
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Keeping Students on Track Using Technological Retention Tools
Important information about your assessment in 2017/18
ASSESSMENT AND MODERATION: IN PRACTICE
Strategies for making exam feedback dialogic
Teacher SLTs
Student Equity Planning August 28, rd Meeting
Their role within Schools and Colleges
General Education Assessment Revision Plan Proposal
Kat Hearn and Dawn Reilly
Phil Davies School of Computing University of Glamorgan “Super U”
Assessment Workshop September 16, 2009.
Our Research Into Assessment
Hands-On: FSA Assessments For Foreign Schools
Their role within Schools and Colleges
Learning Intentions and Success Criteria
External Examiners Briefing Session Friday 14th December 2018
Tutors: providing feedback Students: using tutor feedback
Teacher SLTs
Feedback on Summative Examinations
Diverse use of the web-based peer assessment tool, Aropä, across the University of Glasgow. Elaine Huston, Mary McVey, Susan Deeley, Chris Finlay, John.
A Moodle-based Peer Assessment Tool
This resource has been released by the University of Bath as an Open Educational Resource. The materials are licensed under a Creative Commons Attribution-ShareAlike.
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Institutional Self Evaluation Report Team Training
Presentation transcript:

Phil Denton and David McIlroy, Faculty of Science All by myself: Do students read summative feedback returned to them alone at a computer? Phil Denton and David McIlroy, Faculty of Science

Feedback for learning The facilitation of learning is one of the reasons why assessment is undertaken.1 Relies on the return of high-quality feedback. There have been a number of qualitative studies into students’ experiences of tutor feedback. Task-focussed comments are valued.2 General and vague feedback is ineffectual.3 Must show how to address deficiencies.4 Take in account above & return within 15 days…

e-Marking Electronic marking tools can expedite marking. Often have a ‘statement bank’ e.g. Grademark. Staff have reported a positive impact from the increased use of such tools,5 however, their effectiveness merits further research.6 Of particular interest to this study is the experiences of students receiving tutor feedback on assessments via computer. Do they read, let alone act upon, e-feedback?

Paper versus Electronic Assessment Submission Marking Feedback

Students’ response to e-feedback Our previous work suggested a large minority of 1st years did not read feedback on a summative report.7 At the end of emailed feedback, students were invited to reply to confirm that they had read it. Group B: % mark hidden and invited to guess it.

Quantitative studies of responses to feedback Our previous study offered no way to differentiate between students who did not read the request for a reply, and those who read it and chose to ignore it. A solution is to place the invite at different positions within feedback and compare response rates. “As part of an educational research project, could I please ask you to immediately reply to this email with a blank message. Please do not inform your fellow students that you have done this.”

Outcomes of two quantitative studies Study 1: First year Pharmaceutical Science data analysis exercise. Feedback (~400 words) emailed. For 25 students, the request for a reply was at start, adjacent to mark, and 56% of these students replied. 24 Students had the same request in the final paragraph and only 42% of these students replied. % Mark awarded appears to determine response, being 76% for repliers and 57% for non-repliers. Not clear if ‘word of mouth’ affected replies.

Table 2 Recipients of summative e-feedback. Study 2: Formative FHEQ3 Natural Sciences Excel task with brief indicative feedback, followed by ~700 words of e-feedback on associated summative task. Table 2 Recipients of summative e-feedback. Group N Email request position Mean % Median % T1 27 At start, next to mark 63 65 T2 End of 1st paragraph 64 68 T3 26 End of 2nd paragraph 69 T4 End of 3rd paragraph T5 End of 4th paragraph C None (Control group)

Analysis of response rates by Group Figure 1 Natural Science students (N = 161) who replied to feedback () and who did not ().

No students in the control group emailed a reply. No significant difference in reply rates between T2-5, average 36%: When feedback read, read in full. The mean reply rate for students in T1 (M = 63, SD = 49, N = 27) was significantly higher than T2-5 (M = 36, SD = 48, N = 107) based on a two-sample t-test for equal variances, t(132) = 2.54, p = .012. T1 rate indicates 63% of students will definitely reply upon reading a request. For T2-5, only 36% replied. The difference in these two rates, 27%, did not read the request and, by extension, their feedback.

This comparison of response rates gives: Read, 36%  Eithera, 37%  Not read, 27% Figure 2 Responses of Foundation Natural Science students (N = 161) to feedback on an Excel assignment. aDid not reply to email but may have read their feedback.

Analysis of response rates by % mark Over each of the 5 Test Groups, marks of students who replied ranged from 69% to 81%, average 73%. The marks of students who did not reply in each Test group ranged from 53% to 60%, average 57%. The mean actual % mark for repliers (M=73 SD=20, N= 56) was significantly higher than the mean actual % mark for non-repliers (M=57, SD=22, N= 78) using the two-sample t-test for equal variances, t(132) = 4.48, p < .0001.

Figure 3 % Marks of Natural Science students who replied to their feedback () and who did not ().

Conclusions A quarter of 161 FHEQ3 students did not read summative e-feedback. Between three-eighths and three-quarters did read it (from beginning to end). 2 Studies found lower-performing students are less likely to respond to a request for an email. Either, they are less likely to read their feedback and see the request, or, they read their feedback as much as their high-performing peers but are less likely to respond upon seeing such a request.

Conclusions Alone at a PC is not necessarily a fertile location for learning and we should use strategies that condition students to routinely engage with feedback: Returning formative feedback that is linked to a summative task with same assessment criteria. In-class support after students have received summative feedback (with mark hidden?) online. Around 20 times more time was used to mark the summative Excel task compared to the associated formative exercise – should be reversed?

References Orsmond, P., S. Merry and K. Reiling. 2000. “The use of student derived marking criteria in peer and self-assessment.” Assessment and Evaluation in Higher Education 25(1): 21–38. Black, P., and D. Wiliam. 1998. “Assessment and Classroom Learning.” Assessment in Education: Principles, Policy and Practice 5(1): 7-74. Weaver, M. 2006. “Do students value feedback? Students’ perception of tutors’ written responses.” Assessment and Evaluation in Higher Education 31: 379–394. Higgins, R., P. Hartley, and A. Skelton. 2001. “Getting the message across: The problem of communicating assessment feedback.” Teaching in Higher Education 6(2): 269–274. Heinrich, E., J. Milne, and M. Moore. 2009. “An Investigation into e-Tool Use for Formative Assignment Assessment – Status and Recommendations.” Educational Technology and Society 12(4): 176–192. Nicol, D. J., and C. Milligan. 2006. “Rethinking technology-supported assessment in terms of the seven principles of good feedback practice.” In Innovative Assessment in Higher Education, edited by C. Bryan and K. Clegg. London: Taylor and Francis. Denton, P. and Rowe, P. 2014. “Using statement banks to return online feedback: limitations of the transmission approach in a credit-bearing assessment.” Assessment and Evaluation in Higher Education: pp. 1-9. ISSN 0260-2938.