TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

Slides:



Advertisements
Similar presentations
MSc Dissertation Writing
Advertisements

Problem solving skills
Correction, feedback and assessment: Their role in learning
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
Perceptions of Feedback: Myth & Reality Jon Scott School of Biological Sciences.
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Skills and Techniques Revision.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
The use of Technology in Correspondence Tuition SIG Meeting Milton Keynes 13 September 2006 Wendy Fisher.
© Cambridge International Examinations 2013 Component/Paper 1.
M. George Physics Dept. Southwestern College
Intermediate 2 & Higher Physical Education
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
FLIPPING THE CLASSROOM: ADVENTURES IN STUDENTS’ SELF DIRECTED STUDY ERI TOMITA AND JULIE DEVINE.
Recruitment of online tutors Sharon Slade, Fenella Galpin OU Business School.
Investigating the use of short answer free-text questions in online interactive assessment Sally Jordan 17 th January 2008 Centre for Open Learning of.
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Improving Students’ understanding of Feedback
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
Formative Assessment in Science Teaching Feedback can be a waste of time Stephen J Swithenby The Open University, Milton Keynes MK7 6AA
E-portfolios for PDP An overview of student and staff perceptions across subject areas Federica Oradini and Gunter Saunders Online Learning Development.
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Assessment. Scales and Rubrics Lettered Scales Point Scales 100 Point Scales Degree Classifications.
Enhancing Student Learning Through Error Analysis
Self-Assessment By Students Sharing by Cheryl Lee 16 January 2015.
Making Big Classes Small: Penn State’s Blended Learning Initiative Renata Engel John T. Harwood January 30, 2006 Copyright Penn State, This work.
Mixed-level English classrooms What my paper is about: Basically my paper is about confirming with my research that the use of technology in the classroom.
Frances Chetwynd EATING th June 2011 Marking Guides and Effective Feedback.
Interactive online assessment with teaching feedback – for open learners Valda Stevens and Sally Jordan Student Feedback and Assessment Meeting, 10 th.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
University Writing Project Faculty Feedback
The Maths Pipeline Programme for the FE and Skills Sector
Chapter 8: Systems analysis and design
Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.
Dr Clare Carruthers Mrs Brenda McCarron Dr Adrian Devine Dr Peter Bolan Dr Una McMahon Beattie Feedback on feedback: Engaging students in the feedback.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Pedagogy, Technology, & Course Redesign VIII Vera Cherepinsky MACS Department June 5, 2008 Getting Students to Learn from Their Mistakes: Self-Reflective.
How to Evaluate Student Papers Fairly and Consistently.
TOK Camp 2013 – TOK Presentation Preparation Part 1.
DMW 2007 The Open University's Institute for Educational Technology Open Mentor: Supporting tutors with their feedback to students Denise Whitelock Open.
Personal Development for Communication Technology Pratik Man Singh Pradhan | Module Code: CT1039NI | Week 5 - Lecture.
Social Factors Collecting Information on the impact of Social Factors on Your Teams Performance.
Developing online activities for postgraduate students in computing Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT)
ELPP, 15 November 2010 e-Feedback Meeting Students’ Needs & Expectations Yuhua Hu & Paul McLaughlin The School of Biological Sciences.
© 2008 Gatsby Technical Education Projects. These slides may be used solely in the purchaser’s school or college. Evaluating scientific writing.
1 Setting Assessment Questions and Providing Constructive Feedback.
Construction of Effective Assignment Feedback in Web- based Tertiary-level English Language Teaching -- A Case Study of the Course “English>Chinese Translation”
Unit 2 AS Sociology Research Methods Examination Technique.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Marking and Assessing Andy Wilson Loughborough University.
E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics,
GCSE Child Development How to make your child study brilliant!
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Peer feedback on a draft TMA answer: Is it usable? Is it used? Mirabelle Walker Department of Communication and Systems.
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
Z26 Project Management Presentations Lecture 5b 9 th February 2006 Graham Collins, UCL.
Physics IA Spring 2015 It is important to remember that the formulation of the research question is the student’s responsibility and is assessed within.
Denise Kirkpatrick Pro Vice-Chancellor The Open University, UK Assessment in DE contexts.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Class Observer & Feedback Training Cass Breen & Marco Macchitella.
Moving from Marking to Feedback
Exam Structure Exam (1 Hr 30 Minutes)
The One-Two-Three Feedback Cycle
Editing & Polishing your Assignment
Chapter 8 Performance Management and Employee Development
Information for Parents on Key Stage 2 SATs
DEVELOPMENTAL LEARNING AND TARGETED TEACHING
Unit 7: Instructional Communication and Technology
Creative assessment and feedback
In-Service Teacher Training
Presentation transcript:

TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

FAST project findings (1) Quantity and timing of feedback –Sufficient feedback is provided, often enough and in enough detail –The feedback is provided quickly enough to be useful to students Quality of feedback –Feedback focuses on students’ performance and learning, and on actions under their control –Feedback is appropriate to the purpose of the assignment and to its criteria for success –Feedback is appropriate in relation to students’ understanding of what they are supposed to be doing Student response to feedback –Feedback is received and attended to –Feedback is acted upon by the student Gibbs & Simpson (2004–5)

FAST project findings (2) What students said about their TMA feedback Received plenty of it Motivated by praise and encouragement Mainly received within three weeks Read feedback but rarely acted on it

FAST project findings (3)

FAST project findings (4)

Idea of ‘depth’ of comment Depth 1 – acknowledges e.g. ‘more needed here’; ‘good’ Depth 2 – corrects / amplifies e.g. ‘you needed to mention xxxx’; ‘a good introduction’ Depth 3 – explains e.g. ‘you needed to mention xxxx because …’; ‘a good introduction because …’

FAST project findings (5)

FAST project findings (6) Significant weaknesses in current practice: Too much emphasis on justifying the grade Lack of shared understanding of assessment criteria (students & CT) ALs good at articulating students’ weaknesses; explaining strengths problematic Lack of holistic assessment of students’ work

My project Replicate some of FAST investigations in Technology – results similar? –Courses chosen: T173, T209 & T224 –Analysis of feedback on sample TMAs –Telephone interviews with students Follow-up action with ALs, monitors, CTs, STs as appropriate – and review

Progress so far Feedback analysed on all three courses Some results circulated to T209 & T224 ALs Some comparisons with Science made Telephone interviews conducted on T209 & T224 – T173 next month Some analysis of telephone feedback done – awaiting T173 to finalise

Category percentages compared – Technology & Science

Category percentages compared – Technology courses & Science

Category percentages compared – Technology (not T209) & Science

Type of content comment compared – Technology & Science

Type of content comment compared – Technology courses & Science

Type of content comment compared within T173

Depth percentages compared – Technology & Science

Depth percentages compared – Technology courses & Science

Further comparison of depths

Number of comments per student MeanMedianLowestHighest T T (twice)81 T (twice)48 Science39

Action taken on T209 and T224 Document ‘Using TMA comments to good effect’ prepared for each course –Explains ideas of ‘depth’ and ‘feed forward’ –Contains course-specific examples of depth 2 & depth 3 comments Sent out in first tutor mailing (2006) with commendation from Course Chair Monitors briefed accordingly

Telephone surveys T209 and T224 complete – carried out immediately after end of course Examined –student’s perception of usefulness/helpfulness of feedback –whether (& how) student had used the feedback in any future TMA / the ECA / the exam –student’s preferences regarding placing of comments: on PT3, script, (pro-forma)

Preliminary findings (1) Students are eager to receive their marked TMAs and do read the feedback … … but they do not necessarily use the feedback again in the rest of the course (approx 20% said they never used it) T209 students were likely to make more use of the feedback later in the course than T224 students T209 students particularly mentioned using skills development feedback

Preliminary findings (2) Not all students found that tackling the TMA, and the feedback they subsequently received, encouraged them to study the rest of the course A small number of students said they were disappointed with the quality of the feedback (but some were surprisingly accepting)

Preliminary findings (3) Overwhelming majority of students value comments on the script the most: –‘where lost marks made clear’ –‘tells me exactly where the mistake is’ –‘very specific’ –‘more evidenced against actual text’ –‘easier seeing my work with comments relating to it’

Aside about feedback and eTMAs We can’t assume that students read the PT3 first – or even at all We can’t assume that students find and read a separate marking document sent back with the marked TMA Some (most?) students find juggling documents on the screen awkward/difficult It’s easy for tutors to place comments exactly where they apply Turn-around times longer: symptom of a problem?

Action taken on T209 and T224 T209 and T224: Students reminded to look for and read the PT3 T209: Separate pro-forma dropped for eTMAs; tutors asked to copy and paste grids at end of questions

FAST project findings in Technology? Significant weaknesses in current practice: Too much emphasis on justifying the grade  Lack of shared understanding of assessment criteria (students & CT) varied ALs good at articulating students’ weaknesses; explaining strengths problematic  Lack of holistic assessment of students’ work 

Good feedback in Technology? Quantity and timing of feedback –Sufficient feedback is provided, often enough and in enough detail In general –The feedback is provided quickly enough to be useful to students In general Quality of feedback –Feedback focuses on students’ performance and learning Too much biased towards performance on this TMA –Feedback is appropriate to the purpose of the assignment and to its criteria Purpose and criteria often implicit –Feedback is appropriate in relation to students’ understanding of what they are supposed to be doing Sometimes, but more explanations would be helpful Student response to feedback –Feedback is received by students, and attended to Yes –Feedback is acted upon by the student Not sufficiently

Implications for Course Teams Different sorts of questions and criteria elicit different types of feedback (or) To elicit particular feedback, write the question and criteria accordingly – maybe even write the course material accordingly

More implications for CTs Be explicit with ourselves and with ALs and students about what an assignment’s purpose and criteria are Don’t assume that ALs will instinctively know what sort of feedback we’re hoping for – be explicit in the marking guide (or elsewhere)

Implications for Associate Lecturers Need for shift towards emphasis on supporting student’s learning and progress through course, rather than just explaining what was wrong in this particular TMA That implies more student-centred feedback – and more holistic feedback on PT3s It may also imply giving as much (more?) emphasis to feedback as to marks

Implications for monitors Need to shift emphasis from ‘Was the mark OK?’ towards ‘Was the feedback OK?’ Need to encourage appropriate forms of feedback (and discourage non-appropriate ones?)

References FAST presentations given at Open University, 10 February 2005 Gibbs, G & Simpson, C (2004–5) ‘Conditions under which assessment supports students’ learning’ Learning and teaching in higher education 1(1) pp 3–31; available via

Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT) The Open University Walton Hall Milton Keynes MK7 6AA