Improving student learning through assessment and feedback Graham Gibbs University of Winchester.

Slides:



Advertisements
Similar presentations
Improving student learning by changing assessment across entire programmes Graham Gibbs.
Advertisements

Peer-Assessment. students comment on and judge their colleagues work.
Enhancing Induction: Principles for Improving the Student Experience Engaging the learner: Why did I get 37%? Professor Brenda Smith Goldsmiths, University.
Assessment Design Practical examples. Peer marking using model answers (Forbes & Spence, 1991) Scenario: Engineering students had weekly maths problem.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Key Issues in Assessment and Feedback: The only feedback I received was two ticks and a question mark! University of Surrey Faculty of Arts and Human Sciences.
Know role of and characteristics of effective feedback
An introductory tutorial
Perceptions of Feedback: Myth & Reality Jon Scott School of Biological Sciences.
Course Review on Assessment Professor Brenda Smith.
Developing coherence in assessment. Relationship to the new C- QEM report* Coherence in the course and assessment *Course quality enhancement and monitoring.
Exams and Revision Some hints and tips.
Enhancing Student Engagement – what are we talking about? Graham Gibbs Research Centre for Student Engaged Educational Development.
Audio-feedback Background – why I did it. My experiences – what I did. Student feedback – how did it go?
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Classroom Assessment Techniques: “Group” presentations Social and Legal Environment of Business.
Blind sea captains and sunken treasure: exploring the NSS through the eyes of ‘Captain Cat’
Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
Footloose Feedback.
Assessment for learning: quotes and questions Professor Ranald Macdonald FSEDA, FHEA National Teaching Fellow Chair, SEDA Research Committee Associate.
The Test Assessment Questionnaire Katherine M. Sauer William Mertens Metropolitan State College of Denver University of Colorado at Boulder
Assessment and Feedback
TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Assessment and Feedback Peer and Self Assessment
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Improving Students’ understanding of Feedback
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
Assessment. Scales and Rubrics Lettered Scales Point Scales 100 Point Scales Degree Classifications.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
Strategies for Math Success Russell Conwell Learning Center Online Workshop.
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Principles of Assessment
Feedback on exams Vicki Bruce On behalf of School of Psychology.
University of York May 2008 Using assessment to support student learning Graham Gibbs.
Developing your Assessment and Feedback Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.
Engaging students in assessment Chris Rust Deputy Director, ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
If we want to improve our courses, what should we be paying attention to, and how should we go about making changes? Graham Gibbs.
Lesson Planning for Learning Best Practices ~ 2014.
How to Evaluate Student Papers Fairly and Consistently.
Using Feedback. Objectives Assess the effectiveness of a range of factors affecting achievement Assess the effectiveness of a range of factors affecting.
Lack of Learning or Lack of Studying? An Inquiry into Low Exam Scores Katherine M. Sauer Metropolitan State College of Denver February.
Feb using assessment to motivate learning. QAA Assessment Driven Learning Jean Cook
Assessment for Learning: supporting post graduate students’ Margo McKeever.
Orientation The Harvard Study. ©2008 The Harvard Study Welcome The Harvard Study is dedicated to your success We specialize in GMAT preparation, so as.
PRESENTER: N. LEACH TRANSFORMING ASSESSMENTS An investigation into the assessment strategy for Commercial Law for Accountants 1.
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Early Behaviours and What to Look For EARLY READING BEHAVIOURS…
Improving Academic Feedback - Turning the ship around with the power of 1000 students David Hope
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
Student Peer Review An introductory tutorial. The peer review process Conduct study Write manuscript Peer review Submit to journal Accept Revise Reject.
ELPP, 15 November 2010 e-Feedback Meeting Students’ Needs & Expectations Yuhua Hu & Paul McLaughlin The School of Biological Sciences.
Fair and Appropriate Grading
Primary MCQ Course Evaluation September Mean score represented as bar charts. 1= poor 5= excellent Mean score for each subject is presented as bar.
Why bother giving feedback?. How not to provide feedback?
Self Evaluation. How well did your group work? A: Really well B: Fairly well C: OK D: poorly E: very poorly.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
Improving student learning and experience through changing assessment environments at programme level: a practical guide Graham Gibbs.
Marking to improve student outcomes. Marking and feedback – are they the same?  Marking is the annotating of a piece of written work, using words, symbols.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Enhancing Assessment and Feedback: A Student Perspective HaSS Faculty Assessment and Feedback Event Tuesday 1 st March 2016 Matthew Price and David Jones.
University of Winchester TESTA/American Studies
A department-wide approach to Feedback
Susan Rhind, Neil Lent, Kirsty Hughes, Jill MacKay
A department-wide approach to Feedback
What do Edinburgh Students Want?
Presentation transcript:

Improving student learning through assessment and feedback Graham Gibbs University of Winchester

About TESTA NSS scores for assessment and feedback worse than for any other aspect of student experience, and improving slowest TESTA started four years ago at Winchester + three other universities –Way of thinking about assessment –Way of evaluating impact of assessment Now 23 Universities, and counting... TESTA can improve NSS scores

Law degree programme using TESTA to improve assessment regime NSS QuestionBeforeAfterIncrease Clear criteria37%96%59% Fair assessment50%87%37% Prompt feedback Comments detailed Feedback helped 73% 56% 42% 96% 79% 23% 37% Overall Satisfaction48%96%37%

University of Winchester NSS question Increase 7Feedback on my work has been prompt 47%72%25% Bottom Quartile nationally Top Quartile nationally

NSS assessment scores nationally Average increase annually <1% Winchester increases 20%-60% Edinburgh scores stuck at 39% despite 2-year initiative... now part of Russell Group consortium using TESTA First three degree programmes at Winchester to use TESTA now ranked 1 st nationally in their discipline for student satisfaction (two of them 100%)

Chapter 1..in which we encounter a mystery

Undergraduate degree programme Committed and innovative teachers Most marks from coursework, of very varied forms Few exams Lots of written feedback on assignments: 15,000 words Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment but it does not support learning well. Students: Don’t study many hours and they distribute their effort across few topics and weeks Don’t think there is a lot of feedback or that it is very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are...are unhappy!

Chapter 2..in which we discover that students respond to our assessment in ways that reduce their learning

“I just don’t bother doing the homework now. I approach the courses so I can get an ‘A’ in the easiest manner, and its amazing how little work you have to do if you really don’t like the course.” Snyder, B.R. (1971) The Hidden Curriculum, at MIT

“I am positive there is an examination game. You don’t learn certain facts, for instance, you don’t take the whole course, you go and look at the examination papers and you say ‘looks as though there have been four questions on a certain theme last year, and the professor said that the examination would be much the same as before’, so you excise a good bit of the course immediately…” Miller, C.M.I. & Parlett, M. (1974) ‘Up to the Mark - a study of the examination game’, at Edinburgh

“One course I tried to understand the material and failed the exam. When I took the resit I just concentrated on passing and got 98%. My tutor couldn’t understand how I failed the first time. I still don’t understand the subject so it defeated the object, in a way” Gibbs, G. (1992) Improving the quality of student learning.

“ The tutor likes to see the right answer circled in red at the bottom of the problem sheet. He likes to think you’ve got it right first time. You don’t include any workings or corrections – you make it look perfect. The trouble is when you go back to it later you can’t work out how you did it and you make the same mistakes all over again” Gibbs, G. (1992) Improving the quality of student learning.

Chapter 3...in which teachers use assessment methods to make students learn

The case of the Psychologist The case of the Philosopher of Education The case of the Doctor The case of the Architect The case of the Pharmacist The case of the Engineer The case of the Businessman The case of the Geographer

The case of the engineer Weekly lectures, problem sheets and classes Marking impossible Problem classes large enough to hide in Students didn’t tackle the problems Exam marks: dropped from 55% to 45%

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged Av. exam marks increased from 45% to 85% Why did it work?

The case of the engineer time on task social learning and peer pressure timely and influential feedback learning by assessing –error spotting –developing judgement –self-supervision ‘meta-cognitive awareness and control’

Chapter 4..in which teachers give students feedback, but they don’t read it

Why don’t students read feedback? It is very poor quality feedback: little of it, random, insensitive, unreadable.. It also has marks on it It does not make any sense It relates to goals and standards students don’t understand It is too late to be useful It is about a topic they will never study again It is about how to tackle an assignment that is different from their next assignment...it does not feed forwards

Chapter 5 The story so far...

Assessment supports student learning when... It captures sufficient student time and effort, and distributes that effort evenly across topics and weeks – time on task It engages students in high quality learning effort – focussed towards the goals and standards of the course, which students understand – engagement, deep approach It provides enough, high quality feedback, in time to be useful, and that focuses on learning rather than on marks or on the student Students pay attention to the feedback and use it to guide future studying (feedforwards) and to learn to ‘self-supervise’

Changing assessment at module level Micro-level tactics to address weaknesses in the support of learning (e.g. low student effort) Gibbs, G. (2009) Using Assessment to Support Student Learning. On line at Leeds Metropolitan University Teachers have difficulty making assessment very different on their module if no-one else changes..

Chapter 6..in which we discover differences between programmes in the way students experience assessment

AEQ: Assessment Experience Questionnaire –Quantity and distribution of effort –Quality, quantity and timeliness of feedback –Use of feedback –Impact of exams on quality of learning –Quality of study effort: deep and surface approach –Clarity of goals and standards –Appropriateness of assessment

Chapter 8..in which we discover what kind of assessment regime leads to good (or poor) learning

Characteristics of programme level assessment environments % marks from examinations (or coursework) Volume of summative assessment Volume of formative only assessment Volume of (formal) oral feedback Volume of written feedback Timeliness: days after submission before feedback provided

Range of characteristics of programme level assessment environments % marks from exams: 0% - 100% number of times work marked: 11 – 95 variety of assessment: number of times formative-only assessment: 0 – 134 number of hours of oral feedback: 1 – 68 number of words of written feedback: 2,700 – 15,412 turn-round time for feedback: 1 day – 28 days

Patterns of assessment features within programmes every programme that has much summative assessment has little formative assessment, and vice versa no examples of much summative assessment and much formative assessment there may be enough resources to mark student work many times, or to give feedback many times, but not enough resources to do both...it is clear which leads to most learning

Relationships between assessment characteristics and student learning …wide variety of assessment, much summative assessment, little formative-only assessment, little oral feedback and slow feedback are all associated with a negative learning experience: less of a deep approach less coverage of the syllabus less clarity about goals and standards less use of feedback

Relationships between assessment characteristics and student learning Much formative-only assessment, much oral feedback and prompt feedback are all associated with a positive learning experience: more effort more deep approach more coverage of the syllabus greater clarity about goals more use of feedback more overall satisfaction… …even when they are also associated with lack of explicitness of criteria and standards, lack of alignment of goals and assessment and a narrow range of assessment.

Why? being explicit does not result in students being clear …but discussing exemplars does explicitness helps students to be ‘selectively negligent’ students experience varied forms of assessment as confusing: ambiguity + anxiety = surface approach variety means feedback can’t feed forwards variety means students have little opportunity to get better at anything feedback improves learning most when there are no marks more time to give feedback when don’t have to mark possible to turn feedback round quickly when there are no QA worries about marks (or cheating)

Chapter 8..in which we solve the mystery

Assessment case study: what is going on? Lots of coursework, of very varied forms (lots of innovation) Very few exams Masses of written feedback on assignments Four week turn-round of feedback Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment But students: Don’t put in a lot of effort and distribute their effort across few topics Don’t think there is a lot of feedback or that it very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are

Assessment case study: what is going on? All assignments are marked and they are all students spend any time on. Not possible to mark enough assignments to keep students busy. No exams or other unpredictable demands to spread effort across topics. Almost no required formative assessment Teachers all assessing something interestingly different but this utterly confuses students. Far too much variety and no consistency between teachers about what criteria mean or what standards are being applied. Students never get better at anything because they don’t get enough practice at each thing. Feedback is no use to students as the next assignment is completely different. Four weeks is much too slow for feedback to be useful, and results in students focussing on marks.

Assessment case study: what to change? Reduce variety of assignments and plan progression across three years for each type, with invariant criteria, and many exemplars of different quality, for each type of assignment Increase formative assessment: dry runs at what will later be marked, sampling for marking. Reduce summative assessment: one per module is enough (24 in three years) or longer/bigger modules Separate formative assessment and feedback from summative assessment and marks … give feedback quickly, marks later, or feedback on drafts and marks on final submission Teachers to accept that the whole is currently less than the sum of the parts (and current feedback effort is largely wasted) and give up some autonomy within modules for the sake of students’ overall experience of the programme – so teachers’ effort is worthwhile

Chapter 9 in which we demonstrate improvements in student learning

Case Study 2 whole programme shift from summative to formative assessment linking a narrower variety of types of assignment across modules so feedback can feed in to future modules new ‘core team’ to handle increased face to face feedback parallel changes by individual teachers: e.g. self-assessment reduction in exams, abandoned most MCQs Impact: AEQ scores moving in the right direction for: Quantity and quality of feedback Use of feedback Appropriate assessment Clear goals and standards Surface approach (less of it!) Overall satisfaction

EPILOGUE Diseases, diagnoses and cures 1Too much summative assessment, and too little formative assessment 2Too wide a variety of types of assessment 3Insufficient formal requirements: lack of time on task 4Inconsistent marking standards 5A compartmentalised focus on individual courses at the expense of programme coherence 6Poor feedback: too little and too slow 7Lack of oral feedback and discussion of assessment and the negotiation of meaning of goals and standards 8Orientation of student effort towards reproducing course content in assignments or exams

TESTA: Transforming the Experience of Students Through Assessment FASTECH – using technology to solve the same assessment problems