Download presentation
Presentation is loading. Please wait.
Published byอนันตชัย เก่งงาน Modified over 5 years ago
1
Students as partners in setting assessment criteria
challenging the orthodoxy of assessment rubrics David Watson Student Experience & Engagement Lead School of Design @springfishblue
2
Assessment of learning …is not the same as… Assessment for learning
3
ABC form The “ABC” form used at CPDA is a type of analytic rubric.
4
Benefits of analytic rubrics
Students like them They feel more confident because they are being told what to do to achieve a specific grade. Academic staff like them They feel more confident in making assessments more consistent and in avoiding challenges from students. External examiners like them They feel more confident because they can see the link between assessment criteria and grades.
5
What’s not to like?
6
I do not like rubrics because…
They are convenient Staff can mark work more efficiently. In a time-pressured industry, it makes sense to be efficient. Is the need for efficiency compromising our pedagogies? We believe they may improve grades However, that doesn’t necessarily mean that we are improving students’ skillset as learners. Is our objective to produce students with good grades or with good learning? They are a guide to passing the assessment There is no onus on the student to discover the level of attainment required and therefore no incentive to develop these skills – the skills they will need in the world of work.
7
The Ikea instruction sheet does not test a person’s ability as a furnituremaker, it only tests their ability to read an instruction sheet. Similarly, the rubric does not test a person’s ability to understand the subject under investigation, it merely indicates what a student should do – a blueprint to passing the assessment.
8
Problems with analytic rubrics
Students not challenged to think for themselves They do not need to develop a deep understanding of the subject material or the standards required to achieve specific outcomes (Secondary Education Syndrome). Do not allow for student experimentation Rubrics provide a fixed blueprint that students must follow to achieve the stated grade. We tell them what to do, they do it and we grade accordingly (Secondary Education Syndrome). Designed primarily for formative assessment Analytic rubrics are not intended to be used for summative assessment. They are designed to provide feedback that should be used to improve the quality of the given assessment.
9
CAN conference Next CAN conference:
29-30th May 2019 The Open University (Milton Keynes)
10
Where is the student in all of this?
11
Where is the student in all of this?
Rubrics are written well in advance of teaching Traditionally, students are not involved in setting assessment criteria or in determining the appropriate level of attainment for each grade. Potential benefits of partnering are enormous Partnering with students to navigate our way through the assessment process could promote independent thinking and improve engagement and attainment. Education should be messy People are messy and non-standard. Education is a people-centred undertaking. Our pedagogies need to be messy and non-standard to best serve our students.
12
“How do you think your work should be assessed?”
Obviously, this process has to be mediated by the module tutor.
13
How it works Students given choice Week 1, process explained
During the introduction to the module, students are given a choice between using the ABC form (with which they are familiar) or partnering in deciding the assessment criteria. Week 1, process explained In week one, students are told that good attendance is crucial for good attainment and that they need to understand all the subject material to be effective partners. Week 10, in-class discussion In week 10 (of the 12-week module), students take part in a guided discussion of all the material covered in the preceding 9 weeks and agree on the assessment criteria.
14
Moodle The discussion is written up by the module tutor and added to Moodle for week 11.
15
Feedback The feedback form aligns with the criteria identified during the discussion.
16
Effect on attainment and engagement
17
Average grade : 57% (rubric) : 60% (partnership)
18
Average attendance : 60% (rubric) : 69% (partnership)
19
Datascapes: Term 2,
21
Criticisms Quality assurance? Standardisation? Fairness?
There is no reason why quality assurance cannot be maintained as long as we assume that academic staff are professionals who use best practice moderation. Standardisation? Standardisation could be considered the enemy of good education. It discourages innovation and avoids serendipity. Education should be messy! Fairness? Is it fair to exclude those students who have not engaged with the module and have not, therefore, been involved with the process of determining assessment criteria? The process is intended to promote partnership, inclusion and empowerment.
22
Next steps Repeat in Report findings at CAN2019
23
Takeaway: Avoid Secondary Education Syndrome
Don’t create blueprints for students; focus instead on helping students develop the intellectual skills they will need after graduation.
24
Any questions? David Watson Student Experience & Engagement Lead
School of Design @springfishblue
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.