Improving student learning and experience through changing assessment environments at programme level: a practical guide Graham Gibbs.

Slides:



Advertisements
Similar presentations
Improving student learning by changing assessment across entire programmes Graham Gibbs.
Advertisements

Directorate of Human Resources Embedding graduate attributes within the curriculum Rhona Sharpe, OCSLD Liz Turner, APQO.
Quality Assurance and Quality Enhancement Relationships and Perspectives Professor Barry Jackson PVC, Director of Learning & Teaching Middlesex University.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Developing an effective assessment strategy Peter Hartley, Professor of Education Development University of Bradford
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Course Review on Assessment Professor Brenda Smith.
Developing coherence in assessment. Relationship to the new C- QEM report* Coherence in the course and assessment *Course quality enhancement and monitoring.
Exams and Revision Some hints and tips.
Curriculum Design for Assuring Learning in Business Education
Audio-feedback Background – why I did it. My experiences – what I did. Student feedback – how did it go?
Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Writing an Effective Proposal for Innovations in Teaching Grant
Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Domain 1: Planning and Preparation
Developing your Assessment Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
The Test Assessment Questionnaire Katherine M. Sauer William Mertens Metropolitan State College of Denver University of Colorado at Boulder
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Qualitative Research An Alternative to the Numbers Game.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Assessment and Feedback Peer and Self Assessment
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Improving Students’ understanding of Feedback
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
Assessment. Scales and Rubrics Lettered Scales Point Scales 100 Point Scales Degree Classifications.
Measuring Learning Outcomes Evaluation
Standards and Guidelines for Quality Assurance in the European
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Feedback on exams Vicki Bruce On behalf of School of Psychology.
 Assurance of graduate attributes is a predominant feature in both quality enhancement and assurance in higher education.  Recent developments, including.
Learning to Learn Kristina Edstr ö m, KTH Learning Lab,
Problem-based learning in a traditional curriculum
Effective Differentiated Instruction for All Students
Invisible writing Invisible Writing The link between stupidity and the semicolon Dr Pat Hill, FHEA Academic Skills Tutor /Senior Lecturer School of Music,
University of York May 2008 Using assessment to support student learning Graham Gibbs.
Developing your Assessment and Feedback Judy Cohen Curriculum Developer Unit for the Enhancement of Learning and Teaching.
Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
SEISMIC Whole School and PLC Planning Day Tuesday, August 13th, 2013.
Designing in and designing out: strategies for deterring student plagiarism through course and task design Jude Carroll, Oxford Brookes University 22 April.
Assessment Lecture 3. Chain of control Assessment which results in monitoring a learner’s achievements during his/her programme of study forms an essential.
Preparing students as effective assessors Enabling learning beyond graduation David Nicol Professor of Higher Education Centre for Academic Practice and.
HEA Conference June 22nd – 23rd 2010 Shaping the Future: Future Learning It’s all in the words: the impact of language on the design and development of.
From an e-portfolio to a PLS: Integrating an e-portfolio into PgC in Learning & Teaching in HE. Sarah Chesney Centre for the Development Of Learning &
Jason Leman Education Researcher Sheffield Hallam University.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Student Peer Review An introductory tutorial. The peer review process Conduct study Write manuscript Peer review Submit to journal Accept Revise Reject.
ELPP, 15 November 2010 e-Feedback Meeting Students’ Needs & Expectations Yuhua Hu & Paul McLaughlin The School of Biological Sciences.
Masters Level Modules Ros Ollin School of Education and Professional Development University of Huddersfield.
Assessment Matters … Monday December 5 th 2011 Student Union Academic Council Andy Lloyd, Assessment Project Manager.
Advanced Legal Writing Seminar: Wednesdays, 10:00 p.m. EST Office Hours: Mondays from 3 – 5 p.m. EST, and by appointment AIM sign-in: cssouthall
Fair and Appropriate Grading
1 Equipe Plus Workshop Quality Assuring APL in UK University Lifelong Learning: An Overview Estonian Universities National Network 12,13 June 2007.
Making Assessment Feedback Manageable Professor Carol Evans
Improving student learning through assessment and feedback Graham Gibbs University of Winchester.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
Candidate Support. Working Agreements Attend cohort meetings you have agreed upon. Start and end on time; come on time and stay for the whole time. Contribute.
Marking to improve student outcomes. Marking and feedback – are they the same?  Marking is the annotating of a piece of written work, using words, symbols.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Leading Enhancement in Assessment and Feedback in Medical Sciences
University of Winchester TESTA/American Studies
Partnership Forum 2017 Partner Institution Survey 2016 :
A department-wide approach to Feedback
A department-wide approach to Feedback
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
The Impact of Peer Learning on Assessment Literacy and Feedback Orientation
Presentation transcript:

Improving student learning and experience through changing assessment environments at programme level: a practical guide Graham Gibbs

Introductions Me TESTA project Carol My assumptions about you... You...and afterwards

Programme Coffee Presentation 1: The TESTA project and its focus Activity 1: The assessment problem in participants’ own contexts Coffee Presentation 2: Theory and evidence about effects of assessment Activity 2:Speculation about effects in participants’ own contexts Lunch Presentation 3:Interpreting case study data - examples Activity 3:Interpreting case study data Tea Presentation 4The Quality assurance environment Activity 4Discussion of QA in participants’ own institutions Presentation 5Implementing changes and evaluating their impact Closing discussionImplementation of the TESTA approach

TESTA project - background Conceptual framework: ‘Conditions under which assessment supports student learning’ Measuring the extent to which students experience these conditions: the Assessment Experience Questionnaire (AEQ) Changing assessment methods at module level to meet the conditions (FAST project, Leeds Met booklet) Discovering that programmes and whole institutions had ‘assessment patterns’ through wide use of the AEQ Auditing programme-level assessment regimes Discovering powerful links between features of assessment regimes to students’ learning responses (HEA, three institutions, three disciplines) TESTA as an R&D project – using the research to identify issues to address and measure the impact of changes to assessment – and changing institutional QA rules Four institutions and seven programmes (initially)

Assessment case study: what is going on? Committed and innovative teachers Lots of coursework, of very varied forms Very few exams Masses of written feedback on assignments Four week turn-round of feedback Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment but students: Don’t put in a lot of effort and they distribute their effort across few topics Don’t think there is a lot of feedback or that it is very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are

Changes in assessment in the UK Formative to summative (often 1:10, Oxford 10:1) More summative (up to 95 times in three years)

Summative assessment that is redundant Most students can, in their first year, predict their final results with some accuracy As few as 5% of assessments are necessary to produce the same overall degree classification The high volume of summative assessment in UK universities has not resulted in students doing much work

Changes in assessment in the UK Formative to summative (often 1:10, Oxford 10:1) More summative (up to 95 times in three years) Less feedback (worst NSS scores) Exams to coursework (90%:10% to 10%:90%) Diversity of assessment methods (1 to 18) –innovation –fragmentation of modular courses –multiplicity of learning outcomes Strategic behaviour by students Resource constraints: large classes, no economies of scale in assessment Slow increase in computer based assessment Wide differences between universities

What is the ‘assessment problem’ in your context that led you to come to this workshop?

Student experience of assessment

“I just don’t bother doing the homework now. I approach the courses so I can get an ‘A’ in the easiest manner, and its amazing how little work you have to do if you really don’t like the course.”

“I am positive there is an examination game. You don’t learn certain facts, for instance, you don’t take the whole course, you go and look at the examination papers and you say ‘looks as though there have been four questions on a certain theme this year, last year the professor said that the examination would be much the same as before’, so you excise a good bit of the course immediately…”

“The feedback on my assignments comes back so slowly that we are already on the topic after next and I’ve already submitted the next assignment. I just look at the mark and throw it in the bin”

“The tutor likes to see the right answer circled in red at the bottom of the problem sheet. He likes to think you’ve got it right first time. You don’t include any workings or corrections – you make it look perfect. The trouble is when you go back to it later you can’t work out how you did it and you make the same mistakes all over again”

“One course I tried to understand the material and failed the exam. When I took the resit I just concentrated on passing and got 98%. My tutor couldn’t understand how I failed the first time. I still don’t understand the subject so it defeated the object, in a way”

Literature reporting assessment that improves learning The case of the Engineer The case of the Psychologist

The case of the engineer Weekly lectures, problem sheets and classes Marking impossible Problem classes large enough to hide in Students didn’t tackle the problems Exam marks: 45%

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged

The case of the engineer Course requirement to complete 50 problems Peer assessed in six ‘lecture’ slots Marks do not count Lectures, problems, classes, exams unchanged Exam marks increased from 45% to 85% Why did it work?

The case of the engineer time on task social learning and peer pressure timely and influential feedback learning by assessing –error spotting –developing judgement –self-supervision ‘meta-cognitive awareness and control’

Literature reporting assessment that improves learning The case of the Engineer The case of the Psychologist

“Conditions under which assessment supports student learning”

Quantity and distribution of student effort 1 Assessed tasks capture sufficient student time and effort 2These tasks distribute student effort evenly across topics and weeks

Quality and level of student effort 3These tasks engage students in productive learning activity 4Assessment communicates clear and high expectations to students

Quantity and timing of feedback 5Sufficient feedback is provided, both often enough and in enough detail 6The feedback is provided quickly enough to be useful to students

Quality of feedback 7Feedback focuses on learning rather than on marks or students themselves 8Feedback is understandable to students, given their sophistication

Student response to feedback 9 Feedback is received by students and attended to, and is acted upon by students to improve their work or their learning

Which of these conditions are well met, and poorly met, in the context you work in? 1Assessed tasks capture sufficient student time and effort 2These tasks distribute student effort evenly across topics and weeks 3These tasks engage students in productive learning activity 4Assessment communicates clear and high expectations to students 5Sufficient feedback is provided, both often enough and in enough detail 6The feedback is provided quickly enough to be useful to students 7Feedback focuses on learning rather than on marks or students themselves 8Feedback is understandable to students, given their sophistication 9Feedback is received by students and attended to, and is acted upon by students to improve their work or their learning

Assessment Experience Questionnaire Measures extent to which the ‘conditions’ are perceived to be met –Quantity and distribution of effort –Quality, quantity and timeliness of feedback –Use of feedback –Impact of exams on quality of learning –Deep approach –Surface approach –Clarity of goals and standards –Appropriateness of assessment

Research question What are the characteristics of programme level assessment environments that are associated with positive student learning responses?

Research design Three contrasting universities Three contrasting programmes in each (Humanities, Science, Applied Social Science) Characterise assessment environments –Read documentation (all modules) –Interview programme leader, lecturers and students Administer AEQ Explore relationships between characteristics of programme level assessment design and qualities of student learning -with Harriet Dunbar-Goddet, Chris Rust and Sue Law -funded by the Higher Education Academy

Characteristics of programme level assessment environments % marks from examinations Volume of summative assessment Volume of formative only assessment Volume of (formal) oral feedback Volume of written feedback Timeliness: days after submission before feedback provided Explicitness of criteria and standards Alignment of goals and assessment

Range of characteristics of programme level assessment environments % marks from exams: 3% - 100% number of times work marked: 11 – 95 variety of assessment: number of times formative-only assessment: 0 – 134 number of hours of oral feedback: 1 – 68 number of words of written feedback: 2,700 – 15,412 turn-round time for feedback: 1 day – 28 days

Patterns of assessment features within programmes every programme that is low on the volume of summative assessment is high on the volume of formative assessment no examples of high volume of summative assessment and high volume of feedback there may be enough resources to mark student work many times, or to give feedback many times, but not enough resources to do both

Relationships between assessment characteristics and student learning Explicitness of criteria and standards, alignment of goals and assessment and variety of assessment are all associated with a negative learning experience: less of a deep approach, less coverage of the syllabus, less clarity about goals and standards, less use of feedback, …explicitness, alignment and variety and are also associated with more summative assessment, less formative-only assessment, less oral feedback and less prompt feedback

Relationships between assessment characteristics and student learning Formative only assessment, oral feedback and prompt feedback are all associated with a positive learning experience: more effort, more deep approach, more coverage of the syllabus, greater clarity about goals, more use of feedback, more overall satisfaction… …even when they are also associated with lack of explicitness of criteria and standards, lack of alignment of goals and assessment and a narrow range of assessment.

Why? being explicit does not result in students being clear …but ‘engagement in a community of practice’ does… explicitness helps students to be ‘selectively negligent’ students experience varied forms of assessment as confusing: ambiguity + anxiety = surface approach feedback improves learning most when there are no marks more time to give feedback when don’t have to mark possible to turn feedback round quickly when there are no QA worries about marks (or cheating)

Assessment case study: what is going on? Lots of coursework, of very varied forms (lots of innovation) Very few exams Masses of written feedback on assignments Four week turn-round of feedback Learning outcomes and criteria clearly specified …looks like a ‘model’ assessment environment But students: Don’t put in a lot of effort and distribute their effort across few topics Don’t think there is a lot of feedback or that it very useful, and don’t make use of it Don’t think it is at all clear what the goals and standards are

Assessment case study: what is going on? All assignments are marked and they are all students spend any time on. Not possible to mark enough assignments to keep students busy. No exams or other unpredictable demands to spread effort across topics. Almost no required formative assessment Teachers all assessing something interestingly different but this utterly confuses students. Far too much variety and no consistency between teachers about what criteria mean or what standards are being applied. Students never get better at anything because they don’t get enough practice. Feedback is no use to students as the next assignment is completely different. Four weeks is much too slow for feedback to be useful, and results in students focussing on marks.

Assessment case study: what to change? Reduce variety of assignments and plan progression across three years for each type, with invariant criteria, and many exemplars of different quality, for each type of assignment Increase formative assessment: dry runs at what will later be marked, sampling for marking. Reduce summative assessment: one per module is enough (24 in three years) or longer/bigger modules Separate formative assessment and feedback from summative assessment and marks … give feedback quickly, marks later, or feedback on drafts and marks on final submission Teachers to accept that the whole is currently less than the sum of the parts (and current feedback effort is largely wasted) and give up some autonomy within modules for the sake of students’ overall experience of the programme – so teachers’ effort is worthwhile

Case Studies 1Full data plus Programme Leader’s interpretation (Carol) 2Full data for groups to make sense of and decide what to do about

Quality Assurance issues QAA and Bologna requirements for specification of learning outcomes at module level Institutional requirements for all modules to be ‘free standing’ in terms of assessment regardless of issues of consistency, progression or alignment at programme level QAA audits critical of –inconsistent standards (markers and modules) –volume and timeliness of feedback

Quality Assurance issues Documentation about –outcomes but not feedback –no. of lectures but not volume of feedback or who provides it –criteria, but not how students come to understand them –module level tactics but not programme level strategy Regulations about –course requirements and pass/fail assignments –no feedback before return of approved marks –providing feedback on drafts Meetings with PVCs, separately and together –parallel changes in QA arrangements/QAA audit –changes to documentation and regulations –changes to evaluation data collected –pilots with programmes coming up for revalidation

What are the framing QA issues in your institution?

Implementing and evaluating changes Idiosyncratic nature of: Contexts Evidence Interpretation Parallel changes Proposals Time scales Need for matching before and after sets of data Same programme Equivalent cohort of students Audit AEQ Focus groups Same staff to interpret it?

Implementing TESTA in your institution Methodology on TESTA web site – free and updated Comparative data available – and updated. Send TESTA your data so it can be added to data base. Case studies published on TESTA web site over time Advice from TESTA staff available One day of Graham’s time available to five institutions: write to me with proposals. Criteria: likely scale of implementation and impact; logistics TESTA-arranged meeting between participating institutions (if felt useful) Free to publish your own studies independently, and get TESTA help with drafting and data analysis...

Good luck!