Enhancing Assessment and Feedback Principles and Practice David Nicol Professor of Higher Education Centre for Academic Practice and Learning Enhancement.

Slides:



Advertisements
Similar presentations
Developing online Learning Dr Derek France Department of Geography Chester College of H.E. GEES.
Advertisements

Dr David Nicol Project Director Centre for Academic Practice University of Strathclyde Re-engineering Assessment Practices in Scottish.
QAA Enhancement Themes Conference Heriot Watt University Wednesday 5 th March 2008 Poster Presentation by Mhairi Freeman (lecturer), Sally Michie, Stephanie.
Directorate of Human Resources Examples of blended course designs Oxford Centre for Staff and Learning Development
Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
A Masters in Education in eLearning The University of Hull.
Self- and Peer-Assessment
Problem solving skills
School Based Assessment and Reporting Unit Curriculum Directorate
Enhancing assessment and feedback in the first year: principles and practices David Nicol Professor of Higher Education Centre for Academic Practice and.
Assessment for learning: the benefits of generating feedback David Nicol Professor of Higher Education Centre for Academic Practice and Learning Enhancement.
Peer assessment and group work event and practical workshop RSC WM Stimulating and supporting innovation in learning.
Principles of Assessment and Feedback for Learning Ulster Principles of Assessment and Feedback for Learning School Board Briefing [School/Dept name] [Facilitator.
Workshop: Translating graduate attributes into classroom learning A/Prof Simon Barrie Institute for Teaching and Learning Hong Kong Institute of Education.
Professor David Nicol, Deputy-Director,
Re-engineering Assessment Practices [REAP] in Higher Education David Nicol, Project Director, REAP Deputy-Director (Research & Development in e-learning)
Technology-enhanced assessment and transformational change David Nicol, Project Director, REAP Deputy Director Centre for.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
An Introduction to Information Literacy Judith Keene Information and Learning Services, University of Worcester.
Recording Excellence Nicole Duplain School of Humanities.
Assessment and Feedback Peer and Self Assessment
Personal Development Planning Margaret Harrison Associate Dean of Academic Frameworks.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
The whole series of 5 modules is called:
LECTURER OF THE 2010 FIRST-YEAR STUDENT: How can the lecturer help? February 2010.
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
Jeremy Hall Nicholas Jones Wouter Poortinga An Exploration of Assessment Practices at Cardiff University’s Schools of Engineering, Psychology and the Centre.
Theme 2: Expanding Assessment and Evaluation for FNMI Students Goal #1: First Nations, Métis and Inuit student achievement is increased as measured by.
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Thinking Actively in a Social Context T A S C.
Using formative assessment. Aims of the session This session is intended to help us to consider: the reasons for assessment; the differences between formative.
Overall Teacher Judgements
Peer review and feedback In the hands of the student David Nicol Emeritus Professor of Higher Education University of Strathclyde, Scotland Visiting Professor,
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
© Myra Young Assessment All rights reserved. Provided for the use of participants in AM circles in North Lanarkshire Council.
September 2007 Practice in e-Assessment Conference Increasing learner success with technology supported assessment: findings from the REAP project Catherine.
Professor Daniel Khan OBE Chief Executive OCN London.
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Working with course teams to change assessment and student learning: the TESTA approach to change Graham Gibbs.
The Art of the Designer: creating an effective learning experience HEA Conference University of Manchester 4 July 2012 Rebecca Galley and Vilinda Ross.
Preparing students as effective assessors Enabling learning beyond graduation David Nicol Professor of Higher Education Centre for Academic Practice and.
Re-Engineering Assessment Practices in Scottish Higher Education Dr David Nicol, Project Director Ms Catherine Owen, Project Manager Centre.
Professor Norah Jones Dr. Esyin Chew Social Software for Learning – The Institutional Policy of the University of Glamorgan ICHL 2012, China
Kevan MA Gartland Special Advisor & Professor of Biological Sciences Lesley McAleavy Development Officer (Engage) GCU Feedback Strategy.
Centre for Academic Practice University of Strathclyde
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Enhancing assessment and feedback in the first year: principles and practices David Nicol Professor of Higher Education Centre for Academic Practice and.
Professionally Speaking : Qualitative Research and the Professions. Using action research to gauge the quality of feedback given to student teachers while.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
Assessment and the First Year Experience Dr David Nicol, Project Director, REAP Head of Research & Development in E-learning.
Programming the New Syllabuses (incorporating the Australian Curriculum)
Towards the Implementation of an Undergraduate Package for Self-Assessment to compliment the PASS Initiative Melanie Giles, School of Psychology Amanda.
Students seizing responsibility: A revolution of collegiality Amie Speirs, Zoe Welsh, Julia Jung and Jenny Scoles Introduction: In our project Students.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
An essential part of workplace success!
Re-engineering Assessment Practices in Scottish Higher Education [REAP] Dr David Nicol, Project Director Centre for Academic Practice and Learning Enhancement.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
STELLENBSOSCH UNIVERSITY FACULTY OF MILITARY SCIENCE/ MILITARY ACADEMY SOUTH AFRICA Bontle Monnanyane: Department of Military Management +27(0) :
Improve Own Learning and Performance. Progression from levels 1-3 Progression from levels 1-3 At all levels, candidates are required to show they can.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
Re-Engineering Assessment Practices in Scottish Higher Education Catherine Owen REAP Project Manager Jenny Booth REAP Teaching and Learning.
TRANSLATING PRINCIPLES OF EFFECTIVE FEEDBACK FOR STUDENTS INTO THE CS1 CONTEXT By Claudia Ott, Anthony Robins and Kerry Shepard Presented by Laurel Powell.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Assessment and Feedback – Module 1
Linking assurance and enhancement
Assessment for learning: the benefits of generating feedback
Providing feedback to learners
Presentation transcript:

Enhancing Assessment and Feedback Principles and Practice David Nicol Professor of Higher Education Centre for Academic Practice and Learning Enhancement (CAPLE] Director of REAP and PEER projects ( University of Strathclyde, Scotland ATN Australia: 20 th October 2011

National Student Survey (UK) Assessment & feedback (2008) NoSurvey Statement EnglandScotlandNorthern Ireland 5.The criteria used in marking have been clear in advance 69 6.Assessment arrangements and marking have been fair Feedback on my work has been prompt I received detailed comments on my work Feedback on my work has helped clarify things I did not understand Overall, I am satisfied with the quality of the course

Plan Background Re-engineering Assessment Practices (REAP) project [£1m] Concepts and example of practice An institutional viewpoint PEER project

Background Departments and faculties: supporting educational improvement projects. REAP project: big implementation across 3 universities Policy/strategy: led development of policy in assessment and feedback [based on REAP] Students: ‘Feedback as dialogue’ campaign PEER project – JISC funded [£50k] HE Sector: Project facilitator for QAA Scotland on A&F Research/publications: assessment, learning, change See

REAP : Re-engineering Assessment Practices Scottish Funding Council for Universities (£1m project) 3 Universities - Strathclyde, Glasgow & Glasgow Caledonian Large 1 st year classes ( students) A range of disciplines (19 modules ~6000 students) Many technologies: online tests, simulations, discussion boards, e-portfolios, e-voting, peer/feedback software, VLE, online-offline Learning quality and teaching efficiencies Assessment for learner self-regulation

Background (1) Gibbs, G. & Simpson, C (2004) Conditions under which assessment supports students’ learning, Learning and Teaching in Higher Education, 1, See: Formative Assessment in Science Teaching (FAST) project at:

Gibbs and Simpson (2004) Assessment tasks [Conditions 1-4] 1. Capture sufficient study time (in and out of class) 2. Are spread out evenly across timeline of study 3. Lead to productive activity (deep vs surface) 4. Communicate clear and high expectations i.e concern here is with ‘time on task’ how much work students do - their active engagement in study

Background (2) Literature Review Nicol, D. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 34 (1), Background Student Enhanced Learning through Effective Feedback [SENLEF] project funded by HE Academy REAP project:

Rethinking assessment and feedback 1. Consider self and peers as much as the teacher as sources of assessment and feedback Tap into different qualities than teacher can provide Saves time Provides considerable learning benefits (lifelong learning) 2. Focus on every step of the cycle: Understanding the task criteria Applying what was learned in action 3. Not just written feedback: Also verbal, computer, vicarious, formal and informal

Seven principles of good feedback Good feedback: 1. Clarifies what good performance is (goals, criteria, standards). 2. Facilitates the development of reflection and self- assessment in learning 3. Delivers high quality information to students: that enables them to self-correct 4. Encourages student-teacher and peer dialogue around learning 5. Encourages positive motivational beliefs & self esteem 6. Provides opportunities to act on feedback 7. Provides information to teachers that can be used to help shape their teaching (making learning visible) Source: Nicol and Macfarlane-Dick (2006)

EMPOWERMENT/ SELF-REGULATION ENGAGEMENT Principle 1: Clarify what good performance is (the context of dialogue) Students create criteria Students add own criteria Students identify criteria from samples of work Exemplars of different performance levels provided Students rephrase criteria in own words Provide document with criteria

Two meta principles 1. Meta-PRINCIPLE 1: time and effort on task (structured engagement) i.e. steers on how much work to do and when – Gibbs and Simpson 4 conditions 2. Meta-PRINCIPLE 2: developing learner self- regulation (empowerment/self-regulation) i.e steers to encourage ownership of learning – the seven principles discussed above. Key task for teacher is to balance 1 and 2

Example: Psychology

560 first year students 6 topic areas (e.g. personality, classical conditioning), 48 lectures, 4 tutorials, 12 practicals Assessment; 2 x MCQs (25%), tutorial attendance (4%), taking part in experiment (5%), essay exam (66%)

Problems identified No practice in writing skills but required in the exam More detail provided in lectures than mentioned in exams (not enough independent reading) No feedback except on Multiple Choice Questions (percent correct) Didn’t want to increase staff workload

Psychology Redesign Discussion board in Learning Management System Students in 85 discussion groups of 7-8, same groups throughout year Also open discussion board for class Friday lectures cancelled – discover material themselves Series of online tasks

Structure of group tasks 6 cycles of 3 weeks (one cycle x major course topic) First week: ‘light’ written task (e.g. define terms) = 7 short answers (all answer) Second week = guided reading Week three: ‘deep’ written task: students collaborate in writing a word essay on the same topic. Within each week: The Monday lecture – introducing material Immediately after lecture, task posted online – for delivery the following Monday Model answers (selected from students) posted for previous week’s task

The teaching role Participation in the discussions was compulsory but not marked (in subsequent years there was 2% mark for participation) Course leader provided general feedback to the whole class – often motivational He encouraged students to give each other feedback The group discussions were not moderated but monitored for participation

An example of ‘deep’ task The Task – 800 word essay: Assess the strengths and weaknesses of Freud’s and Eysenck’s theories of personality. Are the theories incompatible? readings suggested questions provided which all students should try

Relation to the Gibbs & Simpson’s four assessment conditions 1. Tasks require significant study out of class (condition 1) 2. Tasks are distributed across topics and weeks (condition 2) 3. They move students progressively to deeper levels of understanding (condition 3) 4. There are explicit goals and progressive increase in challenge (condition 4)

Relation to 7 feedback principles 1. Standard format and model answers provide progressive clarification of expectations (principle 1) 2. Students encouraged to self-assess against model answer (principle 2) 3. Course leader provides motivational and meta-level feedback and selects model answers (principle 3) 4. Online peer discussion aimed at reaching consensus is core feature of design about response (principle 4) 5. Focus on learning not just marks, sense of control/challenge enhanced motivation (principle5) 6. Repeated cycle of topics and tasks provide opportunities to act on feedback (principle 6) 7. VLE captures all interactions allowing course leader to monitor progress and adapt teaching (principle 7)

Benefits Students worked exceptionally hard Written responses of exceedingly high standard Students took responsibility for learning High levels of motivation: atmosphere in class improved Online interactions showed powerful ‘scaffolding’ and community building Feedback with 560 students through peer and self- feedback (model answers) Easy for tutors to monitor participation Improved mean exam performance (up from 51-59%, p<0.01) weaker students benefit most

Online postings/interaction 24,362 messages posted by groups for essay tasks Average number of postings per student postings to general open discussion forum Students set up online study groups for other subjects Structured tasks online triggered important socio- cognitive processes

Has it worked?

Questions and Discussion

Why use principles 1. Provides a framework for operationalizing the big idea – the development of learner self-regulation 2. Helps translate the research into accessible guidelines for teaching practice 3. Ensures change is educationally driven 4. Enables connections to be made across innovations in different disciplines 5. Provides a common language to talk about innovation and for dissemination 6. Can add to the evaluation – process measures 7. Helps identify where technology can leverage benefits

Guidelines for Implementation 1. A single principle or many? 2. Tight-loose – maintain fidelity to the principles (tight) but encourage disciplines to develop their own techniques of implementation (loose) 3. Balance teacher feedback with peer and self- generated feedback 4. The more actively engaged students are, the better the course design

Developments since REAP 1. Principles of Assessment and Feedback approved by University Senate and embedded in policy (2008) 2. Use of principles to inform curriculum renewal and Quality Assurance processes 3. ‘Feedback as Dialogue’ campaign to gain commitment of students 4. PEER Project (Peer Evaluation in Education Review)

Peer Review in Education Evaluation [PEER] The aims of the PEER project are to: Review evidence base for peer review Develop educational designs for peer review (and self-review) Identify software support for peer review Pilot implementations of peer review with large student numbers Produce guidelines for higher education – why do it, how to do it, pitfalls and solutions and software possibilities. see

Peer feedback: augmenting teacher feedback Increasing quantity and variety of feedback No extra workload on teacher with software (e.g. PeerMark, Aropa) More timely – e.g. collaborative projects Simulates professional life – reconciling different feedback perspectives

The argument Not enough attention has been focused on the potential of peer feedback not just as a way of increasing the quantity and quality of the feedback students receive, but as a way of giving students practice in constructing feedback.

The focus  Scenarios where students make evaluative judgements about the work of peers and provide a feedback commentary, usually written  Not talking about scenarios involving..….informal feedback in collaborative tasks.....students evaluating each other’s contribution to group working.....students grading/marking each other’s work, although some rating might be part of peer design

Benefits of feedback construction (1) Constructivist rather than ‘telling’ paradigm 1. High-level cognitive activity: students cannot easily be passive 2. Students actively exercise assessment criteria from many perspectives 3. Writing commentaries develops deep disciplinary expertise 4. See many approaches and learn that quality can be produced in different ways 5. Shifts responsibility to student – puts them in role of assessor exercising critical judgement

Benefits of feedback construction (2) 6. Learn to assess own work – as exactly the same skills are involved Develops capacity to make evaluative judgements - a fundamental requirement for life beyond university. Also, this capacity underpins all graduate attribute development (Nicol, 2010) Nicol, D (2011) Developing students’ ability to construct feedback, Published by QAA for Higher Education, UK

Example 1:Peer feedback Laboratory work [ Gibbs ]  Reason – poor quality of lab reports in science  Strategy – students organised in groups and produce poster to represent their lab report  Peer process – hang poster in class and all students individually walk round analyse posters and write feedback on them (e.g post-its): questions, suggestions, inaccuracies etc  Result – significant improvements in lab work and reporting, positive competition in class, students did not want to look bad.

Example 2: Engineering Design Peer Project case study  DM 100 Design 1: first-year class  Dr Avril Thomson, Course Leader, Design Manufacturing and Engineering Management (DMEM), University of Strathclyde  Caroline Breslin, Learning Technology Adviser, University of Srathclyde

Example 2: Design 1  82 first-year students  Design a product – ‘theme eating and resting in the city’  Research in groups (in city, in library etc.)  Individually produce a Product Design Specification (PDS) – detailed requirements for and constraints on design (rationale, performance,standards, manufacturing etc)  Given a PDS exemplar from another domain to show what’s required (stainless steel hot water cylinder)  Online learning environment: Moodle and PeerMark part of Turnitin suite

Product Design Specification Typical Headings  Rationale  Performance  Environment  Packaging  Maintenance  Weight and size  Disposal  Shipping  Processes  Time Scale  Patents  Product costs  Aesthetics  Ergonomics  Materials  Testing  Quality/reliability  Competition  Marketing  Life in service

DM 100: Design 1 Peer review task  Individually, each student peer-reviewed and provided feedback on the draft PDS of two other students  Criteria: completeness, convincingness of rationale, specificity of values (performance) and one main suggestion for improvements with reasons  Students used experience, giving and receiving feedback, to update PDS which comprises part of a Folio  Students’ self-review Folio and meet/discuss with tutor  Peer review not assessed directly but 10% marks for professionalism which included participation in peer review.

Peer review rubric: DM 100  Do you feel the PDS is complete in the range of headings covered? If no, can you suggest any headings that would contribute towards the completeness of the PDS and explain why they are important?  Is the PDS specific enough? Does it specify appropriate target values or ranges of values? Please suggest aspects that would benefit from further detail and explain.  To what extent do you think the rationale is convincing for the PDS? Can you make any suggestions as to how it might be more convincing? Please explain.  Can you identify one main improvement that could be made to the PDS? Provide reason(s) for your answer.

Evaluation 1.Online survey completed by 64 students 2.Course work marks compared to previous years 3.Focus group interviews 4.Peer review comments recorded online

Results 1 Which aspects of the peer review did you learn from?  Giving feedback10.9%  Receiving feedback26.6%  Giving and receiving feedback54.7%  Neither giving or receiving7.8%

Results 2 Did you modify your initial submission as a result of the peer review activity? Yes, as a result of the peer review given 23.4% Yes, as a result of the peer review received 25.0% Yes as a result of the peer review given AND received 28.1% No21.9% N/A1.6%

Results: student comments If yes, please give specific examples of modifications (n=41)  I added a couple of paragraphs and improved existing paragraphs, this added two full A4 pages to my work  I included specific materials as changed the formatting of the document so it looked more professional  I provided more specific numeric values and expanded my rationale after seeing someone else’s PDS and after receiving feedback

Results: RECEIVING feedback Please give examples of what you learned from RECEIVING peer reviews from other students (n=54)  Parts that I had previously missed were brought to eye such as market competition (noticing)  Receiving peer reviews gave me insight into what others thought of my work and gave me a direction to improve (reader response)  Where the PDS was confusing to understand (reader response)  I found out how good mine was (motivational)  The person who peer reviewed my PDS gave me positive feedback which helped me a lot (motivational)  Not much, they weren’t very good

Results: PROVIDING feedback Please give examples of what you learned from PROVIDING peer reviews of other’s work (n=47)  When giving advice to people on theirs, it gave me greater perception when reviewing my own work by listening to my own advice for example [transfer]  I had a chance to see other peoples work and aspects of their work that I felt were lacking in my work, this helped me to improve my work [transfer]  I was given a greater understanding of the level of the work the course may be demanding [standards]  Allowed me to see from an assessors perspective  Thinking from a critical point of view [critical judgement]  How to look at work critically that isn’t your own

Results: How you carried out peer review Could you make any comments about how you carried out the peer review? How did you evaluate the quality of the work to provide a response to the peer review questions? (n=37)  I compared it to mine and the ideal PDS and said how I would improve it  Partly by comparing my work to theirs  I tried to think about what I wrote and whether this PDS was better or worse  Looked through the headings first and noted any that should have been included then read more in detail about each individually and compared it with my own PDS

Focus groups How did you go about reviewing? ‘I read it through and compared it with what I had done to see if they had put something I had not done and then I added it in if they hadn’t. The four questions were useful as they provided a framework for the review. If we hadn’t had the questions it would have been difficult. I did the reviews separately and then answered one then the other. The first was a better standard than the other – so I used the ideas from the better one to comment on the weaker one. I also read the guidelines in class when I did the peer review. There were ideas from the good one that I hadn’t even thought of in mine’

Focus groups What do you think is best for learning –giving or receiving feedback? ‘For me it would probably be to give feedback because I think seeing what other people have done is more helpful than getting other people’s comments on what you have already done. By looking at other people’s work you can see for yourself what you have forgotten or not even thought about. When people give feedback on yours they generally just talk about what is there. They don’t say, well I did this on mine and you could put that in yours.’

Results  Yes76.6%  No3.1%  Maybe 18.8%  Don’t know1.6% Would you choose to participate in a peer review exercise in the future?

Some tentative conclusions  Giving and receiving feedback are qualitatively different  Feedback receipt helps bring deficiencies in own work to students’ attention, can be motivational and gives a experience of different audiences (readers)  Feedback production appears to be better for developing judgement – critical thinking, using criteria and standards, being more objective about own work  Fortunately receiving and giving usually occur in the same domain of assignment production – hence double duty.  More research required