The feedback conundrum: finding the resource for effective engagement Prof. Margaret Price Director, ASKe Pedagogy Research Centre Faculty of Business.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Human Resources Brookes Assessment Compact Dr Chris Rust, Head, Oxford Centre for Staff and Learning Development Deputy Director, ASKe.
Directorate of Human Resources Engaging students with assessment and feedback Chris Rust, Head, Oxford Centre for Staff and Learning Development.
Peer peer-assessment & peer- feedback
School Based Assessment and Reporting Unit Curriculum Directorate
© Myra Young Assessment All rights reserved. Provided for the use of participants in AM circles in North Lanarkshire Council.
Janet Forsyth Careers Adviser
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Peer Feedback and Assessment: Students as Partners in Designing Inclusive Assessment and Feedback Regina Pauli 1 and Marcia Worrell 2 1 Department of Psychology,
Developing consistency of teacher judgment Module 2.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
Assessment and Problem Based Learning PBL 2004 Glen O’Grady Director, Center for Educational Development.
A School Approach to Designing for Learning Learning Intentions : To know that purposefully designing for learning that is contextually appropriate, strengthens.
Consistency of Assessment
Large Classes & Assessment Professor Margaret Price Director ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Oxford Centre for Staff and Learning Development Improving the effectiveness of feedback through increased student engagement Dr Chris Rust Head, Oxford.
Enhancing assessment and feedback: an evidence-based response Chris Rust Deputy Director ASKe Centre for Excellence in Teaching and Learning (Assessment.
Assessment Standards Knowledge exchange Engaging students with assessment and feedback Dr Chris Rust, Deputy Director ASKe CETL Directorate: Margaret Price,
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
What’s driving the need for flexible curricula? How are our learners changing and what are their needs/expectations for flexible curricula? QAA Enhancement.
Standards and Guidelines for Quality Assurance in the European
Working with exemplars A workshop for Education staff developed by Kay Sambell, Sue Robson, Lynne McKenna, Elise Alexander and Linda Graham as part of.
Technology and Motivation
Research, evidence and engaging learning Profiling the influence of school librarianship Penny Moore
Goal Understand the impact on student achievement from effective use of formative assessment, and the role of principals, teachers, and students in that.
School Innovation in Science Formerly Science in Schools An overview of the SIS Model & supporting research Russell Tytler Faculty of Education, Deakin.
Oral Communication The description of the oral communication task indicates two priorities – the development of basic research skills and the communication.
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Curriculum for Excellence Aberdeenshire November 2008.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Improving feedback, & student engagement with feedback.
Engaging students in assessment Chris Rust Deputy Director, ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Enabling learning and assessment Unit 3 – Week 1.
LECTURE 2 - DTLLS Assessment. Research into the impact of assessment tells us that students learn best when assessment is:  Evenly timed  Represents.
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
Centre for Academic Practice University of Strathclyde
CLT Conference th July 2015 Nudge! A structured programme of support aimed at improving student achievement where average grades are just below.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
ESL Teacher Networking Meeting Session - 2 Raynel Shepard, Ed.D.
Human Resources Brookes Assessment Compact Dr Chris Rust, Head, Oxford Centre for Staff and Learning Development Deputy Director, ASKe.
March E-Learning or E-Teaching? What’s the Difference in Practice? Linda Price and Adrian Kirkwood Programme on Learner Use of Media The Open University.
1 SUPPORTING PEDAGOGICAL CHANGE IN NMR SCHOOLS PROJECT Briefing of NMR secondary schools 11 February, 2010 Jean Russell, Graeme Jane, Graham Marshall.
Programme design and student assessment David Baume 1.
National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Assessment Design and its relationship to NARS and.
Communities of Practice Stephen Merry & Paul Orsmond Staffordshire University Faculty of Sciences.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Limitations and Future Directions of Tests. Test Interpretation and Use A valid test involves valid interpretation and valid use of the test scores A.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Assessment Standards Knowledge exchange Assessment Standards: A Manifesto for Change Dr Chris Rust, Deputy Director ASKe CETL Directorate: Margaret Price,
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Long Range Technology Plan, Student Device Standards Secondary Device Recommendation.
Assessment for Learning Centre for Academic Practice Enhancement, Middlesex University.
School – Based Assessment – Framework
Our Research Into Assessment
Consistency of Teacher Judgement
Implementing Research-Informed Assessment Feedback Practice
Perspectives on Assessment
Business School Beyond 'scribbles in the margin': Engaging students with assessment feedback Student Learning Experience Conference, 16th May 2007 Dr.
Enhancing assessment and feedback: an evidence-based response
Assessment literacy: a new perspective on enhancing learning
Standard for Teachers’ Professional Development July 2016
Gemma mansi and Hilary Orpin
Our Research Into Assessment
Engaging students with assessment and feedback
Presentation transcript:

The feedback conundrum: finding the resource for effective engagement Prof. Margaret Price Director, ASKe Pedagogy Research Centre Faculty of Business

Dynamic Assessment Context Assessment Massification Assessment diversity Holistic approach to assessment Assessment drives learning Transparency and accountability Modularity Purpose of Assessment – various stakeholders Fragmentation of communities Target culture Authentic assessment Uniform Quality Assurance and accreditation systems (Bologna 1999) Employability outcomes Cultural diversity and internationalisation of curriculum Academic Literacies National Student Survey League tables Student engagement

Change in HE and effects on assessment Pedagogic developments (e.g. acknowledgement of assessment as key driver of learning, dialogic feedback; discourse of assessment – fairness, cheating, grade inflation; complexity of assessment) Measurement in HE (e.g.league tables, quality processes and measures, fragmentation). Student voice (fees, influence of prior education, student engagement) Market (employability, graduate jobs, authenticity in assessment)

Current assessment climate More traditional forms of assessment tend to be taken for granted (Pryor and Croussard 2007) Assessment is being slow to catch up with pedagogic developments Impact of assessment cultures. Teachers and students develop implicit and explicit expectations about learning and assessment (Ecclestone, 2006)

Assumption: Designing assessment at module level is sufficient Staff team need a programme view Where there is a greater sense of the holistic programme, students are more likely to achieve the learning outcomes than students on programmes with a more fragmented sense of the programme. (Havnes, 2007) Assessment strategy – looking across a programme o Effects of conflating summative and formative assessment (Price et al 2010, Black and Wiliam, 1998) o Variety in assessment is not always a good thing It is clear how disconcerting students find a large range of assessment activities. It is far better to limit these so students get better at using the feedback to improve performance. (Gibbs, 2011) o Feedback seen as a self contained event (Carless 2011)

Assumption: Constructive alignment is sufficient for coherent assessment and feedback Learning activity Assessment Learning outcomes Priority given to assessment? Assessment often only considered superficially in the programme design phase. A well designed programme - assessment and feedback will look after themselves? Implementation – feedback flows Learner

Assumption: Assessment standards are straightforward Beliefs are strong that Assessment standards can be made explicit Standards are understood by students Standards are consistent between markers

Limitations of explicit articulation Meaningful understanding of standards requires both tacit and explicit knowledge (O’Donovan et al. 2004). “we can know more than we can tell” (Polanyi, reprinted 1998, p.136). Verbal level descriptors are inevitably ‘fuzzy’ (Sadler 1987). There is a cost (in terms of time and resources) to codifying knowledge which increases the more diverse an audience’s experience and language (Snowdon, 2002). Tacit knowledge is experience-based and can only be revealed through the sharing of experience – socialisation processes involving observation, imitation and practice (Nonaka, 1991).

Assessment standards are difficult Assessment judgements rely on local, contextualised interpretations of quality underpinned by tacit understanding of ‘quality’ shared by members of an assessment community (Knight, 2006) A key issue in assessment is that students often do not understand what is a better piece of work and do not understand what is being asked of them particularly in terms of standards and criteria. (O’Donovan et al., 2001)

Active student engagement Passive student engagement Informal activities and inputs Formal activities and inputs 1.The Traditional Model Tacit standards absorbed over relatively longer times informally and serendipitously 2. The ‘Dominant Logic’ Explicit Model Standards explicitly articulated (with limitations) and passively presented to students 3. The Social Constructivist Model Actively engaging students in formal processes to communicate tacit knowledge of standards 4. The ‘Cultivated’ Community of Practice Model. Tacit standards communicated through participation in informal knowledge exchange networks ‘seeded’ by specific activities. The Past The Future O’Donovan, Price & Rust 2008

Assessment judgements are complicated –Markers rank and measure standards simultaneously (Wolf 1997) –Holistic judgements are justified by criteria (Bloxham and Boyd 2011) – –Markers apply different interpretations of key words and phrases within written ‘standards’ (Saunders and Davis 1998; Ecclestone,2001, Webster et al., 2000, (Price & Rust 1999) ) –Markers subconsciously use other criteria (Price 2005) –Markers can use only a small number of criteria at any one time (Elander 2002) –Markers allocate their own weightings to criteria (Ecclestone 2001) –Markers may be harsher if they have more time and/or have to justify their judgements (Baume et al 2004)

Assumption: There is a common view about feedback

Impact on feedback? There has been quite a bit of research on feedback… Don’t read it (Hounsell,1987; Gibbs and Simpson, 2002) Too vague (Higgins, 2000; Walker, 2009) Not understood (Lea and Street, 1998; Weaver 2006; Sadler 2010) Subject to interpretation (Ridsdale, 2003; Orsmond and Merry, 2011) Unidirectional (Nicol 2010) Damages self-efficacy (Wotjas, 1998) Importance of the ‘relational’ (Price et al. 2010)

The aim of the project: to understand better what influences whether feedback is viewed as ‘good’ or ‘bad’ by students (and in NSS scores). What makes good feedback good? An ASKe Pedagogy Research Centre project in collaboration with Cardiff, funded by the HEA

Domains pertaining to the feedback itself (the traditional focus of advice to improve feedback)  Technical factors – presentation and content – legibility, interpretatability (incomprehensible ticks/remarks), levels of explanation.  Particularity of feedback – evidence of engagement with particular piece of work, personalisation valued over ‘standardised criterion-based feedback  Recognition of student effort – evidence of time spent by markers as well as supportive detail.

Domains relating to the context of feedback  Assessment design - clarity of purpose, relevance, realistic (e.g word count, time)  Feedback pre-conditions – clear criteria, task, instructions… dialogue.  Marker predictability – particularly if the marker is not the person who had briefed them or provided formative feedback. Sites of Silence  Timing  Design of assessment patterns

Domains pertaining to the development & expectations of the student  Student mark expectations – some influence but not overwhelming. Effort often equated with marks and influences the type of feedback seen as useful.

Domains pertaining to the development & expectations of the student  Student mark expectations – some influence but not overwhelming. Effort often equated with marks and influences the type of feedback seen as useful.  Student epistemology, resilience and beliefs –dualistic students and model answers and specificity; –intrinsic (learning) and extrinsic (mark) motivations. –poor self evalaution and reliance on feedback –criticism, critique, transaction or conversation

Summary of findings  You don’t need to get it all right all the time  Domains overlap, compensate each other, have strong interrelationships, are not mutually exclusive.  Domains of influence are not causal, prioritised, they are context dependent and influences outside the ‘feedback’ artefact are important.  Student perceptions of feedback are shaped by: Some aspects of the feedback itself – a necessary but insufficient condition for feedback being seen as good Pre-feedback conditions Qualities and perspectives of the student

Where do we go from here? ‘ The domain that perhaps offers greatest unexploited scope for improvement concerns student learning development. Successful students use feedback differently and more effectively (without the context or the feedback changing) and it is possible to change how students perceive feedback and what they do with it.’ (Research report available at

So…. a long term project Look beyond the feedback artefact itself. An overemphasis on technical factors at the expense of contextual elements such as good teacher student relationships can be detrimental. Take a Programme level focus Create resource through development of student assessment literacy Independent learners will cope better with the imprecise nature of assessment and be able to engage with feedback Develop a discourse around assessment and feedback