3 rd ISSOTL Conference, Washington, 9-12 November 2006 1 Challenging traditional forms of assessment: University teachers’ views on examinations Lin Norton*,

Slides:



Advertisements
Similar presentations
The 21st Century Context for
Advertisements

Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
EN Focus Group Economics Network hosted March students Self-selected (response to advert) Mainly year 4 (with some 2nd and 3rd years) Mainly.
Quality Control in Evaluation and Assessment
Peer peer-assessment & peer- feedback
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
SIG invited symposium UTILIZING ASSESSMENT INFORMATION FOR (IN)FORMATIVE FEEDBACK.
Planning to Support Mixed Ability Teaching
Re-Imagining Multiple Choice Tests
5th International CDIO Conference Singapore, June , The Role of Peers in the Assessment of Students’ CDIO Skills Ivan D’haese Johan D’heer.
The ABCs of Assessment Improving Student Learning Through New Approaches to Classroom Assessment.
1 Know Your Students Teaching Effectively in Higher Education in Hong Kong.
Enhancing Academic Practice Nigeria Role of Newcastle University.
Pearson Test of English (PTE)
Responding to the Assessment Challenges of Large Classes.
Core principles of effective assessment. Three interrelated objectives for quality in student assessment in higher education 1. Assessment that guides.
1 Action Research and Pedagogy Lin Norton Education Deanery Liverpool Hope.
Directorate of Human Resources Student plagiarism Deterring, detecting and dealing with it Jude Carroll.
Student experiences of assessment in two problem- based dental curricula: Adelaide and Dublin Tracey Winning Elaine Lim Grant Townsend Dental School, The.
Enhancing a Culture of Teaching and Learning at a ‘Teaching Focused’ University Diane Salter Kwantlen Polytechnic University Liesel Knaack Vancouver Island.
Learning to teach Secondary science: An Activity Theory analysis of issues concerning the use of constructivist approaches Tanvir Ahmed The Open University.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
+ Teaching psychological research methods through a pragmatic and programmatic approach. Patrick Rosenkranz, Amy Fielden, Efstathia Tzemou.
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
6 th semester Course Instructor: Kia Karavas.  What is educational evaluation? Why, what and how can we evaluate? How do we evaluate student learning?
Principles of Assessment
School Innovation in Science Formerly Science in Schools An overview of the SIS Model & supporting research Russell Tytler Faculty of Education, Deakin.
Introduction: Teaching and Testing/Assessment
Thinking Actively in a Social Context T A S C.
1 A proposed skills framework for all 11- to 19-year-olds.
Margaret J. Cox King’s College London
HEA Conference 2010 Norman Brady “The scourge of student instrumentalism”: an examination of the relationship between pedagogy, personal epistemology and.
Problem-based learning in a traditional curriculum
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Supporting Teachers New and 'Old' to Psychology. Objectives By the end of the session you will be able to: Understand how to use active teaching and learning.
Tertiary Education and the Vocational Tutor Christine Warr Consortium/HEA Conference University of Huddersfield 29 th June 2012.
EFFECTIVE RECRUITMENT AND SELECTION
Curriculum and Learning Omaha Public Schools
LECTURE 2 - DTLLS Assessment. Research into the impact of assessment tells us that students learn best when assessment is:  Evenly timed  Represents.
School preparation and choice of university
27-29 August 2008 Fourth Biennial EARLI/Northumbria Assessment Conference 1 Contextualising assessment: the lecturers’ perspective Lee Shannon, Bill Norton,
Using Just-in-Time Teaching for Large Course Instruction Kevin J. Apple James O. Benedict James Madison University.
Assessment Lecture 3. Chain of control Assessment which results in monitoring a learner’s achievements during his/her programme of study forms an essential.
1 “ Assessment for Learning: Developing Autonomous Learners in Childhood Studies ” Prof. Kay Sambell, Northumbria University Improving Student Learning.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Qualifications Update: Care Qualifications Update: Care.
+ The continuum Summative assessment Next steps. Gallery Walk – the Bigger Picture Take one post it of each of the 3 colours. Walk around a look at the.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
24 October 2009ISSOTL Assessment design, pedagogy and practice: What do ‘new’ lecturers think? Lin Norton, Bill Norton, Lee Shannon and Frances.
Human Resources Brookes Assessment Compact Dr Chris Rust, Head, Oxford Centre for Staff and Learning Development Deputy Director, ASKe.
Ready for University Longitudinal, comparative evaluation of workshops to help Access and A-level students understand university assessment criteria James.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Centre for Educational Development ORHEP Project 1 This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported.
Alessio Peluso 1 Critical evaluation of the module ‘Introduction to Engineering Thermo Fluid Dynamics’ First Steps in Learning and Teaching in Higher Education.
Differentiation and challenge: Research and Development Group Cycle Meeting 1: selecting the issue.
Technologizing the postgraduate classroom PedRIO 2016 Sara Smith and Martin Khechara.
Using Avatars and Virtual Environments in Learning: What do they have to offer? Article Review By: Amanda Lawrie.
KEVIN SMITH & KIM HORTON JULY 2015 Educational research and teaching Wales.
PGCE Evaluation of Assessment Methods. Why do we assess? Diagnosis: establish entry behaviour, diagnose learning needs/difficulties. Diagnosis: establish.
Race for Equality – A report on the experiences of Black students in further and higher education
The Big Interview Rebecca Jackson EDU 650: Teaching, Learning and Leading in the 21 st Century Dr. Doerflein January 12, 2015.
Good teaching for diverse learners
Training for Master Trainers: Learning Engagement & Motivation
Lin Norton, Bill Norton, Lee Shannon and Frances Phillips
ASSESSMENT OF STUDENT LEARNING
The Heart of Student Success
Assessment Introduction
Lin Norton, Bill Norton, Lee Shannon and Frances Phillips
EBL – Why do it? Centre for Excellence in Enquiry-Based Learning.
Presentation transcript:

3 rd ISSOTL Conference, Washington, 9-12 November Challenging traditional forms of assessment: University teachers’ views on examinations Lin Norton*, Katherine Harrington $, Bill Norton* & Lee Shannon* *Liverpool Hope University, $ London Metropolitan University

3 rd ISSOTL Conference, Washington, 9-12 November Background Assessment widely acknowledged as having a profound influence on what students learn and how they learn Move away from traditional learning theory to an increasing recognition that learning is a process of knowledge construction rather than knowledge reproduction (Maclellan, 2001) Concomitant move towards alternative methods of teaching and learning (e.g PBL, work based learning, experiential learning) But not necessarily so great a shift in assessment practices - in a study carried out at LHU in 5 disciplines, the most frequently used assessment tasks were exams (Steward et al, 2003)

3 rd ISSOTL Conference, Washington, 9-12 November Traditional exams: a definition An unseen closed book timed assessment, the form of which might be –An essay –A short answer –A multiple choice test

3 rd ISSOTL Conference, Washington, 9-12 November What do exams do? The pros… Represents the end product of the education process Validity of a degree– the external recognition of a qualification that is wanted by universities, employers and individuals (O’Donovan, 2005) An antidote to plagiarism Revision period acts as an opportunity for synthesis and a coherent understanding of the subject ( Entwistle & Marton, 1994) Concentrates students’ minds and motivates them Aids standardisation Gives tutors an idea of how much students have understood Tests different skills and abilities to those demanded in coursework Maybe more appropriate at certain stages of development and for varying levels of self direction in learning (Macdonald 2002)

3 rd ISSOTL Conference, Washington, 9-12 November What do exams do? The cons… Unpopular with students who have often had negative experiences Causes stress and anxiety Students feel they are an unfair assessment of their ability (Norton & Brunas-Wagstaff, 2000; Sambell et al 1997) Tends to test learning that is linear and decontextualised Does not assess meaningful and authentic learning (Maclellan, 2001) Encourages strategic selection of parts of the curriculum that students think will come up on the exam Encourages rote learning and memorisation What is learned for an exam is often quickly forgotten afterwards Represents a snapshot in time rather than a richer picture of student ability Marking of exam scripts is unreliable (Newstead & Dennis, 1994)

3 rd ISSOTL Conference, Washington, 9-12 November Rationale for our research Transformational agendas in assessment, such as switching from traditional exams, require individual lecturers, subject disciplines and institutions to be ‘adaptive.’ Although there is much in the literature on students’ perceptions of exams as a form of assessment, there appears to be relatively little on university teachers’ beliefs about the pedagogical value of exams.

3 rd ISSOTL Conference, Washington, 9-12 November The research study A semi-structured interview study carried out at two universities in the UK to ascertain university teachers’ views on examinations Since exams are a ‘contested’ form of assessment we used the concept of core assessment criteria (Elander et al, 2004) to more obliquely approach our main research questions: –what forms of exam were used –what pedagogical value exams were held to have –if there were any problems affecting assessment procedures

3 rd ISSOTL Conference, Washington, 9-12 November The interview schedule Question schedule adapted from study on essay assessment criteria (Norton et al., 2004) including such questions as: What sorts of examinations are you typically involved in setting? Of the various methods of examination, which do you feel produces the best responses from your students? What do you think a student gains from this form of assessment? Do you think there are any problems with current assessment procedures?

3 rd ISSOTL Conference, Washington, 9-12 November Focus of the presentation This presentation focuses on a subset of 24 out of 29 interviews Examination experience ranged from 6 to 37 years for University A & from 3 to 25 years for University B UniversityBusinessComputingEnglishPsychologyTotal A B Total666624

3 rd ISSOTL Conference, Washington, 9-12 November Analysis 1.Transcripts were subject to repeated readings from which several themes emerged 2.Discussion of the emergent themes between the researchers produced a framework in which to present the findings: 1.Variation in how exams were used 2.Pedagogical justification 3.Constraints

3 rd ISSOTL Conference, Washington, 9-12 November Forms of exam in use Unseen essay Seen essay Problem scenario/ case study Short answers MCQIn class tests Business Computing English64 32 Psychology6 121 Total

3 rd ISSOTL Conference, Washington, 9-12 November Pedagogical justification Most lecturers were alive to the need to align assessment and used or wished to use a range of appropriate methods There was less unanimity about the pedagogical value of exams Ambivalence about worth of exams – I’m not in favour of the pressure that (exams) create, but it shows you who can focus their mind on the problem. (B/Business lecturer) Some staff were making the best of it- “I don’t really like exams but I don’t have a choice if I want to control cheating” (A/Computing lecturer)

3 rd ISSOTL Conference, Washington, 9-12 November Pedagogical value: positive allowed distinction between grades – readily identified ‘better’ students redressed the gender bias allowed students to show what they had really learnt encouraged students to ‘cover’ the syllabus required working under pressure, thinking on your feet – as you have to in real life short answers were appropriate in some circumstances e.g. close textual analysis in English (B), reviewing programs in Computing (A) in class tests could provide good formative feedback MCQ if properly designed could test across the syllabus

3 rd ISSOTL Conference, Washington, 9-12 November Pedagogical value: negative lack of opportunity for formative feedback from end of year/module exams over reliance on memory & regurgitation may be examining exam technique rather than what had been learned unreal, inauthentic, time limited situation should not be the only method of assessment adverse effect on students, anxiety provoking, gender bias

3 rd ISSOTL Conference, Washington, 9-12 November Constraints: what held them back from alternative assessment methods Tension in the sector about the role of Higher Education, attaining qualifications rather than transformational learning Change in the nature of students in HE, widening participation, secondary experience poor preparation what is educationally appropriate versus what is financially viable institutional needs – well set up for exams – more effort spent on exams than on coursework, quicker and more reliable to mark – on the other hand coursework seen as easier option and can give better pass rates lack of time & procedures to develop more varied appropriate assessment methods. Institutional restrictions on feedback. concern about plagiarism a powerful incentive for exams lecturer’s own experience as a student, attitudes & beliefs – they worked for me

3 rd ISSOTL Conference, Washington, 9-12 November Conclusions Most of the teachers in both institutions would abandon exams if: –Better ways of preventing plagiarism in coursework –More time/support available to develop an appropriate range of assessment methods (the question of slowness of institutional change cf Bransford ‘adaptive organisations’) If exams are used then: –more flexible formats and user friendly introduction for students –timing & procedures should permit opportunity to give formative feedback. –students should get feedback on exam scripts

3 rd ISSOTL Conference, Washington, 9-12 November Implications Most university teachers in this study did have an underlying pedagogical philosophy about assessment which was not always put into practice because of the ‘non adaptive’ discipline and/or institution. This confirms findings from a study of new lecturers’ views about assessment (Norton et al 2010) This pedagogical research has produced an evidence base that we hope will enable us to encourage change at the discipline/institutional level

3 rd ISSOTL Conference, Washington, 9-12 November The last word “When I taught in America it was different, I would take my prejudices for assessment that I would favour and design the courses to suit that. Here they are quite rigid, I think things have opened up a great deal in the last five to ten years I would say but I think there's still room to grow the types of assessment that we do include. I think there are certain prejudices, with the new universities at the forefront of redesigns of assessment, there is some prejudice in the more established universities I've taught at, you know that things have to be proven, maybe some of the new forms of assessment that we've attempted do need a little bedding down and some monitoring and evaluation themselves.” (B/English lecturer)

3 rd ISSOTL Conference, Washington, 9-12 November References Elander, J., Harrington, K., Norton, L., Robinson, H., Reddy, P. & Stevens, D. (2004). Core assessment criteria for student writing and their implications for supporting student learning. In C. Rust (ed.), Improving Student Learning 11. Theory, Research and Scholarship (pp ). Oxford, Oxford Centre for Staff and Learning Development. Entwistle, N & Marton, F. (1994) Knowledge objects: understandings constituted through intensive academic study, British Journal of Educational psychology, 64, Macdonald, M. (2002) Systematic assessment of learning outcomes: developing multiple choice exams. Jones & Bartlett Learning Maclellan, E. (2001) Assessment for learning: the differing perceptions of tutors and students. Assessment and Evaluation in Higher Education, 26,4, Newstead, S.E. & Dennis, I. (1994). Examiners examined: The reliability of exam marking in psychology. The Psychologist, 7,216–219. Norton, L. and Brunas-Wagstaff, J. (2000) Students’ perceptions of the fairness of assessment. Paper presented at the first annual conference of the Institute for Learning and Teaching in higher education, iltac 2000, York, June Norton, L.S., Ward-Robinson, H., Reddy, P., Elander, J. and Harrington, K. (2004) Exploring psychology lecturers’ view on assessment criteria. Psychology Learning and Teaching Conference (PLAT 2004), University of Strathclyde, 5-7 April 2004 O'Donovan, N. (2005) There Are no Wrong Answers: An Investigation into the Assessment of Candidates' Responses to Essay-Based Examinations, Oxford Review of Education, 31, 3, Sambell, K., Brown, S & McDowell, L. (1997) ‘But Is It Fair?": An Exploratory Study of Student Perceptions of the Consequential Validity of Assessment Studies in Educational Evaluation, 23, Steward, S., Norton, L.S., Evans, I. & Norton, J.C.W. (2003) Lecturers! What are you assessing? Paper given at the Learning & Skills Research Network Annual Conference ‘Research for all’ GMB National College, Manchester, 6 June, 2003