Improving computer-marked assessment: How far can we go? How far should we go? Sally Jordan, Department of Physical Sciences, The Open University,

Slides:



Advertisements
Similar presentations
Developing online Learning Dr Derek France Department of Geography Chester College of H.E. GEES.
Advertisements

Computer Aided Assessment using QuestionMark Perception by Catherine Ogilvie Educational Development Officer.
Online assessment scenarios Oxford Centre for Staff and Learning Development Oxford Brookes University.
Can technology help absorb the shock of the first undergraduate assessment in HE? Kate Kirk, National Teaching Fellow, Manchester Metropolitan University.
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Technology, Feedback, Action!: The impact of learning technology upon students' engagement with their feedback Stuart Hepplestone
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
A Department of Geographical and Life Sciences b Learning and Teaching Enhancement Unit
Objective vs. subjective in assessment Jaime Correia de Sousa, MD, MPH Horizonte Family Health Unit Matosinhos Health Centre - Portugal Health Sciences.
Good Research Questions. A paradigm consists of – a set of fundamental theoretical assumptions that the members of the scientific community accept as.
Developing competent e-learners: the role of assessment Janet Macdonald.
Peer and Self – Assessment using Computer Assisted Self & Peer Assessment Ratings (CASPAR) Dr Holly Henderson.
Investigating the use of short answer free-text questions in online interactive assessment Sally Jordan 17 th January 2008 Centre for Open Learning of.
Contention: assessment is the most important thing we do for HE students Sally Brown Pro-Vice-Chancellor and Professor of HE Diversity in Learning and.
The Impact on Practice (ImP) Project: A framework to maximise the impact of continuing professional education on practice Liz Clark, Jan Draper and Shelagh.
Moodle Quizzes Much more than Multiple Choice. 5th Annual Conference in Mathematics and Statistics Service Teaching and Learning IT Carlow.
FORMATIVE ASSESSMENT Nisreen Ahmed Wilma Teviotdale.
Perceptions of the Role of Feedback in Supporting 1 st Yr Learning Jon Scott, Ruth Bevan, Jo Badge & Alan Cann School of Biological Sciences.
© AJC /18 Extended Matching Sets Questions for Numeracy Assessments: A Case Study Alan J. Cann Department of Microbiology & Immunology University.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
Discussion examples Andrea Zhok.
The student experience of e-learning Dr Greg Benfield Oxford Centre for Staff & Learning Development.
Ryann Kramer EDU Prof. R. Moroney Summer 2010.
Moodle using surveys and glossaries By Kris (feel free to decorate while I set up….I’ll get you started)
CETL-impact : case studies from S104 Exploring Science Sally Jordan, Diane Butler and Paul Hatherly 3 rd Open CETL Conference, 24 th September 2008.
ANGELA SHORT SCHOOL OF BUSINESS AND HUMANITIES KEVIN STARRS, SCHOOL OF ENGINEERING(RETIRED!) Designing and Delivering an online module.
Exploring Interactive Computer Marked Assessments for Logic Constructs in Level 1 Computing Centre for Open Learning of Mathematics, Science, Computing.
Formative thresholded assessment: Evaluation of a faculty-wide change in assessment practice Sally Jordan and Janet Haresnape Faculty of Science The Open.
ASSESSMENT IN ONLINE ENVIRONMENTS. WELCOME o Facilitator name Position at university Contact info.
Robert W. Arts, Ph.D. Professor of Education & Physics University of Pikeville Pikeville, KY Presented at the Spring Meeting of the Kentucky Association.
Interactive online assessment with teaching feedback – for open learners Valda Stevens and Sally Jordan Student Feedback and Assessment Meeting, 10 th.
SFC Project Re-engineering assessments Weekly multi-choice tests.
Formative Assessment.
Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014.
Maintaining student engagement with formative assessment (using Questionmark) Tracey Wilkinson Centre for Biomedical Sciences Education School of Medicine,
SQA’s digital ambitions CIT-eA Workshop One, City of Glasgow College, Friday 5 September 2014 Martyn Ware Head of Assessment Development and Delivery.
SOLSTICE Conference th & 5 th June 2015 Transactional Distance and Flexible learning Dr John Bostock Edge Hill University.
Principles of good assessment: theory and practice David Nicol, Project Director, Re-Engineering Assessment Practices, University of Strathclyde, Scotland.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Hiram College WAC Based upon work by Erika Lindemann, Donald McAndrew, and Thomas Reigstad.
1 Bacon – T. A. Webinar – 7 March 2012 Transforming Assessment with Adaptive Questions Dick Bacon Department of Physics University of Surrey
Professor Norah Jones Dr. Esyin Chew Social Software for Learning – The Institutional Policy of the University of Glamorgan ICHL 2012, China
ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and.
EAssessment Colin Milligan Heriot-Watt University.
Diagnostic materials and the VLE Judy Ekins. Current provision in M&C Course descriptions contain varying levels of advice on assumed prerequisite skills,
Using Learning Technology to Design and Administer Student Assessments Catherine Kane Centre For Learning Technology, CAPSL Ref:
Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
CSC350: Learning Management Systems COMSATS Institute of Information Technology (Virtual Campus)
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
Assessment for learning: Learning from assessment? Sally Jordan DPS Seminar, 19 th November 2015.
Total Teaching (Feedback and assessment ) Paul Gibson.
Click on a question to see the answer. HOW EFFECTIVE IS THE COMBINATION OF YOUR MAIN PRODUCT AND ANCILLARY TEXTS? My main product and the ancillary product.
Formative Quizzes and their Contribution to the Understanding of Computer Programming by Level One Students Centre for Open Learning of Mathematics, Science,
The ABLE project: How do we put what we have learnt from learning analytics into practice Tinne De Laet, Head of Tutorial Services, Engineering Science,
Moodle Quizes Staff Guide. Creating Quizzes Click Add an Activity or Resource With the course in editing mode...
E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics,
Formative Assessment. Fink’s Integrated Course Design.
Moodle My Feedback: Assessment reports for staff and students Jessica Matt MoodleMoot IE/UK 2016.
Learning Active Citizenship using IPADS, Political Blogs and Social Media Professor Bryony Hoskins, University of Roehampton.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Jessica Matt MoodleMoot IE/UK 2016 Moodle My Feedback: Trialling the My Feedback report at UCL.
The Development of a VLE The highs, the lows and the dream! Joy Gilmour Teignmouth Community College.
Creating Assessments that Engage Students & Staff Professor Jon Green.
Comments: [NONE] The Art of Meaningful Feedback
Rebecca Freeman (Life Sciences) & Lynne Bayley (Chemistry)
Improving computer-marked assessment: How far can we go
An Introduction to e-Assessment
Computer-marked assessment or learning analytics
George Curry and Dr John Butcher
Presentation transcript:

Improving computer-marked assessment: How far can we go? How far should we go? Sally Jordan, Department of Physical Sciences, The Open University, QAA Scotland: Using technology to manage assessment and feedback, Edinburgh, 18 th March 2015.

Assessment can define a “hidden curriculum” (Snyder, 1971). Whilst students may be able to escape the effects of poor teaching, they cannot escape the effects of poor assessment. (Boud, 1995). Students study “what they perceive the assessment system to require” (Gibbs, 2006). “When we consider the introduction of e-assessment we should be aware that we are dealing with a very sharp sword” (Ridgway, 2004).

In this talk I will Discuss potential advantages and disadvantages of computer-marked assessment Discuss ways in which we can improve our practice by  better assignment design  the use of a variety of question types  writing better questions  the use of an iterative design process Discuss the limitations of computer-marked assessment and possibilities for the future

The UK Open University Founded in 1969 Supported distance learning students, mostly studying part-time Undergraduate modules are completely open entry, so students have a wide range of previous qualifications Normal age range from 18 to ?? of our students have declared a disability of some sort of our students live outside the UK iCMA = interactive computer-marked assignment TMA = tutor-marked assignment

Potential advantages of computer-marked assessment To save staff time To save money For constructive alignment with online teaching To make marking more consistent (‘objective’) To enable feedback to be given quickly to students To provide students with extra opportunities to practise To motivate students and to help them to pace their learning To diagnose student misunderstandings (learning analytics)

Potential disadvantages of computer-marked assessment May encourage a surface approach to learning May not be authentic There is no tutor to interpret the student’s answer and to deliver personalised feedback

Comments from students I discovered, through finding an error in the question, that not everybody was given the same questions. I thought this was really unfair especially as they failed to mention it at any point throughout the course. I find them petty in what they want as an answer. For example, I had a question that I technically got numerically right with the correct units only I was putting the incorrect size of the letter. So I should have put a capitol K instead of a lower case k or vice versa, whichever way round it was. Everything was correct except this issue.

Comments from students I discovered, through finding an error in the question, that not everybody was given the same questions. I thought this was really unfair especially as they failed to mention it at any point throughout the course. I find them petty in what they want as an answer. For example, I had a question that I technically got numerically right with the correct units only I was putting the incorrect size of the letter. So I should have put a capitol K instead of a lower case k or vice versa, whichever way round it was. Everything was correct except this issue. Thankfully, these students were happy with computer- marked assessment in general, but particular questions had put them off.

Comments from students A brilliant tool in building confidence It’s more like having an online tutorial than taking a test Fun It felt as good as if I had won the lottery Not walkovers, not like an American-kind of multiple- choice where you just go in and you have a vague idea but you know from the context which is right And from a tutor Even though each iCMA is worth very little towards the course grade my students take them just as seriously as the TMAs. This is a great example of how online assessment can aid learning.

Not all computer-marked assessment is the same To improve quality: Think about your assessment design; why do you want to use computer-marked assessment; how will you integrate it? Use appropriate question types Write better questions Use an iterative design process

Why have I used computer- marked assessment? In my work, the focus has been on ‘assessment for learning’, so feedback and giving students a second and third attempt is important (Gibbs & Simpson, ). We aim to ‘provide a tutor at the student’s elbow’ (Ross et al., 2006). However, a summative interactive computer-marked assignment that ran for the first time in 2002 is still in use, and has been used by around 15,000 students.

A Module website, showing the links to a quiz

Overall feedback on a diagnostic quiz

Use appropriate question types Multiple-choice Multiple-response Drag and drop Matching True/false Hotspot Free text: for numbers, letters, words, sentences Note: You need to think about what your e-assessment system supports. OpenMarkOpenMark PMatchSTACKPMatchSTACK

A basic OpenMark question

A variant of the same question

Use appropriate question types Multiple-choice Multiple-response Drag and drop Matching True/false Hotspot Free text: for numbers, letters, words, sentences Note: You need to think about what your e-assessment system supports. OpenMarkOpenMark PMatchSTACKPMatchSTACK

A STACK question in Moodle

Use appropriate question types Multiple-choice Multiple-response Drag and drop Matching True/false Hotspot Free text: for numbers, letters, words, sentences Note: You need to think about what your e-assessment system supports. OpenMarkOpenMark PMatchSTACKPMatchSTACK

A short-answer question (PMatch)

Different question types in use TOP TEN MOODLE QUESTION TYPES (Worldwide) Number% Multiple choice40,177, True/false6,462, Short-answer3,379, Essay2,321, Matching551, Multi-answer341, Description149, Numerical138, Calculated103, Drag-and-drop matching26, TOTAL53,675, Hunt, T. (2012). Computer-marked assessment in Moodle: Past, present and future. Paper presented at the International CAA Conference, Southampton, July 2012.

Constructed response or selected response? The most serious problem with selected response questions is their lack of authenticity: “Patients do not present with five choices” (Mitchell et al., 2003) quoting Veloski (1999). But even relatively simple selected response questions can lead to “moments of contingency” (Black & Wiliam, 2009) enabling “catalytic assessment”, the use of simple questions to trigger deep learning (Draper, 2009)

A quiz for you Q1. En mnoge est umpitter dan en bfeld because Ait is red Bit is blue Cit is yellow Dit is smaller so will fit through the gap between the house and the wall Eit is green

A quiz for you Q2. The bfeld links to the mnoge by means of a Aelland Bangaster Ctanag Dintrodoll Eussop

A quiz for you Q3. Which two of the following are correct: 1.A is bigger than B 2.B is bigger than C 3.A is bigger than C 4.A is smaller than B 5.B is smaller than C

Our advice to question authors Think about how you want your assessment to be embedded within the module Think about what question type to use (selected response or constructed response) Make sure that your question is carefully worded Think about your feedback Think about providing variants of the questions Check your questions Get someone else to check your questions Modify your questions in the light of student behaviour the first time they are used.

Monitor question performance

When do students do quizzes? (Overall engagement)

When do students do quizzes? (impact of hard deadlines)

When do students do quizzes? (typical patterns of use)

So we have done quite well But writing good questions takes a lot of time and therefore money Two possible solutions: Use machine-learning to develop the answer matching (especially for short-answer free-text questions) Share questions

Why don’t we collaborate more? “Sharing questions is one of those things which is easy to say we’d like but turns out to be very difficult in practice.” Some questions are systems dependent (so need interoperability: Question and Test Interoperability (QTI)) Questions may be context dependent e.g. refer to other resources, assume particular previous knowledge. Is a solution to share questions and allow others to edit them for their own use? Note: questions may be confidential (especially if in high- stakes summative use)

How far should we go? It is technically possible to get good answer matching for some quite sophisticated question types e.g. essays. But Perelman (2008) trained students to obtain good marks for a computer-marked essay by “tricks”. Computer-marked assessment is not a panacea. “If course tutors can be relieved of the drudgery associated with marking relatively short and simple responses, time is freed for them to spend more productively, perhaps in supporting students in the light of misunderstandings highlighted by the e-assessment or in marking questions where the sophistication of human judgement is more appropriate”(Jordan & Mitchell, 2009).

References Black, P. & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), Boud, D. (1995). Assessment and learning: Contradictory or complementary? In P. T. Knight (Ed.), Assessment for learning in higher education (pp ). London: Kogan Page. Draper, S. (2009). Catalytic assessment: Understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2), Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp ). London: Routledge. Gibbs, G. & Simpson, C. (2004-5). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1, Mitchell, T., Aldridge, N., Williamson, W., & Broomhead, P. (2003). Computer based testing of medical knowledge. In Proceedings of the 7th International Computer-Assisted Assessment (CAA) Conference, Loughborough, 8th-9th July Retrieved from Perelman, L. (2008). Information illiteracy and mass market writing assessments. College Composition and Communication, 60(1), Ridgway, J., McCusker, S., & Pead, D. (2004). Literature review of e-assessment. Bristol: Futurelab. Snyder, B.R. (1971). The hidden curriculum. New York: Alfred A. Knopf.

References to systems discussed Hunt, T. J. (2012). Computer-marked assessment in Moodle: Past, present and future. In Proceedings of the 2012 International Computer Assisted Assessment (CAA) Conference, Southampton, 10th-11th July Retrieved from JISC (2010) Effective assessment in a digital age. Retrieved from edo/programmes/elearning/assessment/digiassess.aspx edo/programmes/elearning/assessment/digiassess.aspx Jordan, S. E. & Mitchell, T. (2009). E-assessment for learning? The potential of short-answer free-text questions with tailored feedback. British Journal of Educational Technology, 40(2), Ross, S. M., Jordan, S. E. & Butcher, P. G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp ). London: Routledge. Sangwin, C. J. (2013). Computer aided assessment of mathematics. Oxford: Oxford University Press. Much of what I have said is discussed in more detail in: Jordan, S. E. (2014). E-assessment for learning? Exploring the potential of computer-marked assessment and computer-generated feedback, from short-answer questions to assessment analytics. PhD thesis. The Open University. Retrieved from

Sally Jordan website: blog: