Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment.

Slides:



Advertisements
Similar presentations
CBEA CONFERENCE OCTOBER 20, 2010 MRS. DEDERER BUSINESS TEACHER BETHEL HIGH SCHOOL Moodle.
Advertisements

Completing the cycle: an investigation of structured reflection as a tool to encourage student engagement with feedback Jackie Pates Lancaster Environment.
February Introduction This Careers-based activity has been arranged for several reasons, namely… IT IS ALREADY TIME TO START THINKING ABOUT NEXT.
PDP 8 – Reflective writing
© Dr I M Bradley CG109 - Individual Project (Undergraduate) Overview Briefing.
Peer Assessment of Oral Presentations Kevin Yee Faculty Center for Teaching & Learning, University of Central Florida Research Question For oral presentations,
SATs SATs - Standard Assessment Tests - are used to measure progress. * Progress from Key Stage 1 to Key Stage 2 is measured. On average a child.
Four ‘whys’ for referencing Sue Gill, QuILT. Why reference? 1.So we know what you’ve read 2.To show how your ideas formed 3.Building on the work of others.
Year 6 SATs Information Monday 12th May – Friday 16th May 2014.
Online Reading Lists at Loughborough University Gary Brewerton, Library Systems Manager.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
School of Modern Languages and Cultures Developing potential through independent learning using the VLE Dr Honor Aldred Dr M Chiara La Sala.
Algorithms and Problem Solving-1 Algorithms and Problem Solving.
Investigating the use of short answer free-text questions in online interactive assessment Sally Jordan 17 th January 2008 Centre for Open Learning of.
PowerPoint Presentation for Dennis, Wixom & Tegarden Systems Analysis and Design Copyright 2001 © John Wiley & Sons, Inc. All rights reserved. Slide 1.
Teaching in Maths Background Marking Tutorials Practical information Handouts: PGs yellow+white, UGs pink+whiteyellowwhitepinkwhite Handouts and slides.
Problem-based Learning An Introduction. What is PBL? –“the most significant innovation in education for the professions in many years” –now very common.
Introduction to a Programming Environment
Future Morph: A Introduction STEM Getting Started On line Learning Module.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
South Central Management Network Wednesday 18 June 2008 Newbury Welcome to the Management Network Career Planning Taster Workshop.
Improving computer-marked assessment: How far can we go? How far should we go? Sally Jordan, Department of Physical Sciences, The Open University,
Exploring Interactive Computer Marked Assessments for Logic Constructs in Level 1 Computing Centre for Open Learning of Mathematics, Science, Computing.
Formative thresholded assessment: Evaluation of a faculty-wide change in assessment practice Sally Jordan and Janet Haresnape Faculty of Science The Open.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Interactive online assessment with teaching feedback – for open learners Valda Stevens and Sally Jordan Student Feedback and Assessment Meeting, 10 th.
Level 2 IT Users Qualification – Unit 1 Improving Productivity
Computer-marked assessment as learning analytics Sally Jordan Department of Physical Science CALRG-C, 11 th June 2014.
IB Credit Students earn their IB science course grade based on three written examinations and an independent investigation. This investigation is called.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Jordan Girling.
Online Communities. Topics Social Networking Online Work Spaces Virtual Learning Environments User-Generated Reference Sites.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Katie.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Chris.
ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and.
Updated Today's talk should help you to understand better  what your responsibilities for this module  how you will be taught  how you.
Marking Leslie Croxford & Kevin Millam. Purpose To help you to… mark consistently assess consistently develop robust assessment systems …in line with.
Slide 1 Construction (Testing) Chapter 15 Alan Dennis, Barbara Wixom, and David Tegarden John Wiley & Sons, Inc. Slides by Fred Niederman Edited by Solomon.
By the end of this session you should be able to...
Introducing Unit Specifications and Unit Assessment Support Packs Computing Science National 3 to National 5.
8 th Grade Integers Natalie Menuau EDU Prof. R. Moroney Summer 2010.
Course Production at the Open University Jim Ellis Head of Interactive Media.
Level 2 IT Users Qualification – Unit 1 Improving Productivity JACK GOODING.
Using NLP to Support Scalable Assessment of Short Free Text Responses Alistair Willis Department of Computing and Communications, The Open University,
Chapter 7 The Practices: dX. 2 Outline Iterative Development Iterative Development Planning Planning Organizing the Iterations into Management Phases.
ASSESSMENT ISSUES INCLIL. ASSESSMENT PROCESSES SUMMATIVE SUMMATIVE Makes a judgement on the capability of the learner at a certain point in time Makes.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Carl.
Assessment for learning: Learning from assessment? Sally Jordan DPS Seminar, 19 th November 2015.
Alessio Peluso 1 Critical evaluation of the module ‘Introduction to Engineering Thermo Fluid Dynamics’ First Steps in Learning and Teaching in Higher Education.
CS 4620 Intelligent Systems. What we want to do today Course introductions Make sure you know the schedule for the next three weeks.
Formative Quizzes and their Contribution to the Understanding of Computer Programming by Level One Students Centre for Open Learning of Mathematics, Science,
Algorithms and Pseudocode
E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics,
Innovation What does it look like in Adult Learning?
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
In 2014/15 a new national curriculum framework was introduced by the government for Years 1, 3, 4 and 5 However, Years 2 and 6 (due to statutory testing)
true potential An Introduction to the First Line Manager Programme’s CMI Qualifications.
Maths Information Evening
THIS IS TO EVIDENCE YOUR WORK AND GET THE BEST GRADE POSSIBLE
FSA General Information
Improving computer-marked assessment: How far can we go
Common Core State Standards Standards for Mathematical Practice
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
An Introduction to e-Assessment
Computer Science Testing.
Computer-marked assessment or learning analytics
AICT5 – eProject Project Planning for ICT
Tom Harrison School of English Literature, Language and Linguistics
Software Development Techniques
Presentation transcript:

Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment in a Digital Age Workshop 3 rd February 2011

The context : the Open University Supported distance learning; 180,000 students, mostly studying part-time; Undergraduate modules are completely open entry, so students have a wide range of previous qualifications; Normal age range from 16 to ?? 10,000 of our students have declared a disability of some sort; 25,000 of our students live outside the UK.

Case study presentation What we did Why we did it How we did it Your chance to have a go Discussion S104 websiteS104 website iCMA47iCMA47

Why We use interactive computer marked assignments (iCMAs) alongside tutor-marked assignments (TMAs) to: Provide instantaneous feedback – and an opportunity for students to act on that feedback; Provide ‘little and often’ assessment opportunities and so help students to pace their studies; Act as ‘a tutor at the student’s elbow’; We also use iCMAs for diagnostic purposes. We use a range of question types, going beyond those where students select from pre-determined answers to those where they have to write in their own words.

How Most of our questions are written in OpenMark and sit within the Moodle virtual learning environment; For short-answer free-text questions, we initially used answer-matching software provided by Intelligent Assessment Technologies (IAT) (sitting within OpenMark) The software is based on the Natural Language Processing technique of ‘Information Extraction’; Key point: Real student responses were used in developing the answer matching.

The IAT software represents mark schemes as templates; synonyms can be added and verbs are usually lemmatised.

Human-computer marking comparison The computer marking was compared with that of 6 human markers; For most questions the computer’s marking was indistinguishable from that of the human markers; For all questions, the computer’s marking was closer to that of the question author than that of some of the human markers; The computer was not always ‘right’, but neither were the human markers.

QuestionNumber of responses in analysis Percentage of responses where the human markers were in agreement with question author Percentage of responses where computer marking was in agreement with question author Range for the 6 human markers Mean percentage for the 6 human markers A to B to C to D to E to F to G to

Computer-computer marking comparison An undergraduate student (not of computer science) developed answer matching using two algorithmically based systems, Java regular expressions and OpenMark PMatch; These are not simple ‘bag of words’ systems; Student responses were used in the development of the answer matching, as had been the case for the linguistically based IAT system; The results were compared.

QuestionResponses in set Percentage of responses where computer marking was in agreement with question author Computational linguistics Algorithmic manipulation of keywords IATOpenMarkRegular Expressions A B C D E F G

Recent work We repeated the computer-computer marking comparison - PMatch did even better; We have introduced a spellchecker into Pmatch; We have now transferred the answer matching for our ‘live’ questions from IAT to PMatch; Similar software will be available as part of Core Moodle from late 2011.

PMatch is an algorithmically based system so a rule might be something like Accept answers that include the words ‘high’, ‘pressure’ and ‘temperature’ or synonyms, separated by no more than three words This is expressed as: else if ((m.match("mowp3", "high|higher|extreme|inc&|immense_press&|compres&|[ deep_burial]_temp&|heat&|[hundred|100_degrees]") matchMark = 1; whichMatch = 9; 10 rules of this type match 99.9% of student responses

Have a go Follow the instructions on Worksheet 1, using the student responses on Worksheet 2, to suggest rules for answer matching and feedback. Notes after the Workshop The rules suggested at the JISC workshop on 3 rd Feb gave a 100% match (well done!). Our rules for all the student responses available are given in Worksheet 3. It was suggested that this question could be reframed as a multiple-choice question where students are given a picture of a slide and asked to indicate the position. This is a very good point, though you’d miss the students who thought the kinetic energy was greatest as the child climbed the steps.

Benefits For students – instantaneous feedback on non-trivial e- assessment tasks; For associate lecturer staff – saving from drudgery of marking routing responses; more time to spend supporting students in other ways; For module team staff – knowledge that marking has been done consistently and quickly; For institution – cost saving; more information about student misunderstandings

A proviso Our answer matching accuracy is based on the use of hundreds of student responses and there is an overall financial saving because the modules are studied by thousands of students per year and have a lifetime of years; Our approach may be a less practical solution for smaller student numbers; However monitoring student responses to e-assessment questions of all types remains important.

Overall conclusions Simple pattern matching software has been shown to be very effective; Very small changes can make a huge difference to the effectiveness of innovation; Student perception is important; It is really important to monitor actual student responses and to learn from them.

How far is it appropriate to go? I don’t see online assessment as a panacea; Some learning outcomes are easier than others to assess in this way; Free text questions require students to construct a response, but there still need to be definite ‘right’ and ‘wrong’ answers (though not necessarily a single right answer); However, online assessment provides instantaneous feedback and has been shown to be more accurate than human markers. It can free up human markers for other tasks. It also has huge potential for diagnostic use.

For further information: Jordan, S. & Mitchell, T. (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, Butcher, P.G. & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55,

Useful links PMatch demonstration ‘Are you ready for S104?’ (diagnostic quiz, showing a range of question types) 104/ OpenMark examples site Intelligent Assessment Technologies (IAT)

Sally Jordan Staff Tutor in Science The Open University in the East of England Cintra House 12 Hills Road Cambridge CB2 1PF website: profile.php?staff_id=Sally%26%26Jordan blog: profile.php?staff_id=Sally%26%26Jordanhttp://