E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics,

Slides:



Advertisements
Similar presentations
Enabling successful communication of geographical understanding in written assessments AE SIG GA Conference 2013.
Advertisements

A Masters in Education in eLearning The University of Hull.
Maths for distance-learning science students Sally Jordan The Open University 14th September 2005.
Correction, feedback and assessment: Their role in learning
23 June 2009 Assessing Project Based Learning in Groups Peter Willmot.
Yacapaca Assessment. Authoring Guide for Students Introduction Yacapaca Aims Guide for Teachers.
S.T.A.I.R.. General problem solving strategy that can be applied to a range problems.
Information for Parents on Key Stage 2 SATs
Current Reality Local assessment questions don’t clearly match PARCC expectations Not enough online assessments at all grade levels Not enough exposure.
The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
Recruitment of online tutors Sharon Slade, Fenella Galpin OU Business School.
TMA feedback: can we do better? Mirabelle Walker Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
Investigating the use of short answer free-text questions in online interactive assessment Sally Jordan 17 th January 2008 Centre for Open Learning of.
PARCC Student Presentation.
Thank you for being here!. Learning Outcome Students will demonstrate their ability to apply effective learning strategies to enhance their success in.
 Statutory Assessment Tasks and Tests (also includes Teacher Assessment).  Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Algorithmic e-Assessment with DEWIS Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department.
Jude Carroll, author of Tools for Teaching in an Educationally Mobile World (Routledge 2015) Supporting teaching across cultures: the role of good practice.
CETL-impact : case studies from S104 Exploring Science Sally Jordan, Diane Butler and Paul Hatherly 3 rd Open CETL Conference, 24 th September 2008.
Improving computer-marked assessment: How far can we go? How far should we go? Sally Jordan, Department of Physical Sciences, The Open University,
Exploring Interactive Computer Marked Assessments for Logic Constructs in Level 1 Computing Centre for Open Learning of Mathematics, Science, Computing.
Formative thresholded assessment: Evaluation of a faculty-wide change in assessment practice Sally Jordan and Janet Haresnape Faculty of Science The Open.
Interactive online assessment with teaching feedback – for open learners Valda Stevens and Sally Jordan Student Feedback and Assessment Meeting, 10 th.
Information for Parents on Key Stage 2 SATs 14 th January 2014.
7 September 2015 Information for Parents Key Stage 2 SATs 2013.
Bridging the Gap: Possibilities for Writing Across the Curriculum (WAC) Presentation to the Curriculum Review Steering Group Dr Mary Pryor & Dr Christine.
Maintaining student engagement with formative assessment (using Questionmark) Tracey Wilkinson Centre for Biomedical Sciences Education School of Medicine,
SOLSTICE Conference th & 5 th June 2015 Transactional Distance and Flexible learning Dr John Bostock Edge Hill University.
IB Credit Students earn their IB science course grade based on three written examinations and an independent investigation. This investigation is called.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Jordan Girling.
Designing assessment & learning activities for blended delivery at Wodonga TAFE Michael Gwyther.
Completion, Short-Answer, and True-False Items
1 Bacon – T. A. Webinar – 7 March 2012 Transforming Assessment with Adaptive Questions Dick Bacon Department of Physics University of Surrey
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
How to Evaluate Student Papers Fairly and Consistently.
Personal Reflection on Readings How to write reflectively in an academic paper.
The New English Curriculum September The new programme of study for English is knowledge-based; this means its focus is on knowing facts. It is.
ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and.
EAssessment Colin Milligan Heriot-Watt University.
Designing interactive assessments to promote independent learning Sally Jordan, Phil Butcher & Richard Jordan The Open University Effective Assessment.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Blogging - assessment & feedback Dr Jessie Paterson, School of Divinity, University of Edinburgh
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
ASSESSMENT ISSUES INCLIL. ASSESSMENT PROCESSES SUMMATIVE SUMMATIVE Makes a judgement on the capability of the learner at a certain point in time Makes.
The role of CAA in Helping Engineering undergraduates to Learn Mathematics David Green, David Pidcock and Aruna Palipana HELM Project Mathematics Education.
Assessment for learning: Learning from assessment? Sally Jordan DPS Seminar, 19 th November 2015.
+ Summer Institute for Online Course Development Institute – Assessment Techniques Presentation by Nancy Harris Dept of Computer Science.
Zoe Yates​​ 13/14 Teaching Development Project Will weekly themed activities, on-going group feedback and more self-reflection improve engagement and skill.
Formative Quizzes and their Contribution to the Understanding of Computer Programming by Level One Students Centre for Open Learning of Mathematics, Science,
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
INVITATION TO Computer Science 1 11 Chapter 2 The Algorithmic Foundations of Computer Science.
Willow Tree Community Primary School
Objectivist Learning Environment. Learner Teacher LearningLearning ContextContext LearningLearning ActivityActivity Content Assessment Areas for Comparing/Contrasting.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
New Curriculum assessment at the end of KS1 Laura WilsonMatthew Biss.
SCIENCE Assessment Amanda Cantafio.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
To my presentation about:  IELTS, meaning and it’s band scores.  The tests of the IELTS  Listening test.  Listening common challenges.  Reading.
Understanding Standards: Advanced Higher Event
Websites Revision Guides
Understanding Standards: Nominee Training Event
Using Technology in Teaching
Improving computer-marked assessment: How far can we go
F.S.A. Computer-Based Item Types
Creative assessment and feedback
Principles of Learning
Presentation transcript:

E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

The UK Open University Supported distance learning; 150,000 students, mostly studying part-time; Undergraduate courses are completely open entry, so students have a wide range of previous qualifications; Normal age range from 18 to ?? 10,000 of our students have declared a disability of some sort; 25,000 of our students live outside the UK.

Implications for assessment Within the Open University context, learners are geographically separated and we cannot assume that they will meet their tutor in order to receive feedback. We are seeking to provide students with feedback on assessment tasks which is personalised and received in time to be used in future learning. We are using small regular assessment tasks to help students to pace their study. We are also using assessment tasks to encourage students to reflect on their learning and to enter into informed discussion with their tutor. So perhaps e-assessment can offer benefits for learning…?

The OpenMark system Uses a range of question types, going far beyond what is possible with multiple choice; Question types include: numerical input, text input, drag and drop, hotspot; Students are allowed three attempts with an increasing amount of teaching guidance, wherever possible tailored to the student’s previous incorrect answer; Different students receive variants of each question so each has a unique assignment. OpenMark has been incorporated into Moodle, the open source virtual learning environment being used by the Open University.

Pushing the boundaries… We wanted to be able to ask questions requiring free text answers of a phrase or sentence in length; This requires us to mark many different answers as correct.. and many different answers as incorrect… We are working with a commercially provided authoring tool (from Intelligent Assessment Technologies Ltd.); The system copes well with poor spelling and, usually, with poor grammar; It can handle answers in which word order is significant and it accurately marks negated forms of a correct answer.

Novel features The IAT questions sit within OpenMark and students are offered three attempts with increasing feedback; We provide tailored feedback on both incorrect and incomplete responses; We write the answer matching ourselves using a template-based authoring tool; We use student responses to developmental versions of the questions, themselves delivered online, to improve the answer matching.

Evaluation 1: User lab observations Six students were observed in June 2007; They reacted to the questions in interesting ways; most gave their answers as phrases or in note form, even when it had been suggested that answers should be given as a complete sentence. Why? One student said ‘I’m going to give my answers in the same way as I would for a tutor marked assignment’ – and he did exactly that, composing his answers carefully as grammatically correct sentences.

Evaluation 1 User lab observations cont. Students’ use of feedback was also variable; Some students read the feedback carefully, scrolling across the text, nodding, making appropriate comments, referring to the course text; These students were able to successfully amend their previous answer i.e. to learn from the feedback provided; But some students didn’t make use of the feedback provided, especially when told that an incorrect answer was correct.

Evaluation 2: Human-computer marking comparison The computer marking was compared with that of 6 human markers; For most questions the computer’s marking was indistinguishable from that of the human markers; For all questions, the computer’s marking was closer to that of the question author than that of some of the human markers; The computer was not always ‘right’, but neither were the human markers.

Almost final thoughts The computer’s answer matching is excellent, probably because we have used real student responses to develop it; This matters even in purely formative use – we want students to receive the correct message; A few of these questions have been incorporated into summative (low stakes) assessments. Do students trust the computer to mark their work? So far, we have no evidence that they don’t (they are aware that there is always a human arbitrator);

Final thoughts Most students are intrigued and impressed by the technology; A few would prefer multiple choice questions: ‘you know the answer is always there somewhere’; How useful are e-assessment questions of this type in supporting student learning? What sort of feedback is most useful to students? How far is it appropriate to go?

Acknowledgments The Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) especially Barbara Brockbank, Phil Butcher and Laura Hills; Tom Mitchell of Intelligent Assessment Technologies Ltd.

Our demonstration questions are at Or if you want more… For more about OpenMark For more about Intelligent Assessment Technologies Ltd.

Sally Jordan Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT) The Open University Walton Hall Milton Keynes MK7 6AA