EAssessment Colin Milligan Heriot-Watt University.

Slides:



Advertisements
Similar presentations
Developing online Learning Dr Derek France Department of Geography Chester College of H.E. GEES.
Advertisements

Performance Assessment
Computer Aided Assessment using QuestionMark Perception by Catherine Ogilvie Educational Development Officer.
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Aug 26, By the end of this presentation parents will be able to understand and explain to others in the WIS community: -the complexities of the.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
Innovation Innovation Innovation eAssessment and eLearning May 2006.
 Explain objectives of “learning outcomes”  List “measureable” verbs when writing LO’s  Identify different methods used to evaluate learning  Explain.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 TESTS Purposes of the test Type of test Objectives of the test Content.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
The value of e-assessment in interprofessional education and large student numbers Melissa Owens* John Dermo* Fiona MacVane Phipps * Presenters.
Copyright 2001 by Allyn and Bacon Classroom Evaluation & Grading Dr.Bill Bauer EDUC 202.
CAA MUST be more than multiple-choice tests for it to be academically credible? Phil Davies School of Computing University of Glamorgan.
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
Using Assessment Within Virtual Learning Environments Colin milligan.
Principles of Assessment
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
Graduate Attributes Jackie Campbell, Laura Dean, Mark de Groot, David Killick, Jill Taylor.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
Challenge the future Delft University of Technology Digital testing: What do students think of it? E-merge, November 2014 Ir. Meta Keijzer-de.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Presented by Jennifer Fager For University of Wisconsin-Superior Enhancement Day 1/19/2011.
Authentic Assessment Principles & Methods
Professor Daniel Khan OBE Chief Executive OCN London.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Curriculum and Assessment in Northern Ireland
‘Designing in’ academic, personal and professional development.
Enabling learning and assessment Unit 3 – Week 1.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Assessment Lecture 3. Chain of control Assessment which results in monitoring a learner’s achievements during his/her programme of study forms an essential.
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
Chapter 5 Building Assessment into Instruction Misti Foster
Franklin Consulting E-assessment: What is it? Why does it matter? Tom Franklin Franklin Consulting
Using Blackboard for blended learning Delivering the Geography curriculum at Kingston College This talk will give an overview of the assessment features.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
1 An Investigation of The Response Time for Maths Items in A Computer Adaptive Test C. Wheadon & Q. He, CEM CENTRE, DURHAM UNIVERSITY, UK Chris Wheadon.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Assessment Tools.
Adapted from “Best Practices for Student Learning, Assessment in Online Courses”“Best Practices for Student Learning, Assessment in Online Courses”
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Teaching in teams: lessons from systematic review training NCRM Training the Trainers Event 4 th June 2007 Angela Harden and Karen Bird MRS Node EPPI Centre,
The role of CAA in Helping Engineering undergraduates to Learn Mathematics David Green, David Pidcock and Aruna Palipana HELM Project Mathematics Education.
Adapted from “Best Practices for Student Learning, Assessment in Online Courses”“Best Practices for Student Learning, Assessment in Online Courses”
James Falkofske Summer Assumption # 1 Students WANT to do Well  They want to be there.  They are happy to be there.  If they become disgruntled,
SCROLLA 16th June 2004 The Future of Computer Aided Assessment Helen Ashton Ruth Thomas Heriot-Watt University.
E-assessment for learning? Short-answer free-text questions with tailored feedback Sally Jordan ENAC Potsdam August 2008 Centre for Open Learning of Mathematics,
e-marking in large-scale, high stakes assessments conference themes :  role of technology in assessments and teacher education  use of assessments for.
ASSESSMENT LITERACY AND SELF-ASSESSMENT STRATEGIES Dr Tina Kendall ALSS, Department of English & Media Course Leader for Film Studies/Film & Media Studies.
CHAPTER 4 – EVALUATION GRADING AND BACKWASH Presenter: Diane Whaley.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
PGCE Evaluation of Assessment Methods. Why do we assess? Diagnosis: establish entry behaviour, diagnose learning needs/difficulties. Diagnosis: establish.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Individualised Support for Learning through ePortfolios ISLE David Ross, Project Director Gerard Graham, Project Manager Development of a Pedagogical Model.
Training Week, August 2016 Assessment for
Assessing Young Learners
ASSESSMENT OF STUDENT LEARNING
An Introduction to e-Assessment
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Presentation transcript:

eAssessment Colin Milligan Heriot-Watt University

Randy Bennett, ETS Like many innovations in their early stages, today’s computerised tests automate an existing process without reconceptualising it to realise the dramatic improvements that the innovation could allow.

Bennett’s 3 Stages 1.Resemble Paper Tests, 2.New Formats – multimedia and constructed responses, 3.Simulations, Virtual Reality – embedding assessment within learning.

Summative Formative Efficient for large student numbers, Consistent marking, Integration with institutional systems, Immediate feedback on performance. Immediate feedback, linked to remediation, Individualisation, Opportunity for multiple attempts, Opportunity for innovation.

It’s all Multiple Choice Easy to author (!), Easy to mark, Supported by VLEs. Assesses lower order skills, Guessability, Not suited to some disciplines.

Beyond Multiple Choice Numerical Assessment, Authentic Assessment, Short Answer Questions, Discussion Fora, Essays, Peer Assessment.

Numerical Assessment Randomisation, Expression evaluation, Partial credit, Avoids pitfalls of closed ‘objective- type’ questions, Matched to current assessment.

Authentic Assessment Instead of testing knowledge recall, tests interpretation and application: –Utilise data, –Manipulate simulation, –Perform a task.

TRIADS (University of Derby)

Pushing the Boundaries of Computer Marking src: Tom Mitchell, Intelligent Assessment Technologies

Short Answer Questions Intelligent Assessment Create model answers based on analysis of samples, Acceptable accuracy from around 50 responses, Optimum accuracy from ~200 samples.

Mean Marking Accuracy Increased Consistency, Marking schemes can be varied to suit content, Built in Moderation.

Essays: e-Rater Claims to assess quality of argument, style, examples etc. Requires thousands of sample essays to develop model answers. ‘Off the shelf’ essay questions.

Assessing Discussion Forum Activity Appropriate for Learning Environments, Assessing increases forum activity, Contributions can be assessed, but be careful with marking scheme – e.g.: –Baseline marks for activity, –Extra marks for worthwhile contributions – e.g followed up. Explain marking scheme clearly, Human marking required – intensive.

Low Stakes Peer Assessment Students submit task – e.g. critique of paper. Anonymised and sent out to others in group, Computer used for organisation, collation and distribution of content and feedback, Encourages reflection, Combines well with Discussion Forum activities.

Item Banking and Analysis MCQs will predominate, Question Banks, Highlights good and bad questions, Analysis of Question Quality: IRT –Difficulty, –Discrimination, –Guessability.

Item Characteristic Curve (ICC)

Difficulty: the displacement A more difficult question is less likely to be answered correctly by students of a given ability. more difficult

Discrimination: the slope High discrimination is good only if matched correctly to student ability.

Guessability: the baseline Poor distractors can raise the effective guessability.

Question Design All assessment can suffer from poor question design, Authoring systems make it easy to create (poor) questions, Multi-media presentation provides more opportunity for poor design.

standards orig. src: Charles Duncan, Intrallect

standards Question and Test Interoperability –Reuse questions, –Use others’ questions, –Change delivery systems. Other standards –Integrate with other systems, –Embed questions within content, –Identify questions by metadata, –Match questions to content.

Student Attitudes Resistant to change – so manage it. Sell the benefits – –immediate access to results, –remediation/feedback, Carefully integrate – don’t bolt-on – explain the rationale, Provide opportunity for practice.

Staff Attitudes Resistant to change – so manage it Sell the benefits: –Automation, –Access to others’ questions, –Tried and tested assessment, –Authenticity of Assessment, –Long-term efficiency.

Security, Plagiarism and Reliability Computers are subjected to far more rigorous scrutiny, See assessed work in context of the whole course, Adapt assessment to integrate understanding and bring in personal experience, Get students to rate themselves, Always have a backup plan.

Implementation Cost is front loaded: –Technology Investment, –Assessment Design (Pedagogical Staff Development), –Planning and Piloting (Systems, procedures and policies, Convince the stakeholders), Use computers where appropriate, For what they are good at, Important Issues go beyond the technology.

Where are we going? Assessment integrated with learning, blurring the distinction. Adaptive and diagnostic assessment. Rich feedback, tailored study Portfolios, Assessment of teamwork. Can the curriculum change? Will the curriculum change?

Further Information CAA Conference, Loughborough LTSN Generic Centre CAA CAA/index.phtml Slides and Notes for this Talk

Colin Milligan