How is evidence-based literature informing e-Assessment practice: Findings from an HEA project Denise Whitelock d.m.whitelock@open.ac.uk CALRG 2012 - dmw.

Slides:



Advertisements
Similar presentations
Assessment and Feedback with Technology Enhancement: Highlights from the Synthesis Report Denise Whitelock Lester Gilbert Veronica Gale
Advertisements

School Based Assessment and Reporting Unit Curriculum Directorate
Supporting further and higher education Learning design for a flexible learning environment Sarah Knight and Ros Smith Pedagogy Strand of the JISC e-Learning.
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Using Visualisers in AfL A Case Study in Science.
Whose learning is it anyway?
INACOL National Standards for Quality Online Teaching, Version 2.
The student experience of e-learning Dr Greg Benfield Oxford Centre for Staff & Learning Development.
Principles of Assessment
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Embedding Information Literacy into Staff Development at an acute NHS Trust Sharon Hadley Kim Hacker
Margaret J. Cox King’s College London
© Myra Young Assessment All rights reserved. Provided for the use of participants in AM circles in North Lanarkshire Council.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Professor Daniel Khan OBE Chief Executive OCN London.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Assessment for Learning:Where are we on the e-Assessment spectrum? Denise Whitelock The Open University, Walton Hall, Milton Keynes MK7 6AA
Welcome Synthesis Report on Assessment and Feedback with Technology Enhancement.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
ACCURACY IN ASSESSMENT; EVIDENCING AND TRACKING PROGRESS IN TEACHER EDUCATION BEA NOBLE-ROGERS.
Programming the New Syllabuses (incorporating the Australian Curriculum)
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
Enhancing Learning and Teaching in HE People Performance Potential Staff Development Unit People Performance Potential Developing & supporting post graduates.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Observation System Kidderminster College January 2012.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Talk about the assignment! April 27th 2015 #TOOC15 Webinar.
Designing Quality Assessment and Rubrics
Personal Development from a Student Perspective: Introducing and using the NUS Personal Development Benchmarking Tool Kate Little Senior Project Officer.
Collaborative & Interpersonal Leadership
Equity and Deeper Learning:
Anthony Williams, Maria Northcote, Jason Morton and John Seddon
Professional Recognition and Development (PRD) Scheme
Subject specialist teaching
School – Based Assessment – Framework
Classroom Assessment A Practical Guide for Educators by Craig A
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
THE PORTFOLIO PRINCIPLE
Assessment and Feedback – Module 1
MENTEP, Brussels Janet Looney
“Embracing the Future”
ASSESSMENT OF STUDENT LEARNING
Chartered College of Teaching
Our Research Into Assessment
Jenny Lyn Tee Estrada-Firman Reporter
KA2 Strategic Partnerships – HU01-KA
Teaching All Children: Planning and Assessment
Magnus M B Ross & Mary P Welsh
COMPETENCIES & STANDARDS
Digital assessment Introduction to Assessment and Digital Assessment
Getting the balance of the blend right
Engaging Staff With E-Learning
RARPA Recognising and Recording Progress and Achievement
An Introduction to e-Assessment
Recognising and Rewarding Successful Teaching
Unit 7: Instructional Communication and Technology
Learning design as a foundation for the future success of e-learning
From the beginning: CAP Project Received full commissioning in 2007 to provide two levels of service. NHS England commissioned 13 AAC Hub Services in 2014.
Our Research Into Assessment
External Examiners Briefing Session Friday 14th December 2018
Designing Your Performance Task Assessment
The National and Local context
OTLA Report Writing Training
Approaches to Learning (ATL)
Leveraging Technology to Increase Learning Through Student-Feedback Tools       Leveraging Technology to Increase Learning Through Student-Feedback.
Presentation transcript:

How is evidence-based literature informing e-Assessment practice: Findings from an HEA project Denise Whitelock d.m.whitelock@open.ac.uk CALRG 2012 - dmw

Assessment drives learning (Rowntree, 1987) The Challenge Assessment drives learning (Rowntree, 1987) How does e-assessment and feedback support student learning? CALRG 2012 - dmw

I hate marking but want the tasks and feedback to assist student learning CALRG 2012 - dmw

HEA funded project to: Consult the academic community on useful references Seminar series Survey Advisors Invited contributors Prioritise evidence-based references Synthesise main points For readers: Academics using technology enhancement for assessment and feedback Learning technologists Managers of academic departments CALRG 2012 - dmw

Evidence-based literature 142 references Technology-enhanced methods Use for assessment and feedback Type of evidence Ease of access (18 could not be retrieved) CALRG 2012 - dmw

Categories of evidence used Category Description 1a Peer-reviewed generalizable study providing effect size estimates and which includes (i) some form of control group or treatment (may involve participants acting as their own control, such as before and after), and / or (ii) blind or preferably double-blind protocol. 1b Peer-reviewed generalizable study providing effect size estimates, or sufficient information to allow estimates of effect size. 2 Peer-reviewed ‘generalizable’ study providing quantified evidence (counts, percentages, etc) short of allowing estimates of effect sizes. 3 Peer-reviewed study. 4 Other reputable study providing guidance.

Number of references recommended in each evidence category Cumulative % 1a 15 12.1% 1b 8 18.5% 2 12 28.2% 3 49 67.7% 4 40 100.00% Total 124 CALRG 2012 - dmw

MCQ & EVS use with large learning gains, Draper (2009) Assertion – reason questions eg Pauli, Heisenberg, Planck, de Broglie Learners generate reasons for and against each answer Confidence marking, Gardner-Medwin Mazur’s method of brain teasing. Role of peers, cognitive conflict Students create MCQs for EVS, reflection on counter arguments before you hear them Students create MCQs for final exam. Will increase exam performance CALRG 2012 - dmw

Mobile Technologies and Assessment MCQs, PDAs (Valdiva & Nussbaum, 2009) Polls, instant surveys (Simpson & Oliver, 2007) EVS (Draper, 2009) CALRG 2012 - dmw

CAP peer assessment system, BSc. Network Management & Security (Intl CAP peer assessment system, BSc. Network Management & Security (Intl.), Glamorgan, Phil Davies CALRG 2012 - dmw

Peer Assessment and the WebPA Tool Loughborough (Loddington et al, 2009) Self assess and peer assess with given criteria Group mark awarded by tutor Students rated: More timely feedback Reflection Fair rewards for hard work Staff rated: Time savings Administrative gains Automatic calculation Students have faith in the administrative system CALRG 2012 - dmw

Authentic assessments: e-portfolios CALRG 2012 - dmw

Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon CALRG 2012 - dmw

Building e-portfolios on a chef’s course Evidence of food preparation for e-portfolio Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill CALRG 2012 - dmw

Sharing e-portfolios: The Netfolio concept Social constructivism Connecting e-portfolios (Barbera, 2009) Share and build upon a joint body of evidence Trialled with 31 PhD students at a virtual university Control group used but Netfolio group obtained higher grades Greater visibility of revision process and peer assessment in the Netfolio system CALRG 2012 - dmw

MCQs: Variation on a theme (1) An example of a COLA assessment used at the Reid Kerr College, Paisley. It is a Multiple Response Question used in one of their modules. This question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students. CALRG 2012 - dmw

MCQs: Variation on a theme (2) Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback, University College, Tony Gardner-Medwin Drug Chart Errors and Omissions, Medicines Administration Assessment, Chesterfield Royal Hospital CALRG 2012 - dmw

Basic IT skills, first year med students (Sieber, 2009) Self-diagnosis Basic IT skills, first year med students (Sieber, 2009) Competency-based testing Repeating tests for revision Enables remedial intervention CALRG 2012 - dmw

Students want more support with assessment More Feedback Quicker Feedback Full Feedback User-friendly Feedback And ..................National Students’ Survey CALRG 2012 - dmw 19

Gains from Interactivity with Feedback: Formative Assessment Mean effect size on standardised tests between 0.4 to 0.7 (Black & Wiliam, 1998) Particularly effective for students who have not done well at school http://kn.open.ac.uk/document.cfm?docid=10817 Can keep students to timescale and motivate them How can we support our students to become more reflective learners and enter a digital discourse? CALRG 2012 - dmw

Kent University ab-initio Spanish module LISC: Aily Fowler Kent University ab-initio Spanish module Large student numbers Skills-based course Provision of sufficient formative assessment meant unmanageable marking loads Impossible to provide immediate feedback leading to fossilisation of errors CALRG 2012 - dmw

The LISC solution: developed by Ali Fowler Independently practise sentence translation Receive immediate (and robust) feedback on all errors Attend immediately to the feedback (before fossilisation can occur) A CALL system designed to enable students to: CALRG 2012 - dmw

How is the final mark arrived at in the LISC System? Best to give more weight to the first attempt since this ensures that students give careful consideration to the construction of their first answer but can improve their mark by refining the answer The marks ratio can vary (depending on assessment/feedback type) the more information given in the feedback, the lower the weight the second mark should carry The two submissions are unequally weighted CALRG 2012 - dmw

Heuristics for the final mark If the ratio is skewed too far in favour of the first attempt… ..students are less inclined to try hard to correct non-perfect answers If the ratio is skewed too far in favour of the second attempt… ..students exhibit less care over the construction of their initial answer CALRG 2012 - dmw

Free text entry McFeSPA system (Kochakornjarupong & Brna, 2010) IAT (Jordan & Mitchell, 2009) Open Comment (Whitelock & Watt, 2008) http://kn.open.ac.uk/public/document.cfm?docid=11638 McFeSPA system (Kochakornjarupong & Brna, 2010) Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments Support tool for semi-automated marking and scaffolding of feedback CALRG 2012 - dmw

McFeSPA system Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments Support tool for semi-automated marking and scaffolding of feedback Findings that feedback model would be helpful in training tutors .... Similar to Open Comment findings CALRG 2012 - dmw

Audio Feedback (Middleton & Nortcliffe, 2010) Timely and meaningful Manageable for tutors to produce and the learner to use Clear in purpose, adequately introduced and pedagogically embedded Technically reliable and not adversely determined by technical constraints or difficulties Targeted at specific students, groups or cohorts, addressing their needs with relevant points in a structured way Produced within the context of local assessment strategies and in combination, if appropriate, with other feedback methods using each medium to good effect Brief, engaging and clearly presented, with emphasis on key points that demand a specified response from the learner Of adequate technical quality to avoid technical interference in the listener’s experience Encouraging, promoting self esteem Formative, challenging and motivational CALRG 2012 - dmw

Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognises existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT A d v i c e f o r A c t i o n Elliott’s characteristics of Assessment 2.0 activities CALRG 2012 - dmw

Creating teaching and learning dialogues: towards guided learning supported by technology Learning to judge Providing reassurance Providing a variety of signposted routes to achieve learning goals CALRG 2012 - dmw

Key Messages Effective regular, online testing can encourage student learning and improve their performance in tests (JISC, 2008) Automated marking can be more reliable than human markers and there is no medium effect between paper and computerized exams (Lee and Weerakoon, 2001) The success of assessment and feedback with technology-enhancement lies with the pedagogy rather than the technology itself; technology is an enabler (Draper, 2009) CALRG 2012 - dmw

Key Messages 2 Technology-enhanced assessment is not restricted to simple questions and clear-cut right and wrong answers, much more sophisticated questions are being used as well (Whitelock & Watt, 2008) The design of appropriate and constructive feedback plays a vital role in the success of assessment, especially assessment for learning (Beaumont, O’Doherty & Shannon, 2008) CALRG 2012 - dmw

Key Messages 3 Staff development essential to the process (Warburton, 2009) Prepare students to take the assessments that use technology enhancement by practicing with similar levels of assessment using the same equipment and methods (Shepherd et al, 2006) The reports generated by many technology- enhanced assessment systems are very helpful in checking the reliability and validity of each test item and the test as a whole can be checked on commercial systems (McKenna and Bull, 2000) CALRG 2012 - dmw

References Beaumont, C., O’Doherty, M., and Shannon, L. (2008). Staff and student perceptions of feedback quality in the context of widening participation, Higher Education Academy. Retrieved May 2010 from: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/Beaumont_Final_Report.pdf. Draper, S. (2009). Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2), 285-293. JISC, HE Academy, and ALT (2008). Exploring Tangible Benefits of e-Learning. Retrieved in May 2010 from http://www.jiscinfonet.ac.uk/publications/info/tangible-benefits-publication. Lee, G. and Weerakoon, P. (2001). The role of computer-aided assessment in health professional education: a comparison of student performance in computer-based and paper-and-pen multiple-choice tests. Medical Teacher, Vol 23, No. 2, 152 - 157. McKenna, C. and Bull, J. (2000). Quality assurance of computer-assisted assessment: practical and strategic issues. Quality Assurance in Education. 8(1), 24-31. CALRG 2012 - dmw

References 2 Middleton, A. and Nortcliffe, A. (2010). Audio feedback design: principles and emerging practice. In D.Whitelock and P.Brna (eds), Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.208-223. Shephard, K., Warburton, B., Maier, P. and Warren, A. (2006). Development and evaluation of computer-assisted assessment in higher education in relation to BS7988. Assessment & Evaluation in Higher Education, 31: 5, 583 — 595. Strang, K.D. (2010). Measuring self regulated e-feedback, study approach and academic outcome of multicultural university students. In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.239-255. Warburton, B. (2009). Quick win or slow burn: modelling UK HE CAA uptake. Assessment & Evaluation in Higher Education, 34: 3, 257 — 272. Whitelock, D. and Watt, S. (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3, September 2008, pp.153–156, Routledge, Taylor & Francis Group. ISSN 1743-9884 CALRG 2012 - dmw

Four Assessment Special Issues Whitelock, D. and Warburton, W. (In Press). Special Issue of International Journal of e-Assessment (IJEA) ‘Computer Assisted Assessment: Supporting Student Learning’ Brna, P. & Whitelock, D. (Eds.) (2010). Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, ‘Focusing on electronic feedback: Feasible progress or just unfulfilled promises?’, Volume 2, No. 2 Whitelock, D. (Ed.) (2009). Special Issue on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2 Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e- assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3 CALRG 2012 - dmw