Being brave with module design

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
INACOL National Standards for Quality Online Teaching, Version 2.
Standards and Guidelines for Quality Assurance in the European
Case Study Methodology & e-Learning: Reflections on Evaluation Activities for Blended Modules Richard Walker & Wendy Fountain University of York.
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
Delivering your blended course Richard Walker E-Learning Development Team University of York Preparing, supporting & evaluating student learning.
Asynchronous Discussions and Assessment in Online Learning Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous Discussions and Assessment in.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Jenni Parker, Dani Boase-Jelinek Jan Herrington School of Education Murdoch University Western Australia.
Peer Review: Promoting a quality culture Associate Professor Gordon Suddaby & Associate Professor Mark Brown Massey University New Zealand Contact details:
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Blended Learning Workshop STRATEGIES FOR ASSURING THE QUALITY OF A BLENDED COURSE Richard Walker E-Learning Development Team University of York.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Richard Walker E-Learning Development Team University of York Preparing, supporting & evaluating student learning Delivering your blended course.
E VALUATION PLAN FOR COURSE DESIGN Embedding Literacy and Numeracy Module.
Blended Problem- Based Learning University of York, UK Dr Richard Walker Designing collaboration opportunities for unguided group research through the.
Effecting institutional change through the evaluation of e-learning Richard Walker & Rose Papworth E-Learning Development Team, University of York eLearning.
ENHANCING QUALITY IN ONLINE LEARNING Nadeosa Conference Durban University of Technology 8-9 July 2015 Dr Ephraim Mhlanga.
Using blended learning to engage students: embedding employability awareness and career development skills learning in the curriculum Career Development.
Making wikis work How do we create the conditions for effective collaborative learning? Richard Walker & Wayne Britcliffe E-Learning Development Team,
2 What’s in this presentation? We are seeking the board’s approval and advice on beginning a programme of work on culture and leadership across our trust.
Learning Gain in Active Citizenship Funded by the Higher Education Academy (HEA) Dr. Mary Deane, Senior Lecturer in Education Oxford Brookes University.
Dr Camille B. Kandiko King’s College London
Looking at Our School 2016 A Quality Framework for Post-Primary Schools A tool to support reflection, self-review and evaluation ETBI PRINCIPALS AND DEPUTY.
Ways of doing Needs Assessment
Subject specialist mentoring on the DET
The EQAVET Framework – supporting quality and relevance of VET
ELDT lunchtime webinar series
Instructional Design Groundwork:
Making Practice Visible: The Impact of the FdA in Early Years
NEEDS ANALYSIS.
Staff meeting Monday 20th February 2017
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Introduction and Overview
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Introduction and Overview
Evaluation Emma King.
I love portfolio! Nelly Zafeiriadou MA, EdD ELT School Advisor
Teaching and Learning with Technology
ASSESSMENT OF STUDENT LEARNING
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
WELCOME How do we define assessment?.
Learning Gain: Evaluation, Evidence and Enhancement
© Copyright Showeet.com ORAL PRESENTATION Nº1 Subject: Curriculum Evaluation Date: May 11 th, 2018 Cycle: VI Topic: Unit 1: Evaluation and Innovation and.
Implementing Research-Informed Assessment Feedback Practice
Governance and leadership roles for equality and diversity in Colleges
12/5/2018 2:31 AM Value Added: The Benefits of Enhancing Program Assessment Using Indirect Methods © 2007 Microsoft Corporation. All rights reserved. Microsoft,
TKES and SLO assessments
Student Assessment and Evaluation
Learning gain metrics and personal tutoring: Opportunities and ethics
Learning gain metrics and personal tutoring: Opportunities and ethics
PGCE PCE Mentoring Training
Parent-Teacher Partnerships for Student Success
The Heart of Student Success
Support for English, maths and ESOL Module 8 Improving initial and diagnostic assessment for functional skills.
IDEA Student Ratings of Instruction
Keith Puttick Christine Harrison Judith Tillson
Assessment The purpose of this workshop / discussion is to extend further teachers’ understanding of the Department's Assessment Advice. This workshop.
Designing and delivering a learner centred curriculum
Chapter 4 Instructional Media and Technologies for Learning
Student Assessment and Evaluation
Evaluation Measures, Ongoing Improvements and Enhancement
Providing feedback to learners
Univ. Prof Dr Viktor Jakupec
Accountability and Assessment in School Counseling
High quality CPD for Early Career Teachers
Presentation transcript:

Being brave with module design Session 5: Evaluation

Which aspects of teaching / learning have you evaluated? Impact on understanding and meet objectives, enjoyment and suggestions, Delivery & design, usefulness of session, skills development, formative summative and presentational

How do you check ongoing performance throughout a module? Face to face supervision, surveys and quiz tools, lecture questioning, group activities, homework, formative critical analysis in discussion, presentations How have you evaluated your use of technology to enhance learning? Standard feedback form, e-mail feedback, comparison with the sector, discussions with colleagues…

Challenges if gathering effective evaluation of a module / program? Cultural differences, learning styles, difficulties with summative ?assessment?, asking the right questions, timing, response rates, honesty, convenience of data, monitoring students is time consuming, validity, responding to issues raised.

What do you hope to gain from the session? Designing evaluation in – identify key areas for QE What can technology / VLE offer? Different ways to evaluate student experience Question design Rapid data collection and response

Session Outline 1. ‘Designing in’ evaluation - Building a continuous process of improvement 2. Defining your approach - Principles & practical considerations 3. Data collection methods - Identifying methods and developing a plan 4. Making sense of your evaluation data - Actions & next steps in course development

PROGRAMME AIMS CONTEXT & CONSTRAINTS Learning Aims & outcomes Resources Learning activities Assessment Evaluation CONTEXT & CONSTRAINTS

Defining your approach Why evaluate? What is your purpose? Diagnostic; formative; summative What should be evaluated? What’s your focus? Engagement and activity Appropriateness of the technology Pedagogic effectiveness of design: interrelationship between online & class-based elements How will the evaluation be conducted? What methods will you use? Explicit measures Indirect & embedded methods

Principles for course evaluation Outcome-based: focusing on measurable & objective standards Were the course objectives met (e.g. levels of engagement & patterns of use of online resources)? Did learners reach the targeted learning outcomes (e.g. approaches to learning; levels of understanding)? Interpretive: focusing on context (perceptions of the learning experience) What were the students’ affective and attitudinal responses to the blended course experience? How were the e-learning tools used by students to support their learning in formal & informal study activities? How did the lecturer/tutors perceive students’ learning relative to previous performance? (What actions should be taken for future course development?)

Data collection methods Entry & exit surveys (Informal progress checks) Contribution statistics Tools for reflection Course statistics Focus group interviews

Evaluation Pathway Role Start Course Delivery End Post Course Online Activity Online Activity Online Activity Class Sessions Class Sessions Class Sessions Class Sessions Feedback on performance Feedback on performance Feedback on performance Role Start Course Delivery End Post Course Instructor Entry Survey Feedback on performance Exit Survey Students Task performance and self reflection System Course statistics & contribution histories Researcher Content analysis Focus Group

Data collection methods – entry / exit survey

Data collection methods – entry / exit survey

Data collection methods: informal progress checks Clickers / Discussion board / Polls: highlight “areas of greatest uncertainty “Models of writing” – Education Consolidation of learning outcomes “Britons at work” – English Engagement with a theme

Data collection methods: Tools for reflection, contribution analysis

Categories of cognitive skills and examples from the weekly blogs Characteristic of cognitive skill Example from blog posts Offering resources This case relates to cases of master and servant, these principles apply equally to directors serving the company under express or implied contracts of service, and who are therefore also employees (Dranez Anstalt v. Zamir Hayek,) Making declarative statements I cannot understand the reason, you mentioned, that the UCTA may not apply to this case. LC is not of course a consumer, but M is a relevant consumer. Supporting positions on issues Once Ackerman heard from the inside information from his father in law, he would be as insider under s. 118B (e) of FSMA because he has information “which he has obtained by other means which he could be reasonable expected to know is inside information”. Therefore his action to sell his share of SAH would be dealt with as insider dealing. Adding examples The offence of insider dealing can be committed in 3 ways. If an insider: deals in price-affected securities, when in possession of inside information, s.52(1) CJA 1993 encourages another to deal in price-affected securities, when in possession of inside information, s.52(2)(a) CJA 1993, or discloses inside information other than in the proper performance of his employment or profession, s.52(2)(b) CJA 1993. Framework based on Fox and MacKeogh’s 16 categories of cognitive thinking: Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?', Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134 and examples from the weekly blogs Framework based on Fox and MacKeogh’s 16 categories of cognitive thinking: Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?', Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134

Data collection methods: Contribution stats, Wiki participation proxy indicator

Data collection methods: Course Reports Course reports on Activity in Content Areas Activity in Forums Activity in Groups

Data collection methods: Early Warning System Create rules based on: Last access Grade Due date Students are listed with option to send notifications

Data collection methods: Performance Dashboard

Data collection methods: Turnitin QuickMark breakdown

Developing your evaluation plan Plan before course starts Embed in overall design of course (reflecting learning objectives) Inform students about evaluation (if participation required) Your plan should consider: Aims & focus of evaluation Key questions Stakeholders Time scales & dependencies Instruments & methods Adapted from Jara et al. (2008) Evaluation of E-Learning Courses

Challenges in interpreting your data Student engagement Survey fatigue Reliability: halo/horns effect Validity Visibility of student learning Context of student learning

Reflection on action : Defining next steps Was the course design fit for purpose? Usefulness / engagement patterns for online components of module Complementary nature of class-based & online activities Relevance of assessment plan Sequencing of tasks Were the course materials suited for the online tasks? Levels of learning / differentiation & accessibility Was instructional support adequate, enabling & timely? Instructions, feedback and support

Summary Course delivery as a development cycle Design : Pedagogic aims; design model; course testing; delivery & evaluation plans Deliver : Socialise; support; sustain; sum up student learning. Evidence collection as a feature of course delivery Evaluate : Establish holistic view of student learning – employing outcome focused & interpretive research methods Review : Reflection on action – defining next steps

References and recommended reading Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?' Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134 Gunawardena, C., Lowe, C. & Carabajal, K. (2000). Evaluating Online Learning: models and methods. In D. Willis et al. (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2000 (pp. 1677-1684). Chesapeake, VA: AACE. Jara, M., Mohamad, F., & Cranmer, S. (2008). Evaluation of E-Learning Courses. WLE Centre Occasional Paper 4. Institute of Education, University of London. http://www.wlecentre.ac.uk/cms/files/occasionalpapers/evaluation_of_online_courses_25th.pdf