Download presentation
Presentation is loading. Please wait.
1
Being brave with module design
Session 5: Evaluation
2
Which aspects of teaching / learning have you evaluated?
Impact on understanding and meet objectives, enjoyment and suggestions, Delivery & design, usefulness of session, skills development, formative summative and presentational
3
How do you check ongoing performance throughout a module?
Face to face supervision, surveys and quiz tools, lecture questioning, group activities, homework, formative critical analysis in discussion, presentations How have you evaluated your use of technology to enhance learning? Standard feedback form, feedback, comparison with the sector, discussions with colleagues…
4
Challenges if gathering effective evaluation of a module / program?
Cultural differences, learning styles, difficulties with summative ?assessment?, asking the right questions, timing, response rates, honesty, convenience of data, monitoring students is time consuming, validity, responding to issues raised.
5
What do you hope to gain from the session?
Designing evaluation in – identify key areas for QE What can technology / VLE offer? Different ways to evaluate student experience Question design Rapid data collection and response
6
Session Outline 1. ‘Designing in’ evaluation
- Building a continuous process of improvement 2. Defining your approach - Principles & practical considerations 3. Data collection methods - Identifying methods and developing a plan 4. Making sense of your evaluation data - Actions & next steps in course development
7
PROGRAMME AIMS CONTEXT & CONSTRAINTS Learning Aims & outcomes
Resources Learning activities Assessment Evaluation CONTEXT & CONSTRAINTS
8
Defining your approach
Why evaluate? What is your purpose? Diagnostic; formative; summative What should be evaluated? What’s your focus? Engagement and activity Appropriateness of the technology Pedagogic effectiveness of design: interrelationship between online & class-based elements How will the evaluation be conducted? What methods will you use? Explicit measures Indirect & embedded methods
9
Principles for course evaluation
Outcome-based: focusing on measurable & objective standards Were the course objectives met (e.g. levels of engagement & patterns of use of online resources)? Did learners reach the targeted learning outcomes (e.g. approaches to learning; levels of understanding)? Interpretive: focusing on context (perceptions of the learning experience) What were the students’ affective and attitudinal responses to the blended course experience? How were the e-learning tools used by students to support their learning in formal & informal study activities? How did the lecturer/tutors perceive students’ learning relative to previous performance? (What actions should be taken for future course development?)
10
Data collection methods
Entry & exit surveys (Informal progress checks) Contribution statistics Tools for reflection Course statistics Focus group interviews
11
Evaluation Pathway Role Start Course Delivery End Post Course
Online Activity Online Activity Online Activity Class Sessions Class Sessions Class Sessions Class Sessions Feedback on performance Feedback on performance Feedback on performance Role Start Course Delivery End Post Course Instructor Entry Survey Feedback on performance Exit Survey Students Task performance and self reflection System Course statistics & contribution histories Researcher Content analysis Focus Group
12
Data collection methods – entry / exit survey
13
Data collection methods – entry / exit survey
14
Data collection methods: informal progress checks
Clickers / Discussion board / Polls: highlight “areas of greatest uncertainty “Models of writing” – Education Consolidation of learning outcomes “Britons at work” – English Engagement with a theme
15
Data collection methods: Tools for reflection, contribution analysis
16
Categories of cognitive skills
and examples from the weekly blogs Characteristic of cognitive skill Example from blog posts Offering resources This case relates to cases of master and servant, these principles apply equally to directors serving the company under express or implied contracts of service, and who are therefore also employees (Dranez Anstalt v. Zamir Hayek,) Making declarative statements I cannot understand the reason, you mentioned, that the UCTA may not apply to this case. LC is not of course a consumer, but M is a relevant consumer. Supporting positions on issues Once Ackerman heard from the inside information from his father in law, he would be as insider under s. 118B (e) of FSMA because he has information “which he has obtained by other means which he could be reasonable expected to know is inside information”. Therefore his action to sell his share of SAH would be dealt with as insider dealing. Adding examples The offence of insider dealing can be committed in 3 ways. If an insider: deals in price-affected securities, when in possession of inside information, s.52(1) CJA 1993 encourages another to deal in price-affected securities, when in possession of inside information, s.52(2)(a) CJA 1993, or discloses inside information other than in the proper performance of his employment or profession, s.52(2)(b) CJA 1993. Framework based on Fox and MacKeogh’s 16 categories of cognitive thinking: Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?', Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134 and examples from the weekly blogs Framework based on Fox and MacKeogh’s 16 categories of cognitive thinking: Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?', Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134
17
Data collection methods: Contribution stats, Wiki participation proxy indicator
18
Data collection methods: Course Reports
Course reports on Activity in Content Areas Activity in Forums Activity in Groups
19
Data collection methods: Early Warning System
Create rules based on: Last access Grade Due date Students are listed with option to send notifications
20
Data collection methods: Performance Dashboard
21
Data collection methods: Turnitin QuickMark breakdown
22
Developing your evaluation plan
Plan before course starts Embed in overall design of course (reflecting learning objectives) Inform students about evaluation (if participation required) Your plan should consider: Aims & focus of evaluation Key questions Stakeholders Time scales & dependencies Instruments & methods Adapted from Jara et al. (2008) Evaluation of E-Learning Courses
23
Challenges in interpreting your data
Student engagement Survey fatigue Reliability: halo/horns effect Validity Visibility of student learning Context of student learning
24
Reflection on action : Defining next steps
Was the course design fit for purpose? Usefulness / engagement patterns for online components of module Complementary nature of class-based & online activities Relevance of assessment plan Sequencing of tasks Were the course materials suited for the online tasks? Levels of learning / differentiation & accessibility Was instructional support adequate, enabling & timely? Instructions, feedback and support
25
Summary Course delivery as a development cycle Design :
Pedagogic aims; design model; course testing; delivery & evaluation plans Deliver : Socialise; support; sustain; sum up student learning. Evidence collection as a feature of course delivery Evaluate : Establish holistic view of student learning – employing outcome focused & interpretive research methods Review : Reflection on action – defining next steps
26
References and recommended reading
Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?' Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134 Gunawardena, C., Lowe, C. & Carabajal, K. (2000). Evaluating Online Learning: models and methods. In D. Willis et al. (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2000 (pp ). Chesapeake, VA: AACE. Jara, M., Mohamad, F., & Cranmer, S. (2008). Evaluation of E-Learning Courses. WLE Centre Occasional Paper 4. Institute of Education, University of London.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.