Download presentation
Published byAngelina Newton Modified over 9 years ago
1
Online Course Design Jennifer Freeman ACADEMIC ■ IMPRESSIONS
EVALUATION This session will focus on the formative, summative, and confirmative evaluation of online instructional materials. Methods and tools for defining, quantifying, and measuring quality in online courses will be presented. Participants will use provided templates, tools, and rubrics to develop a sample evaluation plan and critique an online course.
2
Evaluation: Session Goals
Understand the difference between assessment and evaluation Define formative, summative and confirmative evaluation and understand the importance of each Explore theories and methods of evaluation Design a basic revision plan
3
Evaluation vs. Assessment
Evaluation: measuring the quality and effectiveness of learning materials and activities Assessment: measuring students’ learning and achievement of goals and objectives
4
Course Evaluation: Why?
Fix things that are broken Ensure learning outcomes are being achieved Discover causes for failures…proactively fix other problems Discover potential usability/accessibility issues What works in theory doesn’t always work in practice Constant maintenance and improvements to content and strategies Dynamic nature of online learning Building a house metaphor
5
What Do We Evaluate? Instructional materials’ alignment with objectives and the effectiveness of testing instruments Quality of instructional materials Quality of external resources Effectiveness of instructional strategies Usability of tools and technology Effectiveness of teaching skills Evaluation chart Morrison, Ross and Kemp (2004) Evaluate objectives: are we teaching and assessing what we said we would? Evaluate content: everything from no typos present to factually correct content to aesthetically pleasing materials Evaluate instructional support: are students getting the support they need? (library, help desk, tutorials) Is external content reliable and of good quality? Evaluate the teaching strategies: did that group project work out the way we would thought it would? Evaluate the tools: how easy were they for students to use? Did the chat tool work? Was the flash exercise too complicated? Evaluate communication: was the level of interaction and communication appropriate?
6
Formative Evaluation of Instructional Materials
Why? Uncover problems early on; fix broken stuff (hopefully before students find it) Identify potential usability/accessibility issues Examine effectiveness and improve functionality Dynamic nature of online learning What? When? An ongoing process, usually done both during development and while being taught Asks the question, “How are we doing?” Find all of the typos, errors, broken links, gaps in content Test for problems that may arise for students with disabilities What looked good on the storyboard may not work well in practice New technologies and teaching theories surface every day in a field as you as online learning
7
Formative Evaluation of Instructional Materials
Who will use this evaluation information? Course development team Instructor How? What should be evaluated? Instructional materials Instructional strategies Use of tools and technology Keep the audience in mind when designing your evaluation plan and writing the questions you will ask…who will be interested in the evaluation feedback at this stage? Probably will have limited student feedback on instructional support at this point
8
Formative Evaluation: Questions to Ask
Do learning activities and assessments align with the learning objectives? Do learning materials meet quality standards? Are learning materials error-free? Are learning materials accessible? Are learning materials usable? Are the technology tools appropriate and working properly? Are you able to draw a clear correlation to a learning goal for each course activity? Is each learning objective represented by content, activities and assessment? Focus on the instructional materials Focus on the tools – test them – are they working and will students be able to use them properly? Are further instructions or documentation needed for any tool being used? (LMS messaging, chat, discussion areas, assignment tool, tests, etc.)
9
Formative Evaluation: Gathering data
Course development rubrics Checklists Focus group feedback For consistency and repeatability To reduce bias, evaluation should not be done by members of the course development team…you need a “fresh pair of eyes” Develop a testing process/protocol/list of questions/evaluation forms Determine a standardized, set time for testing so it occurs regularly as a normal part of the process Help desk error logs Student FAQ discussion threads “Extra credit for errors found” idea Faculty notes jotted down during semester Facilitator note to self: Focus group feedback (think eportfolio project) and Q/A reviews Use of rubrics and checkpoints assigned to an experienced ID/developer who didn’t work on the course Focus group of potential users should test every part of the course (freelance reviewers) During the first semester a course is being taught, keep a close watch on the error logs from the help desk. Keep track of problem areas and note improvements to be made Let your students help!
10
course is released 10
11
Sample Formative Evaluation Methods
Checkpoint #1 – syllabus, outline and first lesson Checkpoint #2 – half of the course, viewed on multiple platforms Checkpoint #3 – Entire course proofread/edited Checkpoint #4 – Entire course Q/A checked Checkpoint #5 – Final check (previous errors) Student survey after first three lessons Instructor survey after first three lessons Examination of help desk error logs During development Show sample checkpoint rubrics, checklists and surveys - resource (show Firefox link checker) NEED TO IDENTIFY THESE Course is Live
12
Summative Evaluation Why? What? When? Examine effectiveness
Improve functionality Discover causes for failures…fix existing problems What works in theory doesn’t always work in practice Constant maintenance and improvements to content and strategies…the dynamic nature of online learning What? When? Usually done after the completion of each semester Asks the question, “How did we do?” First semester is complete, now have student data and feedback available to identify problems not discovered earlier. “The first semester is like the first pancake.” Dynamic nature of teaching with technology: what improvements have been made that we can take advantage of? What new research is available?
13
Summative Evaluation Who will use this evaluation information?
Instructor Course development team Administration How? What should be evaluated? effectiveness of instructional materials and strategies the learning environment the instructor’s teaching skills availability and ease of use of tools and technology instructor satisfaction with the online teaching experience student satisfaction with the online learning experience Keep the audience in mind…what questions are they interested in having answered? In addition to the areas evaluated during the formative stage, we can now take a look at the areas measured by student success and feedback: the “feel” of the learning environment, the instructor’s skill in teaching online, the usability of the content and teaching tools; as well as the instructor’s opinions on the experience
14
Summative Evaluation: Questions to Ask
Did the students succeed? (grades) Did the learning activities and assessments align with the learning objectives? Were assignments and assessment appropriate to the content? Was time adequate to convey material and complete tasks? Level of instructor and student satisfaction (participation and opinion) Were learning materials easy to use and accessible? What content did students frequently have problems with? What areas of the course are error-prone? Were there any concerns about motivation? What tools did the instructor or students frequently have problems with? Should we continue to use chosen tools? Are program/department needs being met accreditations, prerequisites for other courses, competencies Is the course scaleable? How were students grades? How was the level of participation, according to student opinion? According to instructor opinion? Compared with other courses? How did students feel about the assignments and assessments? How did the instructor feel? What was the rate of success/completion of the assignments and assessments compared to other courses? What were the instructor’s and students’ opinions about time requirements? (This is a big one. Almost every online course attempts to do too much the first time around.) Other student opinions: content? Environment? Level of communication? Tools/technology? Re-visit the help desk logs…where were the problems?
15
Summative Evaluation: Gathering data
Student grades Student surveys Instructor satisfaction surveys Learner self-assessments Pretest/posttest comparisons Assessment item analysis Focus group feedback Help desk error logs Discussion forum and chat archives
16
Confirmative Evaluation
Why? Discover long-term effectiveness of the course Address large-scale changes necessary to the curriculum Constant maintenance and improvements to technology, content and strategies…the dynamic nature of online learning What? When? Usually done some time after the completion of each semester Asks the question, “How are we doing now?” Curriculum appropriate within department (prerequisites for other courses, fit, etc.)? Continue using LMS? Budgetary
17
Confirmative Evaluation
Who will use the evaluation information? Instructor Administration How? What is being evaluated? Students’ long-term retention of learning, usefulness to their long-term goals Long term effectiveness of the course within the program LMS and other technology/tools
18
Confirmative Evaluation: Questions to Ask
Are program/department needs being met (accreditations, prerequisites for other courses, competencies) Trends in level of student satisfaction Course valuable/meaningful to long-term goals? (program/career) Is the course scaleable? Is the course sustainable? Learning environment, technology, tools still meeting our needs? LMS evaluation Major course redesign
19
Confirmative Evaluation: Gathering data
Program student surveys Departmental administrative opinions Faculty peer review of learning materials Employer surveys Retention data Help desk logs LMS effectiveness study / survey
20
Revision Resources: Sample Formative Evaluation and Revision Plan
Analysis of problems found How urgent is it? How long will it take to fix? Assign each issue a priority score Establish a threshold below which a course will be postponed Prioritized list of change requests…when’s the best time to revise? Assign corrections and establish a deadline for each Make note of unaddressed issues URGENT – will immediately affect a student’s grade (a broken test that must be taken by midnight) MAJOR – important content is unavailable (javascript mouseover function isn’t working correctly) MINOR – a typo in the instructor’s bio
24
Sample Summative Evaluation and Revision Plan
Analyze student and faculty surveys; identify themes or trends Analyze assessment Analyze help desk logs Examine course archives Compile list of issues (including issues noted during formative phase that have yet to be addressed) Research solutions Determine time needed to fix Assign priority ratings Assign tasks and establish deadlines Will vary, depending on how often, soon course is to be offered again resources available (release time/extra staff less likely to be forthcoming than during initial development) May have to limit revisions to urgent issues until time/resources are available
29
Revision Activity Music Appreciation
Ten modules, each includes lecture notes, one or more Powerpoint presentations with audio, a graded listening quiz and a synchronous chat requirement Project – “My Favorite Composer” biography assignment submitted as Word doc to instructor Midterm – multiple choice exam Final – multiple choice exam
30
Common LMS Evaluation Criteria
Costs rising at a reasonable rate? Are server space and maintenance needs being met? Is the vendor and software in compliance with required standards? How reliable has the system been? Have there been any security concerns? Level of customization possible within the system? Satisfied with the structure and presentation of courses? Satisfied with the authoring tools provided? Satisfied with the tracking capabilities of the system? Satisfied with the testing engine and/or assessment tools available in the system? Handout # 11 available in the system? • Are faculty, students and staff satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? • Are faculty, students and staff satisfied with the productivity tools (calendar, help files, search engine) provided through the system? • Is student / faculty / staff documentation or training sufficient? • How usable do students, faculty and staff find the tools? • What is the vendor’s reputation in the industry? • What is the vendor’s position in the industry?
31
Common LMS Evaluation Criteria
Satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? Satisfied with the productivity tools (calendar, help files, search engine) provided through the system? Is student / faculty / staff documentation or training sufficient? How usable do students, faculty and staff find the tools? What is the vendor’s reputation in the industry? What is the vendor’s position in the industry?
32
What We’ve Learned The difference between assessment and evaluation
Definitions of formative, summative and confirmative evaluation and the importance of each Methods of evaluation Sample evaluation and revision plans Will be built during presentation
33
Jennifer Freeman jfreeman@utsystem.edu
Will be built during presentation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.