Download presentation
Presentation is loading. Please wait.
Published bySharleen Tyler Modified over 9 years ago
1
Escape the Comparison Trap: An Alternative Approach to Evaluating Online Learning John Sener Sener Learning Services Learning & Training Week Conference Washington, DC, May 1, 2003
2
Comparisons: Moving Beyond “NSD,” What’s “Better” Comparisons = E-L/OL/DL f2f No significant difference: DL f2f + Establishes legitimacy + OK as basic generalization Not very useful beyond that Sometimes can be necessary evil in practice, but…
3
Problems with Using Comparisons (Quasi-) Experimental design is impractical –too many variables –“apples vs. amaroks” –control sometimes detrimental Based on faulty assumptions –Assumes non-existent uniformity of practice – Education, training are largely craft products, participatory experiences –Classroom/f2f is largely unproven too Limited utility of results –SD = so what? – due to external, uncontrollable factors –NSD = mediocrity – why aim low?
4
Moving Beyond Comparisons Evaluating e-learning in its own frame of reference –F2f practice not always applicable –E-learning has its own set of rules “Proving” e-learning by improving practice Documenting results is an improvement –E-L/OL/DL held to closer scrutiny, higher standards, often more stringently evaluated –“Trojan Horse” effect??
5
How to Develop Your Own Effective Evaluation Approach Identify your needs Find theoretical, practical support Develop key evaluation questions –based on needs –linked to theory, practice foundations Design, conduct your evaluation –Ibid. Show the connections
6
Case Study: Online Bioterrorism Courses Project Details –Funders: CDC, NYS Dept. of Health –Developer: Monroe County (NY) Health Alert Network (HAN) –Provider: Rochester Institute of Technology (RIT) Project Courses –Incident Command Response (ICS) –Agents, Treatments, and Protection for Health Care Workers –Management of Public Health Emergencies –What Is Public Health?
7
Identified Needs, Goals Enhance the skills and workforce competencies of local and state health department staff (“non-traditional responders”) and first responders. Determine whether instructor-led online learning is effective in meeting the needs of this particular audience.
8
Theoretical Foundation Characteristics of adult learners (various) Media attribute theory => contribution of key course attributes to learning (Smith & Dillon 1999) Focus on student characteristics that facilitate success (Diaz 2000) Student perceptions of interaction (Fulford and Zhang 1993) Social presence and learning community (Gunawardena 1995, Swan 2002, et al.)
9
Evaluation Approach Student success, learning effectiveness- centered (not delivery method-centered) Key evaluation questions: –Is instructor-led online learning effective? –What elements make it effective? –What improvements would make it more effective? Linking theory, design, practice
10
Key Course Features Delivery Format –Instructor-led, online learning => Instructor Role –Course management system (Prometheus) Student Characteristics –Motivated, busy, time-stressed adults –Voluntary learners –Some students w/low OL comfort level Learning Outcomes –“Just-in-case” job-related knowledge –Satisfactory OL comfort level –Satisfactory attitudes re preparedness level
11
Instructor Role Leadership, guidance Teaching, feedback Presence Overall interaction level
12
Course Interactivity and Learning Community Overt Interaction Covert Interaction Digital Interactivity Perceptions of Interaction Social Presence
13
Delivery Format and Media Attributes Learner Support Accessibility of Learning Resources Navigability Cueing Strategies
14
Student Characteristics and Learning Outcomes Student Characteristics: –Motivation level of voluntary learners –(Dis)Comfort level with online learning –Busy, time-constrained working adults Learning Outcomes: –Knowledge acquisition –Perceived preparedness –Overall student satisfaction
15
Evaluation Methods Pilot Course Survey –Conducted by course instructor within course –Findings and recommendations reviewed by project evaluator –Implemented in subsequent course offerings End-of-Course Online Surveys –Available online for each course –Findings and recommendations reviewed by project team; implemented subsequently In-Course Feedback Content Analysis of Course Discussions
16
Student Survey Design Focused questions related to specific course attribute Open-ended questions to obtain more qualitative feedback –Principal source of feedback for certain questions (prior motivation level, value of flexibility/convenience)
17
Effective Elements: Survey Results
18
Social Presence Content Analysis Performed on ICS course Results: low but not too worrisome –Possible reasons: course length, course content, target audience Students satisfied with other aspects of course + flexibility, convenience Tension between maximizing interaction and maximizing flexibility Planned improvements to optimize interaction opportunities
19
Summary Identify your needs Find theoretical, practical support Develop key evaluation questions –based on needs –linked to theory, practice foundations Design, conduct your evaluation –Ibid. Show the connections
20
Contact Info. And Discussion John Sener, Sener Learning Services jsener@senerlearning.com 301-754-0688 Discussion: –What approach has worked for you? What hasn’t? –What was missing here? –What do you have a differing opinion about?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.