Download presentation
Presentation is loading. Please wait.
Published byKarola Diefenbach Modified over 5 years ago
1
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Development and Deployment of a Web-Based Course Evaluation System Jesse Heines and David Martin Dept. of Computer Science Univ. of Massachusetts Lowell Miami, Florida, May 26, 2005 Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
2
The All-Important Subtitle
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Trying to satisfy ... the Students the Administration the Faculty and the Union presented in a slightly different order from that listed in the paper Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
3
The All-Important Subtitle
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
4
The All-Important Subtitle
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
5
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Thus, virtually all students present that day fill them out However, absentees never fill them out Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
6
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed At best, Chairs “look them over” to get a “general feel” for students’ reactions Professors simply don’t bother with them lack of interest and/or perceived importance simple inconvenience of having to go get them and wade through the raw forms Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
7
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input because those comments are often ... downright illegible so poorly written that it’s simply too difficult to try to make sense of them Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
8
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input However, these comments have the greatest potential to provide real insight into the classroom experience Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
9
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input However, these comments have the greatest potential to provide real insight Bottom Line #1: The paper-based system pays little more than lip service to the cry for accountability in college teaching Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
10
Paper-Based System Reality
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Bottom Line #2: We’re all already being evaluated online whether we like it or not ... Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
11
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
12
Web-Based System Goals
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Web-Based System Goals Collect data in electronic format Easier and faster to tabulate More accurate analysis Possibility of generating summary reports Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
13
Web-Based System Goals
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Web-Based System Goals Collect data in electronic format Easier and faster to tabulate More accurate analysis Possibility of generating summary reports Retrieve legible free-form responses Allow all students to complete evaluations anytime, anywhere, at their leisure, and even if they miss the class in which the evaluations are distributed Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
14
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 What We Thought If we build it, they will come ... ... but we were very wrong! Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
15
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Student Issues Maintain anonymity Ease of use Speed of use Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
16
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Student Issues Maintain anonymity Ease of use Speed of use We guessed wrong on the relative priorities of these issues. Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
17
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Student Issues Our main concern: Prevent students from “stuffing the ballot box” One Student = One Survey Submission Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
18
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Student Issues Our main concern: Prevent students from “stuffing the ballot box” One Student = One Survey Submission Major concern that appeared after the system was deployed: Simply getting students to participate There appeared to be a great deal of apathy, particularly in non-technical courses Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
19
Student Login Evolution
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Fall 2003 Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
20
Student Login Evolution
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
21
Student Login Evolution
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
22
Student Login Evolution
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
23
Administration Issues
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Administration Issues System quality and integrity “Buy in” from the deans But the real issue was ... Dealing with the faculty union Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
24
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Issue #1 Control of which courses are evaluated Contract wording: “The evaluation will be conducted in a single section of one course per semester At the faculty member’s option, student evaluations may be conducted in additional sections or courses.” Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
25
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Union Issue #1 In 2004, all surveys were “turned on” by default, that is, they were all accessible to students on the Web This was a breach of the contract clause stating that “evaluation will be conducted in a single section of one course” In 2005, the default is inaccessible Use of the system thus became voluntary As of May 20, 2005 (end of final exams), 95 professors (25% of the faculty) in 40 departments had made 244 course surveys accessible to students Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
26
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
27
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Issue #2 Control of what questions are asked Contract wording: “Individual faculty members in conjunction with the Chairs/Heads and/or the personnel committees of academic departments will develop evaluation instruments which satisfy standards of reliability and validity.” Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
28
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Union Issue #2 In 2004, deans could set questions to be asked on all surveys for their college This was a breach of the contract clause stating that faculty would develop questions “in conjunction with the Chairs/Heads and/or department personnel committees” In 2005, all college-level questions are now at the department level so that only Chairs can specify required questions Deans then had essentially no access to the system unless they were teaching themselves or were the acting chair of a department Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
29
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
30
Faculty Question Editor
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Question Editor Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
31
Faculty Question Editor
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Question Editor Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
32
Faculty Add Question Form
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Add Question Form Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
33
Survey as Seen by Students
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Survey as Seen by Students Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
34
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Issue #3 Control of who sees the results Contract wording: “Student evaluations shall remain at the department level. At the faculty member’s option, the faculty member may submit student evaluations or a summary of their results for consideration by various promotion and tenure review committees. The faculty member shall become the sole custodian of these student evaluations at the end of every three academic years and shall have the exclusive authority and responsibility to maintain or destroy them.” Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
35
Results as Seen by Faculty
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Results as Seen by Faculty Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
36
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Union Issue #3 Data was collected without faculty consent This was a breach of the contract clause stating that “student evaluations shall remain at the department level” All survey response data for the Fall 2004 semester were deleted on February 15, 2005, unless the faculty member explicitly asked that it be kept What’s going to happen with this semester’s data has not yet been determined Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
37
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
38
Lessons Learned/Confirmed
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Lessons Learned/Confirmed No matter what you do, there will be those who object You must remain open-minded and flexible Practice good software engineering so that the software can be easily modified It’s really worth it to work with the many power factions to garner support Every system needs a “champion” Be prepared to spend a huge amount of time on system support Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
39
Support, Support, Support
Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Support, Support, Support Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
40
Development and Deployment of a Web-Based Course Evaluation System
WebIST, Miami, FL, May 26, 2005 Thank You Jesse M. Heines, Ed.D. David M. Martin, Ph.D. Dept. of Computer Science Univ. of Massachusetts Lowell Jesse Heines and David Martin { heines, dm Univ. of Massachusetts Lowell
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.