Development and Deployment of a Web-Based Course Evaluation System

Slides:



Advertisements
Similar presentations
CLA RTP amendments 1. Align with December 10 vote to allow up to 2 members of same academic area to serve at different ranks 2. Specify that two members.
Advertisements

Institutional Course Evaluation Solution Faculty Senate Executive Committee September 12, 2012 Carol VanZile-Tamsen, Ph.D.; Associate Director, Office.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
Migrating Courses to an Online Format Lee Allen, Ed.D Assistant Professor, Instructional Design & Technology Special Assistant to the Instruction & Curriculum.
University of South Carolina Preparing for the Course Jamil A. Khan, Ph.D., P.E. Associate Professor Mechanical Engineering.
Evaluation of Student Learning, Department of Mathematics and Computer Science, Westmont College, Santa Barbara, CA Interpretation The.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
C.E.L.T Developing a Comprehensive Faculty Evaluation System Jeff Schlicht, Ph.D. Associate Professor Department of Health Promotion and Exercise Sciences.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
Academic Year.  Still working well 17 reports submitted, 1 missing  9 of 18 departments expressed concerns about assessment 4 departments reported.
November 15, :30-300pm. Blackboard Update Blackboard Update Combining of Tickets – Only for cross-listed classes this spring Combining of Tickets.
Academic & Professional Record (APR) Raúl Curto Executive Associate Dean College of Liberal Arts and Sciences.
Information Session for Applicants for Promotion to Professor Fall 2011 Faculty of Arts & Science – June 2011.
ASSESSMENT OF THE INDEPENDENT STUDY PATHWAY AT LECOM: STUDENT FEEDBACK Mark A.W. Andrews, Ph.D., Professor and Director, The Independent Study Pathway.
Welcome to Human Behavior and the Social Environment Tammy Nemeth, MSW, LSW.
Vice-President Academic & Provost Lunch & Learn Session on Faculty Advertising October 24, 2013 Susan Oestreich Office of the Vice-President Academic &
Retention, Tenure and Promotion College of Natural Sciences and Mathematics.
Faculty Evaluation for Online Learning Institutional Standards and Emerging Practices Ellen Hoffman Eastern Michigan University.
Welcome to Effective Writing I for Health Professional Majors Feel Free to Chat Before Seminar begins at 10pm.
JABSOM Annual Chair Reporting Instructions For Reporting Session: July 1 – June 30.
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Graduate Student Academic Services (GSAS) would like to present An introduction to GradPath.
Dr. Campbell Hime Welcome to Effective Writing 1- CM 107.
Welcome to Human Behavior and the Social Environment Deborah Smith, PhD.
Welcome to Effective Writing I for Business Majors Bal Abdul.
Domenick J Pinto Chairperson of Computer Science and Information Technology Sacred Heart University Fairfield CT February
Supplemental Instruction (SI) Spring 2016 Faculty Orientation January 13, :30-12:00 B-8.
Academic Misonduct 1. Definition: Any act that compromises the academic integrity of the University or the educational process. 2.
Overview 04/ Preparers prepare all certifiers * Enter all hours worked and leave usage for employees * Submit for final preparation 2.Signature.
Advanced Higher Computing Science
AP CSP: Identifying People with Data and The Cost of Free
Staying Organized Lesson style & delivery is at the teacher discretion, yet we ask that the content remain unchanged. Feel free to add in additional slides.
Prioritizing Online Course Preparedness Using PACE Matrix
Reappointment, Tenure and Promotion (RTP) Processes and Procedures
Graduate Student Academic Services would like to present
PeerWise Student Instructions
SurveyDIG 2.1 Tutorial.
Lecturer’s employee organization (leo)
Data Fabrication and Falsification
New and Improved Annual Reviews
Academic & Professional Record (APR)
Initial points: Five-year post-tenure review is stipulated by both NJ Statutes and the NJ-AFT Agreement; process is governed by MOA 99 5-year review.
Valley City State University
CTE Advisory Boards—Roles, Responsibilities, and Effective Practices
Curriculum at SCC and Role of the Senate Presented by Craig Rutan and Joyce Wagner SCC Academic Senate Fall 2013 Retreat.
ACADEMIC INTEGRITY TASK FORCE
BECOMING AN EXPERT IN SURVEY DESIGN By Ghania Zgheib EDIT 732
ASSESSMENT OF STUDENT LEARNING
The Good The Bad & The Ugly Real-Life Examples of the SLO Assessment Report Form With Tips on How to Complete It August 21, 2012.
Results of the Students’ Unions survey on the Student Evaluation of Teaching (SET) and Student Evaluation of Modules (SEM) Undergraduate and PGT results.
Elizabeth Lord Vice Provost for Academic Personnel
Designing a Research Package
asset: Academic Survey System & Evaluation Tool
Training for Faculty Reviewers
Preparing for Promotion and Annual Review August 22, 2018
Course Evaluation Ad-Hoc Committee Recommendations
National Student Survey 2019
What is different? Student Reactions Student demographics
Institutional Course Evaluation Solution
SLOs, Curriculum, and Other Things that Shape Your Classroom
Administrative Review Committee
Starting at a New College
Faculty & Staff Promotion and Tenure Workshop Monday, April 8, 2019
New Faculty Orientation Non-tenure-track Faculty Appointments
NOTICE! These materials are prepared only for the students enrolled in the course Distributed Software Development (DSD) at the Department of Computer.
Towson University Store
New Faculty Orientation
Training for Faculty Reviewers
Professional Learning Activities Funding Requests
Tenure and Promotion: Crossing the Finish Line
Presentation transcript:

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Development and Deployment of a Web-Based Course Evaluation System Jesse Heines and David Martin Dept. of Computer Science Univ. of Massachusetts Lowell Miami, Florida, May 26, 2005 Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

The All-Important Subtitle Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Trying to satisfy ... the Students the Administration the Faculty and the Union presented in a slightly different order from that listed in the paper Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

The All-Important Subtitle Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

The All-Important Subtitle Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 The All-Important Subtitle Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Thus, virtually all students present that day fill them out However, absentees never fill them out Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed At best, Chairs “look them over” to get a “general feel” for students’ reactions Professors simply don’t bother with them lack of interest and/or perceived importance simple inconvenience of having to go get them and wade through the raw forms Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input because those comments are often ... downright illegible so poorly written that it’s simply too difficult to try to make sense of them Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input However, these comments have the greatest potential to provide real insight into the classroom experience Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Distributed and filled out in classrooms Collected but not really analyzed Lose valuable free-form student input However, these comments have the greatest potential to provide real insight Bottom Line #1: The paper-based system pays little more than lip service to the cry for accountability in college teaching Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Paper-Based System Reality Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Paper-Based System Reality Bottom Line #2: We’re all already being evaluated online whether we like it or not ... Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Web-Based System Goals Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Web-Based System Goals Collect data in electronic format Easier and faster to tabulate More accurate analysis Possibility of generating summary reports Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Web-Based System Goals Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Web-Based System Goals Collect data in electronic format Easier and faster to tabulate More accurate analysis Possibility of generating summary reports Retrieve legible free-form responses Allow all students to complete evaluations anytime, anywhere, at their leisure, and even if they miss the class in which the evaluations are distributed Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 What We Thought If we build it, they will come ... ... but we were very wrong! Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Issues Maintain anonymity Ease of use Speed of use Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Issues Maintain anonymity Ease of use Speed of use We guessed wrong on the relative priorities of these issues. Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Issues Our main concern: Prevent students from “stuffing the ballot box” One Student = One Survey Submission Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Issues Our main concern: Prevent students from “stuffing the ballot box” One Student = One Survey Submission Major concern that appeared after the system was deployed: Simply getting students to participate There appeared to be a great deal of apathy, particularly in non-technical courses Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Student Login Evolution Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Fall 2003 Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Student Login Evolution Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Student Login Evolution Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Student Login Evolution Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Student Login Evolution Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Administration Issues Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Administration Issues System quality and integrity “Buy in” from the deans But the real issue was ... Dealing with the faculty union Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Issue #1 Control of which courses are evaluated Contract wording: “The evaluation will be conducted in a single section of one course per semester. ... At the faculty member’s option, student evaluations may be conducted in additional sections or courses.” Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Union Issue #1 In 2004, all surveys were “turned on” by default, that is, they were all accessible to students on the Web This was a breach of the contract clause stating that “evaluation will be conducted in a single section of one course” In 2005, the default is inaccessible Use of the system thus became voluntary As of May 20, 2005 (end of final exams), 95 professors (25% of the faculty) in 40 departments had made 244 course surveys accessible to students Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Issue #2 Control of what questions are asked Contract wording: “Individual faculty members in conjunction with the Chairs/Heads and/or the personnel committees of academic departments will develop evaluation instruments which satisfy standards of reliability and validity.” Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Union Issue #2 In 2004, deans could set questions to be asked on all surveys for their college This was a breach of the contract clause stating that faculty would develop questions “in conjunction with the Chairs/Heads and/or department personnel committees” In 2005, all college-level questions are now at the department level so that only Chairs can specify required questions Deans then had essentially no access to the system unless they were teaching themselves or were the acting chair of a department Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Faculty Question Editor Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Question Editor Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Faculty Question Editor Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Question Editor Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Faculty Add Question Form Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Add Question Form Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Survey as Seen by Students Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Survey as Seen by Students Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Issue #3 Control of who sees the results Contract wording: “Student evaluations shall remain at the department level. At the faculty member’s option, the faculty member may submit student evaluations or a summary of their results for consideration by various promotion and tenure review committees. The faculty member shall become the sole custodian of these student evaluations at the end of every three academic years and shall have the exclusive authority and responsibility to maintain or destroy them.” Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Results as Seen by Faculty Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Results as Seen by Faculty Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Union Issue #3 Data was collected without faculty consent This was a breach of the contract clause stating that “student evaluations shall remain at the department level” All survey response data for the Fall 2004 semester were deleted on February 15, 2005, unless the faculty member explicitly asked that it be kept What’s going to happen with this semester’s data has not yet been determined Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Faculty Menu Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Lessons Learned/Confirmed Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Lessons Learned/Confirmed No matter what you do, there will be those who object  You must remain open-minded and flexible Practice good software engineering so that the software can be easily modified It’s really worth it to work with the many power factions to garner support Every system needs a “champion” Be prepared to spend a huge amount of time on system support Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Support, Support, Support Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Support, Support, Support Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell

Development and Deployment of a Web-Based Course Evaluation System WebIST, Miami, FL, May 26, 2005 Thank You Jesse M. Heines, Ed.D. David M. Martin, Ph.D. Dept. of Computer Science Univ. of Massachusetts Lowell {heines,dm}@cs.uml.edu http://www.cs.uml.edu/{~heines,~dm} Jesse Heines and David Martin { heines, dm }@cs.uml.edu Univ. of Massachusetts Lowell