Www.sakaiproject.org 1 Course Evaluation Amitava ‘Babi’ Mitra, MIT Maneesha Aggarwal & Robert Cartolano, Columbia University William Plymale and Aaron.

Slides:



Advertisements
Similar presentations
HERDING JAVELINAS : CoursEval Implementation at a Large Multi-Campus University.
Advertisements

Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Everything you wanted to know, but were afraid to ask……..
Sakaibrary in 2.4: User Feedback Guides Development Jon Dunn and Mark Notess Digital Library Program Indiana University.
Holyoke Public Schools Professional Development By, Judy Taylor
Student Evaluations. Introduction: Conducted: Qualtrics Survey Fall 2011 o Sample Size: 642 o FT Tenured: 158, FT Untenured: 59 o Adjunct: 190 o Students:
Christine Doherty Lydia Li Karen Tsao Stanford University (aka Tests & Quizzes) S akai A ssessment M anager I G O.
Wayne Huebner, Chairman, MSE S&T Campus Project Leader Chris Weisbrook, Director of Academic Programs UM System Project Leader presented to: ITCC April.
Everything you wanted to know, but were afraid to ask……..
Better Maintenance of the Schedule of Classes Through Customization and Security.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
Development of Student Learning Outcomes for GE: lessons from a collaborative approach A Lawson General Education in California Conference, CSU Fullerton.
Seton Hall University Banner Project – June 2007 Update Banner Project Update to the Finance Committee of the Board of Regents June 6, 2007 Stephen Landry,
San Francisco State University Enrollment Planning and Management Academic Institutional Research The SFSU Undergraduate Exit Survey
Report to External Review Board Brigham Young University Civil & Environmental Engineering October 14, 2005.
1 Tests and Quizzes Tool (aka SAMigo) Marc Brierley, Stanford University T&Q Lead Designer
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
EPortfolio Assessment Pilot. Agenda Purpose of the ePortfolio assessment pilot CSD use of ePortfolio English department use of ePortfolio Future applications.
MCCVLC Distance Learning Administrators Survey Results & Discussion.
So You Want to Switch Course Management Systems? We Have! Come Find Out What We’ve Learned. Copyright University of Okahoma This work is the intellectual.
1 CourseWorks and Sakai Update July 2005 (Version 1.1) Robert Cartolano Manager, Academic Technologies, Academic Information Systems Columbia University,
Information Competency: an overview Prepared by: Erlinda Estrada Judie Smith Mission College Library Santa Clara, CA.
Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share.
Managerial Role – Setting the Stage Lesson 6 Jeneen T. Chapman John Madden Facilitators.
SAKAI February What is SAKAI? Sakai ≠ Course Management System Sakai = Collaboration & Learning Environment.
Faculty Center for Instructors and Roster Contacts Roles and Access Faculty Center Features Grade Changes and Approval.
COURSE ADDITION CATALOG DESCRIPTION To include credit hours, type of course, term(s) offered, prerequisites and/or restrictions. (75 words maximum.) 4/1/091Course.
New Products for ©  2009 ANGEL Learning, Inc. Proprietary and Confidential, 2 Update Summary Enrich teaching and learning Meet accountability needs.
1 Sakai Assessment Manager “Samigo” Charles Kerns, Stanford University Lance Speelmon, Indiana University.
1 Successful Ingredients and zShell Josten Ma 12 March 2008.
8th Sakai Conference4-7 December 2007 Newport Beach Programmatic Assessment Using Goal-Aware Activities in Sakai Daniel Tyger (SVSU) & Noah Botimer (UM)
Nancy Severe-Barnett Program Coordinator, SCIS
Enhancing OSP for Programmatic and Institutional Assessment Lynn Ward and John Gosney Indiana University.
COURSE SCHEDULING AT BSU STARTING SEPTEMBER 2013 Revised September 2014 – Overview of our Current Scheduling Process.
THINK LEARN LEAD LINK Flinders University Web Redevelopment An overview May 2006 Antonia Malavazos, Web Project Officer.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
CBU Online Survey System MASEC March 31, 2006 John Ventura.
Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University.
1 © State of New South Wales through the NSW Department of Education and Training, This work may be freely reproduced and distributed for personal,
User Management: Understanding Roles and Permissions for Schoolnet Schoolnet II Training – Summer 2014.
Moodle (Course Management Systems). Forums, Chats, and Messaging.
Overview of Course Request System (CRS) Office of Programs and Academic Assessment (OPAA), UIC July, 2010.
SBTeach Introduction School of Business Course Coordination System.
Online Student Evaluations Paper or Plastic? Archie George Kathy Dean Jason Mayer* University of Idaho *now with Information Technology Services.
1 June 10-15, 2012 Growing Community; Growing Possibilities Switching to on-line evaluations for courses at UC Berkeley Daphne Ogle, Lead Design, UC Berkeley.
AEFI S Assessment Improved Mustafa Sualp Founder, Servant Leader & CEO.
Keeping Up With Moore’s Law 1 Keeping Up With Moore’s Law: Course Management Panel Robert Cartolano Manager, Academic Technologies, Academic Information.
Phoenix Campus Faculty Refresher Training. This workshop will provide faculty with updated knowledge and tools necessary to be effective in the classroom.
ECE791 Senior Design Experience Project Requirements and Timeline.
Who are we???  Four Year Comprehensive College of the SUNY system  604 acre campus located on Long Island about 20 miles east of NYC  Multicultural.
8th Sakai Conference4-7 December 2007 Newport Beach Sakaibrary Project Update: Subject Research Guides December 6, 2007.
Examination Management System - EMS A Web based Examination Management System Developed for the colleges and institutes to conduct online exams and assessments.
CAA Review Joint CAA Review Steering Committee Charge Reason for Review Focus Revision of Policy Goals Strategies Milestones.
Mary Ann Roe e-Colorado Portal Coordinator Colorado Department of Labor and Employment Jennifer Jirous Computer Information Systems Faculty Pikes Peak.
How to Implement Degree Works in 9 Months. Presenters: Marissa Boston, Degree Analyst University of Northern Colorado Jennifer.
IBM Rational Development University of Technology Sydney March 6, 2012 John Schilt Academic Initiative Lead IBM Australia / New Zealand
College of Arts & Sciences Lecturer Promotion Dossier assembly workshop fall 2016.
Conquering A Virtual Accreditation Transparent, Continuous Improvement
College of Arts & Sciences Lecturer Promotion Dossier assembly workshop fall 2017.
Course Evaluation Committee
BlackBoard 5 A Definitive e-Learning Software Platform Ozgur Balsoy,
SAKAI February 2005.
College of Arts & Sciences Lecturer Promotion Dossier assembly workshop fall 2018.
for Instructors and Roster Contacts
for Instructors and Roster Contacts
for Instructors and Roster Contacts
Deborah Wiegand, Assistant Dean, Undergraduate Academic Affairs
Faculty Center for Instructors
LMS Course Provisioning
Presentation transcript:

1 Course Evaluation Amitava ‘Babi’ Mitra, MIT Maneesha Aggarwal & Robert Cartolano, Columbia University William Plymale and Aaron Zeckoski, Virginia Tech

2 Agenda Introduction Review Common Ground Features at Columbia and Virginia Tech Next Steps Q & A

3 Introduction How did this begin ? –Late Feb 05 began exploring who was using/planning to use online course eval within SEPP –March 05, SEPP Course Eval Working Group set up What’s the WG been doing since then ? –Decided to use Columbia's existing feature set as a baseline to start with –CU and VT prepared a detailed functional layout and work flow, and posted it in April so that we can clearly show how the existing systems might operate. MIT, e.g., has studied this and responded with their own needs –CU and VT have made sandboxes available Why are we doing this ? –Getting to the common ground - the baseline that most of us can agree on

4 What is Course Evaluation? Administration of survey to students across one or more courses –May span course, department, school, university –Hierarchy, permissions, access levels are important Provide results to University, School, Department, Faculty, Students to evaluate: –Instructors –Curriculum –Academic Program (Program Evaluation) –Department, School, University (accreditation)

5 Course Evaluation Evaluation has historical aspects at each school There are different motivations for evaluation System must be flexible enough to deal with different needs, but… There are common features that we have identified among our working group members.

6 Common Features --- from the WG’s discussions Anonymity - student privacy must be preserved Evaluation administrator - someone with appropriate authority must create and administer evaluation. Course Selection - must be very flexible –As little as 1 course… –Or the entire department, program, school, etc.

7 More Common Features --- from the WG’s discussions Ability to combine core and specific questions –Course-specific, instructor-specific, etc. Requires extensive reporting capabilities –Administrator, Instructors, Students, Public accessible Requires workflow and automation to make it easy, reduce burden to administrators.

8 Columbia - Current System –In production since Fall 2003 –Built into CourseWorks, campus CMS –Flexible approach needed to support multiple schools, multiple departments, etc. Carrot vs. Stick Open vs. Closed reports Many Reporting types; built-in reports and export data

9 CU Current System (cont’d.) – tool very important; both auto and manual –Aggregate data across semesters to support accreditation review –Very popular, well-received by faculty, students, and administrators 87,825 evaluations administered to date 54,480 evaluations completed 157 evaluation templates created

10 Columbia Live Demo

Current System

Current System Implemented Different forms for different colleges –Business, Architecture, Engineering Statistics –500 class sections per term –230,000 sheets per year Response Rate: 78% during past 3 years

13 SEPP Contribution – VT / CU Columbia University Partnership –SEPP Conference – Summer, 2004 – Denver –Robert Cartolano / Maneesha Aggarwal Virginia Tech Development –Aaron Zeckoski –Kapil Ahuja / Justin Gawrilow SEPP Course Eval Discussion Group VT / CU Evaluation System Summary VT Evaluation System – “sandbox” release –

14 SEPP Contribution – VT / CU Virginia Tech extensions –Administrative hierarchy (Super, University, School, Dept, Instructor) –Dynamic reporting (flexible formatting, trend analysis, sub-setting) –Ability to incorporate existing data collected with VT’s current evaluation system. –Oracle database tuning

Pilot System VT Pilot release – May 2005 –Columbia System with VT extensions –Very limited response to VT pilot study due to late release. –VT pilot study will resume Fall 2005

Pilot System Faculty training will be incorporated into FDI Change to online evaluation system needs approval by Deans and Department Heads Online system will provide better access for Institutional Research and the administration Online system offers faculty flexibility in adding questions to their evaluations

Demo

Current scenario Institutional subject evaluation evaluates between subjects each term, about 2/3 of which are undergraduate subjects and 1/3 graduate. This is out of approximately 1700 subjects with students enrolled and 3000 total subjects offered for credit each term. Evaluations are end-of-term only, on 2 paper forms separately for Science/Engineering subjects and for Humanities, Arts, and Social Science subjects, as chosen by the department. Quantitative data are captured and analyzed, written comments are available to each department from originals. Three departments, Electrical Engineering and Computer Science, Management, and Aeronautics and Astronautics (online) have their own forms and systems for subject evaluations.

19 Needs --- Must haves Must haves for a new system (to replicate current system): Stand alone system (with the capability of being tied to the Course Management System) Anonymity, confidentiality, security Overall institutional administrator, plus levels of school/departmental administrators At least 2 different forms (1 for Science/Engineering subjects and 1 for Humanities, Arts, and Social Sciences subjects) with preference for many Some questions consistent across all schools with flexibility within school/dept./subject. Flexible questions would need to be administered at the local level but available to all. Rating both subject and instructors on the same form Access to individualized reports for each person evaluated Assigning roles to instructors (lecturer, recitation instructor, etc.) Integration with other MIT systems for data downloads (student, subject, and instructor information) and appropriate access (e.g., only to students registered in subject) A 7 point rating scale, with flexibility to change that if needed Blocking access to data/reports until after grades are turned in Analysis and reporting at a centralized point Student access to appropriate report information through a centralized site

20 Needs --- Necessary Necessary in any new system: Searching data/reports on instructor name Assigning rank/status to instructors (prof./assoc. prof./assist. prof./lecturer/grad. student, etc.) Individual and other reports that include overall subject, department, school, and institutional data Longitudinal data/reports Analysis and reporting at local points in addition to centralized one, with capability to print subject/department/school level with 1 click Linking to evaluation results by department and by subject Linking to evaluation results from subject listings/selection (i.e., our on-line catalog) *Integration with course management system with necessary security Having students go to a single site for all their subjects to be evaluated

21 Needs --- Nice to have Nice to have capability for: Evaluating sections of a subject (with the capacity for students to enter instructor's name) Instructors' photos together with their name Mid-term evaluations Evaluations that only faculty would have access to Capturing, editing, and distributing students' comments Having a single sent to a student listing all subjects to be evaluated by that student

22 What’s Next Create a functional specification for ‘Course Eval ver 1.0’ built upon Columbia base and Virginia Tech extensions Develop Java-based version based on Sakai framework Run as standalone or within Sakai Consider using SAMigo as assessment engine, but probably not in version Course Eval ver 1.0

23 Open Discussion Questions? See Demonstrations at 5:30 today Join Evaluation Working Group –Contacts: Robert Cartolano Amitava ‘Babi’ Mitra