1 June 10-15, 2012 Growing Community; Growing Possibilities Switching to on-line evaluations for courses at UC Berkeley Daphne Ogle, Lead Design, UC Berkeley Aaron Zeckoski, Lead Developer, Unicon
Jasig Sakai Conference Speakers Daphne Ogle Berkeley Design Lead OAE since 2010, Fluid Project, Sakai since 2004 Aaron Zeckoski Unicon Development Lead Evaluation System original lead (2006), Evals software since 2001, Sakai since 2004
Jasig Sakai Conference Unicon Profile
Jasig Sakai Conference Unicon technology & domain expertise
Jasig Sakai Conference Unicon service capabilities / offerings
Jasig Sakai Conference Intro to UC Berkeley on-line course evaluation project Demo: Updates to CLE Evaluation System for Spring pilot Future work What we learned Questions & Answers Today’s topics
Jasig Sakai Conference Introduction to online evaluations project at UC Berkeley Overall project initiative
Jasig Sakai Conference Online evaluations for courses New shared questions ◦ campus-wide ◦ instructional format-based (14 formats) ◦ piloted first On-line ◦ pilot CLE Evaluation System ◦ development to meet Berkeley needs & overall system improvement ◦ Berkeley & Unicon collaboration
Jasig Sakai Conference Goals of initiative Rapid access to useful information to help faculty improve their courses and their teaching effectiveness Improvements to the quality & integrity of data that the campus uses to understand and recognize teaching contributions 21st century tools for staff to perform their jobs efficiently and free up time for direct service to students and faculty High quality information for students about courses and instructors at Berkeley Cynthia Schrager, Assistant Vice Provost
Jasig Sakai Conference Project time line Summer 2010 Spring 2011 Summer 2011 Fall 2011 Spring 2012 Summer 2012 Fall 2012 Research new questions & instructional formats Evaluation systems comparative analysis Design, analysis & development CLE eval sys (in UCB branch) Campus buy-in Fall 2010 Pilot new questions & instructional formats Pilot CLE eval sys Assess pilot Design & development (intend to move to trunk) Larger 2nd pilot
Jasig Sakai Conference Spring Pilot Introduction to online evaluations project at UC Berkeley
Jasig Sakai Conference Goals of spring pilot Establish baseline response rate Assess CLE evaluation system ◦ technical longterm sustainability ◦ user experience Build knowledge of CLE evaluation system Build knowledge of infrastructure needs
Jasig Sakai Conference Spring pilot success criteria Establish baseline response rate Collect data on function & experience User experience meets or exceeds previous Transparency across campus Elicit higher quality response Gain understanding to inform Fall design & development effort
Jasig Sakai Conference Spring pilot makeup 8 departments 61 courses (primary & secondary) enrollments
Jasig Sakai Conference Spring pilot Demo
Jasig Sakai Conference Spring pilot development Significant time understanding system Overall model & process to follow ◦ “uber” template ◦ self-service Focus on student & instructor experience System areas of focus ◦ Dashboard ◦ Evaluation presentation ◦ Administration page ◦ Notifications
Jasig Sakai Conference Dashboard - student
Jasig Sakai Conference Dashboard - administrator
Jasig Sakai Conference Evaluation
Jasig Sakai Conference Administration
Jasig Sakai Conference Notifications
Jasig Sakai Conference Bulk import / export configurations
Jasig Sakai Conference Live Demo
Jasig Sakai Conference Lessons learned Food for thought
Jasig Sakai Conference Lessons learned Configurability + flexibility = Complexity ◦ documentation not always up to date ◦ allow significant time to “play around” Unfinished functionality ◦ notifications ◦ options in evaluation creation Global tool ◦ acts different than other tools ◦ where will it be exposed?
Jasig Sakai Conference Lessons learned - cont’d Allow a lot of time for QA ◦ see 1st bullet: lots of variables ◦ QA’ing up to the last minute Clean data required ◦ setup if not using group provider ◦ site roles NOT always evaluation roles Communicate! ◦ be transparent ◦ remind users it’s a pilot
Jasig Sakai Conference Lessons learned - cont’d Learn from other implementors ◦ document models, workflows & implications ◦ community project? Culture & legal requirements vary across campus ◦ viewing respondents ◦ roles, permissions, viewing each others evaluations
Jasig Sakai Conference Future plans Evaluation system at Berkeley
Jasig Sakai Conference What’s next (1 of 2) ? Feedback sessions with users on Spring pilot Needs analysis around results Move our work back into trunk Configuration option to hide respondents page UX improvements ◦ results page ◦ multiple instructors (Berkeley-centric) ◦ add respondent link to dashboard ◦ make display options clear in context ◦ configuration page (information architecture)
Jasig Sakai Conference What else is next? Investigation ◦ support for instructional formats (Berkeley-centric) ◦ give department admins access to results ◦ Evalsys tool added to My Workspace automatically ◦ group provider & SIS integration ◦ partial OAE integration
Jasig Sakai Conference Results (early idea)
Jasig Sakai Conference Results(early idea)
Jasig Sakai Conference Sneak peak: dashboard
Jasig Sakai Conference Sneak peak: scale creation
Jasig Sakai Conference Sneak peak: scale creation
Jasig Sakai Conference Resources Berkeley Confluence ◦ Sakai Confluence ◦ Sakai Jira ◦
Jasig Sakai Conference Thank you! Questions, comments, discussion...