Developing and Implementing Online Test Processing Software E’lise Flood Testing Center Specialist Jenna Anderson Testing Center Specialist Ryan Brainerd Lead, Interactive Design Services
Agenda About Franklin University About Our Testing Office (SLC) Previous Testing Process The Problems Our Solution Software Overview Results Future Improvements Will This Work At Your Institution?
About Franklin University Private, not-for profit institution Approx. 11,000 students –Central Ohio –National –International Undergraduate and Graduate programs Five physical locations Online, traditional, hybrid formats
About Our Testing Office (SLC) There are currently two testing specialists with 3-4 internal test proctors The highest volume of tests we process are out of class tests In 2009, we processed 13,683 exams –56% of those were administered by distance proctors
Previous Testing Process Receive request by from instructor –Proctor information for all students –Test and related materials Manual roster pull and verification Create cover sheets Individual s for all proctors Pre paid envelopes for returning tests Paperwork, sorting, filing when tests returned Process took 45 minutes average per test
The Problems Manual process was too time consuming to scale with University’s growth Lack of enforceable deadlines –5 day window –Late submissions were problematic Proctors were not verified in any meaningful way Lack of centralized system made record keeping difficult –Manual process was and spreadsheets
Our Solution Centralized web application to handle all requests and automate processing as much as possible Automatic s, rosters, mail merge System enforces deadlines System has searchable proctor database System enforces some proctor guidelines – blacklist –Standard survey
STUDENT
PROCTOR
FACULTY
SLC
Results Manual Process Proctor approval at instructor discretion Individually send s with tests attached to proctor with specific student information Instructor had to test request Log into software to pull roster Manually created coversheet Dublin testing center had to have materials saved in separate folder Manual s to all other parties 45 minutes average per test Online Test Administration Proctor approval centralized by testing professionals Tests ed automatically with specific student information automatically populated Instructor submits request through software Roster automatically populated Coversheet automatically populated Automated convergence at all locations Automated s to all other parties 8 minutes average per test
Future Improvements Additional Transparency –More status for student, faculty, proctor, SLC –ETAs Integrated Online Test Environment Automated Proctor Location Support for additional physical locations
Will This Work At Your Institution? Software design and execution was time consuming –12 months initial investment –2 years ongoing support and upgrades –Franklin has an internal software team (AIS/ID) Business support –Academic Standards Proctor Guidelines –Faculty Approval –Testing office (SLC) –Information Technology
Get Something Started Franklin University Columbus, OH E’lise Flood Testing Center Specialist Jenna Anderson Testing Center Specialist Ryan Brainerd Lead, Interactive Design Services