Usability Testing Chris North cs3724: HCI
Presentations karen molye, steve kovalak Vote: UI Hall of Fame/Shame?
Next Thurs: Proj 2, final implementation McBryde 102 code to Purvi, bring report to demo Presentations: UI critique or HW2 results Next Tues: adam hahn, hugh hockett Next Thurs: matthew jaswa, jason bower
Review What are independent variables? What you vary What are dependent variables? What you measure: perf time How do you analyze the results? T-test, anova
UI Evaluation Early evaluation: Wizard of Oz Role playing and scenarios Mid evaluation: Expert reviews Heuristic evaluation Usability testing- “formative” design Controlled experiments- “summative” measuring Late evaluation: Data logging Online surveys
Usability Testing Data: Objective: observe users Subjective: user opinion Qualitative: non-numeric data Quantitative: numeric data Steps: 1.Design experiment 2.Run experiment 3.Analyze data 4.Back to UI design
1. Design Experiment Users: Representative users from User Analysis 3-5 users; quality not quantity (80% rule) Tasks: Benchmark tasks: has metrics, usability specification Informal tasks: no metrics, more exploratory
Setup User instructions “We are evaluating the system, not you!” Legal consent forms Pilot test: rehearsal
2. Run Experiment Lab (McBryde 102) Mobile lab
Process Roles: Subject/User Facilitator: instructs user Observers: collect data Executor: run the prototype, e.g. if faked Process: Give user a task Observe Avoid interfering, hint if completely stuck ~1 hour / subject
Data Collection Video tape: User’s screen User’s keyboard and mouse User’s face + audio Verbal protocol: think aloud Note taking, real time Critical incidents Time stamp Quantitative HCI metrics, like controlled experiment Post-session interviews Eye tracking, biometrics
3. Analyze Data Funny to watch “stupid user!”, “that’s developer X’s fault!”, “this sucks” “how can we redesign UI to solve that usability problem?” Compare measures to usability specifications Identify problems Solve problems in order of importance
Cost-Importance Analysis Spreadsheet: Importance 1-5: (task effect, frequency) 5 = critical, major impact on user, frequent occurance 3 = user can complete task, but with difficulty 1 = minor problem, small speed bump, infrequent Ratio = importance / cost Sort by this 3 categories: Must fix, next version, ignored ProblemImportanceSolutionsCostRatio
Solutions Design principles and guidelines Brainstorming Study other similar designs Solutions suggested by users, experts NOT: more training, documentation Small UI change vs. major redesign
The big picture Iterative: re-design, re-test, … Goal: achieve the usability spec (like soft. eng.) Cost effective Find problems early before the architecture is finalized
Example: Toms hardware Compare, buy computer hardware Find a p4 board max $150 How to fix Tweak systems Improve sys perf? How to overclock celeron?