Y ASER G HANAM Heuristic Evaluation
Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise
Introduction Introduced by Nielsen as a discount usability method. Early in the design or during implementation. Given: A prototype or a working system A set of usability heuristics A few evaluators Come up with: A usability evaluation of the system
Introduction Usability Evaluation Evaluators Usability Heuristics The system
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Get the heuristics Heuristics are system dependent. Nielsen’s heuristics proved reliable & representative. Feel free to add more heuristics but not many. Feel free to drop irrelevant ones.
Usability Heuristics Visibility of system status
Usability Heuristics Match between system and the real world
Usability Heuristics User control and freedom
Usability Heuristics Consistency and standards Greenberg, S., Overview of Heuristic Evaluation, wiki/uploads/CPSC681/Heuristic.ppt, accessed October 10, wiki/uploads/CPSC681/Heuristic.ppt
Usability Heuristics Error prevention
Usability Heuristics Help users recognize, diagnose, and recover from errors
Usability Heuristics Recognition rather than recall
Usability Heuristics Flexibility and efficiency of use Shortcuts: Normal mode vs. Advanced mode Don’t make it an alternative CtrlCCopy
Usability Heuristics Aesthetic and minimalist design
Usability Heuristics Help and documentation Ability to use system No/minimal docs Good Usability
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Get the system ready Prototype: novel application or interface No redesign required Less maintenance later Working system: replacement study or competition Prepare typical scenarios: task analysis
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Get the evaluators HE is a group effort. Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, p John Wiley & Sons, New York, NY (1994).
Get the evaluators The more the better? Not necessarily Rule of thumb: 3 to 5 evaluators Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, p John Wiley & Sons, New York, NY (1994).
Get the evaluators Evaluators’ expertise Novices: better be potential users of the system Usability experts: more effective Double experts (both in usability and the domain): the best to get but very expensive Session manager: facilitates the evaluation session & aggregates reports Observers: provide help to evaluators
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Do the evaluation Evaluators get the heuristics & scenarios. Navigate through the system twice. Inspect screens, dialogues, forms, messages and menus in the system. Categorize any problem under one of the heuristics: should give specific explanation. Can make comments beyond the heuristics. Report in writing or verbally to the observer.
Do the evaluation Observers answer evaluators’ questions. especially domain-specific questions. but, without influencing judgments. IMPORTANT: Inspection should be done individually. Evaluators are not allowed to communicate. Session takes 1 to 2 hours.
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Compile the results Aggregate evaluators’ reports. Eliminate duplicate entries. Merge different problems. Output: One report of all usability problems found by the evaluators.
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Conduct severity rating Evaluators are aware of all usability problems. Severity determined by: frequency of occurrence impact on user persistence 0: Not a problem at all 1: Cosmetic problem 2: Minor problem 3: Major Problem 4: Catastrophic problem
How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan
Evaluators, facilitator & the design team meet. Discuss problems. Suggest solutions. Consider organization’s priorities. Take decisions: fix major not minor problems, delay release, replace interface … etc.
Advantages Discount usability methods: few gives many. Easy to teach. Fast to conduct. Cheap. Can be used in early design. High benefit to cost ratio: i.e. 48
Shortcomings Reduced set of heuristics is very broad and general Usually does not involve end users Finds many minor problems causing a “false alarm” sometimes. Not suitable for in-depth usability testing or critical systems.
Conclusion A discount usability engineering method. Best for time-constrained, budget-limited projects. 3 to 5 evaluators follow 10 heuristics. Finds many problems in a short time. Does not replace other usability methods.
Thanks for listening Questions ????
Exercise Nielsen, J. (1993). Usability Engineering, Academic Press.