Presentation is loading. Please wait.

Presentation is loading. Please wait.

Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.

Similar presentations


Presentation on theme: "Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt."— Presentation transcript:

1 Heuristic Evaluation

2 Sources for today’s lecture: Professor James Landay: http://bmrc.berkeley.edu/courseware/cs160/fall98/lectures/heuri stic-evaluation/heuristic-evaluation.ppt Jakob Nielsen’s web site: http://www.useit.com/papers/heuristic/heuristic_evaluation.html Nielsen articles linked to course web site

3 Heuristic evaluation (what is it?) Method for finding usability problems Popularized by Jakob Nielsen “Discount” usability engineering  Use with working interface or scenario  Convenient  Fast  Easy to use

4 Heuristic evaluation (how ?) Small set of evaluators (3-5) Each one works independently Find problems with an interface using a small number of heuristics (principles) Aggregate findings afterward

5 Use multiple evaluators These people can be novices or experts  “novice evaluators”  “regular specialists”  “double specialists”(- Nielsen) Each evaluator finds different problems The best evaluators find both hard and easy problems

6 Use multiple evaluators

7 Proportion of usability problems found by different numbers of evaluators (Nielsen)

8 Heuristic Evaluation - Advantages Evaluators can be experts. There need not be a working system. Evaluators evaluate the same system or scenario. Often, about 5 evaluators can discover around 75% of the problems.

9 Principles (Nielsen’s original set) Simple & natural dialog Speak the users’ language Minimize users’ memory load Be consistent Provide feedback Provide clearly marked exits Provide shortcuts Good error messages Prevent errors

10 Sample Heuristics (we’ll be using these) 1. Visibility of system status 2. Match between system & real world 3. User control and freedom 4. Consistency & standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility & efficiency of use 8. Minimalist design 9. Help error recovery 10. Help & documentation (PRS pp. 408-409)

11 Revised principles (PRS, 408-9) 1. Visibility of system status searching database for matches

12 What is “reasonable time”? 0.1 sec: Feels immediate to the user. No additional feedback needed. 1.0 sec: Tolerable, but doesn’t feel immediate. Some feedback needed. 10 sec: Maximum duration for keeping user’s focus on the action. For longer delays, use % done progress bars.

13 2. Match between the system and the real world

14 Natural dialog? Socrates:Please select command mode Student:Please find an author named Octavia Butler. Socrates: Invalid Folio command: please

15 Another example: Dragging a diskette into the trash (Stay tuned for lecture on metaphors!)

16 3. User control and freedom  Provide exits for mistaken choices  Enable undo, redo  Don’t force users to take a particular path

17 4. Consistency and standards See also: SPSS menus (“OK” is inconsistently located.

18 5. Error prevention People make errors. Yet we can try to prevent them. How might you go about trying preventing errors?

19 5. Error prevention People make errors. Yet we can try to prevent them. How might you go about trying preventing errors? (try adding forcing functions)

20 6. Recognition rather than recall Ex: Can’t copy info from one window to another Violates: Minimize the users’ memory load (see also Norman’s book)

21 7. Flexibility and efficiency of use  Provide short cuts  Enable macros  Provide multiple ways of accomplishing the same thing

22 8. Aesthetic and minimalist design NOT!

23 9. Help users recognize, diagnose, and recover from errors

24 Error messages (Unix) SEGMENTATION VIOLATION! Error #13 ATTEMPT TO WRITE INTO READ-ONLY MEMORY! Error #4: NOT A TYPEWRITER

25 10. Help and documentation

26 Heuristics adapted to web site evaluation: (PRS p. 415) Adapt the general heuristics provided by Nielsen to the particular domain!

27 Phases of heuristic evaluation 1. Pre-evaluation training - give evaluators needed domain knowledge and information on the scenario (readings, this lecture!) 2. Have them evaluate interface independently 3. Classify each problem & rate for severity 4. Aggregate results (Matt will do this) 5. Debrief: Report the results to the interface designers

28 Severity ratings Each evaluator rates individually: 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix In giving a rating, consider both the flaw’s impact and its frequency.

29

30 Conclusion Heuristic evaluation is a great “discount” method. (You will try out this method with Assignment #2.) But it’s not perfect - some “problems” may not matter, and some problems will be missed. For best results, use heuristic evaluation in combination with user testing!

31


Download ppt "Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt."

Similar presentations


Ads by Google