Download presentation
Presentation is loading. Please wait.
Published byLucy Dingle Modified over 10 years ago
1
Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology Northwestern University Evanston, IL, USA Darlene Fichter Data Librarian University of Saskatchewan Saskatoon, SK, Canada
2
Overview Why heuristics testing? What is heuristics testing? Heuristics applied to the web Using heuristic testing for your intranet
3
Why? Will find 81%-90% of usability problems 1 –Evaluators are experts in software ergonomy and in the field in which the software is applied. 22% to 29% of usability problems 1 –Evaluators know nothing about usability Single evaluators found only 35 percent 2 1) Jakob Nielsen, Finding usability problems through heuristic evaluation. In: Proceedings of the ACM CHI '92 (3.-7. May 1992), pp. 373-380. 2) Jakob Nielsen, http://www.useit.com/papers/heuristic/heuristic_evaluation.html
4
Heuristic evaluation? What –A usability inspection method –One or more expert evaluators systematically inspect a user interface design –Judge its compliance with recognized usability principles When –At any point in the design process Who –More is better –Best results with at least 3-5 evaluators –1 is better than none!
5
Yes, more is better Courtesy of useit.com http://www.useit.com/papers/heuristic/heuristic_evaluation.html
6
How? Evaluators review interface individually –Report problems to coordinator –Assign severity ratings Coordinator combines problems –Removes duplicates Evaluators review combined list –Optional - assign severity ratings as a group Coordinator averages ratings –Ranks problems by severity Web team looks for patterns and find solutions
7
Why? Good method for finding both major and minor problems in a user interface –Finds major problems quickly –Will tend to be dominated numerically by the minor problems –So, it is important to rate errors and rank them Compliments user testing –Not a replacement for it –Used to find different types of errors Things an “expert” user would notice
8
Rating errors Frequency –Is it common or rare Impact –Easy or difficult for the users to overcome? Persistence –A one-time problem? Users can overcome once they know about it –Repeatedly be bothered by the problem? Market impact –Certain usability problems can have a devastating effect, even if they are quite easy to overcome
9
Rating scale 0 = Not a problem I don't agree that this is a usability problem at all 1 = Cosmetic problem only Need not be fixed unless extra time is available 2 = Minor usability problem Fix should be given low priority 3 = Major usability problem Important, so should be given high priority 4 = Usability catastrophe Imperative to fix this before release
10
Heuristics (1-5) 1)Visibility of system status 2)Match between system and the real world 3)User control and freedom 4)Consistency and standards 5)Error prevention
11
Heuristics (6-10) 6)Recognition rather than recall 7)Flexibility and efficiency of use 8)Aesthetic and minimalist design 9)Error recovery 10)Help and documentation
12
That’s great, but… So, how do you apply this in the real world? Several possibilities Unstructured evaluation Structured evaluation
13
Unstructured evaluation Let the experts find the problems as they occur Provides greater “free-form” discovery of problems More appropriate when working with usability experts
14
Edmonton Public Library site 3 evaluators reviewed the site 2 passes through the site 1 ½ to 2 hours
15
Report summary More than 100 unique violations Over 60 violations for “consistency and standards” Another frequently violated heuristic being the “match between the system and the real world” –due to poor labels, jargon and ordering of items
16
Frequency by heuristic
17
Problems by area
18
Link Colors Main links in purple Smaller links in blue Other areas had different link colors altogether
19
Different Menus No ‘Home’ button ‘Borrower Services’ not found as a main page heading Menu options are the search links for the Song Index No side navigation menu offered.
20
Labels, Language and Ambiguity Overlap. Mismatch between heading and items Vague headings Audience specific areas are scattered
21
Structured evaluation Develop a list of specific questions related to the issues at hand –Tie back to heuristic principles Provides greater direction of problem- solving energy More appropriate when relying on “subject” experts
22
Sample questions at Northwestern 1.Did you feel that you were able to tell what was going on with the system while you were working? 2.Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate? 3.Did you notice inconsistencies in the way things were referred to? 4.Were you able to navigate and use the site without having to refer back to other pages for needed information?
23
Feedback - what people said Question: Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate? –No, sentences are too long. Use numbers to mark each choice –Some of the language seemed a bit like "library- ese," i.e. terms like "descriptor," etc –Most of the language makes sense, in the sense that it is not jargon (except for "NUcat"), but as I said the contents of the categories are not always clear
24
Long sentences Jargon
25
Interesting observations There are too many choices which are hard to distinguish between It seems like the info organization probably reflects the internal structures of the library more than the user's point of view I generally felt lost on the site. It was unclear where I needed to go to actually find anything I needed Too much information on one page
26
? ?? ???
27
How is heuristic evaluation relevant to usability testing? Allow us to fix big problems before user testing Provides a clue to problem areas –Can be basis for determining usability questions
28
How is this different from usability testing? Analyzing the user interface is the responsibility of the evaluator Observer can answer questions from the evaluators during the session Evaluators can be provided with hints on using the interface
29
Other types of evaluation techniques Heuristic evaluation Heuristic estimation Cognitive walkthrough Pluralistic walkthrough Feature inspection Consistency inspection Standards inspection Formal usability inspection
30
Questions? Frank Cervone Assistant University Librarian for Information Technology Northwestern University Evanston, IL, USA f-cervone@northwestern.edu Darlene Fichter Data Librarian University of Saskatchewan Saskatoon, SK, Canada darlene.fichter@usask.ca
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.