Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/43 Practical Results from Large-Scale Web Usability Testing Rolf Molich DialogDesign.

Similar presentations


Presentation on theme: "1/43 Practical Results from Large-Scale Web Usability Testing Rolf Molich DialogDesign."— Presentation transcript:

1 1/43 Practical Results from Large-Scale Web Usability Testing Rolf Molich DialogDesign

2 2/43 http://www.dialogdesign.dk/cue2.htm Slides in Microsoft PowerPoint 97 format Download Test Reports and Slides

3 3/43 A recent survey shows that l 80% of all Danish drivers think that their driving skills are above average. How about usability testers? How It All Started...

4 4/43 l Too much emphasis on one-way mirrors and scan converters l Little knowledge of REAL usability testing procedures - mainly beautified descriptions l Too little emphasis on usability test procedures and quality control (”Who checks the checker?”) How It All Started...

5 5/43 Who Checks the Checker? l When did YOU last have an objective check of your usability testing skills? l Who would you trust as an evaluator of your skills?

6 6/43 TestEndTest object Student Professional teams teams 1Oct 979 Danish web-sites500 2Dec 97CUE-1: Win Calendar Progr.04 3Oct 989 Danish web-sites500 4Dec 98CUE-2: www.hotmail.com2+37 5Mar 99Web Text - Encyclopedia04 Comparative Evaluations

7 7/43 l Introductory couse in Human-Computer Interaction at the Technical University of Copenhagen l Two courses in the fall of 1997 and 1998 l 120 students per course l Fifty teams of one to three students l 2 x 9 Danish web-sites tested by four to nine teams with at least four test participants l Three weeks to complete test and write report Student Tests

8 8/43 l Quality of Usability tests and reports is acceptable considering that most teams used 20-50 hours l Some teams wrote quite professional reports after just one month of the course (Surprise?) l Few false problems and opinions l Limited overlap between findings Can Students do Usability Testing?

9

10 10/43 Buttons in lower right corner: l Empty shopping basket l Change order l Continue shopping l Go on with your purchase Would a human bookseller act like this? www.bokus.com - Bookstore

11 11/43 l Inhuman treatment of users on many e-commerce web-sites l On-site searching seldom works. Users are better off without on-site searching l Many web-sites focus on the company, not the user Conclusions -

12 12/43 l Nice layout and graphics l Good response time l Give correct results Conclusions +

13 13/43 User task: l You want to take your business to BG Bank. Make an appointment with the bank l Hard to find in menu structure l Users entered ”appointment” as keyword for Search Problem Example

14

15 15/43 l Tolerate user input errors l Provide human error messages (constructive) l Recommend index, site-map l Special handling of frequent keywords l Show user search keywords in context How to Improve Search

16 16/43 CUE-1 Comparative Usability Evaluation 1 l Four professional teams usability tested the same Windows calendar program l Two US teams (Sun, Rockwell), one English (NPL) and one Irish (HFRG, Univ. Cork) l Results published in a panel and a paper at UPA98 l Main conclusions similar to CUE-2

17 17/43 CUE-2 Comparative Usability Evaluation 2 l Nine teams have usability tested the same web-site –Five professional teams –Two semi-professional teams –Two student teams –(plus three student teams from TUD) l Test web-site: www.hotmail.com

18 18/43 Purposes of CUE-2 l Provide a survey of the state-of-the art within professional usability testing of web-sites. l Set a benchmark against which other usability labs can measure their usability testing skills. l Investigate the reproducibility of usability test results l Give participating teams an idea of strengths and weaknesses in their approach to usability testing

19 19/43 NON Purposes of CUE-2 l To pick a winner l To make a profit

20 20/43 Basis for Usability Test l Web-site address: www.hotmail.com l Client scenario (written by Erika Kindlund and Meeta Arcuri) l Access to client through intermediary (Erika Kindlund) l One month to carry out test - Web-site adress not disclosed until start of test period

21 21/43 What Each Team Did l Familiarize with Hotmail l Define test scenarios l Define user profile; recruit test participants l Run a suitable number of tests, determined by the team l Write usability test report in standard company format and anonymize it

22 22/43 Problems Found CUE-1 CUE-2 l Total number of problems141300 l Found by seven teams-1 l Found by six teams-1 l Found by five teams-4 l Found by four teams14 l Found by three teams115 l Found by two teams1149 l Found only by one team128(91%) 226(75%)

23 23/43 Comparison of Tests l Based mainly on test reports l Focus on significant differences l Selection of parameters for comparison based on two generally recognized textbooks: – Dumas and Redish, ”A Practical Guide to Usability Testing” – Jeff Rubin, ”Handbook of Usability Testing”

24 24/43 Resources TeamABCDEFGHJ Person hours used for test13612384(16)1305010745218 # Usability professionals211131136 Number of tests76650951146

25 25/43 Usability Test Reports TeamABCDEFGHJ # Pages16361053619181122 Exec summaryYYNNNYNYY # Screen shots1008012120 Severity scale232121134

26 26/43 Usability Results TeamABCDEFGHJ # Positive findings084724251446 # Problems2615017105875301820 % Exclusive427124105751335660 % Core problems (100%=26)38733585854502731 Person hours used for test13612384NA1305010745218

27 27/43 Results l There are overwhelmingly many usability problems. l There are many ”serious” usability problems. l Limited overlap between team findings.

28 28/43 Conclusions l In most cases, no form of cost-effective testing will find all or most of the problems - or even most of the serious ones l Claims like ”Method x finds at least 80% of all serious usability problems” are not in accordance with the results of this study

29 29/43 Problems Found in CUE-2 l Total number of different usability problems found300 l Found by seven teams1 l Found by six teams1 l Found by five teams4 l Found by four teams4 l Found by three teams15 l Found by two teams49 l Found only by one team226

30 30/43 Problem Found by Seven Teams During the registration process Hotmail users are asked to provide a password hint question. The corresponding text box must be filled. Most users did not understand the meaning of the password hint question. Some entered their Hotmail password in the text box. Clever but unusual mechanisms like the password hint question must be explained carefully to users.

31

32 32/43 Problem Example Users consistently glanced briefly at this screen and then without hesitation clicked the button ”I Accept” The button ”I Accept” is very conveniently placed (”usable”), but the text is quite difficult to read. The text is written in legalese, not in webbish. Users want text that they can ”Skim, skim, and read”. Do unusable ”Terms of Service” have any legal value?

33

34 34/43 l Difficult to read - legalese, not English l Does not answer important user questions about privacy, cost l Not in native language l Signals ”Don’t waste your time on this”: l Button ”I agree” is too usable l No information on how to return to Terms of Service Problems with Terms of Service

35

36

37 37/43 Examples of language related problems that were detected by European teams l Send Mail: Term "Compose" difficult to understand. Use "Create new message" or "Write Mail” (5/9) l Create new account: "State/Province" textbox is required but does not make sense in many countries (2/9) Language Related Problems

38 38/43 Some language related problems suggested by US teams were not confirmed by European test teams l Change "last name" to "family name" l Meaning of "U.S. Residents only" and "Non-U.S. Residents Only" is unclear Language Related Problems

39

40 40/43 l Problems listed with severity, #users l Distinguish clearly between –Personal opinions, –Expert opinions, –User opinions, –User findings Advice for a Usable Usability Report

41 41/43 l No power user test, although four teams also recruited power users l Few tests that require complicated setup. Examples: Attachments; boundary testing, e.g. large number of e-mails in in-box l Teams completed their usability tests within schedule, but they never compared their results to those from the other teams Some State-of-the-Art Boundaries

42 42/43 l The total number of usability problems for each tested web-site is huge, much larger than you can hope to find in one series of usability tests l Usability testing techniques can be improved considerably l We need more awareness of the usability of usability work Conclusions

43 43/43 http://www.dialogdesign.dk/cue2.htm Slides in Microsoft PowerPoint 97 format CUE-2 Panel: Tuesday at 4.30 p.m. Download Test Reports and Slides

44


Download ppt "1/43 Practical Results from Large-Scale Web Usability Testing Rolf Molich DialogDesign."

Similar presentations


Ads by Google