Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usability Evaluation.

Similar presentations


Presentation on theme: "Usability Evaluation."— Presentation transcript:

1 Usability Evaluation

2 Evaluation with Users Big investment – big potential
Remember, you can identify and solve lots of problems using cognitive walkthroughs and heuristic evaluation before you make the investment Many issues/techniques But they have a common foundation

3 Testing with Users Unlike some of the “userless” tests there aren’t a lot of fancy protocols we will learn. Most of the challenges around preparing for a user-based test involve PREPARING for a user-based test.

4 A Test Plan Checklist, 1 Goal of the test?
Specific questions you want to answer? Who will be the experimenter? Who are the users going to be? How many users are needed? What kind of instruction will the users be given? What tasks will you ask the users to perform? What criteria will be used to determine the end of each task?

5 A Test Plan Checklist, 2 What aids will be made available to users?
To what extent will the experimenter be allowed to help the users? What data is going to be collected and how will it be analyzed? What is the criterion for judging the interface a success?

6 A Test Plan Checklist, 3 Where and when will the evaluation be done?
How long will the evaluation take? What computer support? What software? Initial state of the system? Are there any system/network load requirements?

7 The Role of the Experimenter
Having the right experimenter makes a difference Selecting an appropriate methodology and one the experimenter is familiar with significantly influences the quality of results Knowledge of system implementation can come in handy Participating in a usability study can have a profound impact on a designer Even very simple, informal studies

8 The Role of the Experimenter
Ensures that room, computer, etc are all ready During testing: Should not interfere! If a user is bogged down can give a small hint If the user is hopelessly off track, can fake an equipment problem

9 Ethical treatment of subjects
Your responsibility to protect subjects distress, embarrassment remind them that you are not testing them Informed, voluntary consent understand that they can quit at any time explain test in lay terms Privacy: anonymity, use of image/voice

10 Which Users? As close to real users as possible
if real users are scarce, try surrogates

11 How Many Users? Huge individual differences between users!
Up to a factor of 10 A single point determines an infinite number of lines Some data is better than none

12 Which Tasks? Keep close to the real tasks
may need to shorten some for time reasons may need to provide users with background information

13 When in the Process? Remember: early is better
Formative vs. summative evaluation During design  design modifications After design  evaluation of “finished” product, comparison to baseline, rigorous statistics

14 What to Measure Process data (Qualitative)
problems, questions, reactions what users are thinking Bottom-line data (Quantitative) mostly later for usability measurement not as useful early in design Asking users questions problematic – users will answer

15 Running the test Preparation Introduction Test Debriefing

16 Running the test - Prep Room ready? Equipment ready?
Interface in the start state?

17 Running the test - Intro
Cover the following with the user Evaluating the interface, not the user No personal stake Released version will differ Confidentiality reminder -- system,results Voluntary participation Welcome to ask questions Specific instructions Any questions?

18 Running the test Refrain from interacting with the user
If the user is clearly stuck If several observers are present designate one as lead

19 Running the test - debriefing
Fill out any questionnaires Ask follow-up questions Discussion Any other comments?


Download ppt "Usability Evaluation."

Similar presentations


Ads by Google