An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.

Slides:



Advertisements
Similar presentations
Human Computer Interaction
Advertisements

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Agile Usability Testing Methods
© De Montfort University, Characteristics of Good Dialogues Howell Istance Department of Computer Science De Montfort University.
Técnicas de Calidad en el Software Sesión # 10. Good quality software Operations Transition Testing Portability Reusability Interoperability Maintainability.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
UI Standards & Tools Khushroo Shaikh.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Useability.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Formative and Summative Evaluations
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Fundamentals of Information Systems, Second Edition
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
SE 555 Software Requirements & Specification Requirements Validation.
Intro to Evaluation See how (un)usable your software really is…
Usability 2004 J T Burns1 Usability & Usability Engineering.
Testing and Modeling Users Kristina Winbladh & Ramzi Nasr.
Principles and Methods
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Analytical Evaluations 2. Field Studies
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
Usability Testing CS774 Human Computer Interaction Spring 2004.
SEG3120 User Interfaces Design and Implementation
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Heuristic Evaluation Short tutorial to heuristic evaluation
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Intro to Evaluation See how (un)usable your software really is…
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
User Interface Evaluation Introduction Lecture #15.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
SIE 515 Design Evaluation Lecture 7.
An Overview of Usability Evaluation
Human Computer Interaction Lecture 15 Usability Evaluation
SY DE 542 User Testing March 7, 2005 R. Chow
Evaluation techniques
Usability Techniques Lecture 13.
User interface design.
HCI Evaluation Techniques
Presentation transcript:

An Overview of Usability Evaluation #15 1

Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations  What can we learn from:  Heuristic evaluation ?  Usability testing ?  How do we conduct :  Heuristic evaluation ?  Usability test ? 2

What Is Usability Evaluation  A systematic process aimed at assessing the fit between a UI design and human capabilities within a task context  It is a central element of the UI design process performed throughout the UI development process 3

Why Perform Usability Evaluations?  Find usability problems in an interface design  Assess compliance with style guide ( e.g., MS windows )  Compare alternative UI components  Icon design  Input/output technologies  Assess the worth/usefulness of the software in overall job context 4

Evaluation Methods 5 Evaluation CategoriesEvaluation Requirements Evaluation Technique Usability inspection methods A static prototype A UI design expert Heuristic evaluation Evaluation against guidelines Cognitive walkthrough User-based EvaluationsA dynamic prototype A usability analyst Questionnaires Observational usability study Formal usability study with quantitative data analysis Controlled experiments Analytic EvaluationsA UI designer with expiries in analytic techniques Keystroke level model GOMS Grammars

Questions of a Usability Evaluation (1)  Are the functions made available in a convenient, task oriented way?  we need to have task knowledge  Does the system anticipate the skill and knowledge of the user?  we need to have user knowledge  Does the design meet general rules of good user interface design?  we need to have UI knowledge 6

Questions of a Usability Evaluation (2)  Is the system compliant with other applications running in the user environment?  We need to know the style guide knowledge  How fast can users learn to use the system?  At what speed can users perform various task?  How likely are users to complete a given task ? 7

Usability Inspection Methods  Evaluation against guidelines  Heuristics evaluation 8

Evaluation Against Guidelines and Rules  A process in which each UI element (e.g., menu choice, icon, button, pointer, radio button) is examined against an existing set of general guidelines and a specific set of design rules (the style guide) applicable to a specific product  Mil STD 1476 F  Windows style guide  Performed by one or more UI design experts who have a thorough familiarity with general UI design guidelines and the product/corporate style guide 9

Guidelines and Rules  Guidelines are accepted principles for interface design  Rules specify the interface appearance or action 10

Examples of Guidelines  Guidelines  Displays should be consistently formatted  Displays should be uniquely identified  Use short simple sentences  Employ units of measurement that are familiar to the user 11

Examples of Design Rules  Design Rules  The character stroke width of a system font shall be at least 2 pixels thick  F10 (and Shift+Menu) exits the menu bar and returns the location cursor to the previous object with focus 12

Pro and Cons of Evaluating Against Guidelines  Pros  Provides information on basic design issues  Finds a broad range of usability problems  Cons  Dose not assess whether system meets user's and task needs ( can be compliant and still have poor design)  Time consuming  Guidelines/rules don't exist for all areas of UI design 13

Heuristic Evaluation  Popular and widely used structured review of a UI  Objective is to generate a list of potential usability problems  Evaluator assumes the user's role and identifies problems from a user's perspective  Criteria for "a problem" is a set of recognized usability principles called "Heuristics" 14

Heuristics Identified by Nielsen (1993)  Use simple and natural dialogue  Speak the users' language  Minimize the users' memory load "‘  Be consistent  Provide feedback  Provide clearly marked exit  Provide shortcuts  Provide good error messages  Prevent errors 15

Conducting a Heuristic Evaluation  Collect background information  Identify typical users, scenarios, previous feedback, usability goals  Inspect the flow of the interaction from screen to screen  Inspect screens one at a time against the heuristics  Generate an inspection problem report  Lists and prioritize the usability issues, fixes and/or redesigns 16

Who Should Inspect ?  Usability specialists often find more “ problems ” than evaluators with no usability experience (or computer experience only)  Usability specialists with knowledge about a particular kind of interface being developed (Double specialists) find more usability problems than “ regular ” usability specialists 17

How Many Inspectors ?  Single evaluator finds only about 35% of the problems  Increasing the number of evaluators from 2 to 5 increases the number of problems found up to around 75% of all the problems  Percent of Problems Found 5 Number of inspectors. Nielsen

Types of Problems Uncovered by Heuristic Evaluation  Missing or difficult-to-access functionality  Limited or inappropriate task flow  Limited navigational cues  Inappropriate feedback  Cluttered screens 19

Pros and Cons of Heuristic Evaluation  Pros  Does not involve users, Relatively inexpensive  Finds a broad range of major and minor usability problems  Maximized by using multiple evaluators  Less intimidating to developers then usability testing  Cons  Subjective and dependent on HCI skills of the evaluators and their knowledge of the task and the users  Depends on how realistically and to what degree the system is exercised  Not exhaustive 20

21 Usability Testing

 A set of methods of user-based evaluations  Questionnaires  Observational usability studies  Formal usability studies with quantitative data analysis  Controlled experiments  Observe and measure how users interact with an application  Focus on the direct feedback from end users interacting with the system  Should be the ultimate goal of every evaluation plan because it involves real end users 22

The Nature of Usability Testing  Merges several user-based evaluation methods into a single evaluation process  Observation  Interviews  Testing  Each method illuminates a different aspect of usability  Performed after a design (or parts of a design) have been fine tuned based on usability inspection techniques  Performed before a prototype is handed over to developers and a product is sent out 23

When is Usability Testing Useful  Test early to:  Evaluate an individual aspect of the design  Significantly affect the design  Provide quick answers to developer  may involve fewer users  may collect less data  Test late to:  Verify the entire application  Stable design  Full functionality  Asses the impact of the design on the user  Controlled variables 24

Where Is Usability Testing Performed  User office environment  Users are in their natural surroundings  Easier to recruit users  But --  Uncontrolled environmental setting  Interruptions  Variety of computer configuration  No observation by development team permitted  Usability Lab  Controlled setting  Consistent computer configuration  Data collection equipment  Permits unobtrusive observation by development team 25

Who Participates in Usability Testing ?  Evaluators  Usability specialists  Participants  Potential users  Observers  Members of the design/development team 26

How Many Participants to Include ?  At least 2 from each distinct user group  at earlier stages of the evaluation when focus is on gross usability issues  6 and up (per user group) at later stages of the evaluation when focus is on performance assessment  Remember, the objective in usability evaluation is not to uncover statistical differences, only design issues 27

Measures of Usability  Time to complete task  Completion rate  Number of errors  Types of errors  Severity of errors  Number of requests for help  Number of trials to become proficient in using the system  Comparative ratings  Subjective ratings 28

What to Expect From Test Participants  Do the unexpected  Have preconceived ideas  Do not always ask for help  Fail to follow instruction  Quickly develop habits  Are afraid of breaking system  Are apologetic 29

Data Collection Techniques  Video Taping  User ’ s interactions with the application  User ’ s facial expressions  Audio Taping  User comments  Observer comments  Data collection applications  Keystroke capture  Indexed videotape  Questionnaires  Interviews  Open ended  Structured 30

Summary  Start evaluation early in the design process and continue to evaluate throughout the development cycle  This will minimize the likelihood of a major usability problem emerging during the later phases of the development  Incorporate a variety of evaluation methods  One method cannot predict or identify all the potential usability issues  Include at least one user-based evaluation method in your evaluation plan 31