An Overview of Usability Evaluation

Slides:



Advertisements
Similar presentations
Human Computer Interaction
Advertisements

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Agile Usability Testing Methods
Técnicas de Calidad en el Software Sesión # 10. Good quality software Operations Transition Testing Portability Reusability Interoperability Maintainability.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Chapter 7 design rules.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
Design rules.
Chapter 6 : User interface design
The aims Show how design & evaluation are brought together in the development of interactive products. Show how different combinations of design & evaluation.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Chapter 16: User Interface Design
Human-Computer Interaction
Chapter 1 - Introduction
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
Topic for Presentaion-2
Usability Evaluation, part 2
Human Factors Issues Chapter 8 Paul King.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
Introducing Evaluation
Software Quality Engineering
SY DE 542 User Testing March 7, 2005 R. Chow
Evaluation techniques
HCI – DESIGN RATIONALE 20 November 2018.
Model based design.
Usability Testing: An Overview
WHAT IS INTERACTION DESIGN?
Information System Design Info-440
Usability Techniques Lecture 13.
User interface design.
Evaluation.
HCI Evaluation Techniques
Software Usability and Design
Chapter 7 design rules.
Chapter 7 design rules.
Chapter 7 design rules.
Evaluation Techniques
User CENTERED DESIGN IB TOPIC 7.
Chapter 7 design rules.
Presentation transcript:

An Overview of Usability Evaluation Gabriel Spitz An Overview of Usability Evaluation

Outline What is usability evaluation ? Why perform usability evaluation ? Types of usability evaluations What can we learn from: Heuristic evaluation ? Usability testing ? How do we conduct : Usability test ?

User-Interface design - Thinking Articulate the Problem Develop Requierments Formulate a Conceptual Design Create a Low Fidelity Design Design a high Fidelity Product Evaluate Ideas, concepts, designs

What Is Usability Evaluation A systematic process aimed at assessing the the fit between a UI design and human capabilities within a task context It is a central element of the UI design process performed throughout the UI development process Why?

Why Perform Usability Evaluations? Find usability problems in an interface design Assess compliance with style guide ( e.g., MS windows, IOS, Google Material, etc. ) Compare alternative UI components Icon design Input/output technologies Assess usefulness of the software in overall job context

Scope of Usability Evaluation Like in many scientific endeavors, there are many and different methods to evaluate Methods vary in terms of The purpose of the evaluation – Summative vs. Formative Stage of the design What is being evaluated

Summary - Evaluation Methods Evaluation Categories Evaluation Requirements Evaluation Technique Usability inspection methods A static prototype A UI design expert Heuristic evaluation Evaluation against guidelines Cognitive walkthrough User-based Evaluations A dynamic prototype A usability analyst Questionnaires Observational usability study Formal usability study with quantitative data analysis Controlled experiments Analytic Evaluations A UI designer with expiries in analytic techniques Keystroke level model GOMS Grammars

Questions of a Usability Evaluation Is the system compliant with other applications running in the user environment? We need to know the style guide knowledge How fast can users learn to use the system? At what speed can users perform various task? How likely are users to complete a given task?

Usability Inspection Methods Evaluation against guidelines Heuristics evaluation

Evaluation Against Guidelines and Rules A systematic process in which each UI element (e.g., menu choice, icon, button, pointer, radio button) is examined against an existing set of general guidelines Mil STD 1476 F Windows style guide Performed by one or more UI design experts who have a thorough familiarity with general UI design guidelines and the product/corporate style guide

Guidelines and Rules Guidelines are accepted principles for interface design Rules specify the interface appearance or action

Examples of Guidelines Displays should be consistently formatted Displays should be uniquely identified Use short simple sentences Employ units of measurement that are familiar to the user

Examples of Design Rules The character stroke width of a system font shall be at least 2 pixels thick F10 (and Shift+Menu) exits the menu bar and returns the location cursor to the previous object with focus

Evaluating Against Guidelines Pros Provides information on basic design issues Forces a complete and exhaustive review against all the available guidelines Finds a broad range of usability problems Cons Do not assess whether system meets user's needs ( can be compliant and still have poor design) Is highly time consuming Guidelines/rules don't exist for all areas of UI design Some commercial guidelines/rules are poorly chosen and include conflicts Are not task oriented

Heuristic Evaluation Popular and widely used structured review of a user interface Objective is to generate a list of potential usability problems Evaluator assumes the user's role and identifies problems from a user's perspective Criteria for "a problem" is a set of recognized usability principles called "Heuristics"

Heuristics Identified by Nielsen (1993) Use simple and natural dialogue Speak the users' language Minimize the users' memory load "‘ Be consistent Provide feedback Provide clearly marked exit Provide shortcuts Provide good error messages Prevent errors

Conducting a Heuristic Evaluation Collect background information Identify typical users Build usage scenarios Review user feedback about predecessor products and usability goals for the current product Inspect the flow of the interaction from screen to screen Inspect screens one at a time against the heuristics Generate an inspection problem report Lists and prioritize the usability issues Provide fixes and redesign suggestions Provide estimates of the cost (time, labor) of implementing the suggested redesign

Who Should Inspect ? Usability specialists often find more “problems” than evaluators with no usability experience (or computer experience only) Usability specialists with knowledge about a particular kind of interface being developed (Double specialists) find more usability problems than “regular” usability specialists

How Many Inspectors ? Single evaluator finds only about 35% of the problems Increasing the number of evaluators from 2 to 5 increases the number of problems found up to around 75% of all the problems Percent of Problems Found 5 Number of inspectors . Nielsen 1993

Pros and Cons of Heuristic Evaluation Does not involve users Finds a broad range of major and minor usability problems Relatively inexpensive Maximized by using multiple evaluators Less intimidating to developers then usability testing Cons Subjective and dependent on HCI skills of the evaluators and their knowledge of the task and the users Depends on how realistically and to what degree the system is exercised Not exhaustive

Usability Testing

Usability Testing A set of methods of user-based evaluations Questionnaires Observational usability studies Formal usability studies with quantitative data analysis Controlled experiments Observe and measure how users interact with an application Focus on the direct feedback from end users interacting with the system Should be the ultimate goal of every evaluation plan because it involves real end users

The Nature of Usability Testing Merges several user-based evaluation methods into a single evaluation process Observation Interviews Testing Each method illuminates a different aspect of usability Performed after a design (or parts of a design) have been fine tuned based on usability inspection techniques Performed before a prototype is handed over to developers and a product is sent out

When is Usability Testing Useful Test early (Formative Evaluation) to: Evaluate an individual aspect of the design Significantly affect the design Provide quick answers to developer may involve fewer users may collect less data Test late (Summative Evaluation) to: Verify the entire application Stable design Full functionality Asses the impact of the design on the user Controlled variables

Where Is Usability Testing Performed User office environment Users are in their natural surroundings Easier to recruit users But -- Uncontrolled environmental setting Interruptions Variety of computer configuration No observation by development team permitted Usability Lab Controlled setting Consistent computer configuration Data collection equipment Permits unobtrusive observation by development team

Who Participates in Usability Testing ? Evaluators Usability specialists Participants Potential users Observers Members of the design/development team

How Many Participants to Include ? At least 2 from each distinct user group 2 - 3 at earlier stages of the evaluation when focus is on gross usability issues 6 and up (per user group) at later stages of the evaluation when focus is on performance assessment Remember, the objective in usability evaluation is not to uncover statistical differences, only design issues

Measures of Usability Time to complete task Completion rate Number of errors Types of errors Severity of errors Number of requests for help Number of trials to become proficient in using the system Comparative ratings Subjective ratings

What to Expect From Test Participants Do the unexpected Have preconceived ideas Do not always ask for help Fail to follow instruction Quickly develop habits Are afraid of breaking system Are apologetic

Data Collection Techniques Video Taping User’s interactions with the application User’s facial expressions Audio Taping User comments Observer comments Data collection applications Keystroke capture Indexed videotape Questionnaires Interviews Open ended Structured

Summary Start evaluation early in the design process and continue to evaluate throughout the development cycle This will minimize the likelihood of a major usability problem emerging during the later phases of the development Incorporate a variety of evaluation methods One method cannot predict or identify all the potential usability issues Include at least one user-based evaluation method in your evaluation plan