L4 Usability Evaluations Introduction

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Chapter 4 Design Approaches and Methods
Lecture Roger Sutton 21: Revision 1.
CS305: HCI in SW Development Evaluation (Return to…)
 1 Notes from Heim Chapter 8 and
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
Analytical Evaluations 2. Field Studies
Web Design Process CMPT 281. Outline How do we know good sites from bad sites? Web design process Class design exercise.
UX testing for mobile app Marine embe
Usability 2009 J T Burns1 Usability & Usability Engineering.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Lecture 5 Heuristic evaluations & Early prototype Evaluations HEIM, CHAPTERS
“Come on! Give me ten!” What users really want and really do on library web sites Darlene Fichter OLA Super Conference 2003.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Chapter 5 Models and theories 1. Cognitive modeling If we can build a model of how a user works, then we can predict how s/he will interact with the interface.
Ch 14. Testing & modeling users
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Usability Evaluation/LP Usability: how to judge it.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Evaluation approaches Text p Text p
Usability Testing CS774 Human Computer Interaction Spring 2004.
Usability Testing & Web Design by Alex Andujar. What is Usability? Usability measures the quality of a user's experience when interacting with a Web site,
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Slides for User interface design A software engineering perspective Soren Lauesen 2. Prototyping and iterative design August 2006 © 2005, Pearson Education.
COMPSCI 345 / SOFTENG 350 Review for mid-semester test AProf Beryl Plimmer.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Human Computer Interaction Lecture /11 1.
1 Notes from
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
OSU Libraries presented by the u-team.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Day 8 Usability testing.
A software engineering perspective
CS3205: HCI in SW Development Evaluation (Return to…)
Usability engineering
Usability engineering
Katherine Prentice, MSIS Richard Usatine, MD
Evaluation techniques
Evaluation.
COMP444 Human Computer Interaction Usability Engineering
HCI Evaluation Techniques
Formative Evaluation cs3724: HCI.
Testing & modeling users
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

L4 Usability Evaluations Introduction Notes from Heim Chapter 8 and http://www.usability.gov/ 1

Learning objectives Be able to define usability Be able to describe aspects and measures of usability Appreciate that usability depends on context Be aware of the major types of usability evaluations Usability Evaluations 2

What is the point? To ensure that users can use the system as intended! There are many examples of bad usability. Often problems can be very easily solved! 3

The first time you user test your own software You will be horrified at how bad it is! You will find most of the problems are easy to fix You will become intolerant of poor usability! (Well, maybe that’s not all true, but that’s the general direction…) Usability Evaluations 4

What is usability? Usability is the measure of the quality of a user’s experience when interacting with a product or system (www.usability.gov 2006) Usability is a quality attribute that assesses how easy user interfaces are to use (Nielsen 2003) Usability Evaluations 5

Usability Factors Fit for use (or functionality) – Can the system support the tasks that the user wants to perform Ease of learning - How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks? Efficiency of use - Once an experienced user has learned to use the system, how fast can he or she accomplish tasks? Memorability - If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything? Error frequency and severity - How often do users make errors while using the system, how serious are these errors, and how do users recover from these errors? Subjective satisfaction - How much does the user like using the system? Usability Evaluations 6

Fit for use Does the system function as expected Do the users meet their goals in a timely fashion? Finding ‘bugs’ - otherwise know as errors is NOT the goal! usability testing ≠ system testing But you are aiming to elicit any usability problems if they’re there Keep in mind: What are the system goals? E.g. To achieve a specific state Book a flight Pay someone the correct amount To participate in a computer mediated experience Play games Usability Evaluations 7

Setting usability goals Not necessarily a 1-dimensional measure; may have multiple threshold requirements Average = 2 /hour Over 50% less than 1 /hour Less than 5% over 5 /hour Can have varying levels of success E.g. minimum: not worse than the old way! User errors per hour using the system Unacceptable Minimum Target Exceeds 5 4 3 2 1 Current value Planned value Optimal Usability Evaluations 8

Ease of learning Do you expect to have to read a manual or the help? Or an online tutorial, or take a training class, or use the help hotline? How much time are you prepared to invest in Learning a new interface? Finding something on a web site? What are your usability expectations for a programming IDE? a mobile phone? Usability Evaluations 9

Efficiency of use Do you use the self checkout at the supermarket? Do you use ATM machines? Is it quicker to order a pizza over the phone or on line? Which of these is cheaper for the retailer? How about business processes. If every academic member of staff at UoA (3000) wastes 10 hours a year because the performance review system has a poor interface 3000*10 = 30,000/1700(work hrs per year) = 17.64 years work Usability Evaluations 10

Memorability Are there some things you just can’t remember how to do? A good interface is one where you remember or can, with prompts, recall what to do ‘Good enough’ is contextual If most users are expected to use a feature a often, they shouldn’t have much trouble retaining how to use it Catering for infrequent users puts on the pressure For me (Beryl Plimmer) functions related to end-of-semester marks can be a problem Probably use max of twice a year Only get to use the feature a handful of times before the system changes! Usability Evaluations 11

Error frequency and severity How frequently do users make errors? Touch phone typing  Long and complicated forms It might reduce errors to have dropdown lists to choose from But it might be slower than entering with auto complete. What is the cost of errors A patient suffers seizures because a drug-drug interaction was missed! Booking a flight for 8pm when you wanted 8am Midday? Midnight? Usability Evaluations 12

Human error - slips and mistakes Slip – I knew how to do that, but I failed understand system and goal correct formulation of action incorrect action Mistake – I didn’t really know what I was doing may not even have right goal! Fixing things? slip – redesign interface to make task easier to execute mistake – need to help the user better understand the system Usability Evaluations 13 © 2004 Dix et al.

Subjective satisfaction If users like the interface They will make less errors They will persist longer when they are having problems Aesthetics – how nice it looks is incredibly important in this respect There can be a lot underpinning a perception of looking ‘nice’ Good Contrast / font size, clean layout, good grouping Appropriate language Good fit to workflow ‘Ugly’ may mean fatiguing, inefficient or apt to produce errors Usability Evaluations 14

Types of Usability Evaluations Heuristic evaluations Performance measurements Usability studies Comparative studies http://www.usability.gov/methods/ Usability Evaluations 15

Heuristic evaluations Expert evaluation An expert looks at a system using common sense and/ or guidelines (e.g. Nielsen’s Heuristics) Covered in next lecture Expert - reviewer First law of usability: Heuristic evaluation has only 50% hit-rate Actual problems Predicted False problems Missed problems Usability Evaluations 16

Performance Measurements I.e., an analytical performance measurement that can be extracted directly from the interface as compared to an empirical performance measurement observed in a usability study Fitts’ Law is the classic performance measure for time to complete the task of pointing at an object Hick–Hyman Law time taken to make a decision (e.g., that is the object I want!) There are other more comprehensive models we won’t cover here KLM – keystroke-level model GOMS – goals, operators, methods and selection rules Usability Evaluations 17

Fitts’ Law Fitts’ Law is the classic performance measure. Time to target depends on target width (W) and distance to move pointer (D) (see tutorial exercise) It is a very valuable measure for designing Control size and location Its also fun to play with! 2 1 3 Usability Evaluations 18

Hick–Hyman Law The time it takes for a person to make a decision as a result of the possible choices Particularly important for menus Although log2 only holds if menu is sorted in a logical order (e.g. alphabetical) – otherwise search time is linear! Other factors Recognition time: for icon or word Consistency is good: spatial memory is very powerful – Knowing it’s at the left/right side, top/bottom Usability Evaluations 19

Usability studies Empirical user test Analyse results Get real users to try to perform specified tasks Observe and record Ask their opinion Analyse results What is causing problems? How can you fix them? Usability Evaluations 20

Usability studies Specific tasks Observed Recorded Measured Think-aloud I try this because ... User doesn’t notice ... Facilitator Listens Asks as needed Logkeeper Listens Records problems User Performs tasks Thinks aloud Usability Evaluations 21

Comparative Studies Evaluate the efficacy of two (or more) systems or more often very fine-grained interaction techniques Which of these magnification lenses is the most effective? This is covered in the advanced class (CS705/SE702) Schmieder 2014 Reducing Screen Occlusions on Small Displays, PhD Thesis Usability Evaluations 22

Evaluation Evaluation is an integral part of the development process and can take the form of an informal walkthrough or a more structured heuristic evaluation. Formal usability testing can begin once a prototype has been developed. Evaluation of an existing system is often part of the analysis stage Starting the evaluation process early in the design phase fosters better design because decisions can be tested before its too expensive to make changes

IPD=integrated project delivery MacLeamy curve From the construction industry, but fits well to construction of software, too! MacLeamy 2004 IPD=integrated project delivery

Next Overview of usability testing What you might find out How you might use it Planning and performing a usability test You have to do write a plan for one in Assignment 1. Usability Evaluations 25