Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.

Slides:



Advertisements
Similar presentations
Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION February 19, 2013 Heuristic Evaluation.
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
Prof. James A. Landay Computer Science Department Stanford University Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation.
Code Inspections and Heuristic Evaluation
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 19, 2004.
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 20, 2007.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Usability Testing.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Lecture 23: Heuristic Evaluation
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
University of Washington HCDE 518 & INDE 545 Empirical Evaluation HCDE 518 & INDE 545 Winter 2012 With credit to Jake Wobbrock, Dave Hendry, Andy Ko, Jennifer.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
(adapted from Berkeley GUIR)
Professor John Canny Fall /27/04
(adapted from Berkeley GUIR)
Professor John Canny Spring 2006
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Professor John Canny Fall 2001 Sept 27, 2001
Heuristic Evaluation.
Professor John Canny Spring 2004 Feb 13
Chapter 26 Inspections of the user interface
Professor John Canny Spring 2003 Feb 19
Miguel Tavares Coimbra
Miguel Tavares Coimbra
SE365 Human Computer Interaction
Miguel Tavares Coimbra
Presentation transcript:

Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen

Evaluating UI Designs Usability testing is a major technique –Formal techniques require users, rigid control experiments, statistical analysis “Discount” methods don’t require users –Heuristic Evaluation –Cognitive Walkthrough

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI –independently check for compliance with usability principles (“heuristics”) –different evaluators will find different problems –evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches

Adapted from slide by James Landay Phases of Heuristic Evaluation 1) Pre-evaluation training –give evaluators needed domain knowledge and information on the scenarios 2) Evaluation –individuals evaluate and then aggregate results 3) Severity rating –determine how severe each problem is (priority) 4) Debriefing –discuss the outcome with design team

Jakob Nielsen’s heuristics 1.0 – circa – circa 1994 H1. Simple and natural dialogAesthetic and minimalist design H2. Speak the user’s languageMatch between system and real world H3. Minimize user memory load Recognition rather than recall H4. Be consistentConsistency and standards H5. Provide feedbackVisibility of system status H6. Provide clearly marked exits User control and freedom H7. Provide shortcutsFlexibility and efficiency of use H8. Provide good error messages Help users recognize, diagnose, and recover from errors H9. Prevent errorsError prevention H10. Help and documentationHelp and documentation

Pros / Cons + Cheap (no special lab or equipment) + Easy + Fast (about 1 day) + Cost-effective + Detects many problems without users + Complementary to task-centered approaches + Coverage + Catches cross-task interactions - Requires subjective interpretation - Does not specify how to fix problems - Performance improves as evaluator knowledge increases

Procedure A set of evaluators (3-5 is about optimal) evaluate a UI (some training may be needed) Each one independently checks for compliance with the heuristics –Different evaluators find different problems Evaluators then get together and merge their findings Collectively rate severity of the problems Debriefing/brainstorming  how to fix the problems (and point out what’s really good)

Adapted from slide by James Landay How to Perform H.E. At least two passes for each evaluator –first to get feel for flow and scope of system –second to focus on specific elements Assistance from implementors/domain experts –If system is walk-up-and-use or evaluators are domain experts, then no assistance needed –Otherwise might supply evaluators with scenarios and have implementors standing by

Adapted from slide by James Landay How to Perform Evaluation Where problems may be found –single location in UI –two or more locations that need to be compared –problem with overall structure of UI –something that is missing

Adapted from slide by James Landay Example Problem Descriptions Have to remember command codes –Violates “Minimize the users’ memory load” (H3) –Fix: add drop down box with selectable codes Typography uses mix of upper/lower case formats and fonts –Violates “Consistency and standards” (H4) –Slows users down –Probably wouldn’t be found by user testing –Fix: pick a single format for entire interface

Severity ratings Used to allocate resources to fix problems Should be calculated after all evaluations are done Should be done independently by all evaluators Based on –Frequency the problem will occur –Impact of problem (hard or easy to overcome) –Persistence (will users learn a work around or will they be bothered every time?) 1 – cosmetic problem 2 – minor usability problem 3 – major usability problem; important to fix 4 – usability catastrophe – must fix

Adapted from slide by James Landay Severity Ratings Example 1. [H4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

Adapted from slide by James Landay Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Developer team rates how hard things are to fix Make it a brainstorming session

Adapted from slide by James Landay Results of Using HE (cont.) Single evaluator achieves poor results –only finds 35% of usability problems –5 evaluators find ~ 75% of usability problems –why not more evaluators? 10? 20? adding evaluators costs more adding more evaluators doesn’t increase the number of unique problems found

Adapted from slide by James Landay Decreasing Returns problems foundbenefits / cost (from Nielsen) Caveat: these graphs are for a specific example This is a controversial point.

Why Multiple Evaluators? Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones

Exercise Evaluate an application using heuristic evaluation –Bring your computer if you have one! –Refer back to slide with the 10 heuristics Fill out issues found on next slide(s) and submit them when you are done or we run out of time, whichever comes first We will discuss and debrief after you are done

Heuristic Evaluation Issue Log Example Heuristic Severity Description The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function H4 Consistency 3 Entering invalid input into dialog box on first form results in “Error 103” H8 Error Message 4 …

Heuristic Evaluation Issue Log Heuristic Severity Description

Heuristic Evaluation Issue Log Heuristic Severity Description

Heuristic Evaluation Issue Log Heuristic Severity Description