Heuristic Evaluation Professor: Tapan Parikh TA: Eun Kyoung Choe

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 18, 2003.
Electronic Communications Usability Primer.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 19, 2004.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 21, 2006.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
213: User Interface Design & Development Professor: Tapan Parikh TA: Eun Kyoung Choe
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 6, 2001.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Evaluation: Inspections, Analytics & Models
Mental Models and Affordances Professor: Tapan Parikh TA: Eun Kyoung Choe
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 20, 2007.
Usability Heuristics John Kelleher (IT Sligo). 1 "The two most important tools an architect has are the eraser in the drawing room and the sledge hammer.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Feb 14, 2002.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Cs414 principles of user interface design, implementation and evaluation.
Lecture 23: Heuristic Evaluation
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Nielsen’s Ten Usability Heuristics
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Heuristic Evaluation.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Presentation transcript:

Heuristic Evaluation Professor: Tapan Parikh TA: Eun Kyoung Choe Lecture #7 - February 14th, : User Interface Design and Development

Today’s Outline 1)Nielsen’s Heuristics 2)Heuristic Evaluation 3)Severity and Fixability 4)Example

Heuristic Evaluation A cheap and effective way to find usability problems A small set of expert evaluators “examine the interface and judge its compliance with recognized usability principles” Discount usability testing - find problems earlier and relatively cheaply, without involving real users

What Heuristics? Your readings provide a number of high-level and low- level design guidelines: –Donald Norman, Design of Everyday Things –Jeff Johnson, GUI Bloopers –Jakob Nielsen, Usability Engineering Other heuristics can be provided by your own intuition and common sense We will focus on Nielsen’s list from Usability Engineering

Nielsen’s Heuristics Simple and Natural Dialog Speak the User’s Language Minimize User Memory Load Consistency Feedback Clearly Marked Exits Shortcuts Good Error Messages Prevent Errors Help and Documentation

Simple and Natural Dialog Match the user’s task Minimize navigation Present exactly the information the user needs, when she needs it Use good graphic design Less is more

Adapted from Saul Greenberg

Speak the User’s Language Use the same terms the user would Avoid unusual word meanings Support synonyms and aliases Don’t impose naming conventions Understand users and how they view their domain

Adapted from Jake Wobbrock

Minimize User Memory Load Recognize rather then Recall Edit rather then Enter Choose rather then Input Provide a small number of basic commands

Adapted from Saul Greenberg

Consistency Ensure that the same action always has the same effect (or, avoid modes) Present the same information in the same location Follow established standards and conventions

Adapted from Saul Greenberg

Provide Feedback Continuously inform the user about what is going on Restate and rephrase user input Provide warnings for irreversible actions Give informative feedback even if the system fails

Adapted from Saul Greenberg What did I select? What mode am I in now? How is the system interpreting my actions?

Response Times 0.1 second - perceived as instantaneous 1 second - user’s flow of thought stays uninterrupted, but delay noticed 10 seconds - limit for keeping user’s attention focused on the dialog >10 seconds - user will want to perform other tasks while waiting

Waiting… Provide a progress indicator for any operation longer then ten seconds Reassure the user system hasn’t crashed Indicate how long user has to wait Provide something to look at If can’t provide specific progress, use generic “working” indicator –spinning ball in Mac OS X

Adapted from Saul Greenberg

Clearly Marked Exits Don’t “trap” the user Provide an easy way out of trouble Encourage exploratory learning Mechanisms: –Cancel –Undo, Revert, Back –Interrupt –Exit

Adapted from Saul Greenberg

Adapted from Jake Wobbrock

Shortcuts Allow expert users to go fast Avoid GUI operations Mechanisms: –Keyboard shortcuts –Macros, scripts –Type ahead –Bookmarks, History

Adapted from Saul Greenberg Keyboard accelerators for menus Customizable toolbars and palettes for frequent actions Split menu, with recently used fonts on top Scrolling controls for page-sized increments Double-click raises object- specific menu Double-click raises toolbar dialog box

Good Error Messages Phrased in clear language Avoid obscure codes Precisely indicate the problem Restate user input Do not blame the user Constructively suggest a solution Opportunity to help user in time of need

Adapted from Jake Wobbrock

Prevent Errors Bounds-checking Select rather then Enter Judicious use of confirmation screens Avoid modes, unless they are clearly visible or require action to maintain

Adapted from Saul Greenberg

Adapted from Jake Wobbrock

Help and Documentation Easy to search Task-oriented List concrete steps Provide context-specific help Shouldn’t be too large Is not a substitute for good design

Kinds of Help Tour / Demo Tutorials User Guide / Reference manual Searchable index Tooltips, Balloon Help Reference cards Keyboard templates Adapted from Saul Greenberg

Conducting a Heuristic Evaluation Can use hi-fi or lo-fi prototype Each session should last 1-2 hours Evaluator should go through the interface several times, with specific tasks in mind –First pass: overall feel and scope, identify obvious violations –Second pass: focus on specific elements

Conducting a Heuristic Evaluation 3-5 evaluators are enough to uncover most important problems Each evaluator should inspect the interface alone (to reduce bias) After the session, the evaluators aggregate observations Output is a list of usability problems

Conducting a Heuristic Evaluation If the system is intended to be “walk up and use”, then evaluators should be provided with minimal help If the system requires training, then evaluators should be trained and given an example scenario User can be helped after they have made an attempt and articulated their difficulties

Steps in Heuristic Evaluation Pre-evaluation training Evaluation Severity / Fixability rating Debriefing

Severity Ratings Provided by each evaluator Based on frequency, impact, persistence Combined into a single numeric index Average taken across evaluators Allows for prioritization of fixes

Severity Ratings 0: don’t agree that this is a problem 1: cosmetic problem 2: minor problem 3: major problem; important to fix 4: catastrophe; imperative to fix

Debriefing Conducted with evaluators, observers, and development team members Discuss general characteristics of UI Suggest improvements to address major usability problems Dev team provides fixability ratings Make it a brainstorming session Adapted from Saul Greenberg

Fixability Describes how easy each problem would be to fix Requires some technical knowledge of the system With severity, allows for estimating “cost-benefit” Evaluators also provide possible fix as guidance to development team

Fixability 0: Impossible to Fix 1: Nearly Impossible to Fix 2: Difficult to Fix 3: Easy to Fix 4: Trivial to Fix

Output A list of problems with heuristics, severity, fixability and possible fixes Evaluator: John T. Doe Date: January 1, 2008 System: Nokia Mobile Phone Model #9999 NumberHeuristicLocationDescriptionSeverityFixabilitySumPossible Fix 1 Visibility of system status Home screen The home screen does not portray any information about battery power remaining, making it hard for users to tell how much power they have left Display a battery life indicator on the home screen. 2 User control and freedom Screen for writing a text message Once you are on the screen for writing a text message, you cannot leave without sending the message. Users need a way to get out if they decide not to send a message Allow the CLR button to always move the user back one screen no matter where they are. Adapted from Jake Wobbrock

For Next Time Start working on Assignment 2! Reading about Usability Testing