Heuristic Evaluation November 3, 2016

Slides:



Advertisements
Similar presentations
Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION February 19, 2013 Heuristic Evaluation.
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Prof. James A. Landay Computer Science Department Stanford University Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Heuristic Evaluation.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Lecture 5 Heuristic evaluations & Early prototype Evaluations HEIM, CHAPTERS
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
HCI 특론 (2010 Spring) Low-fi Prototyping. 2 Interface Hall of Shame or Fame? Amtrak Web Site.
Usability Evaluation/LP Usability: how to judge it.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Executive Summary - Human Factors Heuristic Evaluation 04/18/2014.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Efficient Techniques for Evaluating UI Designs CSE 403.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
SIE 515 Design Evaluation Lecture 7.
Sampath Jayarathna Cal Poly Pomona
Human-Computer Interaction
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Prototyping.
(adapted from Berkeley GUIR)
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Professor John Canny Fall /27/04
Unit 14 Website Design HND in Computing and Systems Development
(adapted from Berkeley GUIR)
Professor John Canny Spring 2006
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Software Engineering D7025E
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Professor John Canny Fall 2001 Sept 27, 2001
Heuristic Evaluation.
Professor John Canny Spring 2004 Feb 13
Making diabetes management fun and easy for kids
Chapter 26 Inspections of the user interface
Midway Milestone Presentation: FlexiVoice
Professor John Canny Spring 2003 Feb 19
User Interface Design.
Miguel Tavares Coimbra
User Interface Design In Windows using Blend.
Miguel Tavares Coimbra
SE365 Human Computer Interaction
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Polytone Convey volume and emotion through text. By: A Team
Presentation transcript:

Heuristic Evaluation November 3, 2016 [change next year: make heuristic examples animate BEFORE showing the heuristic] handed out paper for exercise at 2:25 (55 minutes into lecture – let them work for 15 minutes started talking about it at 2:40 and went until 2:58 – leaving 22 minutes for teams Today I’m going to tell you about a useful technique in the evaluation of the DESIGN of software products… November 3, 2016

Hall of Fame or Shame? LG F7100 courtesy of Genevieve Bell, Intel Launched in 2004 in UAE, Saudia Arabia, North Africa, India, Malaysia Qiblah phone UAE, Saudia Arabia, North Africa, India Malaysia 2004 November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Hall of Fame! Good LG F7100 targeted at Muslim audience courtesy of Genevieve Bell, Intel Launched in 2004 in UAE, Saudia Arabia, North Africa, India, Malaysia Good targeted at Muslim audience need to pray 5x/day pointing towards Mecca Mecca Phone November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Hall of Fame or Shame? WhatsApp mobile messaging Works on some non-smart phones… allowed it to really take off in the developing world November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Hall of Fame! Good WhatsApp do not have to pay expensive SMS mobile messaging Good do not have to pay expensive SMS can contact people on other phones works on “feature” phones → led to rapid take-off in developing world November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristic Evaluation November 3, 2016 [75 minutes – was Shooting for 60 minutes including 25 minute exercise – gave 15 minutes to do exercise and discussed for 10 minutes] – cut a little more Lecture ended at 12:45, giving them 35 minutes for team work (shooting for 45-50 minutes) Today I’m going to tell you about a useful technique in the evaluation of the DESIGN of software products… November 3, 2016

Outline Wizard of Oz Heuristic Evaluation Overview The Heuristics Exercise Team Break November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Wizard of Oz Technique Faking the interaction. Comes from? the film “The Wizard of OZ” “the man behind the curtain” Long tradition in computer industry e.g., prototype of a PC w/ a DEC VAX behind the curtain 1) "wizard of oz" is when you use a person or some other way to fake an interaction that has not been implemented yet.  An early example of this in computing was when designing a speech-based UI, to use a human to do the recognition and even output to simulate a functioning system before it is built. Similar things have been done for handwriting recognition. Today, you could imagine that if your application needed working location-sensing features that weren't accurate enough yet or maybe even some more complicated machine learning feature that didn't exist yet, you could have a person (not the end-user) select it from a menu and have that impact the experience of a user testing your application. To the end-user it would appear as if your application worked as designed. November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Wizard of Oz Technique Faking the interaction. Comes from? the film “The Wizard of OZ” “the man behind the curtain” Long tradition in computer industry e.g., prototype of a PC w/ a DEC VAX behind the curtain Much more important for hard to implement features speech & handwriting recognition how would we do it for VR?   2) "hard-coded features" are similar.  Often, you might have an application that needs to require on a large data set (1000s of products, 100s of friends in a social network, 1000s of places with review information, etc.). None of this data may be impossible to get in practice, but for testing your user interface you may want to just "hard-code" some of this data in your app to give the flavor of how it will work (e.g., all the places are on the Stanford campus or you type in 30 products, or you simulate the friends in the social network, but all your users see the SAME list of friends) to allow you to test it or demo it. November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Wizard of Oz Technique Faking the interaction. Comes from? the film “The Wizard of OZ” “the man behind the curtain” Long tradition in computer industry e.g., prototype of a PC w/ a DEC VAX behind the curtain Much more important for hard to implement features speech & handwriting recognition how would we do it for VR? Carbon Shopper   2) "hard-coded features" are similar.  Often, you might have an application that needs to require on a large data set (1000s of products, 100s of friends in a social network, 1000s of places with review information, etc.). None of this data may be impossible to get in practice, but for testing your user interface you may want to just "hard-code" some of this data in your app to give the flavor of how it will work (e.g., all the places are on the Stanford campus or you type in 30 products, or you simulate the friends in the social network, but all your users see the SAME list of friends) to allow you to test it or demo it. November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Evaluation About figuring out how to improve design Issues with lo-fi tests? Ask and THEN NEXT SLIDE November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Evaluation About figuring out how to improve design Issues with lo-fi tests? Not realistic visuals & performance Not on actual interface can’t test alone Need participants can be hard to find repeatedly Issues with lo-fi: Not realistic Not on actual interface Need participants November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI independently check for compliance with usability principles (“heuristics”) evaluators only communicate afterwards findings are then aggregated use violations to redesign/fix problems Can perform on working UI or on sketches November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Why Multiple Evaluators? Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones Meaning? Need multiple evaluators because: Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristics H2-1: Visibility of system status H2-2: Match between system & real world H2-3: User control & freedom DuoLingo – no way when practicing to ask to “test out” like in other parts of the UI. Need to quit & go select the function. (courtesy Armando Mota) November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristics (cont.) H2-4: Consistency & standards H2-5: Error prevention H2-6: Recognition rather than recall H2-7: Flexibility and efficiency of use H2-8: Aesthetic & minimalist design H2-9: Help users recognize, diagnose, & recover from errors These are all from the same UI… what’s the problem? Why? (-> consistency) November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristics (cont.) bad Galaxy S4 message November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristics (cont.) good November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Good Error Messages Clearly indicate what has gone wrong Human readable Polite Describe the problem Explain how to fix it Highly noticeable November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristic Violation Examples [H1-3 Minimize the users’ memory load] Can’t copy info from one window to another fix: allow copying [H2-4 Consistency and Standards] Typography uses different fonts in 3 dialog boxes slows users down probably wouldn’t be found by user testing fix: pick a single format for entire interface November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Severity Ratings 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Severity Ratings Example 1. [H1-4 Consistency] [Severity 3] The interface used the string “Save” on the first screen for saving the user’s settings, but used the string “Store” on the second screen. Users may be confused by this different terminology for the same function. November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Decreasing Returns problems found benefits / cost * Caveat: graphs for a specific example November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Heuristic Evaluatoin Summary Have evaluators go through the UI twice Ask them to see if it complies with heuristics note where it doesn’t & say why Combine the findings from 3 to 5 evaluators Have evaluators independently rate severity Alternate with user testing November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

e x e r c i s e November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

e x e r c i s e Let’s take 15 minutes November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Find 12-15 Heuristic Violations http://hci.stanford.edu/courses/cs147/2016/ au/assignments/simple-heuristic-evaluation.pdf Find 12-15 Heuristic Violations Let’s take 15 minutes handed out paper for exercise at 2:25 – 55 minutes into lecture November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Problems Found H2-5 Error Prevention 2. H2-5 Error Prevention allows non-numeric data in the quantity field. fix: don’t allow it. [90] 2. H2-5 Error Prevention quantity field doesn’t multiply by the price to give a correct total. fix: make it work. [55] 3. H2-10 Help & Documentation link for international visitors hidden at bottom & may not be readable by non-english speakers. fix: move up to prominent location & include flags? [30] 4. H2-5 Error Prevention “Remove item bolded in red”, but red used for multiple purposes. Fix: get rid of ads in the checkout! More direct way to remove out of stock item or not even let me add a item that is out of stock. [100] 5. H2-5 Error Prevention No way to check out. [110] Collect and discuss problems for 10 minutes November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Problems Found Last Year H2-4 Consistency remove column, 4th item is different w/ checkboxes. [150] H2-9 Error prevention non-numeric data in the quantity. Do not allow. [125] H2-2 Match between system & real world vehicle selection link not language I’d expect [100] H2-1 Visibility of System Status unclear which item to remove based on error message (“red/bold”). [150] Collect and discuss problems for 10 minutes November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Cornell class results with number having found that problem in parenthesis November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Further Reading Heuristic Evaluation Longer lecture https://drive.google.com/file/d/0BweiB6wu4sBaN2tfZGxKb2tuOTg/view Books Usability Engineering, by Nielsen, 1994 Web site http://www.nngroup.com/articles/ November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

Next Time Lecture Read Studio Visual Information Design Watch Scott Klemmer’s HCIOnline lectures: 6.1 Visual Design (7:37) 6.2 Typography (10:47) 6.3 Grids & Alignment (17:33) Studio Medium-fi Prototype Presentations Start Heuristic Evaluation November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation

team break break at 2:58 (22 minutes for teams) November 3, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation