Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Heuristic Evaluation.
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Evaluation: Inspections, Analytics & Models
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Heuristic Evaluation VINCENT SCHOENMAKERS CARLOS A. TIJERINA IBARRA EDGAR I. VILLANUEVA.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
RUGGAAMUFFIN Requirements analysis and design Shane Murphy.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Efficient Techniques for Evaluating UI Designs CSE 403.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Heuristic Evaluation May 4, 2016
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Chapter 26 Inspections of the user interface
Evaluation.
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Y ASER G HANAM Heuristic Evaluation

Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise

Introduction Introduced by Nielsen as a discount usability method. Early in the design or during implementation. Given:  A prototype or a working system  A set of usability heuristics  A few evaluators Come up with:  A usability evaluation of the system

Introduction Usability Evaluation Evaluators Usability Heuristics The system

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Get the heuristics Heuristics are system dependent. Nielsen’s heuristics proved reliable & representative. Feel free to add more heuristics but not many. Feel free to drop irrelevant ones.

Usability Heuristics Visibility of system status

Usability Heuristics Match between system and the real world

Usability Heuristics User control and freedom

Usability Heuristics Consistency and standards Greenberg, S., Overview of Heuristic Evaluation, wiki/uploads/CPSC681/Heuristic.ppt, accessed October 10, wiki/uploads/CPSC681/Heuristic.ppt

Usability Heuristics Error prevention

Usability Heuristics Help users recognize, diagnose, and recover from errors

Usability Heuristics Recognition rather than recall

Usability Heuristics Flexibility and efficiency of use  Shortcuts:  Normal mode vs. Advanced mode  Don’t make it an alternative CtrlCCopy

Usability Heuristics Aesthetic and minimalist design

Usability Heuristics Help and documentation Ability to use system No/minimal docs Good Usability

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Get the system ready Prototype: novel application or interface  No redesign required  Less maintenance later Working system: replacement study or competition Prepare typical scenarios: task analysis

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Get the evaluators HE is a group effort. Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, p John Wiley & Sons, New York, NY (1994).

Get the evaluators The more the better? Not necessarily Rule of thumb: 3 to 5 evaluators Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, p John Wiley & Sons, New York, NY (1994).

Get the evaluators Evaluators’ expertise  Novices: better be potential users of the system  Usability experts: more effective  Double experts (both in usability and the domain): the best to get but very expensive Session manager: facilitates the evaluation session & aggregates reports Observers: provide help to evaluators

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Do the evaluation Evaluators get the heuristics & scenarios. Navigate through the system twice. Inspect screens, dialogues, forms, messages and menus in the system. Categorize any problem under one of the heuristics: should give specific explanation. Can make comments beyond the heuristics. Report in writing or verbally to the observer.

Do the evaluation Observers answer evaluators’ questions.  especially domain-specific questions.  but, without influencing judgments. IMPORTANT:  Inspection should be done individually.  Evaluators are not allowed to communicate. Session takes 1 to 2 hours.

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Compile the results Aggregate evaluators’ reports.  Eliminate duplicate entries.  Merge different problems. Output: One report of all usability problems found by the evaluators.

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Conduct severity rating Evaluators are aware of all usability problems. Severity determined by:  frequency of occurrence  impact on user  persistence 0: Not a problem at all 1: Cosmetic problem 2: Minor problem 3: Major Problem 4: Catastrophic problem

How it works - Procedure 1 Get the heuristics 2 Get the system ready 3 Get the evaluators 4 Do the evaluation 5 Compile the results 6 Conduct severity rating 7 Develop an action plan

Evaluators, facilitator & the design team meet.  Discuss problems.  Suggest solutions. Consider organization’s priorities. Take decisions: fix major not minor problems, delay release, replace interface … etc.

Advantages Discount usability methods: few gives many. Easy to teach. Fast to conduct. Cheap. Can be used in early design. High benefit to cost ratio: i.e. 48

Shortcomings Reduced set of heuristics is very broad and general Usually does not involve end users Finds many minor problems causing a “false alarm” sometimes. Not suitable for in-depth usability testing or critical systems.

Conclusion A discount usability engineering method. Best for time-constrained, budget-limited projects. 3 to 5 evaluators follow 10 heuristics. Finds many problems in a short time. Does not replace other usability methods.

Thanks for listening Questions ????

Exercise Nielsen, J. (1993). Usability Engineering, Academic Press.