Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Evaluation: Inspections, Analytics & Models
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Chapter 26 Inspections of the user interface
Evaluation.
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Heuristic Evaluation Evaluating with experts

Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict usability rather than observing it directly Conserve resources (quick & low cost)

Approach - inspections  Expert reviewers used HCI experts interact with system and try to find potential problems and give prescriptive feedback Best if  Haven’t used earlier prototype  Familiar with domain or task  Understand user perspectives

Discount Evaluation Methods 1. Scenarios 2. Heuristic Evaluation 3. Cognitive Walkthrough Separate presentation

Heuristic Evaluation  Developed by Jakob Nielsen  Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) (Web site: )

Heuristic Evaluation  Mainly qualitative  use with experts  predictive

Procedure 1. Gather inputs 2. Evaluate system 3. Debriefing and collection 4. Severity rating

1: Gather Inputs  Who are evaluators? Need to learn about domain, its practices  Get the prototype to be studied May vary from mock-ups and storyboards to a working system

How many experts?  Nielsen found that about 5 evaluations found 75% of the problems  Above that you get more, but at decreasing efficiency

2: Evaluate System  Reviewers evaluate system based on high-level heuristics.  Where to get heuristics? l

Heuristics  use simple and natural dialog  speak user’s language  minimize memory load  be consistent  provide feedback  provide clearly marked exits  provide shortcuts  provide good error messages  prevent errors

Neilsen’s Heuristics  visibility of system status  aesthetic and minimalist design  user control and freedom  consistency and standards  error prevention  recognition rather than recall  flexibility and efficiency of use  recognition, diagnosis and recovery from errors  help and documentation  match between system and real world

Groupware heuristics  Provide the means for intentional and appropriate verbal communication  Provide the means for intentional and appropriate gestural communication  Provide consequential communication of an individual’s embodiment  Provide consequential communication of shared artifacts (i.e. artifact feedthrough)  Provide Protection  Manage the transitions between tightly and loosely- coupled collaboration  Support people with the coordination of their actions  Facilitate finding collaborators and establishing contact Baker, Greenberg, and Gutwin, CSCW 2002

Ambient heuristics  Useful and relevant information  “Peripherality” of display  Match between design of ambient display and environments  Sufficient information design  Consistent and intuitive mapping  Easy transition to more in-depth information  Visibility of state  Aesthetic and Pleasing Design Mankoff, et al, CHI 2003

Process  Perform two or more passes through system inspecting Flow from screen to screen Each screen  Evaluate against heuristics  Find “problems” Subjective (if you think it is, it is) Don’t dwell on whether it is or isn’t

3: Debriefing  Organize all problems found by different reviewers At this point, decide what are and aren’t problems Group, structure Document and record them

4: Severity Rating  Based on frequency impact persistence market impact  Rating scale 0: not a problem 1: cosmetic issue, only fixed if extra time 2: minor usability problem, low priority 3: major usability problem, high priority 4: usability catastrophe, must be fixed

Advantages  Few ethical issues to consider  Inexpensive, quick  Getting someone practiced in method and knowledgeable of domain is valuable

Challenges  Very subjective assessment of problems Depends of expertise of reviewers  Why are these the right heuristics? Others have been suggested  How to determine what is a true usability problem Some recent papers suggest that many identified “problems” really aren’t

Your turn:  Banner – looking at courses  Use Nielsen’s heuristics (p 408)  List all problems  Come up to the board and put up at least one new one  We’ll rate as a group

Neilsen’s Heuristics  visibility of system status  aesthetic and minimalist design  user control and freedom  consistency and standards  error prevention  recognition rather than recall  flexibility and efficiency of use  recognition, diagnosis and recovery from errors  help and documentation  match between system and real world

Next time  Heuristic evaluation of your own prototypes  Bring to class Your materials – sketches, storyboards, working prototype, etc. Set of heuristics you want them to use, print them out

Next time  Each group will demo their prototype(s)  Evaluate: Tuxbo evaluates GWH2 GWH2 evaluates Easy Finder Easy Finder evaluates Tuxbo  Collect, organize and rate severity of problems, include in your part 3 writeup.