Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Cognitive Walkthrough More evaluation without users.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Discount Evaluation Evaluating with experts. Agenda Project was due today – You will demo your prototype next class Heuristic Evaluation Cognitive Walkthrough.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
Heuristic Evaluation Short tutorial to heuristic evaluation
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Evaluation.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Cognitive Walkthrough
Miguel Tavares Coimbra
Presentation transcript:

Discount Evaluation Evaluating with experts

Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform CW on each other’s prototypes

Project part 3 See any comments in your writeups I’ll send out grades tomorrow (I have to add the demo piece) Perform the evaluations Clearly inform your users what you are doing and why. If you are audio or video recording, I prefer you use a consent form. Pilot at least once – know how long its going to take. If you need equipment, ask me

Discount Evaluation Techniques Basis: Observing users can be time-consuming and expensive Try to predict usability rather than observing it directly Conserve resources (quick & low cost)

Approach - inspections Expert reviewers used HCI experts interact with system and try to find potential problems and give prescriptive feedback Best if Haven’t used earlier prototype Familiar with domain or task Understand user perspectives Does not require working system

Heuristic Evaluation Developed by Jakob Nielsen Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb) (Web site: )

How many experts? Nielsen found that about 5 evaluations found 75% of the problems Above that you get more, but at decreasing efficiency

Evaluate System Reviewers need a prototype May vary from mock-ups and storyboards to a working system Reviewers evaluate system based on high-level heuristics. Where to get heuristics?

Heuristics use simple and natural dialog speak user’s language minimize memory load be consistent provide feedback provide clearly marked exits provide shortcuts provide good error messages prevent errors

Neilsen’s Heuristics visibility of system status aesthetic and minimalist design user control and freedom consistency and standards error prevention recognition rather than recall flexibility and efficiency of use recognition, diagnosis and recovery from errors help and documentation match between system and real world

Groupware heuristics Provide the means for intentional and appropriate verbal communication Provide the means for intentional and appropriate gestural communication Provide consequential communication of an individual’s embodiment Provide consequential communication of shared artifacts (i.e. artifact feedthrough) Provide Protection Manage the transitions between tightly and loosely-coupled collaboration Support people with the coordination of their actions Facilitate finding collaborators and establishing contact Baker, Greenberg, and Gutwin, CSCW 2002

Ambient heuristics Useful and relevant information “Peripherality” of display Match between design of ambient display and environments Sufficient information design Consistent and intuitive mapping Easy transition to more in-depth information Visibility of state Aesthetic and Pleasing Design Mankoff, et al, CHI 2003

Process Perform two or more passes through system inspecting Flow from screen to screen Each screen Evaluate against heuristics Find “problems” Subjective (if you think it is, it is) Don’t dwell on whether it is or isn’t

Debriefing Organize all problems found by different reviewers At this point, decide what are and aren’t problems Group, structure Document and record them

Severity Rating Based on frequency impact persistence market impact Rating scale 0: not a problem 1: cosmetic issue, only fixed if extra time 2: minor usability problem, low priority 3: major usability problem, high priority 4: usability catastrophe, must be fixed

Advantages Few ethical issues to consider Inexpensive, quick Getting someone practiced in method and knowledgeable of domain is valuable

Challenges Very subjective assessment of problems Depends of expertise of reviewers Why are these the right heuristics? Others have been suggested How to determine what is a true usability problem Some recent papers suggest that many identified “problems” really aren’t

Your turn: Library – main page, finding a book Use Nielsen’s heuristics (p 686) List all problems Come up to the board and put up at least one new one We’ll rate as a group

Your turn Prepare Determine your heuristics if you want any different ones Demo your prototype to all of us Evaluate Evaluate someone other than yourself Debrief On your own, include in Part 3

Cognitive Walkthrough More evaluation without users

Cognitive Walkthrough Assess learnability and usability through simulation of way novice users explore and become familiar with interactive system A usability “thought experiment” Like code walkthrough (s/w engineering) From Polson, Lewis, et al at UC Boulder

CW: Process Construct carefully designed tasks from system spec or screen mock-up Walk through (cognitive & operational) activities required to go from one screen to another Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter

CW: Assumptions User has rough plan User explores system, looking for actions to contribute to performance of action User selects action seems best for desired goal User interprets response and assesses whether progress has been made toward completing task

CW: Requirements Description of users and their backgrounds Description of task user is to perform Complete list of the actions required to complete task Prototype or description of system

CW: Methodology Step through action sequence Action 1 Response A, B,.. Action 2 Response A... For each one, ask four questions and try to construct a believability story

CW: Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? (is it visible) 3. Once found, will they know it’s the right one for desired effect? (is it correct) 4. Will users understand feedback after action?

CW: Answering the Questions 1. Will user be trying to produce effect? Typical supporting evidence It is part of their original task They have experience using the system The system tells them to do it No evidence? Construct a failure scenario Explain, back up opinion

CW: Next Question 2.Will user notice action is available? Typical supporting evidence Experience Visible device, such as a button Perceivable representation of an action such as a menu item

CW: Next Question 3.Will user know it’s the right one for the effect? Typical supporting evidence Experience Interface provides a visual item (such as prompt) to connect action to result effect All other actions look wrong

CW: Next Question 4.Will user understand the feedback? Typical supporting evidence Experience Recognize a connection between a system response and what user was trying to do

Let’s practice Library web site Users: Student users, familiar with computers and libraries, but new to this web site Task: find Don Norman books in library Click on Books, Catalog, Videos link Search by: choose Author Type “Norman, Don” and click Search Click on Norman, Donald link Click on design of everyday things link Write down call number and what floor its on

CW: Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? (is it visible) 3. Once found, will they know it’s the right one for desired effect? (is it correct) 4. Will users understand feedback after action?

CW Summary Advantages Explores important characteristic of learnability Novice perspective Detailed, careful examination Working prototype not necessary Disadvantages Can be time consuming May find problems that aren’t really problems Narrow focus, may not evaluate entire interface

Your turn Prepare: Describe users and their experience Determine task(s) and action lists Evaluation: Choose the one you didn’t evaluate last time As a group, for each action, answer the 4 questions with evidence

CW: Questions 1. Will users be trying to produce whatever effect action has? 2. Will users be able to notice that the correct action is available? (is it visible) 3. Once found, will they know it’s the right one for desired effect? (is it correct) 4. Will users understand feedback after action?