1 Lecture 18 chapter 9 evaluation techniques. 2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Human Computer Interaction
Human Computer Interaction
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Evaluation Methodologies
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
CS 580 chapter 9 evaluation techniques. Evaluation Tests usability and functionality of system Occurs in laboratory, field and/or in collaboration with.
CENG 394 Introduction to Human-Computer Interaction
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 24 - Usability Testing Based on Heim,
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation Objective –Tests usability –Efficiency – Functionality of system occurs in laboratory,
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
Human Computer Interaction
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
Evaluation techniques
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Chapter 26 Inspections of the user interface
Evaluation.
HCI Evaluation Techniques
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Evaluation Techniques
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Presentation transcript:

1 Lecture 18 chapter 9 evaluation techniques

2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates both design and implementation –should be considered at all stages in the design life cycle

3 Goals of Evaluation assess extent of system functionality assess effect of interface on user identify specific problems

4 Evaluating Designs (expert based) Cognitive Walkthrough Heuristic Evaluation Review-based evaluation

5 Cognitive Walkthrough Proposed by Polson et al –evaluates design on how well it supports user in learning task –usually performed by expert in cognitive psychology –expert ‘walks through’ design to identify potential problems using psychological principles Based on the idea of a code walkthrough in conventional code testing –forms used to guide analysis –can be used to compare alternatives

6 Cognitive Walkthrough (ctd) For each task walkthrough considers –what impact will interaction have on user? –what cognitive processes are required? –what learning problems may occur? Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?

7 Pen-based interface for LIDS UA1: Press look up button SD1: Scroll viewpoint up UA2: Press steering wheel to drive forwards SD2: Move viewpoint forwards UA3: Press look down button SD3: Scroll viewpoint down

8 Pen interface walkthrough UA 1: Press look up button 1.Is the effect of the action the same as the user’s goal at this point? Up button scrolls viewpoint upwards. 2.Will users see that the action is available? The up button is visible in the UI panel. 3.Once users have found the correct action, will they know it is the one they need? There is a lever with up/down looking symbols as well as the shape above and below the word look. The user will probably select the right action. 4.After the action is taken, will users understand the feedback they get? The scrolled viewpoint mimics the effect of looking up inside the game environment.

9 Cognitive walkthrough results Fill out a form –Track time/date of walkthrough, who the evaluators were –For each Action, answer the four proforma questions (as per prev slide, text pp ) –Any negative answer to any question should be documented on a separate Problem Sheet, indicating how severe the evaluators think the problem is, and whether they think it’ll occur often

10 When to do a Cognitive Walkthrough Can be done at any stage in the development process once you have a ‘prototype’ or actual system implementation to work on –Can be done with paper prototype –Can be done with a shrink-wrapped product Focus on key tasks –Things that are done by most users –“Critical success factors” of the system –Consider something that matches the name of the product If it’s an client, do a cognitive walkthrough of the task of writing and sending an

11 Heuristic Evaluation Proposed by Nielsen and Molich. usability criteria (heuristics) are identified design examined by experts to see if these are violated

12 Heuristic Evaluation Rank by severity –0=no usability problem –1=cosmetic – fix if have extra time –2=minor – fixing is low priority –3=major – important to fix –4=usability catastrophe – imperative to fix Heuristics such as 10 from Nielsen –Visibility of system status –Match between system and real world –User control and freedom, etc. (p ) [remember – these will be used to assess your assignment 1 prototype!] Heuristic evaluation `debugs' design

13 Nielsen’s 10 –Visibility of system status –Match between system and real world –User control and freedom –Consistency and standards –Error prevention –Recognition rather than recall –Flexibility and efficiency of use –Aesthetic and minimalist design –Help users recognize, diagnose and recover from errors –Help and documentation

14 When to use Heuristic Evaluation Particular advantage that it can be used very early –From first sketches and outline descriptions –May head off a mistake rather than having to fix it Called a ‘discount usability’ method, because it’s relatively cheap (doesn’t require a lot of time and effort)

15 Review-based evaluation Results from the literature used to support or refute parts of design –Care needed to ensure results are transferable to new design Model-based evaluation (e.g., GOMS, keystroke) Cognitive models used to filter design options e.g. GOMS prediction of user performance (we look at these later in the semester) Design rationale can also provide useful evaluation information

16 Evaluating through user Participation

17 Laboratory studies User taken out of their normal work environment for a controlled test Advantages: –specialist equipment available –uninterrupted environment Disadvantages: –lack of context –difficult to observe several users cooperating Appropriate –if system location is dangerous or impractical for constrained single user systems to allow controlled manipulation of use

18 Laboratory study with Morae We’ve been analyzing a clinical decision support tool –PREDICT CVD/Diabetes estimates risk of a ‘cardiovascular disease event’ (e.g., stroke or heart attack) and gives recommendations for action, as well as information for patients We’re interested in just how GPs and nurses are using the system – how much time they spend on what tasks, what they convey to patient. Difficult to coordinate detailed study in a real General Practice (consent of patient to be videotaped, placing instrumentation in the practice – e.g., Morae) –So, we use ‘medical actors’ and bring the real GP to the Tamaki clinic

19 Field Studies Advantages: –natural environment –context retained (though observation may alter it – see next slide) –longitudinal studies possible (i.e., over longer periods of time than are feasible for bringing people into a lab) Disadvantages: –distractions –noise Appropriate –where context is crucial and for longitudinal studies

20 Unwanted biases in studies You can’t always take a study result at face value… must be attentive to what subjects are feeling Hawthorne effect –Worker is more productive when observed John Henry effect –Worker is [stubbornly] more productive when using his old tools (see Placebo effect –[Patient usually] gets some benefit just because they expect a benefit Pygmalion effect –Student performs better simply because they are expected to do so