Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION February 19, 2013 Heuristic Evaluation.
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
User Interface Design Notes p7 T120B pavasario sem.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Heuristic Evaluation Professor: Tapan Parikh TA: Eun Kyoung Choe
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Nine principles of design Simple and natural dialog Speak the user’s language Minimize user’s memory load Be consistent Provide feedback Provide clearly.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Evaluation: Inspections, Analytics & Models
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
1 Human-Computer Interaction  Design process  Task and User Characteristics  Guidelines  Evaluation.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Nielsen’s Ten Usability Heuristics
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
OVERVIEW Framework Overview – From Programming to Music Dimensions in Detail – Visibility, Progressive Evaluation, Consistency, Viscosity, Abstraction.
Heuristic Evaluation Short tutorial to heuristic evaluation
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Heuristic Evaluation October 26, 2006.
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Usability Testing: An Overview
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Heuristic Evaluation.
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Presentation transcript:

Evaluation in HCI Angela Kessell Oct. 13, 2005

Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral and Social Sciences

Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability Methodology Matters: Doing Research in the Behavioral and Social Sciences

Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability –Usability applied to APIs Methodology Matters: Doing Research in the Behavioral and Social Sciences

Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability –Usability applied to APIs Methodology Matters: Doing Research in the Behavioral and Social Sciences –Designing, carrying out, and evaluating human subjects studies

Heuristic Evaluation Jakob Nielsen

Most usability engineering methods will contribute substantially to the usability of an interface …

Heuristic Evaluation Jakob Nielsen Most usability engineering methods will contribute substantially to the usability of an interface … …if they are actually used.

Heuristic Evaluation What is it?

Heuristic Evaluation What is it? A discount usability engineering method

Heuristic Evaluation What is it? A discount usability engineering method - Easy (can be taught in ½ day seminar) - Fast (about a day for most evaluations) - Cheap (e.g. $(4, i ))

Heuristic Evaluation How does it work?

Heuristic Evaluation How does it work? –Evaluators use a checklist of basic usability heuristics –Evaluators go through an interface twice 1 st pass get a feel for the flow and general scope 2 nd pass refer to checklist of usability heuristics and focus on individual elements –The findings of evaluators are combined and assessed

Heuristic Evaluation Usability Heuristics (original, unrevised list) Simple and natural dialogue Speak the users’ language Minimize the users’ memory load Consistency Feedback Clearly marked exits Shortcuts Precise and constructive error messages Prevent errors Help and documentation

Heuristic Evaluation Usability Heuristics (original, unrevised list) Simple and natural dialogue Speak the users’ language Minimize the users’ memory load Consistency Feedback Clearly marked exits Shortcuts Precise and constructive error messages Prevent errors Help and documentation COMMENTS?

Heuristic Evaluation One expert won’t due Need evaluators Exact number needed depends on cost- benefit analysis

Heuristic Evaluation Who are these evaluators? –Typically not domain experts / real users –No official “usability specialist” certification exists Optimal performance requires double experts

Heuristic Evaluation Debriefing session –Conducted in brain-storming mode –Evaluators rate the severity of all problems identified –Use a 0 – 4, absolute scale 0 I don’t agree that this is a prob at all 1 Cosmetic prob only 2 Minor prob – low priority 3 Major prob – high priority 4 Usability catastrophe – imperative to fix

Heuristic Evaluation Debriefing session –Conducted in brain-storming mode –Evaluators rate the severity of all problems identified –Use a 0 – 4, absolute scale 0 I don’t agree that this is a prob at all 1 Cosmetic prob only 2 Minor prob – low priority 3 Major prob – high priority 4 Usability catastrophe – imperative to fix COMMENTS?

Heuristic Evaluation How does H.E. differ from User Testing?

Heuristic Evaluation How does H.E. differ from User Testing? –Evaluators have checklists –Evaluators are not the target users –Evaluators decide on their own how they want to proceed –Observer can answer evaluators’ questions about the domain or give hints for using the interface –Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions

Heuristic Evaluation What are the shortcomings of H.E.?

Heuristic Evaluation What are the shortcomings of H.E.? –Identifies usability problems without indicating how they are to be fixed. “Ideas for appropriate redesigns have to appear magically in the heads of designers on the basis of their sheer creative powers.” –Cannot expect it to address all usability issues when evaluators are not domain experts / actual users

Measuring API Usability Steven Clarke

User-centered design approach –Understanding both your users and the way they work Scenario-based design approach –Ensures API reflects the tasks that users want to perform Use Cognitive Dimensions Framework

Measuring API Usability Cognitive dimensions framework describes: –What users expect –What the API actually provides Cognitive dimensions framework provides: –A common vocabulary for developers –Draws attention to important aspects The Dimensions: –Abstraction level –Learning style –Working framework –Work-step unit –Progressive evaluation –Premature commitment –Penetrability –API elaboration –API viscosity –Consistency –Role expressiveness –Domain correspondence

Measuring API Usability Cognitive dimensions framework describes: –What users expect –What the API actually provides Cognitive dimensions framework provides: –A common vocabulary for developers –Draws attention to important aspects The Dimensions: –Abstraction level –Learning style –Working framework –Work-step unit –Progressive evaluation –Premature commitment –Penetrability –API elaboration –API viscosity –Consistency –Role expressiveness –Domain correspondence COMMENTS?

Measuring API Usability Use Personas: –Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic) Compare API evaluation with the profile requirements

Measuring API Usability Use Personas: –Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic) Compare API evaluation with the profile requirements COMMENTS?

Methodology Matters: Doing Research in the Behavioral and Social Sciences Joseph McGrath

Methodology Matters: Doing Research in the Behavioral and Social Sciences Key points: All methods are valuable, but all have limitations/weaknesses Offset the weaknesses by using multiple methods

Methodology Matters: Doing Research in the Behavioral and Social Sciences In conducting research, try to maximize: Generalizability Precision Realism

Methodology Matters: Doing Research in the Behavioral and Social Sciences In conducting research, try to maximize: Generalizability Precision Realism -You cannot maximize all three simultaneously.

Methodology Matters: Doing Research in the Behavioral and Social Sciences From

So… 1 st 2 papers focus on computer programs / GUIs 3 rd paper presents the whole gamut of methodologies available to study any human behavior

But… what’s missing?

But… Where are the statistics? Are there objective “right” answers in HCI? How do we evaluate other kinds of interfaces? Other thoughts on what’s missing?

How do we evaluate… “Embodied virtuality” / ubiquitous computing “interfaces” (Aura video… Try to pick out one capability presented, and think about how you might evaluate it

Evaluating Aura Do we evaluate the whole system at once? Or bit by bit? Where / What is the interface? Is anyone not a target user?

From