Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan

Slides:



Advertisements
Similar presentations
User Interface Structure Design
Advertisements

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Copyright 1999 all rights reserved The HCI Design Process n User Interfaces are not just built by sitting down and drawing up designs for them n Just like.
CPSC 481 Week 2 Fateme Rajabiyazdi. What are we doing today  Presentations – 8 minutes each team  Talk about project - phase 3 and 4  Library example.
Evaluating Requirements. Outline Brief Review Stakeholder Review Requirements Analysis Summary Activity 1.
Brugergrænseflader til apparater BRGA Presentation 3: Cognitive Psychology & usable methods.
Cognitive Walkthrough More evaluation without users.
AN OVERVIEW BY JAMIE STARKE The Role of Prototyping.
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
UI Standards & Tools Khushroo Shaikh.
Empirical Methods in Human- Computer Interaction.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Testing your design Without users: With users: Cognitive walkthrough
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
SE 555 Software Requirements & Specification 1 SE 555 Software Requirements & Specification Prototyping.
John Kelleher (IT Sligo) Cognitive Walkthrough. 1 Background Authors – Lewis & Polson (1991) Based on theory L&P CE+ Theory of Exploratory Learning Assesses.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Analytical Evaluations 2. Field Studies
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Centered Design Lecture # 5 Gabriel Spitz.
Usability Testing From: earnusa/index.html I.
Cs414 principles of user interface design, implementation and evaluation.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Determining System Requirements Classes 9,10. SDLC Project Identification & Selection Project Initiation & Planning Analysis ** Logical Design Physical.
FINAL DEMO Apollo Crew, group 3 T SW Development Project.
Predictive Evaluation
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
What about Chapter 7?. What is the usability process? Tyldesley’s 22 possible Measurement Criteria Let’s focus on usability–A usability initiative needs.
SEG3120 User Interfaces Design and Implementation
Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Requirements Validation CSCI 5801: Software Engineering.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
Brugergrænseflader til apparater BRGA Presentation 3: Cognitive Psychology & usable methods.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Overview and Revision for INFO3315. The exam
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Cognitive Walkthrough More evaluating with experts.
COMP5047 Pervasive Computing: 2012 Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Evaluating Requirements
Chapter 2 – Software Processes Lecture 2 1Chapter 2 Software Processes.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
William H. Bowers – Participatory Methods Torres 6.
Prototyping Creation of concrete but partial implementations of a system design to explore usability issues.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Cognitive walkthrough
Evaluation without Users, Part 2
Usability Evaluation, part 2
SY DE 542 User Testing March 7, 2005 R. Chow
The Role of Prototyping
Evaluation.
Case Study: Choosing an Exercise Mode in a Heart Rate Monitor
Presentation transcript:

Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan

CSci October 10 n Introduction to Evaluation n Cognitive Walkthrough n Other Evaluation Methods

CSci Interface Development Methodology n Prototype and Iterate u keep iterating until it is good enough u evaluate along the way to assess n What is Good? What is Good Enough? u set usability goals u should relate to tasks

CSci Casual Iteration n Find major usability problems u missing features u user confusion u poor interaction n Try interface with specific tasks u first use designers, then move towards users u observe overall usage

CSci Casual Iteration n Remember the goal u don’t defend the interface u don’t bias the tests towards the interface n If possible, allow user exploration u may even lead to capturing new tasks n Consider alternative ways to fix a problem

CSci Limits of Casual Iteration n Does not indicate when to stop n Financial trade-offs n Justification of delay

CSci Usability Goals and Measures n Concrete, quantitative measures of usability u learning time u use time for specific tasks and users u error rates u measures of user satisfaction n Comparative usability goals u compare with prior versions or competitors

CSci Things to Watch n Goals should be realistic u 100% is never realistic n Many goals go beyond the application UI u training, manuals n Testing goals should help improve the UI u detail--not just good/bad

CSci Exercise: Setting Usability Goals n In project groups, come up with 2 usability goals for your project u discuss the feasibility of testing these goals F what is needed for the test F when in the process can they be tested? F how much effort, user preparation/training, etc.? F what would you learn from the test?

CSci Interface Evaluation n Goals of interface evaluation u find problems u find opportunity for improvement u determine if interface is “good enough”

CSci With or Without Users n Users are expensive and inconsistent u usability studies require several users u some users provide great information, others little n Users are users u cannot be simulated perfectly n Best choice--Both

CSci Evaluation Without Users n Quantitative Methods u GOMS/keystroke analysis u back-of-the-envelope action analysis n Qualitative Methods u expert evaluation u cognitive walkthrough u heuristic evaluation

CSci Walkthrough Analysis n Economical interface evaluation u low-fidelity prototype u development team F users optional n Effective, if u goal is improvement, not defense u some team members skilled u proper motivation

CSci Cognitive Walkthrough n Goals u imagine user’s experience u evaluate choice-points in the interface u detect confusing labels or options u detect likely user navigation errors n Start with a complete TCUID scenario u never try to “wing it” on a walkthrough

CSci Tell a Believable Story n How does the user accomplish the task n Action-by-action n Based on user knowledge and system interface

CSci Best Approach n Work as a group u don’t partition the task n Be highly skeptical u remember the goal! n Every gap is an interface problem

CSci Who Should Do the Walkthrough n Designers, as an early check n Team of designers & users u remember: goal is to find problems u avoid making it a show n Skilled UI people may be valuable team members

CSci How Far Along n Basic requirements u description or prototype of interface u know who users are (and their experience) u a task description u a list of actions to complete the task (scenario) F DO NOT try to create the action list on the fly! n Viable once the scenario and interface sketch are completed

CSci How to Proceed n For each action in the sequence u tell the story of why the user will do it u ask critical questions F will the user be trying to produce the effect? F will the user see the correct control? F will the user see that the control produces the desired effect? F will the user select a different control instead? F will the user understand the feedback to proceed correctly?

CSci Walkthroughs are not Perfect n They won’t find every problem u limited by nature F new users who know what task they need to accomplish F biased towards correct action sequence u limited in implementation F hard to shed the expertise of evaluators n A useful tool in conjunction with others

CSci Exercise: Cognitive Walkthrough Analysis n In non-project groups of 3-5 n Users and Task to be announced n Scenario developed jointly n Perform walkthrough u identify problems u estimate error probabilities (25% intervals) n Remember who your users are!

CSci GOMS/Keystroke Analysis n Formal action analysis u accurately predict task completion time for skilled users n Break task into tiny steps u keystroke, mouse movement, refocus gaze u retrieve item from long-term memory n Look up average step times u tables from large experiments

CSci GOMS/Keystroke Analysis n Primary utility: repetitive tasks u e.g., telephone operators u benefit: can be very accurate (within 20%) u may identify bottlenecks n Difficulties u challenging to decompose accurately u long/laborious process u not useful with non-experts

CSci Back-of-the-Envelope Action Analysis n Coarse-grain u list basic actions (select menu item) u each action is at least 2-3 seconds u what must be learned/remembered? u what can be done easily? u documentation/training? n Goal is to find major problems u Example: 1950’s 35mm camera

CSci Expert Evaluation n Usability specialists are very valuable u double-specialists are even better n An inexpensive way to get a lot of feedback n Be sure the expert is qualified in your area

CSci Looking Ahead n Next week: Heuristic Evaluation n Walkthroughs Due u “raw” notes notes from each step of walkthrough copy of prototype used, markups copy of scenarios used (note changes or fixes) u processed results 1-2 pages of issues identified, solutions not needed