Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Cognitive Walkthrough More evaluation without users.
HCI 특론 (2007 Fall) User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Human-Computer Interaction Design Principles & Task Analysis.
User-Interface Design Process Lecture # 6 1Gabriel Spitz.
Part 2c: Requirements Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain Chapter 4: Finding.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
1 User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Empirical Methods in Human- Computer Interaction.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
Evaluation Methodologies
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Nine principles of design Simple and natural dialog Speak the user’s language Minimize user’s memory load Be consistent Provide feedback Provide clearly.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
What is a good length of string? –Depends on its use How do you design a good length of string? –Can be determined by a process What is a good user interface?
Evaluation: Inspections, Analytics & Models
Usability and Evaluation Dov Te’eni. Figure ‎ 7-2: Attitudes, use, performance and satisfaction AttitudesUsePerformance Satisfaction Perceived usability.
Analytical Evaluations 2. Field Studies
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
… and after unit testing …
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Guidelines and Prototypes CS774 Human Computer Interaction Spring 2004.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Usability Testing CS774 Human Computer Interaction Spring 2004.
SEG3120 User Interfaces Design and Implementation
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Heuristic Evaluation Short tutorial to heuristic evaluation
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
CEN3722 Human Computer Interaction Overview of HCI Dr. Ron Eaglin.
User Interface Evaluation
Human Computer Interaction Lecture 15 Usability Evaluation
Usability Evaluation, part 2
SY DE 542 User Testing March 7, 2005 R. Chow
Model based design.
Usability Testing: An Overview
CS 522: Human-Computer Interaction Lab: Formative Evaluation
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Usability Techniques Lecture 13.
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
Chapter 26 Inspections of the user interface
Evaluation.
Nilesen 10 hueristics.
Human-Computer Interaction: Overview of User Studies
Cognitive Walkthrough
Presentation transcript:

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test design with and without users

Users & Tasks Real people, not hypothetical users. Specific tasks –concrete, detailed examples –design independent –scrutinize edges Use tasks in design Create scenario –design dependent

Creating Initial Design intelligent borrowing incorporate other applications copy specific interaction techniques only then consider new solutions

Graphic Design Principles clustering visibility reflects usefulness intelligent consistency color as supplement reduced clutter

Clustering

Visibility Reflects Usefulness

Intelligent Consistency

Colour as a Supplement

Reduced Clutter

Evaluating the Design Without Users respect user’s time good evaluation can catch problems missed with only a few users 3 approaches –cognitive walkthrough –action analysis –heuristic evaluation

Cognitive Walkthrough tell a believable story about a user using your design choose one of your design tasks described earlier articulate each of the thoughts and actions needed to complete the task

Action Analysis formal, or keystroke-level, analysis –extreme detail allows for task completion time predictions of within 20 minutes –isn’t easy to do –level of keystroke –GOMS modeling (Card, Moran, & Newell) back-of-the-envelope analysis –doesn’t take a lot of effort –detects large-scale problems –level of ‘action’ (2-3 seconds)

Heuristic Analysis General principles that can guide design Jacob Nielson and Rolf Molich nine general heuristics experienced evaluators can catch 75% of problems that violate one heuristic –average of past results

Simple and natural dialog speak the user’s language minimize user memory load be consistent provide feedback provide clearly marked exits provide shortcuts good error messages prevent errors

3 Forms of Analysis Cognitive Walkthrough and Formal Analysis task-oriented problems in context of job show up BUT: coverage limited to task doesn’t identify cross-task interactions heuristic analysis can compensate

Testing The Design With Users Choose real users choose tasks that reflect real tasks (hopefully the ones identified earlier) use mockups (low-fidelity prototypes) and prototypes Wizard of Oz techniques

Wizard of Oz Technique

Collecting the Data Process Data –observations of users (what they are doing, thinking) –qualitative Bottom-line Data –summary of what happened –quantitative

Thinking Aloud Method Ask users to talk as they perform task give users categories of potential comments –things they find confusing –decisions they are making... ensure users know they are NOT being tested, the system is ethics and privacy

Role of Observer Prompt for comments help when absolutely necessary beware of shaping responses make a note of help given record session –take notes –video/audio –system

Summarizing Data Based on data, update your analysis of tasks –were users interacting with system as expected? –anything missed in cognitive walkthrough? Prioritize errors and difficulties –importance –difficulty of fix

Choosing Between Design Alternatives bottom-line data more relevant in time critical systems, & when comparing designs conduct usability study –simplest approach - between groups experiment pilot studies necessary tension between realism and low variability question users after test

Conclusion: Task-centered User Interface Design use real users and specific tasks at each iteration, first test design without users, then with users use analysis from test data in next iteration of design qualitative (process) is more critical than quantitative (bottom-line) data unless comparing or designing time critical system