CS160 Discussion Section Final review David Sun May 8, 2007.

Slides:



Advertisements
Similar presentations
Microsoft® Access® 2010 Training
Advertisements

1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 11 Designing for Usability I.
HCI 특론 (2007 Fall) User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Information Retrieval: Human-Computer Interfaces and Information Access Process.
Learning about software Interfaces.  In this lab, you will examine  Excel Spreadsheet Interface  Access Database Interface  You will also learn about.
CS160 Discussion Section Matthew Kam Apr 14, 2003.
1 User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
7M701 1 User Interface Design Sommerville, Ian (2001) Software Engineering, 6 th edition: Chapter 15
CMPUT 301: Lecture 25 Graphic Design Lecturer: Martin Jagersand Department of Computing Science University of Alberta Notes based on previous courses by.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
Evaluation Methodologies
Information Retrieval: Human-Computer Interfaces and Information Access Process.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
PowerPoint Presentation for Dennis, Wixom & Tegarden Systems Analysis and Design Copyright 2001 © John Wiley & Sons, Inc. All rights reserved. Slide 1.
Help and Documentation CSCI324, IACT403, IACT 931, MCS9324 Human Computer Interfaces.
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Prevent Errors Do not let the user make the mistake in the first place. Limit choices –Hide or gray out unusable functions Look ahead –Have the system.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Ch 6 - Menu-Based and Form Fill-In Interactions Yonglei Tao School of Computing & Info Systems GVSU.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of Designing the User Interface: Strategies for Effective Human-Computer.
CS 5150 Software Engineering Lecture 11 Usability 2.
Automating Database Processing Chapter 6. Chapter Introduction Design and implement user-friendly menu – Called navigation form Macros – Automate repetitive.
Involving Users in Interface Evaluation Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development April 8, 1999.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Designing Interface Components. Components Navigation components - the user uses these components to give instructions. Input – Components that are used.
CS CS 5150 Software Engineering Lecture 11 Usability 1.
CHAPTER TEN AUTHORING.
SEG3120 User Interfaces Design and Implementation
Why do we need good user interfaces?. Goals of User Interfaces Usable – how much effort to do a task? – example: often-used buttons easier to find – example:
User Support Chapter 8. Overview Assumption/IDEALLY: If a system is properly design, it should be completely of ease to use, thus user will require little.
Interface Design Inputs and outputs –data flows to and from external entities –data flows into and out of processes that are manual or not fully automated.
A Case Study of Interaction Design. “Most people think it is a ludicrous idea to view Web pages on mobile phones because of the small screen and slow.
COMP 106 Practical 2 Proposal 5 Slide 1. Designing an Interface (1) The user must be able to anticipate a widget's behaviour from its visual properties.
1 User Interface Design Components Chapter Key Definitions The navigation mechanism provides the way for users to tell the system what to do The.
CS 5150 Software Engineering Lecture 9 Usability 1.
CS3041 – Final week Today: Searching and Visualization Friday: Software tools –Study guide distributed (in class only) Monday: Social Imps –Study guide.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Yonglei Tao School of Computing & Info Systems GVSU Ch 7 Design Guidelines.
Conceptual Model Design Informing the user what to do Lecture # 10 (a) Gabriel Spitz.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
Fall 2002CS/PSY Preventing Errors An Ounce of Prevention Errors  Avoiding and preventing  Identifying and understanding  Handling and recovering.
Prof. James A. Landay University of Washington Autumn 2006 User Testing November 30, 2006.
User Testing. CSE490f - Autumn 2006User Interface Design, Prototyping, & Evaluation2 Hall of Fame or Hall of Shame? frys.com.
11/10/981 User Testing CS 160, Fall ‘98 Professor James Landay November 10, 1998.
Human Computer Interaction Lecture 21 User Support
Human Computer Interaction Lecture 21,22 User Support
Evaluation Techniques 1
Professor John Canny Fall 2001 Nov 29, 2001
Professor John Canny Spring 2003
Professor John Canny Spring 2003
Professor John Canny Fall 2004
MBI 630: Week 11 Interface Design
based on notes by James Landay
Professor John Canny Spring 2003
Professor John Canny Spring 2004
Professor John Canny Spring 2004
Professor John Canny Fall 2004
Professor John Canny Fall 2001 Nov 13, 2001
Presentation transcript:

CS160 Discussion Section Final review David Sun May 8, 2007

Design Patterns zPattern Style (presented in class) 1.Pattern Title 2.Context 3.Forces 4.Problem Statement 5.Solution Solution Sketch 6.Other Patterns to Consider zTips: 1.Know the pattern format 2.We are not fussy on terminology but make sure the description covers the major conceptual components.

Exercise: Design Pattern for… Pick an object and come up with a design pattern in 15 minutes, eg. –Bike –Coffee mug –Desk lamp

Object Action Model An interaction/cognitive model for how users interact a system. Elements: –Task: the universe of objects the user works with and the actions they apply to those objects. –Interface: metaphoric representations of objects and actions.

OAI Example: the Calculator Reals Addition opFirst numberSecond number OperationsAdd two numbers Pick 2 numbersPerform addition operation Actions (intention) Objects (universe) TASK

Object Action Model ButtonsDisplay CalculatorOperate the calculator Press 1Press + Actions (plan) Objects (metaphor) Interface Write out an equation Press 2Press =

Infovis Information tasks –Specific fact finding –Extended fact finding –Open-ended browsing –Exploration of availability Info search 4-phase pattern 1.Formulation 2.Action 3.Review of results 4.Refinement

Infovis Tasks for a visualization system 1.Overview: Get an overview of the collection 2.Zoom: Zoom in on items of interest 3.Filter: Remove uninteresting items 4.Details on demand: Select items and get details 5.Relate: View relationships between items 6.History: Keep a history of actions for undo, replay, refinement 7.Extract: Make subcollections

Infovis Some key concepts –Query building: visual builders and QBE –Multidimensional scaling –Focus + context Distortion Fish-eye lenses Overview + details –Network visualization –Animation –3D

User Testing

Evaluation Methodologies Expert analysis –Cognitive Walkthrough –Heuristic evaluation –Model-based evaluation (GOMS) User participation –Lab studies –Field studies

Ethical Considerations Sometimes tests can be distressing –users have left in tear (embarrassed by mistakes) You have a responsibility to alleviate –make voluntary with informed consent –avoid pressure to participate –let them know they can stop at any time [Gomoll] –stress that you are testing the system, not them –make collected data as anonymous as possible Often must get human subjects approval

Measuring User Preference How much users like or dislike the system –can ask them to rate on a scale of 1 to 10 –or have them choose among statements “best UI I’ve ever…”, “better than average”… –hard to be sure what data will mean novelty of UI, feelings, not realistic setting, etc. If many give you low ratings -> trouble Can get some useful data by asking –what they liked, disliked, where they had trouble, best part, worst part, etc. (redundant questions)

B A Comparing Two Alternatives Between groups experiment –two groups of test users –each group uses only 1 of the systems Within groups experiment –one group of test users each person uses both systems can’t use the same tasks or order (learning) –best for low-level interaction techniques Between groups will require many more participants than a within groups experiment See if differences are statistically significant –assumes normal distribution & same std. dev.

Experimental Details Order of tasks –choose one simple order (simple -> complex) unless doing within groups experiment Training –depends on how real system will be used What if someone doesn’t finish –assign very large time & large # of errors Pilot study –helps you fix problems with the study –do 2, first with colleagues, then with real users

Errors and Help

Types of errors Mistakes –User intended to do what they did, and it led to an error. User would probably do the same thing again. Slips –User did not mean to do what they did. They can recover by doing it differently again. –Slips are not just for beginners. Experts often make them because they devote less conscious attention to the task.

Minimizing Error User errors: –Use Intuitive (from the users domain of knowledge) command names. –Include short explanations as “tool tips”. –Put longer explanations in help system. Recognition over recall –Easier to select a file icon from a folder than to remember and type in the filename. –Auto-completion can help fix this. Use appropriate representations –E.g. graphical file selector good for choosing individual files –Textual file names support automation, richer organization (using command line options).

Types of errors Mistakes –User intended to do what they did, and it led to an error. User would probably do the same thing again. Slips –User did not mean to do what they did. They can recover by doing it differently again. –Slips are not just for beginners. Experts often make them because they devote less conscious attention to the task.

Description errors Description error: –The action is insufficiently specified by the user. –User may not know all the command line switches, or all the installation options for a program. Solution: –Warn the user that the command is ambiguous, or “unusual”. Provide help about options in several standard ways.

Capture error Capture error: (aka the tongue twister error) –Command sequences overlap, and one is more common. –User reflexively does the common one when trying to do the unusual one. –E.g. try typing “soliton” very fast. Solution –be aware of and test for this error. Try different command names.

Mode errors Mode errors: –User forgets what mode they’re in, and does the command appropriate for another mode. –Digital watches, VCRs etc. Several attributes: –There aren’t enough command keys for all the operations – so the mode determines what each button does. –There isn’t enough display space to provide strong feedback about the mode.

Mode errors Solutions: –Strive for consistent behavior of buttons across modes. –Provide display feedback about behavior of keys in the current mode. –Provide an option for scrolling help tips if possible. –Allow the device to be programmed externally (e.g. from a PC with Bluetooth). –If you don’t have a tiny screen, then make the context clear! i.e. use color, tabs, navigation graphics etc. to make clear to the user “where” they are in the interface.

Detecting Errors The earlier the better: –Check for consistency whenever possible (“asserts” for user input). –If there’s a high risk of error, check for unusual input, or for common slips (spelling correction). E.g. google’s “did you mean XX?” response

Help Types of help : –Task specific –Quick reference –Full explanation –Tutorial Key concepts: –Sandboxing –Context-sensitive help –Adaptive help