Usability and Evaluation Dov Te’eni. Figure ‎ 7-2: Attitudes, use, performance and satisfaction AttitudesUsePerformance Satisfaction Perceived usability.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 5 Development and Evolution of User Interface
Interaksi Manusia Komputer – Marcello Singadji. design rules Designing for maximum usability – the goal of interaction design Principles of usability.
Usable Security (Part 1 – Oct. 30/07) Dr. Kirstie Hawkey Content primarily from Teaching Usable Privacy and Security: A guide for instructors (
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
Part 2c: Requirements Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain Chapter 4: Finding.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
Human Computer Interface. HCI and Designing the User Interface The user interface is a critical part of an information system -- it is what the users.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Case study - usability evaluation Howell Istance.
Chapter 7 Usability and Evaluation Dov Te’eni Jane M. Carey.
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Semester wrap-up …the final slides.. The Final  December 13, 3:30-4:45 pm  Closed book, one page of notes  Cumulative  Similar format and length to.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Nine principles of design Simple and natural dialog Speak the user’s language Minimize user’s memory load Be consistent Provide feedback Provide clearly.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Help and Documentation CSCI324, IACT403, IACT 931, MCS9324 Human Computer Interfaces.
Testing HCI Usability Testing. Chronological order of testing Individual program units are built and tested (white-box testing / unit testing) Units are.
Principles and Methods
Chapter 7 design rules.
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
1 Human-Computer Interaction  Design process  Task and User Characteristics  Guidelines  Evaluation.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Heuristic evaluation Functionality: Visual Design: Efficiency:
SEG3120 User Interfaces Design and Implementation
UI Style and Usability, User Experience Niteen Borge.
Human Computer Interaction
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Copyright 2006 John Wiley & Sons, Inc. Chapter 1 - Introduction HCI: Designing Effective Organizational Systems Dov Te’eni Jane Carey Ping Zhang.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Heuristic Evaluation Short tutorial to heuristic evaluation
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Human-Computer Interaction Design process Task and User Characteristics Guidelines Evaluation ISE
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
TRAINING PACKAGE The User Action Framework Reliability Study July 1999.
Copyright 2006 John Wiley & Sons, Inc Chapter 5 – Cognitive Engineering HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane Carey.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Chapter 7 design rules. Designing for maximum usability – the goal of interaction design Principles of usability –general understanding Standards and.
Design rules.
Chapter 5 – Cognitive Engineering
Human Computer Interaction Lecture 15 Usability Evaluation
Human-Computer Interaction
Chapter 1 - Introduction
HCI in the software process
SY DE 542 User Testing March 7, 2005 R. Chow
Copyright Catherine M. Burns
Model based design.
Usability Testing: An Overview
Usability Techniques Lecture 13.
HCI in the software process
Proper functionality Good human computer interface Easy to maintain
Chapter 7 design rules.
Chapter 7 design rules.
Chapter 7 design rules.
Human Computer Interaction Lecture 14 HCI in Software Process
Chapter 7 design rules.
Presentation transcript:

Usability and Evaluation Dov Te’eni

Figure ‎ 7-2: Attitudes, use, performance and satisfaction AttitudesUsePerformance Satisfaction Perceived usability & behavior) Perceived usefulness (intentions

1. Goal achievement indicators (success rate, failure rate, accuracy, effectiveness). 2. Work rate indicators (speed, completion rate, efficiency, productivity, productivity gain). 3. Operability indicators of the user's ability to make use of the systems features (error rate, problem rate, function usage) 4. Knowledge acquisition indicators of the user's ability and effort in learning to use the system (learnability and learning period). Usability indicators based on performance

1. Time to complete a task 2. Number of user commands to complete task 3. Fraction of task completed 4. Fraction of task completed in a given time 5. Number of errors 6. Time spent on errors 7. Frequency of online help used Usability measures based on performance and HCI

8. Number of available commands not used 9. When task is repeated, ratio of successes to failures 10.Fraction of positive comments made by user 11.Fraction of good to bad features recalled by user 12.Number of expressions of frustration and satisfaction 13.Number of times user loses control over system 14. Number of times the user needs to devise a way of working around the problem/system. Usability measures based on performance and HCI (cont.)

 Know the user  Analyze competing products  Set usability goals  Consider alternative designs  Engage in participatory design  Coordinate the total interface  Check against heuristic guidelines  Prototype  Evaluate interface  Design in iterations * Follow up with studies of installed systems The usability engineering life cycle

1. Using a microcomputer could provide me with information that would lead to a better decisions. 2. I wouldn't use a microcomputer because programming it would take too much time. 3. I'd like to use a microcomputer because it is oriented to user needs. 4. I wouldn't use a microcomputer because it is too time consuming. 5. Using a microcomputer would take too much time away from my normal duties. 6. Using a microcomputer would involve too much time doing mechanical operations (e.g., programming, inputting data) to allow sufficient time for managerial analysis. 7. A microcomputer would be of no use to me because of its limited computing power. 8. I'd like to learn about ways that microcomputers can be used as aids in managerial tasks. 9. Using a microcomputer would result in a tendency to over design simple tasks. 10. I wouldn't want to have a microcomputer at work because it would distract me from my normal job duties. Attitude questionnaire

1. A microcomputer would give me more opportunities to obtain the information that I need 2. I wouldn't favor using a microcomputer because there would be a tendency to use it even when it was more time consuming than manual methods. 3. I'd like to have a microcomputer because it is so easy to use. 4. I'd hesitate to acquire a microcomputer for my use at work because of the difficulty of integrating it with existing information systems. 5. I'd discourage my company from acquiring microcomputers because most application packages would need to be modified before they could be useful in our specific situation. 6. It is easy to access and store data in a microcomputer. 7. A microcomputer would be of no use to me because of the limited availability of application program packages. 8. A microcomputer would be of no use to me because of its small storage capacity. 9. It is easy to retrieve or store information from/to a microcomputer. 10. Using a microcomputer would give me much greater control over important information.

1.Create simple and natural dialog 2.Speak the user's language 3.Minimize the user's memory load 4.Be consistent 5.Provide feedback 6.Provide clearly marked exits 7.Provide shortcuts 8.Provide specific, corrective and positive error messages 9.Minimize propensity for error Heuristic guidelines

1) Exploratory vs. model based. 2) Design or implementation. 3) Field study vs. laboratory testing. 4) Design vs. use. 5) Level of performance measures. 6) Degree of designed manipulation and intrusion. Evaluation techniques classified

Task description from the first-time user’s viewpoint. Include any special assumptions about the state of the system assumed when the user begins work. Action sequence: Make a numbered list of the atomic actions that the user should perform to accomplish the task. Anticipated users: Briefly describe the class of users who will use this system. Note what experience they are expected to have with similar or previous versions. User’s initial goals: List the goals the user is likely to form when starting the task. If there are other likely goals list them, and estimate for each what percentage of user are likely to have them. Cognitive walkthrough start up (from Polson)

 Typing mistakes made before correction  Mistakes remaining  Pauses of 3 seconds or more immediately before mode change  Other pauses of 3 seconds or more  Length of pause immediately before mode change  Attempting to type-over whilst in insert mode  Attempting to insert whilst in type-over mode Measures for comparing displays

) Effectiveness = (Quantity * Quality) % User effectiveness Expert task time 2) Relative efficiency * % Expert effectiveness User task time 3) Productive period = [ (Task time-Total problem time - Learning time) ]/ Task time 4) Problem rate = Number of problems encountered / Task time 6) Complexity factor = Number of calls for assistance Learning time +Total problem time Number of actions undertaken Task time