Chapter 8 Usability Specification Techniques Hix & Hartson.

Slides:



Advertisements
Similar presentations
Requirements gathering
Advertisements

Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
User Interface Evaluation Formative Evaluation. Summative Evaluation Evaluation of the user interface after it has been developed. Typically performed.
References Prof. Saul Greenberg, University of Calgary, notes and articles INUSE 6.2 and RESPECT 5.3 Handbook Prof. , University of , Notes and articles.
Copyright 1999 all rights reserved Why Conduct a Task Analysis? n Uncovered Information may lead to abandoning a bad idea –The world is ready for video.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Methodology Overview Dr. Saul Greenberg John Kelleher.
Chapter 7 Usability and Evaluation Dov Te’eni Jane M. Carey.
Usability Assessment, Evaluation and Testing Laura and Julie.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Identifying needs and establishing requirements Chapter 7a.
The Process of Interaction Design. What is Interaction Design? It is a process: — a goal-directed problem solving activity informed by intended use, target.
Information Systems Development and Acquisition Chapter 8 Jessup & Valacich Instructor: Ramesh Sankaranarayanan.
Part 2: Requirements Days 7, 9, 11, 13 Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Usability and Evaluation Dov Te’eni. Figure ‎ 7-2: Attitudes, use, performance and satisfaction AttitudesUsePerformance Satisfaction Perceived usability.
Analytical Evaluations 2. Field Studies
Marketing Research Unit 7.
CSC271 Database Systems Lecture # 20.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
1.Database plan 2.Information systems plan 3.Technology plan 4.Business strategy plan 5.Enterprise analysis Which of the following serves as a road map.
Introduction to Systems Analysis and Design Trisha Cummings.
VIRTUAL BUSINESS RETAILING
Causality Project Team 1 David Conley Vijay Hattiangadi Byung Lee Jennifer Stoneking.
INFORMATION SYSTEM APPLICATIONS System Development Life Cycle.
Types of Market Research
User Interface Evaluation Usability Testing Methods.
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
1 Human-Computer Interaction Usability Specifications.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
ITCS 6010 VUI Evaluation.
Ch 14. Testing & modeling users
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
Fall 2002CS/PSY Empirical Evaluation Analyzing data, Informing design, Usability Specifications Inspecting your data Analyzing & interpreting results.
Human Computer Interaction
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Usability Testing Chris North cs3724: HCI. Presentations karen molye, steve kovalak Vote: UI Hall of Fame/Shame?
CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Ch 8: Usability Testing - 1 Yonglei Tao. Usability  Learnability  Efficiency  User retention over time  Memorability  Error rate  Error frequency.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
User Interface Evaluation Introduction Lecture #15.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
MIS 321 PS 2 FACT FINDING METHODS: SURVEY AND INTERVIEW.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
The Role of market research To identify what the consumers’ needs are and then try to meet them. Firms carry out market research If businesses fail to.
User Interface Evaluation
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
Investigating System Requirements
Analysis of Software Usability Evaluation Methods
Usability Techniques Lecture 13.
1.2 System Design Basics.
Inspecting your data Analyzing & interpreting results
Presentation transcript:

Chapter 8 Usability Specification Techniques Hix & Hartson

Usability Specifications Quantitative Usability Goals

Usability Attributes Usability characteristics to be measured –Initial Performance –Long-term performance –Learnability –Retainability –Advanced Feature Usage –First Impression –Long-term user satisfaction

How can attributes be measured? Objective Tasks (called benchmark tasks) –Tasks must be representative of what users would perform –Measure performance on benchmark tasks –Tasks must be specific –Do not tell user how to carry out tasks –Should be simple, or small combinations of simple tasks –Consider who your end users are!

Objective Tasks (cont.) Example For the attribute Initial Performance may measure how well they perform a specific function that is primary to the software. Time and error data can be collected.

Subjective Questionnaires Asks for opinions on use. QUIS is an existing validated questionnaire. Questionnaires produce objective data as well as subjective data. Example: First Impression attribute, would want certain rankings on questionnaire.

Usability Specifications For the Task looking for the following information: Current Level of Task Performance Worst Acceptable Level Planned Target Level Best Possible Level Observed Results

Types of measures Objective measures –Time to complete a task –Number or percentage of errors –Percentage of task completed in a given time –Ratio of successes to failures –Time spent in errors and recovery –Number of commands/actions performed –Frequency of help and documentation use –Number of repetitions of failed commands –Number of available commands –Number of time user expresses frustration or satisfaction

Set current levels based on: Previous or existing system Similar competitive systems Performing computer tasks Performing manual tasks Market input From previous prototypes

Considerations when developing specifications Is each attribute practically measurable? Are the user classes specified clearly? Are the values for the levels reasonable? How well do the attributes capture usability for the design?

Chapter 10 Formative Evaluation Hix & Hartson

What is meant by formative evaluation? A formal evaluation plan during design process. To be begun as early as possible in design cycle. First evaluation to take place when 10% of project resources are expended.

Summative Evaluation A human factors engineer’s worst nightmare Evaluation only after completion of the design.

Types of Evaluation Data Objective – Directly observed and measurable Subjective – Opinions Quantitative – Numerical data Qualitative – lists of user problems, suggestions, etc.

Steps of Formative Evaluation Develop evaluation plan (or experiment) –Selecting Participants –Developing tasks and task orders –Determining protocol and procedures –Pilot testing Direct the evaluation

Data Generation –Benchmark tasks –User preference questionnaires –Concurrent Verbal protocol –Retrospective verbal protocol –Critical incident taking –Structured interviews

Direct the evaluation (cont.) Data Collection –Real-time note taking –Videotaping –Audiotaping –Internal instrumentation of the interface

Direct the evaluation (cont.) Analyzing the Data –Compute averages for benchmark tasks –Determining problems or user difficulty –Determine effects on user performance Impact analysis –Importance How important is this problem to the design –Generate solutions –Consider costs to fix problems Redesign, implement, retest

Formative Evaluation Pros & Cons Pros –4 to 5 subjects find 80% of problems –Sensitive to major problems –Can be very through process –Developers empathize with users Cons –Time Consuming –Expensive

Other Usability Testing Methods Heuristic Guidelines Computer Evaluation

Heuristic Usability engineer reviews and evaluates program with no standard procedure Pros –Quickly identify problems –Major problems are discovered Cons –Must use more than one usability engineer –Minor problems not discovered

Guidelines Evaluator looks at design to see if it meets guidelines Pros –Finds general and recurring errors –Easily applied Cons –Major problems can be missed –Guidelines are not exhaustive –Not all programs are created equal –Not all guidelines apply

Computer Evaluation Automated computer program evaluates software Pros –Potential tool in the future Cons –Expensive –Currently finds only primitive problems –Will designers lose creativity trying to design to meet tests?

Conclusions Heuristic can be cost effective Use more than one method Users determine success of software and companies