A software engineering perspective

Slides:



Advertisements
Similar presentations
What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
Advertisements

Anskaffelse og kravspecifikation SR9_Checking. SR9: Checking and validation Kilder SR: Soren Lauesen: Software requirements - Styles and techniques. Addison-Wesley,
 1 Notes from Heim Chapter 8 and
Søren Lauesen 1942Born August 10th 1958(Denmark’s computer runs) 1960High-school certificate 1962Employed at Regnecentralen 1965Masters, math-physics 1969External.
User interface design A software engineering perspective Soren Lauesen Slides for Chapter 1 November 2004 © 2005, Pearson Education retains the copyright.
A software engineering perspective
Slides for: Software requirements - Styles and techniques Soren Lauesen 3. Functional requirement styles January 2007 Slides covered by the compendium.
Analytical Evaluations 2. Field Studies
Creating UIs Usability Testing. How to create a UI? Plan TestDesign.
McGraw-Hill/Irwin Introduction to QuickBooks Pro, 2004 © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 4 Bank Reconciliation.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Predictive Evaluation
Chapter 8: Systems analysis and design
Database Applications – Microsoft Access Lesson 9 Designing Special Queries Updated 4/11.
Slides for Software requirements Styles and techniques Soren Lauesen 9. Checking and validation August 2006 © 2002, Pearson Education retains the copyright.
L4 Usability Evaluations Introduction
Output and User Interface Design
Slides for User interface design A software engineering perspective Soren Lauesen 12. User documentation and support August 2006 © 2005, Pearson Education.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Slides for User interface design A software engineering perspective Soren Lauesen 7. Function design August 2006 © 2005, Pearson Education retains the.
1 CSE 3345 User interface design A software engineering perspective Chapter 2: Prototyping and Iterative Design.
SEG3120 User Interfaces Design and Implementation
Slides for User interface design A software engineering perspective Soren Lauesen 9. Reflections on user interface design August 2006 © 2005, Pearson Education.
Slides for User interface design A software engineering perspective Soren Lauesen 2. Prototyping and iterative design August 2006 © 2005, Pearson Education.
User interface design A software engineering perspective Soren Lauesen Slides for Chapter 1 November 2004 © 2005, Pearson Education retains the copyright.
Chapter 8 Usability Specification Techniques Hix & Hartson.
CSE 3345 User interface design A software engineering perspective Chapter 8: Prototypes and Defect Correction.
How to Create an Address How to Create a Free Account, Read and Answer your s. Yahoo! provides FREE . To create a free .
1 CSE 3345 User interface design A software engineering perspective Chapter 1: Usability.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Creating interfaces Multi-language example Definition of computer information system VoiceXML example Project proposal presentations Homework: Post proposal,
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Slides for User interface design A software engineering perspective Soren Lauesen 13. More on usability testing August 2006 © 2005, Pearson Education retains.
Bank Reconciliation Chapter 4. PAGE REF #CHAPTER 4: Bank Reconciliation SLIDE # 2 Objectives Reconcile your checking Create bank reconciliation reports.
Texas Assessment Management System STAAR Alternate Manage Teacher Assignments.
Day 8 Usability testing.
SurveyDIG 2.1 Tutorial.
What is a CAT? What is a CAT?.
ELPA21 Data Entry Interface (DEI) Overview
A Usability Study of a Language Centre Web Site
Heuristic Evaluation August 5, 2016
THIS IS TO EVIDENCE YOUR WORK AND GET THE BEST GRADE POSSIBLE
Software acquisition and requirements SR3_Functions - except tasks
Usability engineering
Usability engineering
ELPA21 Data Entry Interface (DEI) Overview
Data Entry Interface (DEI) Overview
SLCM_AD_315 Booking Rules
Usability Testing: An Overview
NextGen Trustee General Ledger Accounting
Human Computer Interface
For Computer-Based Testing
Human Computer Interface
How to Create and Start a Test Session
A software engineering perspective
Data Entry Interface (DEI) Overview
Human Computer Interface
Human Computer Interface
* 2000/08/1307/16/96 This presentation will probably involve audience discussion, which will create action items. Use PowerPoint to keep track of these.
Welcome to the Validation Wizard Tutorial - Part 2 -
COMP444 Human Computer Interaction Usability Engineering
Guidelines for Selecting Computer Software
Formative Evaluation cs3724: HCI.
Data Entry Interface (DEI) Overview
Principles of HCI Design
Automotive Technology Principles, Diagnosis, and Service
Presentation transcript:

A software engineering perspective User interface design A software engineering perspective Soren Lauesen Slides for Chapter 1 November 2004 © 2005, Pearson Education retains the copyright to the slides, but allows restricted copying for teaching purposes only. It is a condition that the source and copyright notice is preserved on all the material.

User interfaces Technical interfaces Slide 2 Fig 1.1A System interfaces Courses? User interfaces Accounting system Technical interfaces Factory Hotline? System Manual?

Easy to make a user interface: Just give access to the database Fig 1.1B Quality factors Slide 3 Easy to make a user interface: Just give access to the database Hard to make a good user interface see, edit create, delete Database Quality factors: Correctness Availability Performance Security Ease of use Maintainability . . . Functionality: Necessary features All factors important. Hard to measure, but possible.

Responsibility? Programmers? Slide 4 Fig 1.2 What is usability? Max three menu levels On-line help Windows standard ?? Usability factors: a. Fit for use (adequate functionality) Ease of use: b. Ease of learning c. Task efficiency d. Ease of remembering e. Subjective satisfaction f. Understandability Responsibility? Programmers? Other developers? User department? Measurable Priorities vary Game programs: a. ??

Fig 1.3 Usability problems Slide 5 Examples: The system works as intended by the programmer, but the user: P1. Cannot figure out how to start the search. Finally finds out to use F10. P2. Believes he has completed the task, but forgot to push Update. P3. Sees the discount code field, but cannot figure out which code to use. P4. Says it is crazy to use six screens to fill in ten fields. P5. Wants to print a list of discount codes, but the system cannot do it. Severity classes: 1 Missing functionality 2 Task failure 3 Annoying 4 Medium problem (succeeds after long time) 5 Minor problem (succeeds after short time) Critical problem = Missing functionality, task failure, or annoying

Fig 1.4 Usability test - think aloud Slide 6 Purpose: Find usability problems I try this because ... User doesn’t notice ... Facilitator Listens Asks as needed Logkeeper Listens Records problems User Performs tasks Thinks aloud

(Fig 1.4 cont.) Plan Carry out Reporting Test-users: Test-tasks: Slide 7 Plan Test-users: Test-tasks: Study system yourself Carry out Explain purpose: - Find problems when using the system - System’s fault - not yours Give task - think aloud, please Observe, listen, note down Ask cautiously: - what are you looking for? - why . . . ? Help users out when they are surely lost Reporting List the usability problems - within 12 hours

Fig 1.5 Heuristic evaluation Slide 8 Fig 1.5 Heuristic evaluation Purpose: Find usability problems Usability specialist looks at system using common sense and/or guidelines The specialist lists problems (Consults with other experts) Expert - reviewer First law of usability: Heuristic evaluation has only 50% hitrate Actual problems Predicted False problems Missed problems

Fig 1.6A Measuring usability - task time (performance) Slide 9 ATM Users: 20 bank customers, random selection. Task 1: Withdraw $100 from ATM. No instructions. Measure: How many succeed in 2 min? Task 2: Withdraw as much as possible ($174) Measure: How many succeed in 5 min? Reqs: Task 1: 18 succeed. Task 2: 12 succeed. How to measure What to measure Requirement - target Internal ordering system Users: 5 secretaries in the company. Have tried the internal ordering system. Have not used it for a month. Task 1: Order two boxes of letter paper + . . . Measure: Average time per user. Reqs: Average time below 5 min. What to measure Risky! Pros: Classic approach. Good when buying. Cons: Not good for development. Not possible early. Little feedback.

Fig 1.6B Choosing the numbers Slide 10 Why 20? Cost versus reliability. During development: One, later two, later ... Users: 20 bank customers ... Measure: In 2 min? Reqs: Task 1: 18 succeed. Task 2: 12 succeed. Why 2 mins? Best practice, ideal way ... Why 18? 90% of customers should succeed. Task 2 harder. Open target Reqs: 18 out of 20 must succeed within ____ min. We expect around 2 min. Specify how, what, and expectations. Wait and see what is possible.

Fig 1.6C Measuring usability - Problem counts Slide 11 Users: 3 potential users. Think-aloud test. Record usability problems. Task 1: Order two boxes of letter paper + . . . Task 2: . . . Measure: Number of critical problems per user. Number of medium problems on list. Reqs: Max one user encounters critical problems. Max 5 medium problems on the list. How to measure What to measure Requirement Pros: Possible early - mockup sufficient. Good feedback to developers. Cons: Best for ease of learning. Only indications for other factors.

Fig 1.6D Measuring usability - Keystroke counts Slide 12 Task 1: Withdraw a standard amount from ATM. Task 2: . . . Measure: Number of keystrokes and mouse clicks. Reqs: Max keystrokes 6 - incl. PIN code. Total system response time max 8 s. How to measure What to measure Requirement Total task time 6 keystrokes @ 0.6 s 3.6 s total system response time 8.0 s Total task time 11.6 s Plus other user actions? Pros: No users needed. Possible early - mockup sufficient. Cons: Not sure users find the fast way. Only task efficiency.

Fig 1.6E Measuring usability - Opinion poll Slide 13 Ask 20 novice users to complete the questionnaire. Measure: Count number of entries per box. Reqs: 80% find system easy to learn. 50% will recommend it to others. How to measure What to measure Requirement Questionnaire agree neutral disagree The system was easy to learn The system is easy to use The system helps me . . . It is fun to use I will recommend it to others Pros: Widely used. You may ask for any usability factor. Cons: Doesn’t match objective evidence. Only indications during development. Little feedback to developers.

Fig 1.6F Measuring usability - Score for understanding Slide 14 Ask 5 potential ATM users what these error messages mean: Amount too large PIN code invalid . . . Ask them also: What would the system do if . . . Measure: Assess answers on scale A-D. Reqs: 80% of answers marked A or B. How to measure What to measure Requirement Pros: Easy way to test understandability. Best way to cover error messages. Useful both early and late in development. Cons: Only measures understandability..

Fig 1.6G Measuring usability - Guideline adherence Slide 15 Ask an expert to review the user interface and identify deviations from guideline X. (Or ask two experts to come up with a joint list.) Measure: Number of deviations per screen. Reqs: At most one deviation per screen. How to measure What to measure Requirement Pros: Adherence helps users switch between systems. Company-specific guidelines for internal systems can help even more. Cons: Cannot guarantee high usability. Developers find guidelines hard to follow - examples help best.

Fig 1.6H Which usability measure? Slide 16 Ease of remember Subjective satisf. Understandability Ease of learning Task efficiency Fit for use Development, early Development, late Buying a system Task time Problem counts Keystroke counts Opinion poll Score for underst. Guidelines Highly useful Some use Indications only ? ?

A software engineering perspective User interface design A software engineering perspective Soren Lauesen Slides for Chapter 2 November 2004 © 2005, Pearson Education retains the copyright to the slides, but allows restricted copying for teaching purposes only. It is a condition that the source and copyright notice is preserved on all the material.

Traditional systems development HCI classic: iterative design Slide 18 Fig 2.1 The development process Analysis Traditional systems development Design Experts? Guidelines? Program Usability test? Scaring results ! Too late to correct Test Operation Design prototype Program Usability test Study users and tasks Analysis HCI classic: iterative design

Fig 2.2 Hotel system Task list Breakfasts 23/9 Book guest Checkin Checkout Change room Record services Breakfast list Breakfasts 23/9 Room Buffet In room 11 1 12 2 13 1 1 15 . . .

Fig 2.3A Hotel system prototype

(Fig 2.3A Cont.)

Fig 2.3B Defect list for hotel system mockup

Fig 2.3C Hit-rate of Hotel System evaluation Heuristic evaluation: 7 false problems 21 predicted problems 8 likely, but not observed 6 hits 20 observed problems 14 missed problems

Fig 2.4 Various prototypes Tool-drawn mockup: Hand-drawn mockup: 15-30 min 30-60 min Which prototype is the best? Screen prototype: Functional prototype: 1-4 hours 2-8 hours

(Fig 2.4 Cont.) Full contents of a mockup Handling a system  Empty screens for copying  Screens with realistic data  Screens to be filled in by user  Menus, lists, dialog boxes  Error messages  Help texts  Notes about what functions do Handling a system with 100 screens? Accelerator effect: If the central screens are good, the rest are okay almost automatically