Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.

Slides:



Advertisements
Similar presentations
Chapter 5 Development and Evolution of User Interface
Advertisements

Copyright 1999 all rights reserved The HCI Design Process n User Interfaces are not just built by sitting down and drawing up designs for them n Just like.
CS305: HCI in SW Development Evaluation (Return to…)
Cognitive Walkthrough More evaluation without users.
Técnicas de Calidad en el Software Sesión # 10. Good quality software Operations Transition Testing Portability Reusability Interoperability Maintainability.
Data gathering.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Identifying needs and establishing requirements Chapter 7a.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Identifying needs and establishing requirements Chapter 7b.
An evaluation framework
Intro to Evaluation See how (un)usable your software really is…
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Testing and Modeling Users Kristina Winbladh & Ramzi Nasr.
McInterface User Interface Development Project IS 213 Spring 2001 Linda Harjono Saifon Obromsook John Yiu Wai Chi 1 st May, 2001.
Part 2: Requirements Days 7, 9, 11, 13 Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain.
Evaluation: Inspections, Analytics & Models
Analytical Evaluations 2. Field Studies
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
©2011 1www.id-book.com Analytical evaluation Chapter 15.
User Interface Evaluation Usability Inquiry Methods
Chapter 8: Systems analysis and design
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Evaluation approaches Text p Text p
Usability Testing CS774 Human Computer Interaction Spring 2004.
Chapter 5 by Judith A. Effken
Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Software Engineering User Interface Design Slide 1 User Interface Design.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
UCI Large-Scale Collection of Application Usage Data to Inform Software Development David M. Hilbert David F. Redmiles Information and Computer Science.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Introduction to System Evaluation IS 588 Dania Bilal & Lorraine Normore Spring 2007.
Behavioral Interviewing Job Search Support Group Session 7.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Evaluation without Users, Part 2
Usability Evaluation, part 2
Collaboration with Google Drive
Chapter 23 Deciding how to collect data
Cognitive Walkthrough
Evaluation.
HCI Evaluation Techniques
Cognitive Walkthrough
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users

Observation n The goal is to become virtually invisible to users so that they will use the system and you observation how they use the system n Observing users one often finds that they are using the software in ways that would not have been expected

Questionnaires n One cannot always take user comments at face value. –Little problems could be overblown –Data about people’s actual behavior should have precedence over people’s claims of what they think they do n E.g., in a study, 26% of users commented on a command even though they had previously stated that they did not know it.

Questionnaires n Users’ predictions of new features and their rating of the features after trying them, was only 28% n In a study users were given a questionnaire about the difficulty of the instructions of a mobile phone. –When tested on the instructions, only 50% got it right. n Always pilot test the questions before hand n Only ask a question if you want to know the answer (if the replies will make a difference to the system)

Interviews n They have the advantage of being more flexible –The interviewer can rephrase a question –Or ask follow-up questions n Interviewers should ask open-ended questions (not yes or no questions) n Ask users to recall critical incidents in their USE OF THE SYSTEM

Focus Groups n Are used to assess user needs and feelings before and after system deployment n A group of at least six is run by a moderator who maintains focus n The moderator prepares a script for what issues need to brought up

Focus Groups n Only after having seen a prototype of the new system do focus groups change from being skeptical to having a feeling they would like to get new features

Logging Actual Use n Logging means having the computer collect statistics about the detailed use of the system n Frequency with which each user has used each feature n Frequency of errors –In a study, 85% of errors came from 9 common error messages

Logging Actual Use n In another study 10% of the help screens that were accessed the most accounted for 92% of the requests. n Logging is done by instrumenting low-level system software. Try to use middle-level logs. E.g., save Java button presses. n It’s possible to log complete transcripts of user sessions

Logging Actual Use n Another use of logging –Study the users’ use of a user interface to find usability problems that were over looked when observing users

User Feedback n One will tend to hear MOSTLY from dissatisfied users. So, the feedback may not be representative of the majority n You could allow users a quick or “gripe” function n Check with custom support n Use a user internet board for feedback

Combining Usability Methods n Heuristic evaluation and user testing –Heuristic methods will not waste users time –Both methods may find different usability problems.

Cognitive Walkthroughs n A formalized way of imagining people’s thoughts and actions when they use an interface for the first time.

Cognitive Walkthroughs n Anticipate problems before testing your users n Envision the users’ route on their way to complete a task n Walk through this route and identify problems they might encounter –Confusing labels, options, likely errors –Then fix the found problems