Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 7 - Usability Testing in Practice Based.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Interactive Systems Technical Design
Testing your design Without users: With users: Cognitive walkthrough
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Heuristic Evaluation: Hotels.com
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
1 Lecture 18 chapter 9 evaluation techniques. 2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 24 - Usability Testing Based on Heim,
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
 What to “know”? ◦ Goals of information visualization. ◦ About human perceptual capabilities. ◦ About the issues involved in designing visualization for.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
CMSC 345, Version 1/11 S. Mitchell 1 Usability and User Interface Design.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
User Interface Design SCMP Special Topic: Software Development
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Nilesen 10 hueristics.
CSM18 Usability Engineering
Presentation transcript:

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 7 - Usability Testing in Practice Based on Heim, Chapter 8

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Usability testing in practice We seldom have the time and resources for a classic formal usability test –And, most of the time, we shouldn’t wait for such resources anyway before seeking feedback We can structure the procedure and choose the surrogate users cleverly to make quicker and easier tests that are still informative –aka “discount usability testing” 1-2

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Some test types in more detail Heuristic evaluation Talk aloud protocol Cognitive walkthrough (Wizard of Oz) 1-3

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 4 Heuristic Evaluation Proposed by Nielsen and Molich. usability criteria (heuristics) are identified design examined by experts to see if these are violated

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 5 Heuristic Evaluation Rank by severity –0=no usability problem –1=cosmetic – fix if have extra time –2=minor – fixing is low priority –3=major – important to fix –4=usability catastrophe – imperative to fix (a little counter-intuitive – low is good, like in golf) Heuristics, particularly the10 from Nielsen –Visibility of system status, Match between system and real world, User control and freedom, etc. –Nielsen’s 10 embody a wealth of applied cognitive science and experience –But, procedurally, heuristic evaluation could be done with any set of principles or standards Heuristic evaluation ‘debugs’ designs

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 6 Nielsen’s 10 –Visibility of system status –Match between system and real world –User control and freedom –Consistency and standards –Error prevention –Recognition rather than recall –Flexibility and efficiency of use –Aesthetic and minimalist design –Help users recognize, diagnose and recover from errors –Help and documentation

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Nielsen’s 10 in depth 1.Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. 2.Match between system and the real world: The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. 3.User control and freedom: Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. 1-7

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Nielsen’s 10 in depth 4.Consistency and standards: Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. 5.Error prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. 6.Recognition rather than recall: Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. 1-8

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Nielsen’s 10 in depth 7.Flexibility and efficiency of use: Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. 8.Aesthetic and minimalist design: Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. 1-9

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Nielsen’s 10 in depth 9.Help users recognize, diagnose, and recover from errors: Error messages should be expressed in plain language (no codes*), precisely indicate the problem, and constructively suggest a solution. 10.Help and documentation: Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. * If you have a good relationship between end-user and help desk, and somewhat technical users, an error code in addition to your error message may have a useful role (“Hey Bill, I’m getting that error 319 again.”) 1-10

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 11 When to use Heuristic Evaluation Particular advantage that it can be used very early –From first sketches and outline descriptions –May head off a mistake rather than having to fix it Called a ‘discount usability’ method, because it’s relatively cheap (doesn’t require a lot of time and effort)

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Talk aloud protocol (aka ‘protocol analysis’) Have a user try the system and speak out loud about what they’re doing –Not entirely natural (except for really annoying people!), but tends to work OK Note misconceptions –Is what the user says they’re doing, not actually what they’re doing Note vocabulary –What is the user calling objects and actions; should screen instructions be adjusted to align? Note problems –Most important to note when they go astray (fail to achieve their goal and/or get an error message) 1-12

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Talk aloud (contd.) Note aspirations –The user might spontaneously say what they wish they could do? Is that feature there, but just not obvious to the user? Is it a feature that could reasonably be added Variations for two –Have two users sit side-by-side at the computer and perform the task together –Then it becomes much more natural for them to ‘talk aloud’ –Potentially get extra rich insights if the users disagree about how to do something –More user input per unit time for the testers 1-13

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 14 Cognitive Walkthrough (aka Wizard of Oz) Proposed by Polson et al –evaluates design on how well it supports user in learning task –usually performed by expert in cognitive psychology –expert ‘walks through’ design to identify potential problems using psychological principles Based on the idea of a code walkthrough in conventional code testing –forms used to guide analysis –can be used to compare alternatives (A bit like a formalised version of the Talk Aloud Protocol, but generally an expert rather than a typical user is ‘driving’)

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 15 Cognitive Walkthrough (ctd) For each task walkthrough considers –what impact will interaction have on user? –what cognitive processes are required? –what learning problems may occur? Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 16 Pen-based interface for LIDS UA1: Press look up button SD1: Scroll viewpoint up UA2: Press steering wheel to drive forwards SD2: Move viewpoint forwards UA3: Press look down button SD3: Scroll viewpoint down UA = User Action SD = System Display

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 17 Pen interface walkthrough UA 1: Press look up button 1.Is the effect of the action the same as the user’s goal at this point? Up button scrolls viewpoint upwards. So, it’s immediately rewarding with respect to that goal. 2.Will users see that the action is available? Yes. The up button is visible in the UI panel. 3.Once users have found the correct action, will they know it is the one they need? There is a lever with up/down looking symbols as well as the shape above and below the word look. The user will probably select the right action. 4.After the action is taken, will users understand the feedback they get? The scrolled viewpoint mimics the effect of looking up inside the game environment.

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 18 Cognitive walkthrough results Fill out a form –Track time/date of walkthrough, who the evaluators were –For each Action, answer the four pro forma questions (as per previous slide) –Any negative answer to any question should be documented on a separate Problem Sheet, indicating how severe the evaluators think the problem is, and whether they think it’ll occur often

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 19 When to do a Cognitive Walkthrough Can be done at any stage in the development process once you have a ‘prototype’ or actual system implementation to work on –Can be done with paper prototype ‘Wizard of Oz’ name is reference to ‘Pay no attention to that man behind the curtain’ – a person plays the role of the yet-to-be-written software system –Can be done with a shrink-wrapped product Focus on key tasks –Things that are done by most users –“Critical success factors” of the system –Consider something that matches the name of the product If it’s an client, do a cognitive walkthrough of the task of writing and sending an (not of the user login dialogue)

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley ‘Real users’ and statistics What’s a ‘real user’ anyway? –In lecture 4 we said Participants should be ‘real users’ and that they should be “actual users who are asked to perform realistic and representative tasks using a proposed design” –The idea is to get the reaction of people most like your future users –This provides the most valid feedback for most usability test designs Discount usability methods explicitly contradict this representativeness –Heuristic evaluation and cognitive walkthrough are intended to be done by ‘experts’ (hard to say who an ‘expert’ is, but it’s different than aiming for typical users) –Talk-aloud could be representative users, but doesn’t have to be 1-20

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley ‘Real users’ and statistics (contd.) For statistical validity you want a random sample of your future user base –So, actually not all perfectly average users, but a sample proportionally representing the range of relevant user attributes e.g. skills and experience –The accepted way to do this is to sample randomly from a population all potential participants have an equal probability to be selected to participate; but it’s confounding if some people decline to participate [maybe the busiest and most talented ones!] –Actually need a really large sample to support statistical inferences (e.g. 95% confidence interval of estimated mean task time) 1-21

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Real users and statistics (contd.) In discount usability testing we don’t pretend to have a random sample or adequate sample size for reliable quantitative estimates But qualitative data is valid –If one person is observed to make an error a certain way, then another could –If one person thinks your word (or font, or colour) choice is bad or inconsistent, another could too 1-22

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Relationship of Usability Testing and UI Design General maxim –“Test early and often” –It’s cheaper to learn about problem sooner in the development process than later Fits many parts of development lifecycle –Discovery Usability testing of the existing system(s) to understand opportunities / define requirements –Design Evaluate design (esp. lo-fi / paper prototypes) and iteratively refine based on feedback –Pre and post deployment Make sure you’ve got it right, set agenda for perfective maintenance 1-23

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Relationship to design (contd.) Increasing need for evaluation of matured, deployed products –There’s a lot of software out there Almost always smarter to buy than to build if there’s something already written Evaluate potential products (possibly at a friend site that’s already deployed) –The backers of matured products have big budgets to stay on top Imagine the usability testing budgets of Amazon, Google, Apple, Nokia 1-24

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Usability testing and research Usability testing can also be applied to more basic research –Answer questions of relative performance of UI options in particular contexts E.g. Determine whether circular menus outperform vertical rectangular menus for a given selection task –Test the usability of novel UI devices How do people react to a force feedback control for a virtual world navigation task? –Teach us about human beings per se E.g. How do people react to different colours for highlighting of search targets? At this point usability testing merges with psychology and human physiology research 1-25

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Example Your usability group has been approached to evaluate a new eBay/Trademe type service. It’s been big in Korea, and they have a prototype version that’s been converted for the New Zealand market. How do you propose to assess it? 1-26

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Answer part 1 Phase 1. Try some discount usability assessment –Devise some key tasks (e.g. 1. posting something for sale, and 2. bidding on something) –Try a talk-aloud protocol with whoever you can grab (will probably uncover any gross problems) –Try heuristic evaluation on Nielsen’s 10 with maybe 3-5 people –Probably try a cognitive walkthrough Iterate on this with design/development team until it’s looking pretty good 1-27

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Answer part 2 Phase 2. Decide the most key success factors to be assessed in a usability test –E.g. 1. people don’t make mistakes that impact their intent re purchase and sale and 2. they like it Recruit a group of users (offer them a movie voucher or such) Design protocol (maybe as bidder and then as seller), probably no ‘training’ – just orient them to their role and sit them at the computer Might be wise to randomise users to the new system or Trademe (the chief competitor – see if this system has an ‘edge’) –Carryover effects (boredom, practice) would probably be too strong to allow a repeated measures design, alas 1-28

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Answer part 3 Set your measures –Formalise what you mean by a ‘serious error’ –Set up a satisfaction questionnaire (to get qualitative feedback, but with a couple Likert scale questions that define the satisfaction score) –Measure task time, too Might make a learnability/time composite objective like “Can 90% of users successfully place a bid within 10 minutes of effort after having spent 10 minutes browsing for-sale postings?” 1-29

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Answer part 4 After the study –Explore video/screen log of every serious error –Present the profile of errors, satisfaction and task time (along with summary of qualitative feedback) to your client Do they have a usability edge on TradeMe? (or at least come up similar if they have some other advantage that will attract users) If not, they better re-think their market entry, or go for a significant UI redesign 1-30