Download presentation
Presentation is loading. Please wait.
1
Planning an evaluation
Factors to be considered in planning an evaluation purpose - who are the stake holders laboratory vs field studies qualitative vs quantitative measures information provided immediacy of response intrusiveness resources UniS Department of Computing Dr Terry Hinton 1/2/06
2
CS285 Usability Engineering - Evaluation 1
Evaluation in general – two forms: Formative: as part of the design and implementa-tion process Summative: after the system has been developed Goals of an evaluation to test the usability and functionality of an interactive system • assess the extent of the system’s functionality • assess its usability - see 10 heuristics assess the affect of the interface on the user • identify any specific problems with the system or with its use UniS Department of Computing Dr Terry Hinton 1/2/06
3
Evaluation Methods for Interactive Systems
Analytical Methods Expert Evaluation Experimental Methods Observational Methods Query Methods UniS Department of Computing Dr Terry Hinton 1/2/06
4
Evaluation Methods for Interactive Systems
Analytical Methods Predict performance based on a model e.g. analysis of cash dispenser based on number of key strokes required, time needed to press a key, time needed to think, time needed to react. Expert Evalation Use an expert to carryout cognitive walkthrough -obtain a fast result Experimental Methods design experiments in the laboratory e.g. speed of recognition of key words dependent on font & colour UniS Department of Computing Dr Terry Hinton 1/2/06
5
Observational Methods - in the field
Users: expert users typical users novice users Types of use: Direct use Discretionary use Tasks: Cognitive based Interactive Psychological attitude involved UniS Department of Computing Dr Terry Hinton 1/2/06
6
UniS Department of Computing
Query Methods – survey opinions, attitudes, easy, enjoyable skills experience contextual issues Collection of Data Interviews Questionnaires UniS Department of Computing Dr Terry Hinton 1/2/06
7
UniS Department of Computing
Usability Usability defined: usability = efficiency + effectiveness + enjoyment Can’t compute a usability parameter - J Neilson proposed 10 Usability heuristics details see later heuristics - set of rules for solving problems other than by an algorithm (Collins English Dictionary 2nd Ed.) UniS Department of Computing Dr Terry Hinton 1/2/06
8
Experimental methods - in the laboratory
design an experiment for laboratory conditions make an hypothesis - testable select your subjects select the variables - change one at a time statistical measures - time, speed, number of events - sample size etc. UniS Department of Computing Dr Terry Hinton 1/2/06
9
Observational techniques - in the field
Observe behaviour - arbitrary activity and/or - set the tasks Task analysis Specify a set of tasks gives insight into usability Specify a goal gives insight into cognitive strategy used Record - actions, time, errors etc. UniS Department of Computing Dr Terry Hinton 1/2/06
10
Observational techniques - in the field
Verbal Protocol - Think aloud Protocol analysis paper and pencil audio recording video recording computer logging user notebooks Automatic protocol analysis tools Post-event protocol - teach-back or Post-task walkthroughs UniS Department of Computing Dr Terry Hinton 1/2/06
11
Query techniques - Attitudinal Data
Interviews design an interview schedule Questionnaires general open-ended scalar multi-choice UniS Department of Computing Dr Terry Hinton 1/2/06
12
Planning an evaluation
Factors to be considered in planning an evaluation purpose - who are the stake holders laboratory vs field studies qualitative vs quantitative measures information provided immediacy of response intrusiveness resources UniS Department of Computing Dr Terry Hinton 1/2/06
13
Usability Metrics - what you can measure
Time to complete task (average, maximum) Percentage of users completing task Number of errors (average, maximum) Type of errors - fatal or non-fatal Strategy to solve errors - use manual “work around” cognitive strategy ask colleague use help give-up! UniS Department of Computing Dr Terry Hinton 1/2/06
14
Usability Metrics – 5 minute exercise
You are paying for the design of a new mobile phone – what are the 6 most important usability metrics that you would insist upon being met? UniS Department of Computing Dr Terry Hinton 1/2/06
15
Ten usability heuristics by J Neilson
Visibility of system status system should keep users informed about what is going on Match between system and the real world system should speak users’ language - words, phrases and concepts familiar to the user (rather than system-oriented terms) User control and freedom users often choose system functions by mistake - support undo/redo Consistency and standards follow platform conventions (users shouldn’t have to wonder whether different words, situations, or actions mean the same thing. UniS Department of Computing Dr Terry Hinton 1/2/06
16
Ten usability heuristics
Error prevention better than error messages Recognition rather than recall make objects, actions, and options visible (users shouldn’t have to remember information) Flexibility and efficiency of use accelerators (unseen by novice users) my speed up interaction for expert users - system allows users to tailor frequent actions Aesthetic and minimalist design simplicity is beauty UniS Department of Computing Dr Terry Hinton 1/2/06
17
Ten usability heuristics
Help users recognise, diagnose, and recover from errors Express error messages in plain language (no codes), indicate the problem, and suggest solution Help and documentation Ideally, its better if system can be used without documentation, Most often it is necessary to provide help and documentation. Such information should be easy to search, focused on the user’s task, list concrete steps to be carried out and not be too long. Examples are always helpful. UniS Department of Computing Dr Terry Hinton 1/2/06
18
Shneiderman’s 8 Golden Rules of Dialogue Design
Heuristics for dialogue design and evaluation (p189) Strive for consistency Enable frequent users to use short cuts Offer informative feedback Design dialogues to yield closure Actions need a beginning, middle and end, informative feedback at end enables closure to take place – user can move on UniS Department of Computing Dr Terry Hinton 1/2/06
19
Shneiderman’s 8 Golden Rules of Dialogue Design
Offer simple error handling Permit easy reversal of actions Support internal locus of control Experienced operators like to know they are in charge. User should initiate actions rather than being on the receiving end of them Reduce short-term memory George Miller’s (1956) “The magical number seven – plus or minus 2” UniS Department of Computing Dr Terry Hinton 1/2/06
20
UniS Department of Computing
Questionnaire Design A simple checklist yes no Don’t’ know copy paste Example of 6-point rating scale (avoid a middle value) Very useful Of no use UniS Department of Computing Dr Terry Hinton 1/2/06
21
UniS Department of Computing
Questionnaire Design An example of a Likert Scale strongly agree agree slightly agree neutral slightly disagree disagree strongly disagree An example of a semantic differential scale extremely slightly neutral slightly extremely easy difficult UniS Department of Computing Dr Terry Hinton 1/2/06
22
UniS Department of Computing
Questionnaire Design An example of a ranked order question: Place the following commands in order of usefulness using the numbers 1 to 4, 1 being the most useful. copy paste group clear UniS Department of Computing Dr Terry Hinton 1/2/06
23
UniS Department of Computing
Analysis of Results UniS Department of Computing Dr Terry Hinton 1/2/06
24
UniS Department of Computing
Analysis of Results Question – what is the correlation between these data? UniS Department of Computing Dr Terry Hinton 1/2/06
25
Analysis of Results – Cross Tabulation
Number of Errors Method of Learning Low Medium High Sub Total Reading 3 2 8 Instruction 6 5 19 Trial & Error 9 16 Friend 7 12 21 13 22 29 64 Useful website: UniS Department of Computing Dr Terry Hinton 1/2/06
26
Presentation of Results
Provide a sample questionnaire with data inserted into cells e.g.: Strongly agree Agree Slightly agree Slightly disagree Disagree Strongly disagree 25 15 13 4 2 3 UniS Department of Computing Dr Terry Hinton 1/2/06
27
Evaluation in the Design Phase
Participatory Design - user is involved in the whole design life cycle Number of methods to help convey information between user and designer brainstorming storyboarding workshops pencil & paper exercise role playing UniS Department of Computing Dr Terry Hinton 1/2/06
28
UniS Department of Computing
Evaluating the design Participatory Design - Usability Engineer on design team brainstorming storyboarding paper & pencil design Cognitive walk through Heuristic evaluation - rule of thumb UniS Department of Computing Dr Terry Hinton 1/2/06
29
UniS Department of Computing
Evaluating the design Review based evaluation Compare design with previous good practice Model based evaluation Theoretical assessment of usability metrics e.g. ATM dispensing cash - calculate time for typical user to withdraw cash Implementation - design to be discussed later in module UniS Department of Computing Dr Terry Hinton 1/2/06
30
UniS Department of Computing
Choosing an evaluation method Ref: Dix, A., Finlay, J.,Abowd, G., Beale, R. (1994) UniS Department of Computing Dr Terry Hinton 1/2/06
31
Choosing an evaluation method
UniS Department of Computing Dr Terry Hinton 1/2/06
32
Choosing an evaluation method
UniS Department of Computing Dr Terry Hinton 1/2/06
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.