©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.

Slides:



Advertisements
Similar presentations
Human Computer Interaction
Advertisements

Human Computer Interaction
evaluation techniques
Evaluating UI Designs assess effect of interface on user performance and satisfaction identify specific usability problems evaluate users’ access to functionality.
CS305: HCI in SW Development Evaluation (Return to…)
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Evaluation techniques Part 2
Useability.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation Methodologies
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
1http://img.cs.man.ac.uk/stevens Evaluation CS2391 Lecture n+1: Robert Stevens.
Evaluation techniques Part 1
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
Predictive Evaluation
Chapter 7 evaluation techniques. Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Human Computer Interaction Chapter 3 Evaluation Techniques.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Chapter 10 Evaluation. Objectives Define the role of evaluation To understand the importance of evaluation Discuss how developers cope with real-world.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
1 Lecture 18 chapter 9 evaluation techniques. 2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field.
CS 580 chapter 9 evaluation techniques. Evaluation Tests usability and functionality of system Occurs in laboratory, field and/or in collaboration with.
CENG 394 Introduction to Human-Computer Interaction
Chapter 8 Usability Specification Techniques Hix & Hartson.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Overview and Revision for INFO3315. The exam
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
User Interface Evaluation Introduction Lecture #15.
Department of CSE, KLU 1 Chapter 9 evaluation techniques.
Observational Methods Think Aloud Cooperative evaluation Protocol analysis Automated analysis Post-task walkthroughs.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation Objective –Tests usability –Efficiency – Functionality of system occurs in laboratory,
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Human Computer Interaction
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
Evaluation techniques
CSE310 Human-Computer Interaction
Usability Techniques Lecture 13.
evaluation techniques
Evaluation.
Week: 14 Human-Computer Interaction
HCI Evaluation Techniques
CSM18 Usability Engineering
Evaluation Techniques
Experimental Evaluation
evaluation techniques
Presentation transcript:

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 2 COMP 7620 Evaluation A very significant aspect of HCI design that separates it from SE To test usability and usefulness Can be done in the lab and/or in the field Evaluate the design (early) and the implementation (later) Evaluate the implementation during design (formative) and after it has been deployed and used by customers (summative)

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 3 COMP 7620 Goals Assess the functionality and usefulness of the interactive system: –Does it match requirements specifications? –Does it match the expectations of designers and users? –Does it meet the set out performance goals? Assess the effect of the interface on the user: usability Identify specific problems with the system and its interface, and develop remedies.

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 4 COMP 7620 Styles Laboratory Studies –Advantages specialized equipment can be used environment can be controlled –Disadvantages Cost Unnatural/intimidating Difficult to observe users in their “natural state” –Use when collecting data in actual environment is impractical if environment needs to be controlled if experimental psychology techniques are to be used

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 5 COMP 7620 Styles Field Studies –Advantages Observe users in their “natural” setting Can observe effects of context on system use For longitudinal studies that require several days/weeks/months –Disadvantages Cost Can’t control environment: noise, interruptions etc. –Use when A longitudinal study is required Data on actual use conditions is desired Strict control of the experimental condition unimportant

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 6 COMP 7620 Techniques Cognitive Walkthrough Heuristic Evaluation Review-based Evaluation Model-based Evaluation Usability Evaluation Observational Methods Query Techniques Experimental Evaluation

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 7 COMP 7620 Cognitive Walkthrough Like code inspection/walkthrough in SE Evaluates the design through a prototype - how well does it support the user in learning how to do the task - through exploration by experts Usually performed by an expert who “walks through” the design to identify potential problems

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 8 COMP 7620 Cognitive Walkthrough Starts with the expert being provided with: –A prototype of the system –Task, action and user descriptions developed during the design For each task identified in task analysis, CW considers the following: –What impact will interaction have on the user? –What cognitive processes are required? –What learning/interaction problems may occur?

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 9 COMP 7620 Cognitive Walkthrough –For each action the user needs to carry out to accomplish a goal, the expert asks: Can the user recognize the correct action that is required? Is the action (i.e. how to carry it out) visible at the interface? Is there a difference between its intended effect (what the user wants to happen) and its actual effect (what the system does)? Is the user able to carry out the action successfully? Can the user then successfully interpret the feedback provided by the system? A negative answer indicates a potential usability problem –Understand the example in

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 10 COMP 7620 Heuristic Evaluation A set of usability criteria (called heuristics) are identified –see the list on p. 413 –E.g. System behaves in a predictable way in response to all user actions System behaves in a consistent way System provides feedback for all correct and incorrect actions Then the design and/or the prototype is examined by experts to see if these are violated This is called a “usability inspection technique”

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 11 COMP 7620 Heuristic Evaluation Select the heuristics from those proposed in the literature by experts like Jakob Nielsen Develop system and task specific questions to verify if the design/prototype satisfies each selected heuristic Have multiple experts independently evaluate the system/prototype using these questions

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 12 COMP 7620 Other Evaluation Techniques –Review-based Evaluation Review the technical HCI literature to see if similar designs/systems have been evaluated Not commonly practiced –Model-based evaluation An analytical approach in which models developed during the design process, such as ATN, GOMS, TDH etc, are analyzed to discover potential problems –Usability Specification & Evaluation Selecting, setting target levels, and measuring specific usability attributes: usability specification table Already covered; Read Chapter 8 of Reference if you haven’t yet done so!

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 13 COMP 7620 Observational Methods –A class of techniques called “protocol analyses” Experimenter note-taking –cheap but limited User notebooks –subjective, coarse level data –but useful user insights –good for longitudinal studies –beepers/PDAs used for reminding Audio-taping –may miss actions, gestures etc. –transcription difficult

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 14 COMP 7620 Protocol Analyses Video-taping –more complete record –but special equipment needed –obtrusive –transcription difficult Computer-logging –automatic and unobtrusive –but voluminous data Some combination of these is typically used

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 15 COMP 7620 Concurrent Think-aloud Protocols Method –User observed while doing the task –User asked to talk aloud what he/she is doing, why, and what he/she is thinking/expecting to happen, etc. Advantages –Simple technique –Can provide insights into user’s cognitive processes –Can reveal causes of errors Disadvantages –Highly subjective –Voluminous raw data –Talking may alter performance

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 16 COMP 7620 Cooperative Protocols Method –Variation of think-aloud in which the subject cooperates with the experimenter in asking and answering questions Advantages –Advantages of think-aloud –Less constrained than think-aloud –User is encouraged to criticize the system and provide clarifications Disadvantages –Disadvantages of think-aloud

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 17 COMP 7620 Retrospective Protocols Also called Post-task Walkthrough Method –User reflects on what happened after doing the task. –User asked questions to fill in details. –Sometimes combined with a during-task protocol collection Advantages –Experimenter can focus on relevant incidents –Task interruption due to talking is avoided Disadvantages –Memory limitations –Post-hoc interpretation of what happened is likely to be subjective

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 18 COMP 7620 Query Techniques Advantages: informal, cheap, simple Main disadvantage: subjective Techniques –Structured Interviews Experimenter questions each user after working with the system using prepared questions. Written/Oral Advantages: –Different questions for different users and tasks –Issues can be explored fully –Provides significant user input Disadvantages –Time consuming

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 19 COMP 7620 Query Techniques –Questionnaires/Surveys Fixed, typically multiple choice, written questionnaire given to users to fill out. Careful design of questions and data analysis methods needed. Advantages: –Quick, useful for large numbers of users –Can quantify data and statistical analyses possible –Provides significant user input Disadvantages –Less flexible than interviews –Less deeply probing

©N. Hari Narayanan Computer Science & Software Engineering Auburn University 20 COMP 7620 Query Techniques –Questionnaires/Surveys Careful design of questions and data analysis methods needed. Question Styles: –General: to characterize the subject –Open-ended: to elicit opinions/suggestions –Scalar: Likert Scale –Multiple-choice –Ranked Examples –See worked exercise p. 433 of text –See p. 228 of reference