Usability Evaluation.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Técnicas de Calidad en el Software Sesión # 10. Good quality software Operations Transition Testing Portability Reusability Interoperability Maintainability.
CyLab Usable Privacy and Security Laboratory 1 C yLab U sable P rivacy and S ecurity Laboratory Designing.
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
1 User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Usability Testing Lecture #8 - March 5th, : User Interface Design and Development.
Empirical Methods in Human- Computer Interaction.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation of usability tests. Why evaluate? 1. choose the most suitable data- collection techniques 2. identify methodological strength and weaknesses.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Saul Greenberg Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of.
ISE554 The WWW 3.4 Evaluation Methods. Evaluating Interfaces with Users Why evaluation is crucial to interface design General approaches and tradeoffs.
James Tam Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Spring break survey how much will your plans suck? how long are your plans? how many people are involved? how much did you overpay? what’s your name? how.
Chapter 14: Usability testing and field studies
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Evaluating a Research Report
Human Computer Interaction
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Evaluating HRD Programs
Usability Testing Chapter 6. Reliability Can you repeat the test?
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
Usability Testing: Role Playing as an Illustration Defining experimental tasks Recording stuff Identifying problems  changes Leading to A6 Presented by.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
G544 – Practical project SELF REPORT. Revision  Socrative quiz  In pairs – answer each question.  We will then discuss each answer given.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Usability Evaluation. Objectives for today Go over upcoming deliverables Learn about usability testing (testing with users) BTW, we haven’t had a quiz.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
11/10/981 User Testing CS 160, Fall ‘98 Professor James Landay November 10, 1998.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
CEN3722 Human Computer Interaction User Testing
User Interface Evaluation
Usability Evaluation, part 2
Introducing Evaluation
Program Evaluation Essentials-- Part 2
Data Collection Methods, Qualitative research
Experimental Design.
Usability Techniques Lecture 13.
Experimental Design.
Chapter 23 Deciding how to collect data
From Controlled to Natural Settings
HCI Evaluation Techniques
Human-Computer Interaction: Overview of User Studies
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Usability Evaluation

Evaluation with Users Big investment – big potential Remember, you can identify and solve lots of problems using cognitive walkthroughs and heuristic evaluation before you make the investment Many issues/techniques But they have a common foundation

Testing with Users Unlike some of the “userless” tests there aren’t a lot of fancy protocols we will learn. Most of the challenges around preparing for a user-based test involve PREPARING for a user-based test.

A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter? Who are the users going to be? How many users are needed? What kind of instruction will the users be given? What tasks will you ask the users to perform? What criteria will be used to determine the end of each task?

A Test Plan Checklist, 2 What aids will be made available to users? To what extent will the experimenter be allowed to help the users? What data is going to be collected and how will it be analyzed? What is the criterion for judging the interface a success?

A Test Plan Checklist, 3 Where and when will the evaluation be done? How long will the evaluation take? What computer support? What software? Initial state of the system? Are there any system/network load requirements?

The Role of the Experimenter Having the right experimenter makes a difference Selecting an appropriate methodology and one the experimenter is familiar with significantly influences the quality of results Knowledge of system implementation can come in handy Participating in a usability study can have a profound impact on a designer Even very simple, informal studies

The Role of the Experimenter Ensures that room, computer, etc are all ready During testing: Should not interfere! If a user is bogged down can give a small hint If the user is hopelessly off track, can fake an equipment problem

Ethical treatment of subjects Your responsibility to protect subjects distress, embarrassment remind them that you are not testing them Informed, voluntary consent understand that they can quit at any time explain test in lay terms Privacy: anonymity, use of image/voice

Which Users? As close to real users as possible if real users are scarce, try surrogates

How Many Users? Huge individual differences between users! Up to a factor of 10 A single point determines an infinite number of lines Some data is better than none

Which Tasks? Keep close to the real tasks may need to shorten some for time reasons may need to provide users with background information

When in the Process? Remember: early is better Formative vs. summative evaluation During design  design modifications After design  evaluation of “finished” product, comparison to baseline, rigorous statistics

What to Measure Process data (Qualitative) problems, questions, reactions what users are thinking Bottom-line data (Quantitative) mostly later for usability measurement not as useful early in design Asking users questions problematic – users will answer

Running the test Preparation Introduction Test Debriefing

Running the test - Prep Room ready? Equipment ready? Interface in the start state?

Running the test - Intro Cover the following with the user Evaluating the interface, not the user No personal stake Released version will differ Confidentiality reminder -- system,results Voluntary participation Welcome to ask questions Specific instructions Any questions?

Running the test Refrain from interacting with the user If the user is clearly stuck If several observers are present designate one as lead

Running the test - debriefing Fill out any questionnaires Ask follow-up questions Discussion Any other comments?