Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Evaluating with experts
An evaluation framework
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Usability 2009 J T Burns1 Usability & Usability Engineering.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Chapter 20 Why evaluate the usability of UI designs?
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
What about Chapter 7?. What is the usability process? Tyldesley’s 22 possible Measurement Criteria Let’s focus on usability–A usability initiative needs.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
SEG3120 User Interfaces Design and Implementation
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Midterm Stats Min: 16/38 (42%) Max: 36.5/38 (96%) Average: 29.5/36 (78%)
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Heuristic Evaluation August 5, 2016
Usability Evaluation, part 2
GAN-MVL 2.1 Heuristic Evaluation
Chapter 20 Why evaluate the usability of user interface designs?
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Chapter 21 Deciding on what you need to evaluate: the strategy
Chapter 26 Inspections of the user interface
Chapter 22 Planning who, what, when, and where
Evaluation.
COMP444 Human Computer Interaction Evaluation
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to evaluate: the strategy  Chapter 22 Planning who, what, when, and where

Jakob Nielsen's set of heuristics  Does the Interface Meet the Usability Requirements? ◦ Effective ◦ Efficient ◦ Engaging ◦ Error tolerant ◦ Easy to learn  Exploring Other Concerns in Evaluations ◦ Why users are unable to complete tasks easily. ◦ Is the UI developed for all levels of users? ◦ Are all design features acceptable to users?  Visibility of system status  Match between system and the real world  User control and freedom  Consistency and standards  Error prevention  Recognition rather than recall  Flexibility and efficiency of use  Aesthetic and minimalist design  Help users recognize, diagnose, and recover from errors  Help and documentation 2

 Expert Review  Cognitive Walkthrough  Usability evaluation ◦ Participant ◦ Observer (Logger) ◦ Facilitator (Briefer/debriefer) ◦ Evaluator ◦ Evaluation Data  Raw Data  Qualified Data  Quantified Data 3

◦ Our Running Example: Global Warming  The Users for S103  Users’ Tasks and the Global Warming UI  CD-ROM based  The Domain for the Global Warming UI  Paper-based and CD-ROM materials  The Environment for the Global Warming UI  Home-study, or work ◦ Description of the Global Warming User Interface 4

5

 Usability concerns ◦ evidence of visibility, affordance, and feedback  True Users  Task Scenarios with setting 6

7

8

◦ The Process of Usability Evaluation Is Iterative ◦ Techniques for Usability Evaluations  User Observations  Inspections of the User Interface  Conform to usability standards?  Other Evaluation Techniques  Variations of user observation or inspection 9

10

◦ Welcome participant, ask a couple of questions to determine participant’s attitude and current half filled/empty mood ◦ Explain purpose  the system has problems that we have not been able to uncover. ◦ Make participant comfortable. Assure the participant they can stop at any time ◦ Rehearse the “think out loud” with the participant ◦ Give participant the task scenarios with setting to complete tasks  you observe and record in another location – you must give the participant privacy & room ◦ Following completion of tasks, ask for participant’s views & to complete post-test questionnaire ◦ Thank participant. 11

◦ Deciding What to Test ◦ Do You Have to Work within Any Constraints? ◦ Writing Up Your Evaluation Strategy for the Global Warming Evaluation 12

◦ What Is the Purpose of This Evaluation?  Does system meet usability requirements/concerns  Qualitative Usability Requirements  Desired features  “The users on an e-shopping site should be able to order an item easily and without assistance.”  “Railway clerks work in extremely noisy environments, so any warning messages to them should be visually distinct and highlighted on the screens.” 14

 Quantitative Usability Requirements/Usability Metrics  Explicit measures used: percentages, timings, or numbers are used.  “It should be possible for the users to load understand any page of a web site in 10 seconds using a 56K modem.”  “It should take no more than two minutes for an experienced user (one who has domain knowledge and has undergone the prescribed level of training when the new system is introduced) to enter know how to enter a customer’s details in the hotel’s database & do so with no more than 2 sec hesitation” 15

 Prioritizing Usability Requirements and Concerns  The usability requirements most important to the success of the system are given priority.  Assign values to the five dimensions of usability, the Five Es. 16

 General Museum Site 17

◦ What Type of Data Do I Want to Collect?  Quantitative data  Numeric content  Qualitative data  Non-numeric content 18

◦ What Am I Evaluating?  Never a finish product without opportunities to improve ◦ What Constraints Do I Have?  Money  Timescales  Availability of usability equipment  Availability of participants and the costs of recruiting them  Availability of evaluators ◦ Documenting the Evaluation Strategy 19

20

◦ Deciding What to Test ◦ Do You Have to Work within Any Constraints? ◦ Writing Up Your Evaluation Strategy for the Global Warming Evaluation 21

CHAPTER 22 PLANNING WHO, WHAT, WHEN, AND WHERE

Who Is a True User? Users who reflect the different skills, domain knowledge, system experience Key True User Questions – what characteristics must be present to insure a true user Number of Participants to have valid results N=6 Create an agreement Before evaluation (protects all; frees user) Ulab Team WHO – USERS & ULAB TEAM 23

WHAT – USABILITY EVALUATION 1. Conduct a planning meeting involving the ULAB team. Definition of the usability goals and concerns for the evaluation Establishment of the parts of the evaluation Creation of a user profile Development of screening questionnaire Creation of task scenarios (task scripts) Determination of the quantitative and qualitative measures (evaluation metrics) Assignment of team roles for the evaluation Establishment of the method of analysis of data and the baseline criteria Establishment of equipment list needed for the evaluation and configuration of the evaluator room 2. Complete independent tasks to be performed by team members including participant selection criteria, development of test materials, etc. 24

WHAT – USABILITY EVALUATION (CONTINUED) 3. Conduct a lab walk-through for the planned evaluation with all team members. Do a rehearsal with one of the team members as the user. This data is not valid.. Edit all materials accordingly 4. Conduct a pilot evaluation. Edit all materials accordingly. 5. Conduct the evaluation with 6 true users. Analyze the results, prepare findings and recommendations. Prepare a final project summary report and deliver the report, all test materials, and raw capture data. Prepare a PowerPoint presentation of key findings. 25

Establishment of the project time line Decide the Duration of the Evaluation Sessions Create an Evaluation Timetable – sessions, evaluation, reporting Decide the metrics to capture & the baseline criteria Look at Table 6.1 Possible Measurement Criteria Preparing Task Descriptions – the tasks the participant will perform while interacting with the prototype during the evaluation Task Cards Example - Task Descriptions for Global Warming WHEN - CREATING A TIMETABLE 26

Where Will You Do the Evaluation? Field Studies – user’s own environment Controlled Studies – other than user’s environment The Setting for the Global Warming Evaluation Arranging Usability Evaluation Sessions The Arrangements for the Global Warming Evaluation WHERE 27