Some Usability Engineering Methods

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Member FINRA/SIPCThursday, November 12, 2009 Resource Menu Changes - Report User Experience Study | Kevin Cornwall.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
Usability presented by the OSU Libraries’ u-team.
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Interactive Systems Technical Design
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Evaluating with experts
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation: Hotels.com
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
R. G. Bias | School of Information | SZB 562BB | Phone: | i Some Usability Engineering Methods Randolph Bias Week.
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
User Interface Design SCMP Special Topic: Software Development
Human Computer Interaction Lecture 15 Usability Evaluation
Human Computer Interaction Slide 2
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
SY DE 542 User Testing March 7, 2005 R. Chow
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Nilesen 10 hueristics.
CSM18 Usability Engineering
Some Usability Engineering Methods
Presentation transcript:

Some Usability Engineering Methods Randolph Bias Week 7

First . . . Let’s talk about your projects.

Remember . . . Our approach, within usability engineering of web sites and other user interfaces, is: Empirical Iterative User-Centered Design

The Methods of Usability Engineering . . . Are employed in order to enable you to bring user data (empiricism) to bear on the emerging product design. You (the usability engineer) become an advocate for the user, in the product development process.

One big problem Cost

Three Methods to address cost Remote End-user testing (lab testing) Heuristic Evaluations Usability Walkthroughs Today let’s talk about WHY and WHEN we employ one method or another, and HOW to carry them out.

End-user Testing Also called “lab testing” Can be done on paper-and-pencil design, prototype, early code, existing product

EUT - Benefits Gather performance and satisfaction data Performance data: time on task, error rates, # calls to the help desk, # references to the documentation, . . . . Satisfaction data: End-of-test questionnaire Can be find-and-fix or benchmarking Ensure coverage of certain parts of the UI -- good control

EUT - Limitations Artificial situation Successful test doesn’t “prove” the product works Need representative users! Ease of learning vs. ease of use Hard to test longitudinally

EUT -- What to test? Can rarely cover all the UI. I like to test: critical tasks frequent tasks nettlesome tasks

Rubin’s 4 Types Exploratory (working on the “skeleton”) Assessment test (working on the “meat and flesh” of the UI) Validation test (does it meet the objectives?) Comparison test (compare two competing designs) Rubin, J. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley: New York, NY, 1994.

Set up Create environment (ambient setting, HW, SW) Identify participants Establish test roles (test monitor/administrator, data logger, timer, video operator, product experts (SMEs), other observers) Create test plan Prepare observers Prepare test materials

What materials? Instructions NDA Permission to videotape Test scenarios Questionnaire(s)

Conduct the Test Welcome the test participant Communicate that this is not a test of THEM, and they can leave any time The scenarios should match the real world setting Ask the test participants to “think aloud” to better understand intent Offer post-test questionnaire, and debrief

After the Test Quick data to product team Assign severities and build recommendations Build archival report Serve as change agents!

Ah, but REMOTE Saves tons of travel money Allows you to get otherwise hard-to-get test participants. Allows them to be in their own environments. Might allow product designers/developers to watch from their own office. But . . . lose some fidelity of the test environment (video?) Some added set-up cost (time)

What is a Heuristic Evaluation? Evaluators systematically inspect the application interface to check for compliance with recognized usability guidelines (heuristics). (Thus, an INSPECTION method.) Identifies major and minor usability problems - Conducted by three to five experienced usability engineers (or one!) - Problems are reported along with the heuristic it violates

Problems Identified The probability of one evaluator finding . . . A major usability problem - 42% * A minor usability problem - 32% More evaluators = more problems identified *from www.useit.com

Strengths of an HE Done by people experienced in usability not just “dumb” users Can identify both major and minor usability problems Can be done relatively quickly and inexpensively UNlike EUT, can sometimes cover every corner of a UI or web site

Weaknesses of an HE If done at end of design, designers may be resistant to changes Some designers/developers may be unmoved by “just opinions” Experienced usability evaluators may miss content problems that actual users would find Can HELP address this issue by using SMEs

Typical Methodology Interface is exercised -1st pass to develop the big picture -2nd pass to accomplish typical tasks Each problem is reported along with the heuristic it violates Comments are consolidated Severity levels – Critical, Major, Moderate, Minor  

Heuristics Used - Architecture - Task Design - Navigation and User Control - Consistency and Standards - Information is Meaningful - Flexibility and Efficiency of Use - User Support - Recognition rather than Recall - Aesthetic and Minimalist Design

Severity Levels Used Critical Usability Issue: Loss of user data System shutdown Abandoned task Major Usability Issue – Completed task but considerable frustration or extra steps Moderate Usability Issue – - Moderate work around or multiple attempts * Usability Suggestion

Sample Summary of Results 38 4 Totals 2 Help Icons 7 Main IDE 1 Debug Run Project Code Editor 16 3 Visual Layout 5 Project Creation Installation MINOR MAJOR CRITICAL TYPE

Nielsen’s Usability Heuristics Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

Nielsen’s Heuristics (cont’d.) Recognition rather than recall Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

How to . . . http://www.useit.com/papers/heuristic/heuristic_evaluation.html

Other Heuristics http://www.stcsig.org/usability/topics/articles/he-checklist.html

Welcome . . . . . . to today’s Usability Walkthrough. You are here to evaluate the Zara Tours web site. This is a test of the site, not of you.

Usability Walkthroughs Purpose To collect user data -- from multiple users at one time -- to help drive the design of a user interface

Context This is just one step in a series of “User-Centered Design” methods we’re employing: Heuristic evaluation (professional judgment) Usability walkthrough (YOU ARE HERE) End-user testing in the lab

Key Characteristics Three types of people in the room We’re going to go through some task flows, as a group. You’ll have hard-copy packets, where you will write the actions you would take, if you were online, carrying out a certain, prescribed task. No discussion until all have written a response. We’ll announce the “correct” action (according to the current design). We’ll discuss the page -- representative users first.

HARD Please note . . . Design is SW development teams budget time to debug the code; we’re debugging the design!

This is a test of the DESIGN! Not a test of YOU. If you have trouble finding the “right” answers, then WE have a problem with the UI. The site design team is being very bold, to expose their design to users early like this -- but they’re doing it because they realize the benefit, and they want to get it right!

And so . . . Given the input we hear today, the site design team (assisted by a usability professional) will redesign the interface.

The Flow of the Day We’ll hand you a piece of paper with a scenario description on it. We’ll hand you a packet of screen shots, in order. DO NOT LOOK AHEAD, please. We’ll ask you to write down on the page what action you’d take, to accomplish the task in the scenario.

The Flow (cont’d.) We’ll announce the “right” answer. I’ll ask you to indicate if you got it right. When you did not, I’ll welcome discussion. I’ll welcome the designers and developers to jump in, as discussion winds down. I’ll let SOME redesign go on, real-time. Write more comments under the screen.

Yet more flow . . . Then you’ll be asked to turn the page. Now, you’ll have to “reset,” and assume you got to this new page (somehow). Then, what action would you take on THIS page, while still trying to accomplish the task? After a scenario, we’ll give you a questionnaire to complete.

Benefits Lot of data early in the design cycle. Usability of individual screens, terminology, SOME task flow. Collaborative redesign on the fly.

Limitations Can’t get some data (e.g., time on task). You can’t browse, as you might online -- tendency to “lose your place.” Feel free to turn BACK in your packet, but not ahead.

Any questions? We’ll try to finish up by 2:30 or so. Then have some more informal time to discuss the interface. We may get more informal at points. THANK YOU very much for being here. Absolutely NO ONE should be embarrassed -- not the users, not the developers.

OK, first task: Task 1: Find how many days it takes to make the trek up Mt. Kilimanjaro via the Machame Route.

Which, Why, When?

Next Week Meet at BMC Software (map to follow, online). Turn in test plan the week after Spring Break.