A centre of expertise in digital information management www.ukoln.ac.uk UKOLN is supported by: Usability testing for the WWW Emma Tonkin UKOLN www.bath.ac.uk.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Web Design Process CMPT 281. Outline How do we know good sites from bad sites? Web design process Class design exercise.
UX testing for mobile app Marine embe
CSI-553 Internet Information Presented by: Ignacio Castro June 28, 2006 Internet Usability.
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
… and after unit testing …
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Heuristic evaluation Functionality: Visual Design: Efficiency:
What is interaction design? Eileen Kraemer CSCI 4800/6800 University of Georgia.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
A centre of expertise in digital information management 1 UKOLN is supported by: SWMLAC Workshop: Accessibility and Usability Marieke Guy.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Mtivity Client Support System Quick start guide. Mtivity Client Support System We are very pleased to announce the launch of a new Client Support System.
OSU Libraries presented by the u-team.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
RUGGAAMUFFIN Requirements analysis and design Shane Murphy.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
A centre of expertise in digital information management UKOLN is supported by: Usability on a Shoestring Budget (1) Emma Tonkin & Greg.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
SIE 515 Design Evaluation Lecture 7.
Katherine Prentice, MSIS Richard Usatine, MD
Unit 14 Website Design HND in Computing and Systems Development
Evaluation.
Nilesen 10 hueristics.
Presentation transcript:

A centre of expertise in digital information management UKOLN is supported by: Usability testing for the WWW Emma Tonkin UKOLN

A centre of expertise in digital information management Introduction UKOLN, the University of Bath Why this session?

A centre of expertise in digital information management Why do projects fail? Project Impaired Factors % of the Responses 1. Incomplete Requirements 13.1% 2. Lack of User Involvement 12.4% 3. Lack of Resources 10.6% 4. Unrealistic Expectations 9.9% 5. Lack of Executive Support 9.3% 6. Changing Requirements & Specifications 8.7% 7. Lack of Planning 8.1% 8. Didn't Need It Any Longer 7.5% 9. Lack of IT Management 6.2% 10. Technology Illiteracy 4.3% 11. Other 9.9%

A centre of expertise in digital information management Introducing usability Definition: the measure of a products potential to accomplish the goals of a user How easy a user interface is to understand and use Ability of a system to be used [easily? Efficiently? Quickly?] The people who use the project can accomplish their tasks quickly and easily

A centre of expertise in digital information management Assumptions There are several dimensions to usability –Focus on users –People use products to be productive –Users are busy people trying to accomplish tasks quickly –Users decide when a product is easy to use (Adapted from Redish & Dumas, A Practical Guide to User Testing)

A centre of expertise in digital information management However… Are users always busy? Does this imply that usability is only present in the workplace?! Effectiveness vs. efficiency vs. satisfaction Do users know when a product is ready? Do all users agree about usability? Is usability actually measurable? Is there one statistic that == % usability?

A centre of expertise in digital information management Elements of usability Nielsen refers to five elements or components of usability: –Learnability –Efficiency –Memorability –Errors –Satisfaction –Usability Engineering, 1993, p.26 These may not be of equal importance in all cases.

A centre of expertise in digital information management In other words… Usability depends on context –What does the user want to do? –Who is the user? Related to: –Internationalisation; cultural, social issues –Task analysis; working out what the user wants to do (what the goal is) and how he/she would expect to accomplish it

A centre of expertise in digital information management Science vs craft Formal approaches: –Research-driven –hard science –Laboratory-based Informal approaches: –Naturalistic, qualitative observations –Informal setting

A centre of expertise in digital information management User model vs user testing Either we apply our understanding of the way users act, and test the interface that way Or we simply observe users...

A centre of expertise in digital information management

A centre of expertise in digital information management A note about rule-based testing/validation Should be vs is – model vs reality Great handwriting does not guarantee a compellingly readable result

A centre of expertise in digital information management Scenario-based user testing Based around tasks Simple scenarios (hypothetical stories/abstract-level test cases): –For a company web page, locating and using contact details –Registration and login to a wiki Process: provide a task and ask the user to complete it –It is important to test the right tasks!

A centre of expertise in digital information management Cognitive walkthrough Works something like this: Task: Climb mountain and find the highest peak

A centre of expertise in digital information management Required for CW A description of the interface A task scenario Assumptions: What knowledge does the user already have? Functionality: What actions will accomplish the task with this interface?

A centre of expertise in digital information management Method: Look at each step that is required to accomplish the task: –Will the user try this step? –Will the user notice that this action (control, button, switch) is available? –Will the user associate this action with the effect that they are hoping for? –If this action is performed, does it appear that progress is being made? Can you 'tell a success story' for each step? If not, there is a usability problem.

A centre of expertise in digital information management Recording your test: Create a diary format: –Trying to achieve whatever: Looking for something that does whatever Found a button marked foo But clicking on foo took me to unrelated- looking screen blah Like the mountain-climbing line, you can go back and try another trajectory – document this in a similar way.

A centre of expertise in digital information management Developing appropriate task scenarios Probably the hardest thing about any usability testing On the one hand, you are not required to support very improbable scenarios. On the other hand, developing and supporting probable scenarios is key to a user-centred development process.

A centre of expertise in digital information management Trying out a CW Who's got a mobile phone? In groups: –Work out a couple of tasks. –Working from the perspective of a user with an appropriate level of knowledge (you will have to define what that means!), try the tasks. Document the result.

A centre of expertise in digital information management User testing (with real users!) The popular example is heuristic evaluation. Heuristics are rules of thumb. Heuristic evaluation requires about six people and a large amount of coffee. Provide them with a list of the ten (or twelve, or...) heuristics, and ask them to examine each page ('screen') for problems, according to the heuristics.

A centre of expertise in digital information management Ten heuristics Visibility of system status: Does the system give timely & appropriate feedback? Match between system and the real world: Is it speaking the users language? User control and freedom: How hard is it to undo unwanted actions? Consistency and standards: Does it follow conventions and expectations? Error prevention: Are potential errors recognised before becoming a problem? Recognition rather than recall: Does the system rely on the users memory? Aesthetic & minimalist design: Are dialogs cluttered with information? Help users recognise, diagnose & recover from errors: Are error messages useful? Help and documentation: Is there online help? Is it useful?

A centre of expertise in digital information management Evaluating the results Again, a diary form can be helpful: 'Screen 1 violates heuristic 10 because...' Merge these notes. List by frequency order to see most obvious bugs List by heuristic to see severity for your purposes

A centre of expertise in digital information management Applying the results Bug fixes Feature requests Major objections Misnamed element Confusing colours It would be much easier if… …this textbox autocompleted …the system remembered my preferences I dont like [type of application] I prefer [totally different type of application] …(Oh) Strange interaction flow Low- hanging fruit?

A centre of expertise in digital information management Testing layouts via greeked text Wasn't going to talk about this, but it's turned out to be useful Early stage of web site design often involves developing layouts/templates Because no real content exists yet, these may be hard to test using the above methods However, a layout should communicate something about page function. Does it?

A centre of expertise in digital information management Preparing a template Get greeked text from the Lorem Ipsum generator: – Place it into template. Do not leave a single readable word! Make yourself a list of elements that should be visible on the page Find/bribe about six test subjects

A centre of expertise in digital information management Example list: Main page content Page title Person responsible for page Navigation elements Last updated date Logo How to get back to the front page? News items

A centre of expertise in digital information management Testing process One user at a time: On each layout 'greeked', ask the user to identify each element or group of elements. If they can't find it, invite them to mark where they think it ought to be. Asking the user to 'think aloud' can be helpful Also, ask the user to give a mark (out of ten, or from -3 to +3, or whatever...) on 'subjective appeal' Note: Randomising order reduces systematic error

A centre of expertise in digital information management Coming up with a preamble This is a strange thing to ask someone to do. Do not be surprised if you get some funny looks. Come up with a short, reassuring introduction to the test. Useful items to include: –Introducing the software (purpose, not detail) –Your participation will help us... –Remember, we are testing the software, not your performance... –Please think out loud... –This style of test helps us to...

A centre of expertise in digital information management Examining the results Build a table: As the layout is improved, the number of misidentified elements should reduce

A centre of expertise in digital information management Creating scenarios Must be: –Motivating –Credible –Complex –Provide easy-to-evaluate results An Introduction to Scenario Testing, Cem Kaner, Florida Tech, June 2003 –Can be gleaned from documented requirements?

A centre of expertise in digital information management The test process A facilitator with detailed knowledge about the site/software is chosen to oversee the test –They must take care not to influence the users behaviour! The tester (user) is briefed about the site/software They then go through each scenario –Think-aloud method – describing and explaining actions –Talk-aloud method – describing without explanation (considered more accurate) The facilitator keeps notes and prompts the user where necessary Alternatively/additionally, the session can be videoed