1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Human Computer Interaction
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Heuristic Evaluation May 4, 2016
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
CS 522: Human-Computer Interaction Usability and HCI Topics
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Chapter 26 Inspections of the user interface
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Presentation transcript:

1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia

2 Methods for usability evaluation Wide variety of methods, often ad-hoc testing (not too systematic) More than one approach may be needed –one test cannot find all problems –One evaluator cannot see all aspects Consult standards for basic requirements!

3 Methods, categories Cognitive Modeling17 Conceptual Design22 Empirical Methods34 Inquiry16 Needs Analysis8 Project Management27 Prototyping20 Usability Evaluation 46

4 Usability evaluation methods used in Finland

5 Usability standards The ISO Reference Model For Web design –The process domain. This domain describes the design process used by the organisation, such as the one described in ISO 13407:1999 Human-centred design processes for interactive systems. –The evaluation domain. This domain contains the tools and techniques used to assess the final design, such as usability testing. –The design domain. This is the domain within which the designer develops the web site.

6 Usability in ISO standards ISO :2002 Software ergonomics for multimedia user interfaces. In Tuubi ISO 9241 Ergonomics of human-system interaction provides requirements and recommendations that contribute to usability, and the ergonomic principles underlying them. User interface evaluation ISO Part 110: Dialogue principles.

7 ISO Dialogue principles Is the dialogue suitable for the user's task and skill level? (Suitability for the task) A dialogue is suitable for a task when it supports the user in the effective and efficient completion of the task. In a dialogue which is suitable for the task, the user is enabled to focus on the task itself rather than the technology chosen to perform that task. Does the dialogue make it clear what the user should do next? (Self-descriptiveness) A dialogue is self-descriptive to the extent that at any time it is obvious to the users which dialogue they are in, where they are within the dialogue, which actions can be taken and how they can be performed.

8 Dialogue principles 2 Is the dialogue consistent? (Conformity with user expectations) A dialogue conforms with user expectations if it corresponds to predictable contextual needs of the user and to commonly accepted conventions. Does the dialogue support learning? (Suitability for learning) A dialogue is suitable for learning when it supports and guides the user in learning to use the system. Can the user control the pace and sequence of the interaction? (Controllability) A dialogue is controllable when the user is able to initiate and control the direction and pace of the interaction until the point at which the goal has been met.

9 Dialogue principles 3 Is the dialogue forgiving? (Error tolerance) A dialogue is error-tolerant if, despite evident errors in input, the intended result may be achieved with either no or minimal corrective action by the user. Error tolerance is achieved by means of damage control, error correction, or error management to cope with errors that occur. Can the dialogue be customised to suit the user? (Suitability for individualisation) A dialogue is capable of individualization when users can modify interaction and presentation of information to suit their individual capabilities and needs.

10 Methods End-user testing Upgraded Nielsen heuristics Cognitive walk-through Situation analysis – use cases

11 End-user testing Test subject belongs to the target group Genuine tasks Videotaping of tests (talk-aloud) Eye-tracking EEG measurements Result analysis –Classification of errors according to seriousness Test report

12 Classes of design errors 1.Local mistake 2.Systematic error 3.Fixing would need redesign 4.Fixing needs use case analysis

13 Strenghts and weaknesses Strenghts –Gives understanding of real use situations –Finds critical usability problems –Genuine user views –Credibility of results Weaknesses –Demanding and expensive: –Planning, implementation and analysis –Real users needed

14 A Usability problem exists if Test person –Proposed improvement to the system –Got confused –Tried to find solution more than three times –It took over 3 minutes to reach the goal –Gave up! System failure Jacobsen: The evaluator effect in usability tests

15 Nielsen heuristics 1.Visibility of system status 2.Match between system and the real world 3.User control and freedom 4.Consistency and standards 5.Error prevention 6.Recognition rather than recall 7.Flexibility and efficiency of use 8.Aesthetic and minimalist design 9.Help users recognize, diagnose, and recover from errors 10.Help and documentation

16 Addition to the Nielsen heuristics 11.Respect for user and user needs 12.Pleasant product use experience 13.Support for quality standards 14.User privacy protection Muller et al (1995) Validating an extension to Participatory…

17 Heuristic assessment 1/2 3-5 evaluators –Experts of the application area and usability Independent learning of the application Discussion on findings 1-2 hours

18 Heuristic assessment 2/2 Note each problem separately –problem –Use case –What heuristic category is violated –Fixing proposal –Classification of seriousness Regularity, effects, permanence Also good sides of the design

19 Classification of problem seriousness 1.Not a problem 2.Cosmetic problem 3.Minor usability problem 4.Major usability problem 5.Catastrophic usability problem

20 Strenghts and weaknesses Strenghts –Cheap, fast, intuitive and can be applied in many situations –Good for fixing easy problems –Are fixes really improvements? Weaknesses –User interaction observation is not caught –Hard to find really fatal problems

21 Cognitive Walkthrough Method with end users Questions: 1.Is the user goal right for the user interface? 2.Can the user find the right function? 3.Can the user connect her goal to the function? 4.When the task is completed, does the feedback indicate that user has proceeded in the right direction?

22 Situation: use case A familiar and genuine work place Interviews with users and observations A well-defined subject The user is the expert – the researcher is an apprentice

23 Polite applications People tend to humanize computers (Reeves & Nass 1996) People attribute intentions and attitudes to computer systems Computer messages display programmer attitudes!

24

25 Polite applications A polite application is –Easy to approach –Ready to fill user needs –Good information –Is not confused and sticks to the task –Does not bother users with its own problems –Reliable