A centre of expertise in digital information management www.ukoln.ac.uk 1 UKOLN is supported by: SWMLAC Workshop: Accessibility and Usability Marieke Guy.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

A centre of expertise in digital information management UKOLN is supported by: Usability testing for the WWW Emma Tonkin UKOLN
Human Computer Interaction
Cognitive Walkthrough More evaluation without users.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Usability presented by the OSU Libraries’ u-team.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Allison Bloodworth, Senior User Interaction Designer, University of California, Berkeley Gary Thompson, User Experience Leader, Unicon, Inc. Introduction.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
UX testing for mobile app Marine embe
CSI-553 Internet Information Presented by: Ignacio Castro June 28, 2006 Internet Usability.
Noor AL Jiboury (Onondaga Community College), Tatiana Tavares (UFPB), Alexandre Nóbrega (UFPB) One of the lessons learned in area of HCI (Human Computer.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation/LP Usability: how to judge it.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
© Simeon Keates 2008 Usability with Project Lecture 4 –18/09/09 Susanne Frennert.
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
RUGGAAMUFFIN Requirements analysis and design Shane Murphy.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
A centre of expertise in digital information management UKOLN is supported by: Usability on a Shoestring Budget (1) Emma Tonkin & Greg.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
Software Engineering D7025E
Chapter 26 Inspections of the user interface
Evaluation.
Formative Evaluation cs3724: HCI.
Testing & modeling users
Evaluation: Inspections, Analytics & Models
Presentation transcript:

A centre of expertise in digital information management 1 UKOLN is supported by: SWMLAC Workshop: Accessibility and Usability Marieke Guy Interoperability Focus

A centre of expertise in digital information management 2 What is Usability? Definitions: the measure of a product's potential to accomplish the goals of the user how easy an interface design is to understand and use ability of a system to be used [easily or efficiently] the people who use the product can do so quickly and easily to accomplish their own tasks

A centre of expertise in digital information management 3 Assumptions Usability has multiple dimensions: Usability means focusing on users people use products to be productive users are busy people trying to accomplish tasks users decide when a product is easy to use Have a look at Janice (Ginny) Redish and Joseph Dumas, A Practical Guide to Usability Testing, 1999, p. 4

A centre of expertise in digital information management 4 Critique Are users always busy? Does this definition imply that usability is only present in the workplace?! Effectiveness; Efficiency; Satisfaction Do users always know when a product is ready? Do all users agree about usability? Is usability even measurable? Is it a single characteristic?

A centre of expertise in digital information management 5 Elements of Usability Nielsen refers to five elements or components of usability: –learnability –efficiency –memorability –errors –satisfaction - Jakob Nielsen, Usability Engineering, 1993, p. 26 These may not have equal importance in all cases.

A centre of expertise in digital information management 6 Intentions and Goals Usability depends on context: –What does the user want to do? –Who is the user? –What's the user's perspective on life? Related to: –Internationalization: cultural, social –Task analysis: working out what the user wants to do (what the goal is), and how he or she would expect to be able to do it!

A centre of expertise in digital information management 7 Goal: Reading your Some favourite subject lines –“Meeting time changed” (which one?) –“New version” (...of?) –“____________” (the 'Blank' mail) –“Hello” (...hi!) These lack clarity Example: 'Mystery Meat Navigation': interfaces which make you do the work...

A centre of expertise in digital information management 8 Goal: Studying Music Therapy ….at Augsburg –Hmm.... what's that nice bar on the bottom doing? What happens if I click on... argh! It's moving! Umm... hey, there are words underneath. Maybe it's over there... moving too fast! Argh! I'll move the mouse off so I can read it! Oh... the words have disappeared... And so forth

A centre of expertise in digital information management 9 Goal: Renew your membership …in the Charleston Metro chamber –Hmm... what's the point of those pictures? oh... none... let's see, it says menu up there, but it's just a bunch of buttons! Err... well, if I shove the mouse over them, some text appears on the left. And all the pictures change. Ooh... nausea... And so forth

A centre of expertise in digital information management 10 What’s up with these Sites?

A centre of expertise in digital information management 11 Assessing Usability Questionnaires Heuristic Analysis Cognitive walkthrough Measuring the success of a Web site User-centred design Personas

A centre of expertise in digital information management 12 Questionnaires Advantages –Feedback from the user perspective –Largely independent of context –Can be used as base for comparison –Quick and cost effective, generates a lot of data Disadvantages –User expresses their reaction from their perspective - subjective topics are difficult –Questionnaires usually don’t go into detail –Looks like quantitative data, but provides only superficial understanding

A centre of expertise in digital information management 13 Heuristic Analysis Visibility of system status Match between system and real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognise, diagnose and recover from errors Help and documentation

A centre of expertise in digital information management 14 Heuristic Assesment To do a heuristic assessment… Take three to five evaluators and explain these heuristics to them Ask each evaluator to examine the interface on their own and write down each place where the interface violates a heuristic To make this easier, give the evaluator a typical usage scenario to follow

A centre of expertise in digital information management 15 Heuristic Analysis Doesn’t take long Is simple to do Finds most of the more obvious interface flaws Requires only a few (3-5) unskilled evaluators to find ~90% of problems Can also be used on mock-up interfaces, such as prototypes on paper or screenshots

A centre of expertise in digital information management 16 Cognitive Walkthrough...is a practical evaluation technique that: Can be performed in very early stages of prototyping, eg on paper Quick and easy evaluation of designs, right from the start Does not involve users - helps designers to see the system from the user’s perspective Helps to identify problems in interacting with the system - procedural problems Sessions can be videotaped for later analysis

A centre of expertise in digital information management 17 Cognitive Walkthrough Procedure –Set a goal to be accomplished with the system –Search the interface for currently available actions –Select the action that seems likely to make progress towards the goal –Perform the selected action, and evaluate the system’s feedback for evidence that progress has been made towards the current goal How well does the interface support exploratory learning?

A centre of expertise in digital information management 18 Cognitive Walkthrough Prerequisites: –information about the users’ knowledge and experience –about the expected uses of the system –a list of the expected ‘correct’ actions used to achieve each goal

A centre of expertise in digital information management 19 What makes a Good Web site? Depends on the site! But... –How many users visit the site? –How long do users spend on the site? –How many pages do they visit? –Do they achieve their aims?

A centre of expertise in digital information management 20 Answers... Users’ movements on your website are recorded as log files From these we can see: –How many unique users visit –Where they came from - who referred them to our site –How long they spend on the site –What pages they visit –When (and perhaps why) they leave - maybe they all get stuck on the same page?

A centre of expertise in digital information management 21 User-centred Design Designing for the audience An iterative process Run the results past your users at each stage Encourage user participation Test early and often, even before any code exists - use paper prototypes Add and test details each cycle

A centre of expertise in digital information management 22 User Requirements Usability testing reveals what parts of the user interface need to be fixed Requirements analysis reveals the functionality that the program does not fulfil User requirements can be used to inform design - to help the product approach users’ actual needs Who are the users? What are their goals and how can we help? Expensive methods: task analysis Discount method: making use of User personas

A centre of expertise in digital information management 23 User Personas Designing for real users is hard Choose several in-depth user personas for whom to design Based on real people, derived from qualitative research Not based on a single individual Helps us model software to likely user needs

A centre of expertise in digital information management 24 Creating User Personas Identify the target user groups Research real people from those groups –Ethnography, interviews, diaries Develop personas from the information gathered –Organise research data into descriptive text

A centre of expertise in digital information management 25 Using Personas Get to know the personas. Write down their goals, their intentions Write the story of how they would achieve their goal - the scenario Determine the individual tasks or steps involved in this story This represents a required feature set Additionally, don’t be afraid to run ideas past them!

A centre of expertise in digital information management 26 Acknowledgements Thanks to my colleague Emma Tonkin for the content of these slides!

A centre of expertise in digital information management 27 Questions Any questions?