Got Data, Now What? Analyzing Usability Study Results Lynn Silipigni Connaway June 26, 2005 Presented at the ALA 2005 Annual Conference Chicago, IL LAMA/MAES.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
What in the World? Geographical Representation of Library Collections in WorldCat: A Prototype Lynn Silipigni Connaway Clifton Snyder.
Human Computer Interaction
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
Usability presented by the OSU Libraries’ u-team.
Presentation: Usability Testing Steve Laumaillet November 22, 2004 Comp 585 V&V, Fall 2004.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Empirical Methods in Human- Computer Interaction.
Formative and Summative Evaluations
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Evaluation: Inspections, Analytics & Models
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
RESEARCH DESIGN.
Usability 2009 J T Burns1 Usability & Usability Engineering.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Human Computer Interaction
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Usability Evaluation/LP Usability: how to judge it.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
SEG3120 User Interfaces Design and Implementation
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
Quantitative and Qualitative Approaches
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Human-computer interaction: users, tasks & designs User modelling in user-centred system design (UCSD) Use with Human Computer Interaction by Serengul.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
OSU Libraries presented by the u-team.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Day 8 Usability testing.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
Human-Computer Interaction
Usability Evaluation, part 2
Chapter 26 Inspections of the user interface
Evaluation.
CSM18 Usability Engineering
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Got Data, Now What? Analyzing Usability Study Results Lynn Silipigni Connaway June 26, 2005 Presented at the ALA 2005 Annual Conference Chicago, IL LAMA/MAES Using Measurement Data for Library Planning and Assessment Committee

Usability Testing: Why? Probably the best reason to test for usability is to eliminate those interminable arguments about the right way to do something. With human-factors input and testing, however, you can replace opinion with data. Real data tend to make arguments evaporate and meeting schedules shrink. (Fowler, 1998, Appendix, p. 283)

Usability Testing: Definition Degree to which a user can successfully learn and use a product to achieve a goal Research methodology Evaluation Experimental design Observation and analysis of user behavior while users use a product or product prototype to achieve a goal (Dumas and Reddish, 1993, p.22) User-centered design process involving user from initial design to product upgrade (Norlin and Winters, 2002) Approach is to be a servant to the users of a system NOT to be subservient to technology (Gluck, 1998) Goal is to identify usability problems and make recommendations for fixing and improving the design (Rubin, 1994)

Usability Testing: Background Relatively new methodology (Norlin and Winters, 2002) Origins in aircraft design Traced back to marketing Development of a product Popular in 1980s with widespread access to computers Initiation of human computer interface usability studies Evolved from human ethnographic observation, ergonomics, and cognitive psychology Qualitative and quantitative data

Usability Testing: Purpose Evaluation tool Identify problem areas Determine the fit of the design to the intended users (Norlin and Winters, 2002, p. 5)

Usability Testing: Suitable Questions What is the best layout for a web page? How can you optimize reading from PDAs and small screen interfaces? Which online fonts are the best? What makes an e-commerce site difficult to use? Can individual personality or cognitive skills predict Internet use behavior? How can library collection holdings and library data be represented geographically?

Usability Testing: Principles Keep the end user in mind Achieve superiority through simplicity Improve performance through design Refine and iterate (Norlin and Winters, 2002, p.10)

Usability Testing: Web Design Criteria Links must be consistent and predictable Group like things on the same page Be consistent with language Most important information should be on the first screen Provide keywords for quick reading/scanning Do not use animation or sounds Make links look like links Distinguish text from graphics Avoid jargon (Spool, 1999)

Usability Testing: Web Design Criteria Ten Usability Heuristics (Nielsen) Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation

Usability Testing: Web Design Criteria Goals for user-centered design Enable users to Achieve their particular goals and meet their needs Move quickly and with few errors Create a site that users like More likely to perform well on a product that provides satisfaction

Usability Testing: Methodology Artificial environment (laboratory) Maintain more control May provide more specific data on a particular feature Natural environment Better holistic representation of real people doing real work

Usability Testing: Methodology Four types of usability tests (Rubin, 1994, p ) Exploratory test – early product development Assessment test – most typical, either early or midway in the product development Validation test – verification of products usability Comparison test – compare two or more designs; can be used with other three types of tests

Usability Testing: Methodology Develop problem statements, objectives, and/or hypotheses Identify and select participants who represent target population May or may not be randomly selected Select test monitor/administrator Empathetic Impartial Good communicator Good memory Able to follow test structure Able to react spontaneously to situations that cannot be anticipated Allow user time for task Dont rescue the user Continue with the plan if mistakes occur

Usability Testing: Methodology Design test materials Screening questionnaire Provides user profile Ascertains pretest attitudes and background information Provides information about participants previous knowledge and experience Orientation script Describes the test to participants Aids in understanding the participants performance Data logger materials Data collection instrument for categorizing participants actions Can note time to match with videotape recording

Usability Testing: Methodology Design test materials Non-disclosure and tape consent forms for legal protection Task list List of actions participants will execute Desired end results Motives for performing task Actual observations monitor will record State of system

Usability Testing: Methodology Design test materials Posttest questionnaire All participants asked the same questions Gather qualitative information and precision measurements Debriefing guide Structure and protocols for ending the session Participants explain things not apparent in actions Motive Rationale Points of confusion

Usability Testing: Methodology Test materials and equipment Conduct the test Represent the actual work environment Users are asked to think aloud Observe users while using or reviewing the product Probe Controlled and extensive questioning Collect quantitative and qualitative data and measures Record comments or questions about the product Observe and document users behaviors

Usability Testing: Methodology Debrief Analyze the data Diagnose and recommend corrections Categorize and identify problems with the product Identify solutions Qualitative analysis Textual notes from debriefing Read responses Summarize findings

Usability Testing: Methodology Analyze the data Quantitative analysis Questionnaires Screening Posttest Triangulation to validate findings Data from questionnaires, observations, screen tracking software, comments, and open-ended questions

Usability Testing: Interpret Data Interpret the data Five factors for benchmarking the usability of an interface (Shneiderman and Plaisant, 2004) Time to learn Speed of performance Rate of errors Retention over time Subjective satisfaction

Usability Testing: Interpret Data Interpret the data Prioritize severity of problems Severity ratings (Zimmerman and Akerelrea, 2004) Time required to complete task Number of users who encountered problem Negative impact on users perception of the product Difficult if 70% of users cannot perform task Error criticality = Severity + Probability of Occurrence (Rubin, 1994)

Usability Testing: Interpret Data Usable Web site: (Rubin, 1994) Usefulness Establish whether it does what the user needs it to do Effectiveness Ease of use to achieve the desired task Learnability Ease of learning application and moving from being a novice to a skilled user User satisfaction Users attitude about the sitehow enjoyable it is to use

Usability Testing: Report Results Executive summary Report Describe methodology Who, what, when, where, and how Describe how tests were conducted Profile users and describe sampling Detail data collection methods Succinctly explain the analysis Provide screen captures Include tables and graphs Provide examples Identify strengths and weaknesses Recommend improvements

Usability Testing: Making the Data Work Read report Determine what worked and what did not work Redesign product/system based upon findings May be necessary to conduct another usability test

Usability Testing: Limitations Two major limitations (Wheat) Reliability Testing of users who are not atypical users Individual variation within the test population Validity Test tasks, scenarios of the search processes, and testing environment are not accurate Results not generalizable to the entire user population Testing is always artificial (Rubin, 1994, p.27)

OCLC WorldMap TM Research prototype Test geographical representation of WorldCat holdings By country and date of publication For library collection assessment and comparison Complement the AAU/ARL Global Resources Network project Geographically represent library statistical data from UNESCO, ARL, Bowker, and others Number of libraries by type Expenditures by library type Number of volumes and titles Number of librarians Number of users

Usability Testing: OCLC WorldMap TM Review sample handouts Screening questionnaire Task list Posttest questionnaire Executive summary

Usability Testing: OCLC WorldMap TM Conducted informal usability tests Currently redesigning the interface Conduct second group of formal usability tests Make revisions prior to making publicly available

Questions and Discussion