Toolkit Support for Usability Evaluation 05-830 Spring 2013 – Karin Tsai 1.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Chapter 5 Development and Evolution of User Interface
Requirements gathering
Designing and Evaluating Mobile Interaction: Challenges and Trends Authors: Marco de Sa and Luis Carrico.
Usability Testing HiØ, Masterstudium Informatikk Grensesnittdesign høsten 2006 Gisle Andresen Forsker, Institutt for Energiteknik.
USABILITY AND EVALUATION Motivations and Methods.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
OSU Libraries presented by the u-team.
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
Usability 2004 J T Burns1 Usability & Usability Engineering.
INF 132 Usability Testing Presentation 5. Overview of the System.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Olli Kulkki Markus Lappalainen Ville Lehtinen Reijo Lindroos Ilari Pulkkinen Helsinki University of Technology S Acceptability and Quality.
Software Process and Product Metrics
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Web Design Process CMPT 281. Outline How do we know good sites from bad sites? Web design process Class design exercise.
CSC271 Database Systems Lecture # 20.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
Human Interface Engineering1 Main Title, 60 pt., U/L case LS=.8 lines Introduction to Human Interface Engineering NTU Seminar Amy Ma HIE Global Director.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
1 ICS 122: Software Specification and Quality Engineering Spring 2002Lecturers: H. Muccini and D. J. Richardson Lecture 13: Summary The three aspects:
User Interface Evaluation Usability Testing Methods.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Usability. Definition of Usability Usability is a quality attribute that assesses how easy user interfaces are to use. The word "usability" also refers.
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
©2010 John Wiley and Sons Chapter 12 Research Methods in Human-Computer Interaction Chapter 12- Automated Data Collection.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Bryan Kern (SUNY Oswego), Anna Medeiros (UFPB), Rafael de Castro (UFPB), Maria Clara (UFPB), José Ivan (UFPB), Tatiana Tavares (UFPB), Damian Schofield.
Heuristic evaluation Functionality: Visual Design: Efficiency:
UI Style and Usability, User Experience Niteen Borge.
Lecture 7: Requirements Engineering
Work on Wellbeing: History, development, and insights. Dr Aaron Jarden.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
OVERVIEW Framework Overview – From Programming to Music Dimensions in Detail – Visibility, Progressive Evaluation, Consistency, Viscosity, Abstraction.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Overview and Revision for INFO3315. The exam
OSU Libraries presented by the u-team.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Exam 2 Review Software Engineering CS 561. Outline Requirements Development UML Class Diagrams Design Patterns Users, Usability, and User Interfaces Software.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
Object-Oriented and Classical Software Engineering Seventh Edition, WCB/McGraw-Hill, 2010 Stephen R. Schach
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Web Site Usability Motivations of Web site visitors include: –Learning about products or services that the company offers –Buying products or services.
CEN3722 Human Computer Interaction Overview of HCI Dr. Ron Eaglin.
A Hierarchical Model for Object-Oriented Design Quality Assessment
SIE 515 Design Evaluation Lecture 7.
Usability engineering
Usability engineering
Usability Evaluation, part 2
Interaction qualities
Usability Testing T/TAC Online Project Design Team
Chapter 12: Automated data collection methods
SY DE 542 User Testing March 7, 2005 R. Chow
Chapter 20 Why evaluate the usability of user interface designs?
CIS 210 Systems Analysis and Development
Chapter 26 Inspections of the user interface
The User Interface Design Process
Evaluation.
USABILITY PART of UX.
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

Toolkit Support for Usability Evaluation Spring 2013 – Karin Tsai 1

Overview Motivation Definitions Background from Literature Examples of Modern Tools 2

Motivation To improve or validate usability Comparison between products, AB tests, etc. Measuring progress Verify adherence to guidelines or standards Discover features of human cognition 3

Usability Attributes Learnability – easy to learn Efficiency – efficient to use Memorability – easy to remember how to use Errors – low error rate; easy to recover Satisfaction – pleasant to use and likable 4

Evaluation Categories Predictive –psychological modeling techniques –design reviews Observational –observations of users interacting with the system Participative –questionnaires –interviews –“think aloud” user-testing 5

Challenges and Tradeoffs Quality vs. Quantity –“Quality” defined as abstraction, interpretability, etc. –User testing – high quality; low quantity –Counting mouse clicks – low quality; high quantity Observing Context Abstraction –Event reporting in applications places burden on developers –Complicates software evolution 6

Evaluation Type: Predictive Description: Uses a predictive human performance model (“cognitive crash dummy”) to evaluate designs. 7 CogTool

8 ProsCons FreeLimited in “realisticness” Good for getting a baseline evaluation of prototypes Quite confusing at first (extremely high learning curve) Instantly accessible (not limited by participant availability or completion of the system’s functionality) Documentation is “daunting” Neat concept and insight into human cognition Limited usefulness Overall Score: 6.5/10

Evaluation Type: Observational Description: Aggregates developer-defined event data in useful ways. 9 Mixpanel

10 Mixpanel ProsCons Very powerful built-in analysis toolsHigh learning curve Good API for automated scriptingExpensive Scalable Application events = developer burden/maintainability issues Flexible to fit needs of developersRate-limited (one request at a time) Overall Score: 9.5/10

Evaluation Type: Observational Description: Real-time data visualization. 11 Chartbeat

12 Chartbeat ProsCons Data is real-time Does not scale well (financially) with huge sites Captures data that is hard to obtain via events (reading, writing, idling, active time, referrals, social integration, etc.) Limited in the data it captures (have to “hack” it if you want event-like data) Great for site monitoringOnly records “page-level” interactions Really awesome visualizationLimited historical data access Easy to useNot built for usability evaluation Overall Score: 7/10

Evaluation Type: Participative Description: Watch a user complete a task on your system while thinking aloud. 13 User Testing

14 User Testing ProsCons Probably best method for catching usability issues Small sample size (hit or miss) Most thorough recording of user interaction with the system Not easily scalable (expensive) “Think aloud” allows data insights not otherwise attainable from just user interactions Limited user availability Can observe certain demographics without requesting personal information in the system itself Sometimes, it’s painful to watch… Overall Score: 8.5/10

15 Questions?