Download presentation
Presentation is loading. Please wait.
Published byNatalie Terry Modified over 8 years ago
1
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct 30: Midterm in class No office hours (out of town)
2
Oct 212 Midterm material Everything up to exactly this point (including DemoCustomDialog) Things to study: Slides Programs Javadoc No need to memorize all methods of Swing classes. Familiarity with the most common ones will be tested though.
3
Evaluating User Interfaces Material taken mostly from “Interaction Design” (Preece, Rogers, Sharp 2002)
4
Oct 214 User Interface Humor
5
Oct 215 User Interface Evaluation Users want systems that are easy to learn and use Systems also have to be effective, efficient, safe, satisfying Important to know: What to evaluate Why it is important When to evaluate
6
Oct 216 What to evaluate All evaluation studies must have specific goals and must attempt to address specific questions Vast array of features Some are best evaluated in a lab, e.g. the sequence of links to find a website Others are better evaluated in natural settings, e.g. whether children enjoy a particular game
7
Oct 217 Why it is important to evaluate Problems are fixed before the product is shipped, not after One can concentrate on real problems, not imaginary ones Developers code instead of debating Time to market is sharply reduced Finished product is immediately usable
8
Oct 218 When to evaluate Ideally, as early as possible (from the prototyping stage) and then repeatedly throughout the development process. “Test early and often.”
9
Oct 219 Evaluation Paradigms “Quick and Dirty” evaluation Usability Testing Field studies Predictive evaluation
10
Oct 2110 “Quick and Dirty” evaluation User-centered, highly practical approach Used when quick feedback about a design is needed Can be conducted in a lab or the user’s natural environment Users are expected to behave naturally Evaluators take minimum control Sketches, quotes, descriptive reports are fed back into the design process
11
Oct 2111 Usability Testing Applied approach based on experimentation Used when a prototype or a product is available Takes place in a lab Users carry out set tasks Evaluators are strongly in control Users’ opinions collected by questionnaire or interview Reports of performance measures, errors etc. are fed back into the design process
12
Oct 2112 Field studies Often used early in design to check that users’ needs are met or to assess problems or design opportunities Conducted in the user’s natural environment Evaluators try to develop relationships with users Qualitative descriptions that include quotes, sketches, anecdotes are produced
13
Oct 2113 Predictive evaluation Do not involve users Expert evaluators use practical heuristics and practitioner expertise to predict usability problems Usually conducted in a lab Reviewers provide a list of problems, often with suggested solutions
14
Oct 2114 Evaluation techniques Observing users Asking users their opinions Asking experts their opinions Testing users’ performance Modeling users’ task performance to predict the efficacy of a user interface
15
Oct 2115 The DECIDE framework Determine the overall goals that the evaluation addresses Explore the specific questions to be answered Choose the evaluation paradigm and techniques Identify practical issues Decide how to deal with the ethical issues Evaluate, interpret, and present the data
16
Oct 2116 D etermine the overall goals What are the high level goals of the evaluation? Examples: Check that evaluators have understood the users’ needs Ensure that the final interface is consistent Determine how to improve the usability of a user interface
17
Oct 2117 E xplore specific questions Break down overall goals into relevant questions Overall goal: Why do customers prefer paper tickets to e-tickets? Specific questions: What is the customer’s attitude? Do they have adequate access to computers? Are they concerned about security? Does the electronic system have a bad reputation? Is it’s user interface poor?
18
Oct 2118 C hoose paradigm and techniques Practical and ethical issues might be considered Factors: Cost Timeframe Available equipment or expertise Compromises may have to be made
19
Oct 2119 I dentify practical issues Important to do this before starting Find appropriate users Decide on the facilities and equipment to be used Schedule and budget constraints Prepare testing conditions Plan how to run the tests
20
Oct 2120 D ecide on ethical issues Studies involving humans must uphold a certain code Privacy of subjects must be protected Personal records must be kept confidential Exact description of the experiment must be submitted for approval
21
Oct 2121 E valuate the data Should quantitative data be treated statistically? How to analyze qualitative data? Issues to consider: Reliability (consistency) Validity Biases Scope Ecological validity
22
Oct 2122 We’ll take a closer look at… Two predictive evaluation techniques: Heuristic evaluation Cognitive walkthroughs A usability testing technique User testing
23
Oct 2123 Heuristic Evaluation Heuristic evaluation is a technique in which experts, guided by a set of usability principles known as heuristics, evaluate whether user interface elements conform to the principles. Developed by Jakob Nielsen Heuristics bear a close resemblance to design principles and guidelines Interesting article on Heuristic Evaluation: http://www.useit.com/papers/heuristic/heuristic_evaluation.html
24
Oct 2124 List of heuristics Visibility of system status Match between system and the real world User control and freedom Consistency and standards Help users recognize, diagnose, and recover from errors
25
Oct 2125 List of heuristics (cont.) Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.