Download presentation
Presentation is loading. Please wait.
Published byConrad Cooper Modified over 9 years ago
1
1 CS 430: Information Retrieval Lecture 13 Usability 1 Guest Lecture: Gilly Leshed
2
2 Course Administration Midterm Examination When: Wednesday, October 13, 7:30 to 9:00 p.m. Where: Upson B17 About: All material covered before the midterm break For a sample paper and further information see the Examinations page on the web site. Note the instructions about laptop computers.
3
3 Course Administration Assignment 2 To help understanding of how to calculate the similarity between a query and a document, Lecture 11, Slide 24 has been expanded to three slides. The paper describes how to calculate the inner product between two vectors. Lecture 11, Slides 24-26 now describe how to scale the inner product to calculate the cosine of the angle between two vectors. With the test data, the documents are all similar in length, so that a ranking based on inner product will be close to a ranking based on cosine.
4
4 Lecture Outline N-Tier Model User-Centered Approach Requirements Design Implementation Evaluation Summary
5
5 User Interface Business Logic Data N-Tier Model Web page Search engine Index file, data files Focus of this lecture N-Tier model
6
6 User-Centered Approach Requirements –Who are the users? –What do they want? –What do they need? Designing the user interface Implementing the user interface Evaluating the user interface User-Centered Approach Requirements Design Evaluation Implementation
7
7 Requirements Who are the users? Understanding the users via ethnographic research Descriptions of users: –Demographic characteristics –Computer usage background –Job description and work environment –Disabilities: Color blindness Language issues Typing issues Personas – archetypes of users, describing behavior patterns, goals, skills, attitudes, and environment (Cooper, 2003) Requirements The study of people in their natural settings; a descriptive account of social life and culture, based on qualitative methods (e.g. detailed observations, unstructured interviews, analysis of documents)
8
8 Requirements What do Users Need? Defining user interface requirements Based on task analysis: –Task definition –Context definition Several ways to accomplish task analysis: –Ethnographic research –Scenario-based analysis –Discussion with users and subject-matter experts (SME) Requirements
9
9 Requirements What do Users Want? When asking users, they often: –Provide their attitudes, not their needs –Bend the truth to be closer to what they think you want to hear –Rationalize their behavior “I would have seen the button if it had been bigger” Instead of asking users what they want: –Watch what they actually do –Do not believe what they say they do –Definitely don't believe what they predict they may do in the future Requirements
10
10 Non-functional Requirements Performance, Reliability, Scalability, Security… Example: Response time (Nielsen, 1994) 0.1 sec – the user feels that the system is reacting instantaneously 1 sec – the user will notice the delay, but his/her flow of thought stays uninterrupted 10 sec – the limit for keeping the user's attention focused on the dialogue Requirements
11
11 Design Mental Models Also called conceptual model What a person thinks is true about a system, not necessarily what is actually true Similar in structure to the system that is represented Simpler than the represented system. A mental model includes only enough information to allow accurate predictions (i.e. no data structures) Allows a person to predict the results of his actions Design
12
12 System Model of an Article User Model of an Article System Model of an Article User Model of an Article Design Mental Models – Example 1 The article’s body and meta-data are conceived as a whole. An article’s meta-data is available in one database and its data is available in a separate database. Design
13
13 Design Mental Models – Example 2 System Model of a search engine User Model of a search engine The search engine retrieves return hits directly from their source The search action does not involve accessing the documents’ source Design
14
14 Design User Interface Design Guidelines Consistency –Appearance, controls, and function –Both within the system and to similar systems Feedback User in control Recognition rather than recall Easy reversal of actions –Error handling Consider different expertise: –Novice, intermediate and expert users Design
15
15 Designing the Search Page Making Decisions Overall organization: –Spacious or cramped –Division of functionality to different pages –Positioning components in the interface –Emphasizing parts of the interface Query insertion: insert text string or fill in text boxes Interactivity of search results –Retrieve the information from the results –Narrow the search Performance requirements Design
16
16 Google Spacious organization
17
17 Yahoo! cramped organization
18
18 AltaVista Division of functionality to different pages
19
19 ACM Digital Library Emphasized components
20
20 ACM Digital Library advance search Different query insertion ways
21
21 Design Return Hits A snippet is a short record that a search system returns to describe and link to a hit. Example: Web search “Nielsen evaluation heuristics” Heuristic Evaluation... Jakob Nielsen's Online Writings on Heuristic Evaluation. How to conduct a heuristic evaluation; A list of ten recommended heuristics for usable interface design... www.useit.com/papers/heuristic/ - 5k - Cached - Similar pages Design
22
22 Designing the Return Hits Making Decisions Dynamic (generated from query+ document) or precomputed (from document only) Content only or with related information Highlighting of search terms Length vs. number on page User must understand why the hit was returned Design
23
23 Dynamic Return Hits Dynamic snippets
24
24 Precomputed Return Hits Precomputed snippets
25
25 Implementation Use HTML, Java, Visual Basic, WinForms… Make sure the user interface is loosely-coupled with the search engine underneath User Interface Business Logic Data Implementation
26
26 Evaluation The process of determining the worth of, or assigning a value to, the user interface on the basis of careful examination and judgment. Make sure your system is usable before launching it! Two categories of evaluation methods: –Empirical evaluation: with users –Analytical evaluation: without users Evaluation
27
27 Evaluation But what is usability? From ISO 9241-11, Usability comprises the following aspects: Effectiveness – the accuracy and completeness with which users achieve certain goals –Measures: quality of solution, error rates Efficiency – the relation between the effectiveness and the resources expended in achieving them –Measures: task completion time, learning time, clicks number Satisfaction – the users’ comfort with and positive attitudes towards the use of the system –Measures: attitude rating scales Evaluation
28
28 Evaluation with Users Testing the system, not the users! Stages of evaluation with users: Preparation Sessions conduct Analysis of results Evaluation
29
29 Evaluation with Users Preparation Determine goals of the usability testing “The user can find the required information in no more than 2 minutes” Write the user tasks “Answer the question: how hot is the sun?” Get participants –Use the descriptions of users from the requirements phase to detect potential users Evaluation
30
30 Evaluation with Users Sessions Conduct Conduct the session –Usability Lab –Simulated working environment Observe the user –Human observer(s) –Video camera –Audio recording Inquire satisfaction data Evaluation
31
31 Evaluation with Users Results Analysis If possible, use statistical summaries Pay close attention to areas where users –were frustrated –took a long time –couldn't complete tasks Respect the data and users' responses, don't make excuses for designs that failed Note designs that worked and make sure they're incorporated in the final product. Evaluation
32
32 Evaluation without Users Assessing systems using established theories and methods Evaluation techniques Heuristic Evaluation (Nielsen, 1994) –Evaluate the design using “rules of the thumb” Cognitive Walkthrough (Wharton et al, 1994) –A formalized way of imagining people’s thoughts and actions when they use the interface for the first time Claims Analysis – based on scenario-based analysis –Generating positive and negative claims about the effects of features on the user Evaluation
33
33 Summary User Interface Business Logic Data Summary Requirements Design Implementation Evaluation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.