Download presentation
Presentation is loading. Please wait.
Published byDinah McCormick Modified over 6 years ago
1
DL Interfaces & Users LIS 5472 Digital Libraries Week 12
Instructor: Dr. Sanghee Oh College of Communication & Information, Florida State University
2
Featured DL Presentations
Jonathan Klepper Matthew Hunter Robin Martin
3
DL Project Questions?
4
Peer-Group Evaluation
The level of EFFORT by this group member. [e.g. attendance at all group meetings, was on time and prepared for group meetings, promptly responded to group s, group discussion boards, etc.] How much INPUT did this group member contribute to: Organizing and/or scheduling group work? Completing assigned tasks on schedule? Creating effective communication? The QUALITY of that effort. [e.g. quality of research, quality of products, etc.] How would you rate his or her level of COOPERATION? [Flexibility with scheduling, etc.] What's the level of this group member's POSITIVE IMPACT on this project? [leadership, patience, helpful, considerate, compassionate, etc.]
5
DL Architecture Principles
6
Interaction Design in Digital Libraries
Visualizing what appears on the screen of a digital library Identifying how users manipulate, search, browse and use objects in a digital library Enhancing the effective interaction among components of digital libraries
7
Key Principles for User-Centered Systems Design (Source)
8
Evaluating the Design Jakob Nielsen’s five usability attributes
Learnability (easy to use) Efficiency (efficient to use) Memorability (easy to remember) Errors (a low error rate) Satisfaction (pleasant to use) Kling and Elliott’s usability for digital libraries (Kling & Elliott, 1994) User acceptance of digital libraries (Thong, Hong, & Tam, 2002)
9
Usability Evaluation by Analysts
Heuristic Evaluation (Blandford, et al., 2004) Quick, cheap, easy, and the most popular evaluation method Working through every page or screen of a system 3-5 analysts are recommended to examine the system (Nielsen, 1994) Revising prototypes based on problems identified and prioritized during the evaluation (For more detailed information about how to conduct a heuristic evaluation, please go to useit.com at
10
10 Usability Heuristics by Jakob Nielsen
Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalistic design Help users recognize, diagnose, and recover from errors Help and documentation
11
Usability Test Proposal Development
A heuristic evaluation can be conducted with a list of questions that you develop. Ten recommended heuristics by Jakob Nielsen Example set of questions Develop at least 2-3 core questions per heuristic items and describe a brief rationale for the questions. For the usability test report, each team member needs to do the heuristic evaluation with the questions in the proposal. Summarize and briefly analyze the evaluation results, with suggestions for improvement.
12
Heuristic Evaluation Question Sample
1: Visibility of system status.
13
Heuristic Evaluation Question Sample
1: Visibility of system status. Rationale: To make sure that users understand what information exists in the collection and how to find the information. Question examples When searching, is it clear where you are in the digital library? How clearly does the web site describe the contents of the collection? Does the formatting and organization of the web site help you understand where you are in the collection?
14
Heuristic Evaluation Question Sample
2: Match between the system and the real world
15
Heuristic Evaluation Question Sample
2: Match between the system and the real world Rationale The digital library should have clear language and follow web site conventions. Question examples Are the terms used throughout the digital library easy to understand? Is it clear when icons are links? Is it clear what the search and browse options are?
16
Heuristic Evaluation Question Sample 1
3: User Control and Freedom
17
Heuristic Evaluation Question Sample
3: User Control and Freedom Rationale The digital library web site should make it easy for users to return to the home page and to find pages within the site. Question examples Can users easily move through search results? The navigation was clearly visible. Users could easily find the ‘back’ and ‘home’ buttons.
18
Heuristic Evaluation Question Sample
4. Consistency and Standards
19
Heuristic Evaluation Question Sample
4. Consistency and Standards Rationale: Users should be able to clearly understand the format of the digital library, and metadata terminology should be understandable and consistent throughout. Question examples I felt the format was consistent throughout the digital library. I felt the cataloging terms made sense and were consistent. Does the site comply with conventions (such as how links look)?
20
Heuristic Evaluation Question Sample
5. Error prevention
21
Heuristic Evaluation Question Sample
5. Error prevention Rationale To prevent users from encountering errors or problems. Question examples Does the web site adequately prevent problems? When no search results are found, the interface presents the user with different browsing options. The search engine automatically searches across all metadata fields for results without the user needing to specify a field.
22
Heuristic Evaluation Question Sample
6. Recognition rather than recall
23
Heuristic Evaluation Question Sample
6. Recognition rather than recall Rationale Providing users with additional browsing techniques ensures successful searching. Automatically searching all metadata fields and tags makes it easier for users to find what they are looking for. The web site should not require users to remember information from previous pages to perform tasks. Question examples The digital library title is displayed at the top of every page within the collection. A menu or list is provided on each page highlighting the possible options available to users
24
Heuristic Evaluation Question Sample
7. Flexibility and Efficiency of Use
25
Heuristic Evaluation Question Sample
7. Flexibility and Efficiency of Use Rationale Advanced search options allow users to search specific fields for the results they need. Ease of searching and browsing methods ensures users will visit more than once. Question examples The digital library allows a number of advanced search options. Users can easily browse the collections . Images can be enlarged for viewing.
26
Heuristic Evaluation Question Sample
8. Aesthetic and Minimalist Design
27
Heuristic Evaluation Question Sample
8. Aesthetic and Minimalist Design Rationale The user interface design is one of the most important aspects of a digital library. Users should be able to view readable text and relevant information so that their research experience is faster and easier. Question examples The digital library has a simple and clean design. The digital library uses readable fonts in colors that are easy read for all users.
28
Heuristic Evaluation Question Sample
9. Help Users Recognize, Diagnose, and Recover from Errors
29
Heuristic Evaluation Question Sample
9. Help Users Recognize, Diagnose, and Recover from Errors Rationale is important to keep the user informed when no items can be located so that they can take advantage of other search options. Question examples The search results indicate when no results are found. A message is displayed when a collection contains no items.
30
Heuristic Evaluation Question Sample
10. Help and Documentation
31
Heuristic Evaluation Question Sample
10. Help and Documentation Rationale Providing users contact information allows opportunities for feedback from viewers. This feedback can provide excellent suggestions for increasing viewer numbers. Asking users for feedback is an important way to update our digital library and to repair or fix any errors that are reported. Question examples A contact address is displayed in the footer of each page for users to send feedback to staff. The digital library provides some search suggestions for users.
32
Heuristic Evaluation Measures
Likert Scales Agreement (5 points) Strongly Agree 5 Agree 4 Neutral 3 Disagree 2 Strongly Disagree 1 Frequency (5 points) Very frequently 5 1 Occasionally 4 2 Rarely 3 Very rarely Never
33
User Needs User analysis Scope/definition of user group
User characteristics Demographic information Knowledge and experience Computer/IT experience or knowledge Level of experience with the task Psychological characteristics
34
Developing User Personas
Categorizing user types e.g., Primary, Secondary, and Others Annotating user characteristics according to the categories Interviewing with real users Specifying User Tasks Understanding typical DL tasks
35
Group Activity
36
Developing User Personas
Identify who are the primary users of your DL (and secondary users, if appropriate). Develop one personas per type of users (e.g., a novice user and an advanced user). A persona may includes Name, age Personal information, including family and home life Computer proficiency and comfort level with using the Web Pet peeves and technical frustrations Personal and professional goals Information-seeking habits and favorite resources Motivation or “trigger” for using the digital library
37
Persona Development References
Cooper, A. (1999). The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity, Indianapolis: Sams. Head, A. J. (2003). Personas: setting the stage for building usable information sites. Online, 27(4),
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.