Usability Evaluation of Digital Libraries Stacey Greenaway Submitted to University of Wolverhampton module Dec 15 th 2006.

Slides:



Advertisements
Similar presentations
Computer Supported Cooperative Work by an Agent Oriented Software Engineering Approach: CSCW by AOSE Darlinton Carvalho
Advertisements

DELOS Highlights COSTANTINO THANOS ITALIAN NATIONAL RESEARCH COUNCIL.
Web Search Results Visualization: Evaluation of Two Semantic Search Engines Kalliopi Kontiza, Antonis Bikakis,
Connecting social technologies with information literacy. Internet Librarian International 2006 Kara Jones Subject Librarian University of Bath, UK
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
1 ETR 520 Introduction to Educational Research Dr. M C. Smith.
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Evaluating usability through claims analysis Suzette Keith Ann Blandford, Bob Fields, Richard Butterworth, Yin Leng Theng.
Steering Group. Past work Ideas From Visits Strategies –what are strategies –how do they relate to interactional properties Discreet states & transition.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Information architecture Summary Natalia Shatokhina CS575 Spring 2010.
HOW PEOPLE GO ABOUT SEARCHING VIDEO COLLECTIONS: LESSONS LEARNED presented by Barbara M. Wildemuth School of Information & Library Science University of.
Social Network Analysis: Tasks and Tools Steven Loscalzo and Lei Yu Department of Computer Science Watson School of Engineering and Applied Science State.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
ENHANCING INFORMATION ACCESS AND RETRIEVAL TECHNIQUES TO IMPROVE THE EFFECTIVENESS OF INFORMATION ARCHITECTURE ANUBHA JAIN Research Scholar The IIS University.
Literature Search Techniques 2 Strategic searching In this lecture you will learn: 1. The function of a literature search 2. The structure of academic.
Grey Literature, E-Repositories and Evaluation of Academic & Research Institutes. The case study of BPI e-repository Maria V. Kitsiou - Head Librarian,
The Information School at the University of Washington Information Architecture What is it Bob Boiko UW iSchool ischool.washington.edu Metatorial Services.
Evaluation of digital collections' user interfaces Radovan Vrana Faculty of Humanities and Social Sciences Zagreb, Croatia
THE PEDAGOGICAL DESIGN OF TECHNOLOGY ENHANCED COLLABORATIVE LEARNING Minna Lakkala Centre for Research on Networked Learning and Knowledge Building, University.
Metadata in CHAOS How researchers tag and annotate radio broadcasts Marianne Lykke, Aalborg University Haakon Lund, RSLIS, Copenhagen University Mette.
Comprehensive user education to successfully navigate the Internet Part 1 - Introduction Course developed by University Library of Debrecen.
Ciarán O’Leary Wednesday, 23 rd September Ciarán O’Leary School of Computing, Dublin Institute of Technology, Kevin St Research Interests Distributed.
Personalization of the Digital Library Experience: Progress and Prospects Nicholas J. Belkin Rutgers University, USA
Automatic Subject Classification and Topic Specific Search Engines -- Research at KnowLib Anders Ardö and Koraljka Golub DELOS Workshop, Lund, 23 June.
Towards an activity-oriented and context-aware collaborative working environments Presented by: Ince T Wangsa Supervised by:
Learning Analytics: A short introduction Learning Analytics & Machine Learning March 25, 2014 #LAK14.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
A Model to Facilitate Effective Blended E-learning within Universities in Developing Countries B. Aguti, R. J. Walters, G. B. Wills Electronics and Computer.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Presenter : Ching-ting Lin Instructor: Ming-puu Chen Developing a Usability Evaluation Method for E-learning Application: From Functional Usability to.
BibEval – A framework for usability evaluations of online library services Thomas Weinhold, Bernard Bekavac, Sonja Hamann * Swiss Institute for Information.
Math Information Retrieval Zhao Jin. Zhao Jin. Math Information Retrieval Examples: –Looking for formulas –Collect teaching resources –Keeping updated.
Promoting the Sustainability of a Digital Initiatives Project User-Centered Assessment and Testing of Aerial Photographs of Colorado Holley Long, Kathryn.
Computational Thinking Class Overview web site:
인지구조기반 마이닝 소프트컴퓨팅 연구실 박사 2 학기 박 한 샘 2006 지식기반시스템 응용.
 hd.jpg hd.jpg Information Retrieval and Interaction.
Designing Attention-Centric Notification Systems Five HCI Challenges Scott McCrickard Center for Human-Computer Interaction & Department of Computer Science.
Information Architecture & Design Week 5 Schedule -Planning IA Structures -Other Readings -Research Topic Presentations Nadalia your Presentations.
Extracting Keyphrases from Books using Language Modeling Approaches Rohini U AOL India R&D, Bangalore India Bangalore
Towards a Reference Quality Model for Digital Libraries Maristella Agosti Nicola Ferro Edward A. Fox Marcos André Gonçalves Bárbara Lagoeiro Moreira.
Introduction to Concept Maps Edward A. Fox and Rao Shen CS5604 Fall 2002 “Information Storage & Retrieval” Dept. of Computer Science Virginia Tech, Blacksburg,
Comparing Document Segmentation for Passage Retrieval in Question Answering Jorg Tiedemann University of Groningen presented by: Moy’awiah Al-Shannaq
L&I SCI 110: Information science and information theory Instructor: Xiangming(Simon) Mu Sept. 9, 2004.
1 IBM Academic Initiative Introduction for Pamplin School of Business Virginia Tech – October 13, 2011 “IBM Academic Skills Cloud and Computing Education.
DANIELA KOLAROVA INSTITUTE OF INFORMATION TECHNOLOGIES, BAS Multimedia Semantics and the Semantic Web.
Achieving Semantic Interoperability at the World Bank Designing the Information Architecture and Programmatically Processing Information Denise Bedford.
Presented by: Samia Azhar( ) Shahzadi Samia( )
KNOWLEDGE MANAGEMENT UNIT II KNOWLEDGE MANAGEMENT AND TECHNOLOGY 1.
DELOS Network of Excellence on Digital Libraries Yannis Ioannidis University of Athens, Hellas Digital Libraries: Future Research Directions for a European.
Katy Börner Teaching & Research Teaching & Research Katy Börner
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Department of Computer Science Continuous Experimentation in the B2B Domain: A Case Study Olli Rissanen, Jürgen Münch 23/05/2015www.helsinki.fi/yliopisto.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
User Needs and Behavior WXGB63083 Course Title Course Code Introductory Lecture.
A TEN-YEAR UPDATE The DeLone and McLean Model of Information Systems Success (D&M IS)
Cognitive Informatics for Biomedicine – Chapter 5
Muneo Kitajima Human-Computer Interaction Group
Visiting human errors in IR systems from decision making perspective
Impact assessment and decision making
Briefing to ARL Membership
HCI Evaluation Techniques
Toward a Definition of IS Identity
MSc. Research Methods Week 1- Introduction.
Presentation transcript:

Usability Evaluation of Digital Libraries Stacey Greenaway Submitted to University of Wolverhampton module Dec 15 th 2006

Digital Libraries– General Research Areas Digital Libraries – General Research Areas Web based Metadata Automate Obtaining information Preservation Quality of Service Intellectual property rights HCI, Usability, Accessibility

Digital Libraries–HCI, Usability, Accessibility Digital Libraries – HCI, Usability, Accessibility Accessible regardless of disability, language or cultural differences. Keyword searching. Ability to browse topics. Intuitive interface. Content optimised. Quick information retrieval. Good indexing (Metadata)

Usability Evaluation - Usability Problems No standards for digital library usability evaluation. Need to highlight common usability problems. How to model user behaviour? Users information need. Computer Scientist vs. Information Scientist Comparing multiple digital libraries

Usability Evaluation – Standard Techniques General Research Question: Which techniques are most appropriate for evaluating the usability of Digital Libraries? Usability Inspection. Heuristic Evaluation. Cognitive Walkthrough Claims Analysis. None of the above techniques were found by Blandford et al. (2004) or Hartson et al. (2004) to be hugely successful at highlighting problems.

Usability Evaluation – Attribute by Attribute CASSM - (Concept-based Analysis of Surface and Structural Misfits) Blandford et al. (2004) Saracevic and Covi Framework (2000). Fuhr Framework (2001) Specific Research Question: Will evaluating digital libraries Attribute by Attribute allow for a comparison of usability problems of multiple digital libraries? 3 proposed conceptual frameworks for evaluating digital libraries attribute by attribute.

Usability Evaluation - Attribute by Attribute Basic Method: System split into dimensions CASSM – 2 Dimensions – System & User Saracevic and Covi, Fuhr– 3 Dimensions – System, User & Content Attributes elicited and assigned to attribute groups within the dimension. Each attribute is evaluated Attributes can be analysed and compared to results of other evaluations.

Usability Evaluation - Attribute by Attribute Does the Tsakonas Framework highlight more usability problems than Sandusky Framework, when evaluating the usability of multiple digital libraries? Tsakanos Framework: 3 Dimensions: System, User & Content. 3 sub systems: Interface, Information Retrieval and Advanced Functionality. Evaluate 3 conditions: Performance, Usability and Usefulness of the system. Define criteria (attributes) and methods (tools to evaluate the attributes) Relationships defined between the interaction concepts, the system dimensions and criteria. Methods (additional to basic method proposed by Saracevic and Covi, Fuhr :

Usability Evaluation - Attribute by Attribute Does the Tsakonas Framework highlight more usability problems than Sandusky Framework, when evaluating the usability of multiple digital libraries? Sandusky Framework: 6 Dimensions – Audience, Institution, Access, Content, services and Design and Development. (referred to as attribute group) No separation or distinction between system, user and content tasks, individual attributes can be compared to each other easier. Evaluate cause an effect between attributes. Methods (additional to basic method proposed by Saracevic and Covi, Fuhr :

Usability Evaluation - Attribute by Attribute Does the Tsakonas Framework highlight more usability problems than Sandusky Framework, when evaluating the usability of multiple digital libraries? Both frameworks are conceptual. Both frameworks do not provide experiments to help compare the efficiency of the frameworks. Both state further development required. The 2 frameworks share goal of evaluating the usability of multiple digital libraries, but in different contexts, for different purposes and have differing sub sets of research goals. Inconclusive results, experiments need to be conducted analysing effectiveness of both frameworks on highlighting usability problems of a selection of digital libraries. Comparison:

Usability Evaluation - Attribute by Attribute Benefits: Comparing multiple digital libraries Find common usability problems – and fix them Produce standards for evaluation Create test suites for digital library research Compare cause and effect between attributes in a single digital library Tool for classification

Usability Evaluation - Attribute by Attribute Negatives: Difficult to predict user behavior. Hard to model user satisfaction. No benchmarks for digital library research. Difficult to produce definitive set of attributes. No comprehensive checklist as all libraries different. Some attributes elicited at each evaluation. Requires system developers knowledge.

Usability Evaluation - Attribute by Attribute Conclusion: Evaluating digital libraries attribute by attribute is a promising alternative to standard usability evaluation tools. More research needs to be done to test the strengths of these frameworks. More research creating test suites/benchmark data Requirement for standards in digital library development and evaluation. Through literature review alone it is impossible to determine whether any conceptual framework is more efficient than another.

Usability Evaluation - Attribute by Attribute References: BLANFORD, A. CONNELL, I. EDWARDS, H. (2004b) Analytical Usability Evaluation For Digital Libraries: A Case Study. in Proceedings of the 4th ACM/IEEE-CS joint conference on Digital libraries Tuscon, AZ. New York, NY, USA: ACM Press, pp. 27–36 FUHR, N. HANSEN, P. MABE, M. MICSIK, A. SØLVBERG, I. (2001) Digital Libraries: A Generic Classification and Evaluation Scheme. in Proceedings of the 5th European Conference on Research and Advanced Technology for Digital Libraries, London, UK: Springer-Verlag, 2163 pp SANDUSKY, R. J. (2002) Digital Library Attributes: Framing usability research. in Proceedings of the Workshop on Usability of Digital Libraries at Joint Conference On Digital Libraries, p. 35. SARACEVIC, T. COVI, L. (2000) Challenges for digital library evaluation. In D. H. Kraft (Ed.), Knowledge Innovations: Celebrating Our Heritage, Designing Our Future. Proceedings of the 63rd Annual Meeting,November 11-16, 2000, Chicago, IL Washington, D.C.: American Society for Information Science. pp TSAKONAS, G. KAPIDAKIS, S. PAPATHEODOROU, C. (2004) Evaluation of User Interaction in Digital Libraries. in proceedings of the Delos workshop on evaluation of Digital libraries 2004.