Measuring the quality of academic library electronic services and resources Jillian R Griffiths Research Associate CERLIM – Centre for Research in Library.

Slides:



Advertisements
Similar presentations
Victorian Curriculum and Assessment Authority
Advertisements

Copyright © 2014 ICT Inspires Ltd. All Rights Reserved. ICT (Computing) Subject Leader Course Session 2: Broader.
Mosaic Evaluation Undertaken by CERLIM, Manchester Metropolitan University November 2009 Jenny Craven, Jill Griffiths.
Web Search Results Visualization: Evaluation of Two Semantic Search Engines Kalliopi Kontiza, Antonis Bikakis,
Introduction There are various theoretical concepts and skills that bioscience students need to develop in order to become effective at solving problems.
New technologies and disaster information resources Part 2. The right information, at the right time, the right way.
Service Quality This is the regularity with which a service provider can provide efficient services to the customers. It is imperative for every organization.
Evaluating the Mixed Economy Model in Central Scotland Police Kenneth Scott Director, Centre for Criminal Justice and Police Studies University of the.
Information Literacy and Inquiry-based learning Pamela McKinney Learning Development and Research Associate (Information Literacy) at CILASS CILASS identifies.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
CISB444 - Strategic Information Systems Planning
CAP 252 Lecture Topic: Requirement Analysis Class Exercise: Use Cases.
EPIC Online Publishing Use and Costs Evaluation Program.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Business research methods: data sources
Lecture 7 Evaluation. Purpose Assessment of the result Against requirements Qualitative Quantitative User trials Etc Assessment of and Reflection on process.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Needs Analysis Instructor: Dr. Mavis Shang
Proposal 13 HUMAN CENTRIC COMPUTING (COMP106) ASSIGNMENT 2.
Hartley, Project Management: Integrating Strategy, Operations and Change, 3e Tilde Publishing Chapter 7 Quality Management Achieving technical performance.
socio-organizational issues and stakeholder requirements
Evaluation of Information Sources and Reference Services Dr. Dania Bilal IS 530 Fall 2006.
Non-functional requirements
Customer Focus Module Preview
Towards Harmonized R & D Quality Dimensions (Indexes) The Kinneret College on the Sea of Galilee By: Raymond Austin under the guidance of Dr. Avigdor Zonnenshain.
Software Development Unit 2 Databases What is a database? A collection of data organised in a manner that allows access, retrieval and use of that data.
Dimensions Of Product Quality (Garvin) 1. Core Performance basic operating characteristics 2. Features “extra” items added to basic features 3. Reliability.
Website evaluation models and acceptability factors K.Vipartienė, E. Valavičius.
Evaluating Educational Technology and Integration Strategies By: Dakota Tucker, Derrick Haney, Demi Ford, and Kent Elmore Chapter 7.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
This work is licensed under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported License.Creative Commons Attribution-NonCommercial- ShareAlike.
Dimensions Of Product Quality (Garvin) 1. Core Performance basic operating characteristics 2. Features “extra” items added to basic features 3. Reliability.
Classroom Assessment A Practical Guide for Educators by Craig A
SiTEL LMS Focus Group Executive Summary Prepared: January 25, 2012.
Outcome Based Evaluation for Digital Library Projects and Services
An Online Knowledge Base for Sustainable Military Facilities & Infrastructure Dr. Annie R. Pearce, Branch Head Sustainable Facilities & Infrastructure.
Dimensions Of Product Quality (Garvin)
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
COMP106 Assignment 2 Proposal 1. Interface Tasks My new interface design for the University library catalogue will incorporate all of the existing features,
Guide: DR. R. BALASUBRAMANI Assistant Professor Department of LIS Bharathidasan University Tiruchirappalli.
Human Computer Interaction
Subject Guides and Subject Librarianship Ndhlovu Phillip Assistant Librarian NUST Library SubjectsPlus ZULC National Workshop December 13, 2011.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
ISO DOCUMENTATION. ISO Environmental Management Systems2 Lesson Learning Goals At the end of this lesson you should be able to:  Name.
March E-Learning or E-Teaching? What’s the Difference in Practice? Linda Price and Adrian Kirkwood Programme on Learner Use of Media The Open University.
An Introduction to Formative Assessment as a useful support for teaching and learning.
Quality Function Deployment. Example Needs Hierarchy.
QUALITY DEFINITIONS AND CONCEPTS
1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003.
Creating & Building the Web Site Week 8. Objectives Planning web site development Initiation of the project Analysis for web site development Designing.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Systems Analysis Lecture 5 Requirements Investigation and Analysis 1 BTEC HNC Systems Support Castle College 2007/8.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluation of an Information System in an Information Seeking Process Lena Blomgren, Helena Vallo and Katriina Byström The Swedish School of Library and.
Software Quality Control and Quality Assurance: Introduction
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Classroom Assessment A Practical Guide for Educators by Craig A
Evaluation of Reference Services
Lisa Dawson – Head of Student Systems Operations
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
LEAN PRODUCTION AND QUALITY MANAGEMENT
Presentation transcript:

Measuring the quality of academic library electronic services and resources Jillian R Griffiths Research Associate CERLIM – Centre for Research in Library and Information Management Department of Information and Communications Manchester Metropolitan University UK

Introduction Quality Attributes and development Quality Attribute measurement in practice - the process of assessment Benefits of this approach Conclusions

Development of the Quality Attributes Garvin (1987) identified eight attributes that can be used to evaluate a variety of services. These have been adapted and extended (to ten attributes) by a number of authors to apply to information and library services. Garvin (1987) identified eight attributes that can be used to evaluate a variety of services. These have been adapted and extended (to ten attributes) by a number of authors to apply to information and library services. a holistic assessment of the quality of services or resources a holistic assessment of the quality of services or resources encompasses usability encompasses usability user centered view of performance effectiveness using the user’s own: user centered view of performance effectiveness using the user’s own: perception of relevance perception of relevance perception of satisfaction with perception of satisfaction with both items retrieved both items retrieved resource as a whole resource as a whole

Quality Attributes GARVIN BROPHY and GRIFFITHS Performance, the primary purpose of the product or service and how well it is achieving that primary purpose. Performance, concerned with establishing confirmation that a library service meets its most basic purpose, such as making key information sources available on demand. Features, secondary characteristics which add to the service or product without being of its essence. Features: aspects of the service which appeal to users but are beyond the essential core performance attributes, such as alerting services.

GARVIN BROPHY and GRIFFITHS Reliability, the consistency of the product or service’s performance in use. Reliability, which for information services would include availability of the service. Such problems as broken Web links, lack of reliability and slowness in speed of response would be measured as part of this attribute. Conformance, whether or not the product or service meets the agreed standard, which may be internally or externally generated. Conformance: whether the service meets the agreed standard, including conformance questions around the utilisation of standards and protocols such as XML, RDF, Dublin Core, OAI, Z39.50 etc.

GARVIN BROPHY and GRIFFITHS Durability, the amount of use the product or service can provide before it deteriorates to a point where it needs replacement. Durability, related to the sustainability of the information or library service over a period of time. Currency of information, that is, how up to date the information provided is when it is retrieved. Serviceability, how easy it is to repair a product or correct a service when it goes wrong, including the level of inconvenience experienced by the customer. Serviceability, could be the level of help available to users during specific actions or otherwise at the point of need; availability of instructions and prompts throughout an online service; the usefulness of help.

GARVIN BROPHY and GRIFFITHS Aesthetics, the appearance of the product or service. Aesthetics and Image, both of the physical library and of web- based services based upon it. Perceived quality, in essence the reputation of the product or service among the population, especially those with whom the potential customer comes into contact. Perceived Quality: the user’s view of the service as a whole and the information retrieved from it. It may be useful to measure perceptions both before and after a service is used. Usability, which is particularly relevant to electronic services and includes issues of accessibility.

Quality Attribute measurement in practice - the process of assessment Deciding if a single resource or several resources are the focus will impact on: Deciding if a single resource or several resources are the focus will impact on: why you are assessing why you are assessing how you assess how you assess who will assess: who will assess: end users – public, students, academic staff end users – public, students, academic staff expert users – colleagues, usability/accessibility experts, you! expert users – colleagues, usability/accessibility experts, you! how you handle the resultant data how you handle the resultant data

Quality Attribute measurement in practice - the process of assessment Design of tasks/test searches Design of tasks/test searches if assessment is being made to gain an understanding of users’ behaviour then participants should be allowed to use their own tasks or queries if assessment is being made to gain an understanding of users’ behaviour then participants should be allowed to use their own tasks or queries if the evaluation is to assess the service then it will be necessary to design tasks or test searches if the evaluation is to assess the service then it will be necessary to design tasks or test searches A task based approach can be : A task based approach can be : very directed, as in McGillis and Toms (2001) very directed, as in McGillis and Toms (2001) looser simulations of real world situations such as those proposed by Borlund (2003) and developed from work by Ingwersen (1992, 1996) and Byström and Järvelin (1995) looser simulations of real world situations such as those proposed by Borlund (2003) and developed from work by Ingwersen (1992, 1996) and Byström and Järvelin (1995) Questionnaires used for post-searching quantitative data collection

Measuring the Quality Attributes Quality Attribute Measure Performance Basic requirements, primary operating features Satisfaction that required information was found Satisfaction that required information was found Satisfaction with ranking order of retrieved items Satisfaction with ranking order of retrieved items Conformance Agreed standard Not evaluated by end users, could be assessed by expert user/service provider Not evaluated by end users, could be assessed by expert user/service provider Features Secondary operating attributes, added value, subjective Search option/s used Search option/s used Features particularly liked Features particularly liked Reliability High user value Any dead links found Any dead links found Impact of dead links on judgment of service Impact of dead links on judgment of service Satisfaction with speed of response Satisfaction with speed of response Durability Sustainability of the service Not evaluated by users Not evaluated by users

Quality Attribute Measure Currency How up-to-date is the information Information retrieved by the service up-to-date Information retrieved by the service up-to-date Serviceability How easy will it be to put things right Instructions and prompts helpful Instructions and prompts helpful Use of Help Use of Help Helpfulness of Help Helpfulness of Help Aesthetics Highly subjective area of prime importance Satisfaction with interface and presentation of Satisfaction with interface and presentation of features features Familiarity with interface/elements of the interface Familiarity with interface/elements of the interface Ease of understanding of retrieved item list Ease of understanding of retrieved item list Perceived quality Users’ judgments Rate quality of service and information retrieved Rate quality of service and information retrieved Usability Important in any user- centred evaluation User friendliness of service User friendliness of service How easy to remember which features to use How easy to remember which features to use Satisfaction with facility to input query Satisfaction with facility to input query Satisfaction with facility to modify query Satisfaction with facility to modify query

Benefits of the approach Allows for: Allows for: Assessment of a single service Assessment of a single service Assessment of multiple services Assessment of multiple services

Single service assessment

Comparative assessment of services User satisfaction with services by Quality Attribute

Conclusions Use of Quality Attributes as evaluation criteria allows investigation of user perception of services before and after use. Use of Quality Attributes as evaluation criteria allows investigation of user perception of services before and after use. Allows service providers and developers to identify specific areas for improvement by targeting those areas which have been assessed lower by users. Allows service providers and developers to identify specific areas for improvement by targeting those areas which have been assessed lower by users. These results are early indicators which seem to demonstrate that measures other than just Performance play an important role in users’ evaluation, e.g. Aesthetics, Usability. These results are early indicators which seem to demonstrate that measures other than just Performance play an important role in users’ evaluation, e.g. Aesthetics, Usability.

Conclusions Students often confused as to the meaning of quality. Students often confused as to the meaning of quality. Further work needed to explore meaning of Perceived Quality. Further work needed to explore meaning of Perceived Quality. Brophy (2004) has ruminated that, as a profession, we may be moving beyond individual techniques in an attempt to synthesise the different approaches towards measurements of impact, to get back to the essential question of ‘do libraries and their services do any good?’ Brophy (2004) has ruminated that, as a profession, we may be moving beyond individual techniques in an attempt to synthesise the different approaches towards measurements of impact, to get back to the essential question of ‘do libraries and their services do any good?’

Contact details and further information Thank you! Any questions? Jill Griffiths on