1 © Tefko Saracevic, Rutgers University Evaluation of library and information services (LIS): an overview Contexts Approaches Levels Requirements Measures.

Slides:



Advertisements
Similar presentations
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Advertisements

Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
June 19, Proposal: An overall Plan Design to obtain answer to the research questions or problems Outline the various tasks you plan to undertake.
1 Chapter 2: Product Development Process and Organization Introduction Importance of human resources: Most companies have similar technology resources.
Service Quality This is the regularity with which a service provider can provide efficient services to the customers. It is imperative for every organization.
Customer Satisfaction in the Public Administration.
William Paterson University Five Strategic Areas of Focus at the Cheng Library Fairleigh Dickinson University June 18, 2009 Anne Ciliberti
Spreadsheet Management. Field Interviews with Senior Managers by Caulkins et. al. (2007) report that Spreadsheet errors are common and have been observed.
Search Engines and Information Retrieval
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Tefko Saracevic1 EVALUATION in searching Requirements Criteria
Tefko Saracevic, Rutgers University1 Libraries, library services & library public Historical perspective & evolutionary trends.
© Tefko Saracevic, Rutgers University1 Search strategy & tactics Governed by effectiveness & feedback.
© Tefko Saracevic, Rutgers University1 1.Discussion 2.Information retrieval (IR) model (the traditional models). 3. The review of the readings. Announcement.
Internal Control Concepts Knowledge. Best Practices for IT Governance IT Governance Structure of Relationship Audit Role in IT Governance.
© Tefko Saracevic, Rutgers University1 Interaction in information retrieval There is MUCH more to searching than knowing computers, networks & commands,
CONTEXT Evaluation of Information Services. Topics of Day Mission Vision Goals and Objectives Standards Types of Metrics  Input  Output  Performance.
INFO 624 Week 3 Retrieval System Evaluation
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
© Tefko Saracevic1 Search strategy & tactics Governed by effectiveness&feedback.
© Tefko Saracevic, Rutgers University1 DIGITAL LIBRARIES 17:610:553 Tefko Saracevic Michael Lesk
Usability Specifications
Tefko Saracevic An early study of user frustration in a library Any relevance for present day brick & mortar and digital libraries? Tefko Saracevic, Ph.D.
PART TWO EMPLOYMENT Chapters 5-7.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Evaluation of Evaluation in Information Retrieval - Tefko Saracevic Historical Approach to IR Evaluation.
Purpose of the Standards
1 Engineering Economic Decisions Lecture No.1 Professor C. S. Park Fundamentals of Engineering Economics Copyright © 2005.
Spreadsheet Management. Sarbanes-Oxley Act (SOX, 2002) Requires “an effective system of internal control” for financial reporting in publicly- held companies.
Evaluation of digital Libraries: Criteria and problems from users’ perspectives Article by Hong (Iris) Xie Discussion by Pam Pagels.
© Tefko Saracevic, Rutgers University1 Mediation in librarianship & information retrieval Reference interview Human-human interaction Question negotiation.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
«Enhance of ship safety based on maintenance strategies by applying of Analytic Hierarchy Process» DAGKINIS IOANNIS, Dr. NIKITAKOS NIKITAS University of.
UNDERSTANDING CUSTOMER REQUIREMENTS
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Program Evaluation Using qualitative & qualitative methods.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Search Engines and Information Retrieval Chapter 1.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Chapter 6 : Software Metrics
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
L643: Evaluation of Information Systems Week 11: March 17, 2008.
Understanding customer expectations and perceptions
Information Retrieval Evaluation and the Retrieval Process.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Vision Library Media Center serves as an integral part of the school ’ s educational program and is the information hub of the school. All students will.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Computer Concepts 2014 Chapter 10 Information Systems Analysis and Design.
Quality Systems PG Diploma in Hospitality Management
© Ying Zhang, Tefko Saracevic 1 criteria in evaluation of use & usability in digital libraries Ying Zhang, Ph.D. University of California, Irvine, USA.
Forging Forward: Using Evaluation as a Stepping Stone Joe Matthews SLA – San Diego Fall Seminar October 30, 2015.
Location Planning and Analysis Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent.
Comparison of CA Evaluation Protocols, CA Framework, IPMVP and CPUC Policy Manual* A preface to group discussion *In terms of how they define.
Kathy Corbiere Service Delivery and Performance Commission
Barcelona Declaration of Measurement Principles Presented June 17, 2010 Revised June 20, 2010 Final July 19, 2010 Global Alliance ICCO Institute for Public.
Chapter. 3: Retrieval Evaluation 1/2/2016Dr. Almetwally Mostafa 1.
Environmental Systems and Society Internal Assessment.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Are we there yet? Evaluating your graduation SiMR.
 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.
Functional Area Assessment
Evaluation of Reference Services
Quantifying the value of our libraries. Are our systems ready?
An early study of user frustration in a library
digital libraries and human information behavior
6 Chapter Training Evaluation.
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Presentation transcript:

1 © Tefko Saracevic, Rutgers University Evaluation of library and information services (LIS): an overview Contexts Approaches Levels Requirements Measures Tefko Saracevic, Rutgers University

2 © Tefko Saracevic, Rutgers University Why evaluate ? Importance of evaluation of LIS increasing, because: Social importance of information changing Transition from “just-in-case” to “just-in-time” model of service - stress on access Increased competition - many new players competing for resources Growth of electronic inf. resources & networks 4 Demands for justification growing by funders in practice & research

3 © Tefko Saracevic, Rutgers University Broad context Role that LIS play related to: è SOCIETY - community, culture, discipline... è INSTITUTIONS- universities, organizations, companies... è INDIVIDUALS - users & potential users (nonusers) Roles lead to broad, but hard questions as to what context to choose for evaluation Each context demands different criteria, measures, methodologies

4 © Tefko Saracevic, Rutgers University Context questions 4 Social: how well do LIS support inf. demands, needs & roles of society, community? – hardest to evaluate 4 Institutional: how well do LIS support institutional/organizational mission & objectives? – tied to objectives of institution – also hard to evaluate 4 Individual: how well do LIS support inf. needs & activities of people? – most evaluations in this context

5 © Tefko Saracevic, Rutgers University Approaches to evaluation 4 Many approaches exist quantitative, qualitative … effectiveness, efficiency... each has strong & weak points 4 Systems approach prevalent Effectiveness: How well does a system perform that for which it was designed? Evaluation related to objective(s) Requires choices: – Which objective, function to evaluate?

6 © Tefko Saracevic, Rutgers University Approaches (cont) 4 Economics approach: Efficiency: at what costs? Cost-effectiveness: cost for a given level of effectiveness 4 Ethnographic approach practices, effects within an organization, community learning & using practices & comparisons

7 © Tefko Saracevic, Rutgers University Approaches... Distinction between: 4 Effectiveness: how well does a LIS achieve that for which it was designed? – relates to objectives 4 Efficiency: what are the costs in performing a LIS? – relates to $$$, time, effort … 4 Cost effectiveness: what are the costs for a given level of effectiveness – relates both effectiveness & efficiency

8 © Tefko Saracevic, Rutgers University Levels of evaluation System- centered: 1. Engineering: hardware & software; reliability, errors 2. Input: contents, coverage 3. Processing: procedures, techniques, algorithms User- centered: 4. Output: search, interaction 5. Use & user: application to tasks; market; fitness-of-use 6. Social: effect on research, productivity, organization... Danger: isolation of levels

9 © Tefko Saracevic, Rutgers University Requirements for evaluation Once a context is selected need to specify all five: 1. Construct A system, process, source – e.g. a given IR function or system; a Web site, a Dlib source 2. Criteria - to reflect objective(s) e.g. relevance, utility, satisfaction, accuracy, completeness, time, costs 3. Measure(s) - to reflect criteria precision, recall, various Likert scales, $$$,...

10 © Tefko Saracevic, Rutgers University Requirements … (cont.) 4. Measuring instrument - judgments by users on relevance or on a scale; cost/function 5. Methodology - procedures for collecting & analyzing data 4 No evaluation can proceed if not ALL of these are specified! 4 Sometimes specification on some are informal & implied, but they are always there.

11 © Tefko Saracevic, Rutgers University LIS functions 4 When evaluating we have to consider processes/functions Each function: different evaluation approaches 4 Major LIS functions: AVAILABILITY --acquisition of inf. materials & resources; holdings ORGANIZATION -- intellectual, physical ACCESS -- physical & intellectual – searching, retrieval OUTPUTS -- dissemination, use

12 © Tefko Saracevic, Rutgers University Availability 4 Social: how good coverage? field; problem area; community u Criteria: representative, depth, breadth, up-to-date... u Measures: degree, duplication u Method: compare, survey 4 Institutional: how well inf. resources satisfy mission, needs, plans... ? education, research, work... u Criteria: matching, attributes u Method: survey, functional comparison, e.g. curriculum

13 © Tefko Saracevic, Rutgers University Availability (cont.) 4 Individual: how well users served, satisfied ? u Criteria: awareness, expectations, satisfaction, success & failure rate u Measures: scales, branching diagrams (success or failure at each point of user action) u Methods: surveys, counting & statistical analyses, probability of success e.g. requests made/fulfilled

14 © Tefko Saracevic, Rutgers University Organization 4 Processing level: How well is a collection/data base represented, organized? u Criteria: depth, breadth, type, relevance, quality, errors, time, effort, costs... u Measures: degree, precision, recall, quality benchmarks (standards), error rate, time/process, $$$... u Methods: comparative processing, user or expert evaluation, quality analyses, economic analyses

15 © Tefko Saracevic, Rutgers University Access 4 Individual: How well did users interact with a service? 4 About users’ reactions to interaction with system u Criteria: accessibility, effort, convenience, facilities (ease, adequacy), staff (helpfulness efficiency), frustration, errors, difficulties... u Measures: scales, indicators u Methods: surveys, interviews, observations, experiments, transaction log analysis

16 © Tefko Saracevic, Rutgers University Access: searching, retrieval 4 Individual: how well did users retrieve relevant answers? 4 Related to user needs, tasks But often concentrated on system algorithms, H-C interactions etc 4 Criterion: relevance A few others proposed, e.g. satisfaction 4 Measures: recall, precision Other: overlap, consistency, Likert scales 4 Methods: labs (TREC), observation,

17 © Tefko Saracevic, Rutgers University Dissemination & use 4 Individual: How did users perceive results of use? 4 Related to users’ tasks u Criteria: cognitive (learning...), affective (satisfaction...), accomplishment (task), expectations (getting...), time (saving, worth...), money (cost value...) u Measures: scales, numbers u Methods: survey, interviews, critical incidence, impact estimate

18 © Tefko Saracevic, Rutgers University Operational & quality criteria (Say, Seaman & Cohen) 4 Reliability - delivery of a LIS accurately & dependably correct answers, relevant consistency 4 Responsiveness - readiness to provide service minimizing turnaround, time callbacks 4 Assurance - knowledge, ability, courtesy of staff understanding of collection, technology providing individual attention

19 © Tefko Saracevic, Rutgers University Quality criteria (cont.) 4 Access - sufficiency in staff, equipment, hours of operation waiting time access policies; location 4 Communication - informing & listening; language adjustment question negotiation teaching users; instructing 4 Security - freedom from danger, risk or doubt safety; confidentiality 4 Tangibles - physical facilities building etc. condition; layouts equipment condition

20 © Tefko Saracevic, Rutgers University Branching method Total requests (T) Circulation (C) Library function (L) User function (U) Satisfied requests (S) Not acquired In circulation Library malfunction User malfunction Reasons for satisfying (or not satisfying) a known item request : success & failure analysis Satisfaction rate (percentage) = S/T

21 © Tefko Saracevic, Rutgers University Branching... T = 437 C = 399 L = 347 U = 299 S = 245 Not acq.=38 In circul.= 52 Libr. malf. = 48 User malf. = 54 Example from a study of requests for specific books from an academic library

22 © Tefko Saracevic, Rutgers University Branching... Calculation of perf. rates: Satisfaction rate = 245/437 =.56 = 56% Acquisition performance =399/437=91% i.e. library had 91 % of requested books Circulation perf. = 347/399 = 87% 13% of acquired books were in circulation Library perf. = 299/347 = 86% 14% of books not in circulation were not found because some library malfunction User performance = 245/299 = 82% 18% of books that were on the shelf were not found by users because of their error Satisfaction rate (by probabilities)=.91 (A) x.87 (C) x.86 (L) x.82 (U) =.56 or 56%

23 © Tefko Saracevic, Rutgers University Conclusions 4 In practice need & importance of evaluation increasing 4 In research an ever present need new systems, approaches 4 Essential for improvements, decisions, resource allocation 4 But evaluation requires: commitment by management & staff; hard work financial & human resources knowledge how to do it continuous, not one-shot effort If we do not evaluate others will