Presentation is loading. Please wait.

Presentation is loading. Please wait.

Library Performance Indicators Does it really make sense to measure them? Błażej Feret Library, Technical University of Lodz, Poland

Similar presentations


Presentation on theme: "Library Performance Indicators Does it really make sense to measure them? Błażej Feret Library, Technical University of Lodz, Poland"— Presentation transcript:

1

2 Library Performance Indicators Does it really make sense to measure them? Błażej Feret Library, Technical University of Lodz, Poland Blazej.Feret@bg.p.lodz.pl Marzena Marcinek Main Library, Cracow University of Technology, Poland marcinek@biblos.pk.edu.pl Błażej Feret Library, Technical University of Lodz, Poland Blazej.Feret@bg.p.lodz.pl Marzena Marcinek Main Library, Cracow University of Technology, Poland marcinek@biblos.pk.edu.pl

3 CASLIN 2006, Český Ráj, 11–15.6.2006 Zagadnienia  Introduction  LPI-s for traditional libraries (ISO)  LPI-s for digital/electronic libraries (ISO)  Measuring service quality: LibQUAL+/Rodski  Performance Analysis of Polish Research Libraries Project (Marzena)  New context of the library/Googlization  Place of the future library  LPI-s for libraries of the 21 st Century?  Conclusions

4 CASLIN 2006, Český Ráj, 11–15.6.2006 Introduction How good we (the libraries) are?

5 CASLIN 2006, Český Ráj, 11–15.6.2006 High quality of library performance is crucial for each research library to survive. Wide on-line access to information makes researchers and students demand the highest quality library services. It is the quality of library services that decides on the perception of the library within its parent institution and the society. Derfert-Wolf, Górski, Marcinek @IFLA 2005 Introduction. Quality – is it important?

6 CASLIN 2006, Český Ráj, 11–15.6.2006 Glossary: quality = fitness for purpose, fitness for use, conformity to requirements and absence of defects ISO Standard 11620 (Performance Indicators for Libraries) defines „quality” as: Totality of features and characteristics of a product or service that bear on the library's ability to satisfy stated or implied needs. In the TQM context : „The quality of service is defined by the customer's perception of both the quality of the product and the service providing it” (Barnard, 1994) Introduction. What is quality?

7 CASLIN 2006, Český Ráj, 11–15.6.2006 Quality assessment depends not only on the product or service as it is, but also on a person or institution involved in the assessment process. Introduction. Quality.

8 CASLIN 2006, Český Ráj, 11–15.6.2006 Who decides about the quality and who evaluates its level and assesses quality („fitness to purpose”) of the library? "Many librarians maintain that only they, the professionals, have the expertise to assess the quality of library service. They assert that users cannot judge quality, users do not know what they want or need, and professional hegemony will be undermined if they kowtow to users. Such opinions about services, in fact, are irrelevant. The only thing that matters is the customer opinions, because without users there is no need for libraries except to serve as warehouses… After all, customers (present, potential, and former ones) believe that the library's reason for being open is to meet their needs. Each customer evaluates the quality of service received and decides when (or if) there will be further interaction with that organization” (Hernon, Altman 1998) Introduction. Quality.

9 CASLIN 2006, Český Ráj, 11–15.6.2006  The quality of a library is defined and assessed from a perspective of different groups of people  Basic element of the quality is user satisfaction  Users in different countries or even different user groups may have different needs and expectations and therefore different level of satisfaction from the same service  User satisfaction is NOT an objective value (though it is measurable)  User satisfaction “is the emotional reaction to a specific transaction or service encounter” (Hernon, Altman)  User satisfaction from a single transaction is determined by many different factors including service quality, user’s past experience with the service provider, actual emotional status of user, etc. (Hernon, Altman) Introduction. Quality.

10 CASLIN 2006, Český Ráj, 11–15.6.2006 The better service quality – the higher satisfaction of users, but: The “perceived quality” is different from “the objective quality” The “service quality” is dependent on the customers perception of what they can expect from a service and what they believe they have received, rather than it is any “objective” standard as determined by a professional group or in conventional performance measurement. R.Cullen, Library Trends, vol. 49, no. 4, 2001 Introduction. Service quality.

11 CASLIN 2006, Český Ráj, 11–15.6.2006 Introduction. User satisfaction. User is satisfied when provision of the service meets his/her expectations. If there is a gap between service delivered and user expectations – the user is not satisfied from the service.

12 CASLIN 2006, Český Ráj, 11–15.6.2006 1.The discrepancy between users’ expectations and management’s perception of these expectations 2.The discrepancy between management’s perception of users’ expectations and service quality expectations 3.The discrepancy between service quality specifications and actual service delivery 4.The discrepancy between actual service delivery and what is communicated to users about it 5.The discrepancy between users’ expected service and perception of service delivered. The gaps between users’ expectations and perceptions (SERVQUAL model): Introduction. Gap analysis.

13 CASLIN 2006, Český Ráj, 11–15.6.2006 Introduction.  Quality User satisfaction Other parameters  measuring user satisfaction  measuring other parameters  quality surveys (LibQual+, Rodski)  Library performance indicators  ISO norms

14 CASLIN 2006, Český Ráj, 11–15.6.2006 Introduction  Why to measure library performance? Library management and decision-making process Monitoring implementation of strategic plans Optimization of library activities; service enhancements Acquiring and rational allocation of financial resources Library marketing Accreditation Benchmarking, rankings, to compare… …  academic vs. public libraries context

15 CASLIN 2006, Český Ráj, 11–15.6.2006 How’s your wife?”Compared to what? M.Lynch :Compared to What? Or, Where to Find the Stats. AMERICAN LIBRARIES, September 1999 to compare…

16 CASLIN 2006, Český Ráj, 11–15.6.2006  Norm ISO 11620:1998, ISO 11620:1998/AD1:2003 Information and Documentation. Library performance indicators  Technical Report ISO/TR 20983:2003 Information and documentation. Performance indicators for electronic library services  Poll R., te Boekhorst P. “Measuring Quality: International Guidelines for Performance Measurement in Academic Libraries”. IFLA 1996.  ICOLC Project: Guidelines for Statistical Measures of Usage of Web- Based Information Resources. ICOLC, 2001 ( http://www.library.yale.edu/consortia/2001webstats.htm)  Projekt COUNTER (www.projectcounter.org)  Guidelines for Statistical Measures of Usage of Web-Based Information Resources. ICOLC, 2001 (http://www.library.yale.edu/consortia/2001webstats.htm) Introduction How to measure library performance?

17 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO 11620 (1998)  This International standard is applicable to all types of libraries in all countries.  Indicators may be used for comparison over time within the same library. Comparison between libraries may also be made, but only with extreme caution.  This International Standard does not include indicators for the evaluation of the impact of libraries either on individuals or on society. Information and documentation – – Library Performance Indicators

18 CASLIN 2006, Český Ráj, 11–15.6.2006  User perception General USER SATISFACTION  Public Services General Percent of Target Population Reached Cost Per User Library Visits per Capita Cost per Library Visit Providing Documents Titles availability Required Titles Availability Percentage of Required Titles in the Collection Required Titles Extended Availability In-library Use per Capita Document Use Rate ISO 11620 (1998)

19 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO 11620 (1998)  Public Services (cont.) Retrieving Documents Median Time of Document Retrieval from Closed Stacks Median Time of Document Retrieval from Open Access Areas Lending Documents Collection Turnover Loans per Capita Documents on Loan per Capita Cost per Loan Loans per Employee Document delivery from external sources Speed of Interlibrary Lending Enquiry and reference services Correct Answer Fill Rate

20 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO 11620 (1998)  Public Services (cont.) Information searching Title Catalogue Search Success Rate Subject Catalogue Search Success Rate User education NO INDICATOR Facilities Facilities Availability Facilities Use Rate Seat Occupancy Rate Automated Systems Availability  Technical Services Acquiring documents Median Time of Document Acquisition Processing documents Median Time of Document Acquisition

21 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO 11620 (1998)  Technical Services (cont.) Cataloguing Cost per Title Catalogued  Promotion of services NO INDICATOR  Availability and use of human resources NO INDICATOR

22 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO 11620 Amendment 1 (2003)  Public services Providing documents Proportion of Stock Not Used Shelving Accuracy Lending Documents Proportion of Stock on Loan  User services User Services Staff per Capita User Services Staff as Percentage of Total Staff Additional indicators:

23 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO/TR 20983 (2003) Information and documentation – – Performance indicators for electronic libraries The performance indicators described in this Technical Report are used as tools to compare the effectiveness, efficiency and quality of the library services and products to the library’s mission and goals. They can be used for evaluation purposes in the following areas:  Comparing a single library performance over years  Support for management decisions  Demonstrating the library performance and its cost to the funders, the population and public  Comparing performance between libraries of similar structure  Whether the library’s performance or the use of its services has changed over years  How far the performance or use in one library differs from that in other libraries

24 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO/TR 20983 (2003)  Public Services General Percentage of Population Reached by Electronic Services Providing electronic library services Percentage of Expenditure on Information Provision Spent on the Electronic Collection Retrieving documents Number of Documents Downloaded Per Session Cost Per Database Session Cost Per Document Downloaded Percentage of Rejected Sessions Percentage of Remote OPAC Sessions Virtual Visits as Percentage of Total Visits Enquiry and reference services Percentage of Information Requests Submitted Electronically

25 CASLIN 2006, Český Ráj, 11–15.6.2006 ISO/TR 20983 (2003)  Public Services (cont.) User education Number of User Attendances at Electronic Service Training Lessons Per Capita Facilities Workstation Hours Available Per Capita Population Per Public Access Workstation Workstation Use rate  Availability and use of human resources Staff training Number of Attendances at Formal IT and Related Training Lessons Per Staff Member Deployment of Staff Percentage of Library Staff Providing and Developing Electronic Services

26 CASLIN 2006, Český Ráj, 11–15.6.2006 Measuring library service quality  LibQUAL+  RODSKI  Performance Analysis for Polish Research Libraries

27 CASLIN 2006, Český Ráj, 11–15.6.2006 LibQUAL+ (ARL)  LibQUAL+ has 22 standard statements and the option to select five local service quality assessment statements. For each of which, the client is asked to rate three times – for the minimum, desired and perceived levels of service quality. These are all scaled 1-9, with 9 being the most favourable. There is an open ended comments box about library services in general.

28 CASLIN 2006, Český Ráj, 11–15.6.2006

29  Rodski is the Australian behavioural research company which develops its own surveys.  Rodski had 38 statements and the option to include up to 15 local service quality assessment statements which clients are asked to rate twice – firstly to measure the importance of each of the statements to them, and secondly to measure their impression of the library’s performance on each statement. These are scaled 1-7, with 7 being the most favourable. There are two comments boxes at the end of the survey – one for general comments and one for “the one area we could improve on to assist you”? RODSKI (Rodski) (rodski.com.au)

30 CASLIN 2006, Český Ráj, 11–15.6.2006

31 Performance Analysis for Polish Research Libraries Marzena Marcinek

32 Research Libraries in Poland Total Total National Library National Library Academic Libraries Academic Libraries Libraries of the Polish Academy of Sciences (PAS) Libraries of the Polish Academy of Sciences (PAS) Libraries of branch R&D units Libraries of branch R&D units Public libraries Public libraries Other Other1225 1 989 989 94 94 99 99 11 11 31 31

33 Collection of Polish research libraries (excl. e-collection) Total in thousand vols. Books Serials Special collections in thousand physical units Total73.50357.54615.95725.967 National Library 2.865 2.865 2.129 2.129 736 736 2.977 2.977 Academic Libraries 52.80442.23810.56619.308 Libraries of the PAS 4.729 4.729 2.960 2.960 1.769 1.769 583 583 Libraries of branch R&D units 2.867 2.867 2.062 2.062 805 805 1.093 1.093 Public libraries 6.386 6.3865.391 995 995 1.463 1.463 Other3.8522.766 1.086 1.086 543 543

34 Readers, loans and staff of research libraries Readers in thousand Loans for individual users for individual users in thousand physical units in thousand physical unitsStaff Total2.10217.2979.461 National Library 3525590 Academic Libraries 1.65714.7126.680 Libraries of the PAS 42270380 Libraries of branch R&D units 37138234 Public libraries 2501.904975 Other81247602

35 Academic libraries Regulations the Library Act of 1997 the Library Act of 1997 the Higher Education Act of 2005 the Higher Education Act of 2005Funds budgets of parent institutions from the resources of the appropriate ministries, e.g. the Ministry of Science and Higher Education (usually cover only current expenditure) budgets of parent institutions from the resources of the appropriate ministries, e.g. the Ministry of Science and Higher Education (usually cover only current expenditure) various grants and projects various grants and projects

36 Official library statistics in Poland Central Statistical Office (CSO) Central Statistical Office (CSO) data collected every second year The Higher Education published by the Ministry of Science and Higher Education The Higher Education published by the Ministry of Science and Higher Education data collected every year

37 Assessment of higher education institutions (and libraries) State Accreditation Commission State Accreditation Commission Journals Journals University / parent institution bodies University / parent institution bodies Libraries Libraries

38  lack of national library statistics system  data on libraries are gathered every second year by the Central Statistical Office - insufficient for comparable analyses and not consistent with ISO 2789  lack of unified criteria to evaluate and compare library performance  lack of tools for systematic data gathering  lack of body/institution responsible for developing methods and tools for library evaluation  the State Accreditation Commission - dealing with library issues in a very general manner Characteristics of library statistics and performance measure in Poland

39 Quality initiatives and user surveys in Polish academic libraries Development of Library Management as Part of the University Total Quality Management (EU Tempus grant, 1998-2000) Development of Library Management as Part of the University Total Quality Management (EU Tempus grant, 1998-2000) "Analysis of current state of libraries with selected performance indicators""Analysis of current state of libraries with selected performance indicators" user survey (LIBRA package)user survey (LIBRA package) Comparative studies of Polish research libraries (national conference, Krakow 2001) Comparative studies of Polish research libraries (national conference, Krakow 2001) a lot of separate researches and surveys a lot of separate researches and surveys the need for common patterns and results the need for common patterns and results

40 The Group for Standardisation for Polish Research Libraries  formed in 2001, initially as an informal team  activities incorporated into the overall plan of tasks of the Standing Conference of the Directors of Higher Education Libraries  "Performance Analysis for Polish Research Libraries” – a project based on the agreement on cooperation signed by 8 institutions employing members of the Group (2004)  Project co-financed by the Ministry of National Education and Sport, 2004

41 A Common Project of Polish Research Libraries on Comparable Measures Objectives  to define methods for the assessment of Polish research libraries  to select a set of performance indicators and standards for library performance (quantity, quality and effectiveness)

42  to collect libraries' statistical data for a computer database  to conduct a comparative research  to prepare and publish yearly reports Goals

43 Tasks  identification of publications on library performance and national solutions in different countries  preparation and further modification of a questionnaire for the survey of library performance  preparation and further modification of a dedicated software for the acquisition and analysis of data collected in the surveys  data collection  promotion  detailed analysis of data

44 Staff Collection Budget Infrastructure Circulation Information services Didactics Publications and data bases created by the library Library cooperation, organisation of library events, professional activity of library staff Questionnaire

45 EU TEMPUS PHARE JEP 13242-98 “Development of Library Management as part of the University TQM”  ISO 11620:1998, AD1:2003 Information and Documentation. Library performance indicators  ISO 2789:2003 Information and Documentation. International Library Statistics  R. Poll, P. te Boekhorst “Measuring Quality : International Guidelines for Performance Measurement in Academic Libraries”. IFLA 1996 Patterns for the Polish Questionnaire

46 Changes and modifications to the questionnaire (2004)  more indicators and formulas based mainly on the ISO 11620 and ISO 2789 standards (information services, electronic sources and usage)  problems reported by librarians or observed by the administrator of the database  more notes and comments (financial and staff issues)

47 Questionnaire  48 questions of various types  refer to easily accessible or computable data (e.g. size of collection, number of users etc.)  closed questions about the services offered (e.g. on-line reservation: Yes/No)  88 performance indicators  19 calculated by librarians  69 calculated automatically

48  the need for a comprehensive analysis of current state of Polish research libraries  the need to cover all aspects of library activities included in questionnaires  the need to develop standards for library evaluation in the future, on the basis of current performance indicators  usefulness for different purposes, both for libraries and another institutions and authorities  “three times-calculated” selected indicators (lack of FTE student equivalent) Why so many indicators ?

49  library expenditure per student/user,  expenditures for library materials/books per student/user  ratio of library budget to the budget of its parent university  time required for the technical processing of a document  collection on the computer system as a % of the whole collection of the library  percent of catalogue descriptions acquired from outside resources Examples of performance indicators required to complete the questionnaire

50  Registered users as % of potential users  Total books per student/user  Books added per student/user  Number of students/users per one library staff member  Total library area per student/user  Number of students/users per one study place in reading rooms  Loans per registered user  Loans per library staff member  User services staff as % of total staff  Staff with higher LIS education as % of total staff  Open access printed books as % of total printed books Examples of performance indicators calculated automatically

51 Software for the acquisition and analysis of data – requirements  on-line access to the questionnaire (submission, modification)  selected performance indicators automatically calculated and presented  automatic control and verification of the accuracy of data in the fields  multi-aspect comparative analysis of selected data and performance indicators  access to analysing functions for individual libraries  Internet website - information about the Project, a set of instructions, questionnaires, useful links, results of research  module for librarians - an on-line questionnaire, multi-aspect analysis of data concerning one’s own library  administrator’s module - registration of libraries and direct contacts  database - incorporate and register data from the questionnaires (dynamic form)  module for the Group - statistical analyses on data and performance indicators

52 Elements of Software Application an Internet web-site with direct links to information about the Project, a set of instructions, questionnaires, results of research, useful links to sites dealing with performance indicators and library statistics;an Internet web-site with direct links to information about the Project, a set of instructions, questionnaires, results of research, useful links to sites dealing with performance indicators and library statistics; a module for librarians - an on-line questionnaire with tools for automatic control and verification of the accuracy of data entered in each field, also an adequate formulae to calculate selected performance indicators. There are two versions of the questionnaire: for academic libraries and for public libraries. The module for librarians enables also multi-aspect analysis of data concerning one’s own library according to various criteria;a module for librarians - an on-line questionnaire with tools for automatic control and verification of the accuracy of data entered in each field, also an adequate formulae to calculate selected performance indicators. There are two versions of the questionnaire: for academic libraries and for public libraries. The module for librarians enables also multi-aspect analysis of data concerning one’s own library according to various criteria;

53 Software (2) an administrator ’ s module enables registration of libraries and individual persons entitled to transmit data and work out analyses. It is also used for direct contacts with library staff responsible for filling-in the questionnaires; an administrator ’ s module enables registration of libraries and individual persons entitled to transmit data and work out analyses. It is also used for direct contacts with library staff responsible for filling-in the questionnaires; the database designed to incorporate and register data from the questionnaires has been given a dynamic form i.e. the administrator can change, add or delete any fields corresponding to the questions from the questionnaire; the database designed to incorporate and register data from the questionnaires has been given a dynamic form i.e. the administrator can change, add or delete any fields corresponding to the questions from the questionnaire; a module for the Task Group for Standardisation designed as a tool to carry out statistical analyses on data and performance indicator. a module for the Task Group for Standardisation designed as a tool to carry out statistical analyses on data and performance indicator.

54 Data collection Since autumn 2003 the programme for statistical data collection is accessible for each library registered in the system Since autumn 2003 the programme for statistical data collection is accessible for each library registered in the system By 15 May 2005 in the Project database there are registered 57 libraries By 15 May 2005 in the Project database there are registered 57 libraries  52 academic libraries (41 state-owned and 11 non state-owned)  3 public libraries  2 special libraries Questionnaires for 2003 completed 29 libraries: Questionnaires for 2003 completed 29 libraries:  23 state-owned academic libraries  2 non state-owned academic libraries  2 special libraries  2 public libraries Questionnaires for 2002 completed 17 libraries Questionnaires for 2002 completed 17 libraries  16 academic libraries  1 public library

55 Problems with data collection lack of some statistical data required to complete the questionnaire or difficulties in obtaining them lack of some statistical data required to complete the questionnaire or difficulties in obtaining them lack of comparable data on the use of electronic resources (incl. differences in usage statistics generated by various providers) lack of comparable data on the use of electronic resources (incl. differences in usage statistics generated by various providers) differences in library structure and budgeting within university differences in library structure and budgeting within university difficulties with validation - mistakes (e.g. wrong ratio) need correction, misunderstanding of data requirements, wrong interpretation of questions difficulties with validation - mistakes (e.g. wrong ratio) need correction, misunderstanding of data requirements, wrong interpretation of questions participation in the Project is not compulsory participation in the Project is not compulsory

56  performance indicators calculated by librarians  performance indicators calculated automatically  ratio of expenditures in library budgets  groups of the analysed libraries:  state-owned academic libraries of different types (university libraries, technical university libraries, agricultural university libraries and other),  non state-owned academic libraries,  public libraries  special libraries  average values, medians, maximal and minimal values The analysis of data for 2002-2003 - examples

57

58

59 User satisfaction IFLA guidelines recommend two indicators to examine user opinion: user satisfaction measured at two levels user satisfaction measured at two levels general user satisfactiongeneral user satisfaction user satisfaction with individual services or components of those servicesuser satisfaction with individual services or components of those services user satisfaction with services offered for remote use user satisfaction with services offered for remote use

60 So far and in the future Now Now Libraries conduct their own user surveysLibraries conduct their own user surveys Libraries involved in the Tempus Project within a common user survey project analysed user needs with professional computer programme - the LIBRA software packageLibraries involved in the Tempus Project within a common user survey project analysed user needs with professional computer programme - the LIBRA software package Future: Future: A unified, nation-wide user survey conducted with common tools; the methodology based on ISO standard 11620 and IFLA guidelines. The results obtained will be quantified and presented as numerical scores.A unified, nation-wide user survey conducted with common tools; the methodology based on ISO standard 11620 and IFLA guidelines. The results obtained will be quantified and presented as numerical scores.

61 Conclusions The Project “Performance Analysis for Polish Research Libraries” is focused on the development of methods and standards for the evaluation of quality of research libraries including the academic ones The Task Group for Standardisation is convinced that such a development of methods and standards ought to be preceded by a several-year examination of performance indicators based on library statistics and user satisfaction research In the next stage the results of such a research will be used for the assessment of the degree to which libraries comply with the standards required The evaluation of current performance of research libraries is the first stage of that task The methodology and tools used in the Project need to be improved, completed and developed

62 Plans for the future (1)  to continue the process of standardisation of statistical data  to prepare guidelines for the interpretation of indicators used  to improve the database - more possibilities for comparative studies  to develop more performance indicators based on ISO 11620, possible to be calculated basing on the data already collected  to select more performance indicators concerning electronic environment

63 Plans for the future (2)  to prepare comment form for questions from the respondents  to calculate more performance indicators from ISO 11620 on the base of data already existing  to develop standard user surveys and computer software for data analysis for determining quantified user satisfaction as qualified indicators;  to develop nation-wide set of standards and clearly determine set of performance indicators formulas and interpretations for each standard  to promote, to promote, to promote...

64 CASLIN 2006, Český Ráj, 11–15.6.2006 Internet in library

65 CASLIN 2006, Český Ráj, 11–15.6.2006 Internet as a competitor to library Internet users remaining Here is how Americans line up when probed about specific topics and whether they think the Internet will satisfy their information needs: Pew Internet Project 2002: www.pewinternet.org

66 CASLIN 2006, Český Ráj, 11–15.6.2006 Internet as a competitor to library 71%85%87%76%

67 CASLIN 2006, Český Ráj, 11–15.6.2006  Easy access to Internet and simple to use search mechanisms  Independence in conducting search  Apparent proficiency in using internet browsers/search  Excess of information available  Availability of the same or similar sources of information  Variety of information types available with the same tool Internet as a competitor to library – why?

68 CASLIN 2006, Český Ráj, 11–15.6.2006 Internet context Workplace Entertainment Learning Neighbourhood Research = Library context

69 CASLIN 2006, Český Ráj, 11–15.6.2006

70 New context  Changes in users’ needs and patterns of behaviour: Cellular phones Multimedia online (music, movies) News and other information online Podcasts (what’s this?) RSS (what’s this?) Trade online Personalization of services  Societies online Blogs (what’s this?) Wikis (what’s this?) chat rooms Tagging, commenting, opinions Virtual realities, lifes…

71 CASLIN 2006, Český Ráj, 11–15.6.2006

72

73

74

75

76

77 150,000-250,000 visits A DAY!

78 CASLIN 2006, Český Ráj, 11–15.6.2006

79

80  Technologies behind

81 CASLIN 2006, Český Ráj, 11–15.6.2006

82 PL = 73% W. Europe  100%

83 CASLIN 2006, Český Ráj, 11–15.6.2006 Where can I find podcast from the latest lecture by…?

84 CASLIN 2006, Český Ráj, 11–15.6.2006 Palmtops

85 CASLIN 2006, Český Ráj, 11–15.6.2006 Nano Phone, Cardphones,...

86 CASLIN 2006, Český Ráj, 11–15.6.2006 E-books

87 CASLIN 2006, Český Ráj, 11–15.6.2006 E-books

88 CASLIN 2006, Český Ráj, 11–15.6.2006 E-books

89 CASLIN 2006, Český Ráj, 11–15.6.2006 E-paper

90 CASLIN 2006, Český Ráj, 11–15.6.2006 E-paper

91 CASLIN 2006, Český Ráj, 11–15.6.2006 E-paper Do you know how to catalog THIS? Can it be catalogued at all? Do we NEED to catalogue this?

92 Google invests in wired … A $189,000,000 pilot

93 CASLIN 2006, Český Ráj, 11–15.6.2006 Bidirectional wireless module

94 CASLIN 2006, Český Ráj, 11–15.6.2006  Googlization

95 CASLIN 2006, Český Ráj, 11–15.6.2006 Google jako dominant informacyjny Google „Classic” Google News Google Print Google Earth Google Video Google Alerts Google SMS Google Answers Google Groups Google Labs Google College Google Scholar.....???

96 CASLIN 2006, Český Ráj, 11–15.6.2006

97

98

99

100

101

102

103

104

105

106

107

108

109 books Books

110 CASLIN 2006, Český Ráj, 11–15.6.2006

111 Next Massive Wave of Broadband Expands Secure Broadband Wireless Low-Power- Consumption Mobile/Display Devices Real-Time Infra- structure Transition to Service-oriented architecture 2006/7 3G

112 WEB 2.0  RSS – really simple syndication  Wikis  New Programming Tools: AJAX, API  Blogs and blogging  Recommender Functionality  Personalized Alerts  Web Services  Folksonomies, Tagging and Tag Clouds  Social Networking  Open access, Open Source, Open Content  Commentary and comments  Personalization and My Profiles  Podcasting and MP3 files  Streaming Media – audio and video  User-driven Reviews  Rankings & User-driven Ratings  Instant Messaging and Virtual Reference  Photos (e.g. Flickr, Picasa)  Socially Driven Content  Social Bookmarking

113 CASLIN 2006, Český Ráj, 11–15.6.2006 Use your imagination:  Mobile devices  Electronic paper  Global digititalization of resources  Open access to knowledge  Wireless networks

114 CASLIN 2006, Český Ráj, 11–15.6.2006  Place of the library

115

116 Results of round II/III % Book reading Book distribution Journal reading Journal distribution El. info. reading El. info distribution What percentage of information will be accommodated by people via electronic, and not by printed media? Feret, Marcinek 2005 In 10 years from now:

117 Results of round II/III What percentage of queries asked by academic library users will be in the year 2015 directed to the Internet instead of their university library? referenceresearch % What percentage of library users will visit the library in person at least once a year, in the university of 2015? Feret, Marcinek 2005

118 The power of libraries

119 The Long Tail of QUESTIONS libraries

120 CASLIN 2006, Český Ráj, 11–15.6.2006 Place of the library Library – a social place? Library – information sorter? Library – warehouse of obsolete resources? Library – consultation centre?  …we need to move on from the mindset of the local 'library' as the core supplemented by digital resources from external providers and the wider internet – to a different mindset where the 'library' is a value-added overlay on the wider canvas of readily available digital information content, which provides value-added presentation and personalised delivery of information resources to match the specific needs of researchers, students and staff in the University, integrated with their other working/study materials. (Di Martin)

121 CASLIN 2006, Český Ráj, 11–15.6.2006 Library performance indicators in 2020  Some of the „classic” indicators will survive  The basic indicators will be: Demand for library services (percentage of target population, which uses the library) (what for?, how often?) User satisfaction (definitely!) Impact of library on the quality of scientific research (VERY difficult to measure) Ranking in user-driven ratings … any other suggestions? What indicators… ?

122 CASLIN 2006, Český Ráj, 11–15.6.2006 Conclusions  Library performance indicators will not die, though formal measurement of different aspects of library activities will be less important than now  Basic factor driving changes will be user satisfaction; this will be also important for university management as a proof of importance of the library

123 It’s an “Exploration Space” not a collection space

124 It’s an Information Ocean, not a Highway

125 The future is already here, it’s just not evenly distributed yet

126 CASLIN 2006, Český Ráj, 11–15.6.2006 You don’t have to agree with us, but be warned…

127 CASLIN 2006, Český Ráj, 11–15.6.2006 We tend to overestimate changes to happen in the coming year, but to underestimate changes in the coming decade… Andrew Odlyzko

128 CASLIN 2006, Český Ráj, 11–15.6.2006 Many thanks to Stephen Abram (SirsiDynix) for sharing slides, some of which were used in this presentation.


Download ppt "Library Performance Indicators Does it really make sense to measure them? Błażej Feret Library, Technical University of Lodz, Poland"

Similar presentations


Ads by Google