Vendor Usage Data for Electronic Resources: A Survey of Libraries

Slides:



Advertisements
Similar presentations
COUNTER: improving usage statistics Peter Shepherd Director COUNTER December 2006.
Advertisements

Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
E-Books in HE Libraries -: Time for a quantum leap forward? Jill Taylor-Roe Head of Liaison & Academic Services Newcastle University STM E-book 2.03 Seminar.
Implications of Release 3 of the COUNTER Code of Practice Vendor Usage Reports: Are we all on the same page now? Charleston Conference November 6, 2008.
© 2008 EBSCO Information Services SUSHI, COUNTER and ERM Systems An Update on Usage Standards Ressources électroniques dans les bibliothèques électroniques.
Usage Statistics in Context: related standards and tools Oliver Pesch Chief Strategist, E-Resources EBSCO Information Services Usage Statistics and Publishers:
Counting resource use: The publisher view
JUSP: a shared approach to measuring usage Angela Conyers and Jo Lambert 5 th June 2013, QQML2013.
Kathy Perry, VIVA Director With special thanks to Peter Shepherd, COUNTER Executive Director Electronic Resources and Libraries Conference March 19, 2014.
ScholarlyStats: How it Utilizes COUNTER and SUSHI Standards, Questions Regarding Effective Use, and Future Strategic Plans for the Service Tina Feick,
1 L U N D U N I V E R S I T Y a home grown, bespoke institutional Federated Search tool JIBS Conference at The John Rylands University Library,
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
The Impact of Consortial Purchasing on Library Acquisitions: the Turkish Experience Tuba Akbaytürk 24 th Annual IATUL Conference Ankara, Turkey.
Connecting E-Resource Management Systems and Usage Statistics IFLA ERMS Satellite Meeting Cape Town, South Africa August 16, 2007 Oliver Pesch EBSCO Information.
Jill Emery, Head of Acquisitions The University of Austin & In absentia: Dana Walker, Head of Acquisitions University of Georgia Libraries Anti-Acquisitions.
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
Evaluating and Purchasing Electronic Resources- The University of Pittsburgh Experience Sarah Aerni Special Projects Librarian University of Pittsburgh.
Getting started on informaworld™ How do I register with informaworld™? What do I do if I forget my password? My institution does not subscribe to any journals,
Getting started on informaworld™ How do I register my institution with informaworld™? How is my institution’s online access activated? What do I do if.
Usage Data for Electronic Resources WRAPS/FRIP Presentation April 24, 2007 Gayle Baker, Maribeth Manoff, Eleanor Read.
All That Data Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources Maribeth Manoff, Eleanor Read, and Gayle.
What does Elsevier count? Use Measures for Electronic Resources: Theory & Practice ALCTS Program, ALA, Chicago Daviess Menefee Director, Library Relations,
Where's the evidence? The role of usage statistics in collection management Angela Conyers Evidence Base, Research & Evaluation Birmingham City University.
Data, Data Everywhere: Making Sense of the Sea of User Data.
Making the Most of E-Journal Usage Data Electronic Resources & Libraries 2008 March 20, 2008 Gayle Baker, Eleanor Read & Maribeth Manoff.
Meta Tagging / Metadata Lindsay Berard Assisted by: Li Li.
COUNTER and the development of standards for usage reports Marthyn Borghuis, Elsevier COUNTER Executive Committee For: CALISE-Taiwan.
E-Metrics Project Update American Library Assocation Orlando June 27, 2002 Sherrie Schmidt Dean of University Libraries Arizona State.
Building on sand? Using statistical measures to assess the impact of electronic services Dr Angela Conyers evidence base research & evaluation UCE Library.
1 July 11, 2013 © 2013 SIPX, Inc. Confidential SIPX Processes for Instructors Presented by Mary-Lynn Bragg July 2013.
Electronic Resource Usage Statistics in ERM By Sue Kendall – Head of Collection Development, San Jose State University Rae Ann Stahl – Technical Services.
Swets Information Services Supplying the hybrid library – the agent’s role Robert Bley, Sales Manager: E-Access Solutions.
Consortial Uses of an ERMS Tommy Keswick SCELC ALCTS ERIG Meeting ALA 2008 Annual Conference Anaheim, California June 28, 2008.
Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis (MaxData) Carol Tenopir and Donald.
By SAWSAN HABRE LEBANESE AMERICAN UNIVERSITY AMICAL Meeting #2 June 1-3, 2005 DDS/ILL
Gathering, Integrating and Analyzing Usage Data: A look at collection analysis tools and usage statistics standards, and important questions to consider.
Informed decisions for Selection Support in Libraries 20th Pan-helenic Conference of Academic Libraries Thessaloniki, 14/11/2011 Núria Sauri Electronic.
Fixing a Hole: Incorporating Usage Statistics into the Electronic Resource Life Cycle Margaret Hogarth & Barbara Schader, ER&L 2008.
How Scientists Use Journals: Electronic and Print Carol Tenopir Donald W. King
Overview for Faculty. How We Got Here Inflationary Pressures (LJ, 2009 & Ebsco) % increase since 2005 by discipline: 31-58% 2010 annual inflation estimate.
Assessing current print periodical usage for collection development Gracemary Smulewitz Distributed Technical Services Rutgers University Libraries.
E-Journal Usage Data From SFX Enhancing Our Understanding of Full-Text Usage Maribeth Manoff University of Tennessee Libraries ELUNA 2 nd Annual Meeting.
Walking a Tightrope in the Transition to Electronic Resources Debra G. Skinner Georgia Southern University GOLD/GALILEO Users Group Conference July 31,
Alma Analytics Usage Yoel Kortick | Senior Librarian.
NC LIVE Titles Common Problems Ralph Kaplan 3 April 2003.
Coastal Carolina University
Databases vs the Internet
2 March 2017 Jevgenija Sevcova, EIFL Programmes and events coordinator
AP CSP: Cleaning Data & Creating Summary Tables
Zorian M Sasyk Electronic Access/UX Librarian
The Journal Usage Statistics Portal (JUSP)
Empowering Data: Persuasion Through Presentation
Quantifying the value of our libraries. Are our systems ready?
HCT: The Library Catalogue
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Introduction and aims Supports libraries by providing a single point of access to e-journal usage data Assists management of e-journals collections to.
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
Getting started on informaworld™
SpringerLink Training August 2010
Module 6: Preparing for RDA ...
How can EBSCO help in the collection of resource usage data for CAUL?
Vendor-Supplied Usage Data: Challenges and Opportunities
SUSHI, COUNTER and ERM Systems An Update on Usage Standards
COUNTER Update February 2006.
The Journal Usage Statistics Portal (JUSP)
“All About Me” Staff Development Day
Journal Usage Statistics Portal (JUSP): a simpler way to measure use and impact
OpenURL: Pointing a Loaded Resolver
Springshare’s LibInsight: E-Journals/Databases Dataset
Presentation transcript:

Vendor Usage Data for Electronic Resources: A Survey of Libraries ER&L 2007 Conference February 23, 2007 Gayle Baker & Eleanor Read

MaxData “Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis” Funded by Institute of Museum and Library Services (IMLS) 2005-2007

MaxData Project: UT Libraries Team Gayle Baker, UT Libraries Eleanor Read, UT Libraries Maribeth Manoff, UT Libraries Ron Gilmour, UT Libraries Carol Tenopir, Project Director, UT SIS and Center for Information Studies http://web.utk.edu/~tenopir/maxdata/index.htm

Talk Outline Survey (why, who, when) Responses General Conclusions Suggested areas for more research

Survey: Purpose How much effort is involved in working with vendor-supplied use data? How are the data used? What data are most useful in managing electronic resources? Mention Yovits-Ernst definition of information as data of value used in decision-making. At the UT Libraries, before the project, we: downloaded individual usage reports every 6 months. For ARL, we put together totals of July-June downloads, searches, sessions for spreadsheet for ARL supplemental stats. (Approx. 120 hours or 3 weeks every 6 months) put them up on an intranet by vendor/publisher and told the subject librarians that they were available. No detail analysis by title. In a few instances, mainly “big deals,” we added price data and identified subscribed titles, We looked at the use of subscribed vs. non-subscribed titles, determined that some non-subscribed titles had very high use, while some subscribed titles had little use. We were able to use this information to drop unused subscriptions and add unsubscribed titles of high use as subscriptions. Vendor definition – delivers content to the library and with whom the library has a contractual agreement for content delivery

Survey: Subjects Sent to Library Directors at Carnegie I and II research institutions (360+) April 2006 99 replies, 92 respondents A copy of the survey is on the conference web site for this talk. Library directors were asked to forward the survey to the person in their library who worked with vendor-supplied usage statistics. I would like to thank those of you who responded to the survey. We received 99 replies, with 92 that answered the survey. Replies not counted as respondents: Some duplicates Indication that they did not receive vendor-supplied usage data Some not academic institutions with an FTE (government library??—check out) Very thoughtful letter about the difficulty providing answers to the questions

FTE of Respondents

Number of Vendors Providing Usage Reports 1. Please provide the following information about usage statistics for electronic resources that you receive from vendors. Number of vendors that provide reports ____ There may have been some confusion between the number of different vendors and the number of individual e-resources. I was going after the number of individual sources/sites of vendor usage data. On the histogram, you can see… We are not looking at the mean to make decisions; median may be better. Why median, not mean: In most cases the responses were not normally distributed. So, we use the median.. No statistical tests were done. This was descriptive analysis.

Number of Vendors Providing Usage Reports Here’s a graphical representation to show you how the data is distributed with a few high values.

Reports for Different Types of Resources We asked what percentage of the reports relate to various types of electronic resources: 1.b. Full-text journals Abstract & Index databases (no full-text journals) Electronic books Other electronic resources I probably should have had a category for what I call hybrids – resources with components of all of these (like LION, which has full-text of literature and criticism, reference sources, etc. Median responses in table

Reports for Different Types of Resources This is a graphical representation of the chart that I just showed you. Note that some of the data did not add up to 100%u from some respondents. Note that the bar for ALL group and the groups with more respondents are very similar. Reports for A&I and Full-text account for most ot the vendor reports.

Purpose for Reviewing and/or Analyzing Vendor Data 2. We then asked for what purpose or purposes are you reviewing and/or analyzing vendor-supplied usage data? We offered 4 options for responding: Meeting reporting requirements (like ARL and ACRL) Make subscription decisions Justify expenditures Other

Other Purposes Collection management Cost/use Cancellation decisions Change to electronic-only Promotion / marketing / training for lower use e-resources Administrative Strategic planning / budget Curiosity 2. Other reasons to work with this data. >>>Look for some quotes

Number of Hours Processing Usage Reports in 2005 3. In 2005, approximately how many hours did you and/or your staff spend processing usage reports from vendors? Processing may include: Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing Other Asking for an estimate

Number of Hours Processing Usage Reports in 2005

Percentage of Time 3.a. As part of the same question, I asked for the percentage of time spent on the various stages of working with the use data in 2005 Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing Other Table is of median percentages from respondents Several respondents stated that this was difficult to estimate These are estimates. There can be varying levels of personnel working on this, from student workers, library paraprofessionals, and librarians.

Percentage of Time Processing Vendor Data Look at the ALL bar: 20/20/20/20 3.a. Then I asked for an estimate of the percentage of time spent on the various stages of working with the use data: Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing (this was especially difficult for some because it may be split among several people who do it at varying levels of intensity) Other Several respondents stated that this was difficult to estimate Bar chart shows median percentages from respondents These are estimates. There can be varying levels of personnel working on this, from student workers, library paraprofessionals, and librarians.

Other Work with Vendor Data Interacting with the vendor about problems Making data available on staff intranet Using data in reports 3. Other …If they answered, then what “other” work did you do? 18 responded “Yes” to doing other work with vendor data I analyzed the written answers By far the most frequent type of “other” work with vendor data was interacting with the vendor about problems: How to get the data (especially with new e-resource titles) Changes in username/password Change in URL and format to get to the data (hunting around the page for admin area…, login to the e-res site with admin username/password or login to a special admin URL) Data not being emailed This past year, many vendors were converting to revision 2 of COUNTER, so formats changed for some and not others Making data available online (intranet) – for internal use Using the use data in reports Among some of the other reasons (with single answers: Input to ERM

Combine Data from Different Sources to Look at Use Combine vendor stats (36) Combine / compare with other use data gathered electronically (SFX, web logs, consortia reports) (17) Cost (12) Fund code/subject (5) Other (12) 4. Did you combine the data from different sources in order to look at usage of e-journals across vendors? If yes, please describe what you do. Looking for other sources of data – fund codes, cost, etc. 53 responded “yes” to this question. My interpretation of the written answers – self coded: Combine vendor stats – 36 THIS WAS THE MOST COMMON (db vs. journal) (Should have anticipated this answer and asked about totals or detail by title) Combine/compare with other use data – 17 Cost – 12 Fund code / subject – 5 (not easy to do @ individual journal level) Other (12) Identify the “other” data: Combine data by title from aggregator and publisher sources Add more…

Biggest Challenges Lack of consistency / standards (61) Takes too much time (27) COUNTER standards help but… (14) 5. What are the biggest challenges that you face in making effective use of vendor usage statistics? 61 stated that inconsistency / lack of standards are still a problem 27 stated that dealing with the statistics took too much time. “I’ve heard of software that “grabs” stats for a library. This has to be the future since otherwise, this business is way too staff intensive.” 14 mentioned that the COUNTER standards have been helpful; and most of them indicated lack of consistency/standards as the biggest challenge: Definitions (especially trying to merge COUNTER and non-COUNTER data) Inconsistencies in so-called COUNTER-compliant reports (ISSN, title (all caps, leading THE, other info in the title field), non-journals listed in JR-1, listing of non-subscribed titles in report, …) .”Before discussing challenges, I would have to say that COUNTER has helped a lot. The biggest challenge is apples vs. oranges and being sure that we correctly understand whatever denomination the vendor is using. … The next biggest problem is the sheer range of practices – push vs. pull, different denominations, changing passwords, revised data after an error has been detected, people who leave the company without passing along the responsibility to sent you data, etc….

Most Useful Statistic(s) Number of full-text downloads (67) Number of searches (41) Number of sessions (27) COUNTER statistics (26) Number of turnaways (17) Other (17) The next to last question, before asking about FTEs for demographic purposes was: 6. What usage statistic or report from vendors do you find most useful and why? Answers categorized: # full-text downloads # searches for databases # sessions / logins for databases (mention Tim Bucknell at UNC-Greensboro talk at CC last November) # turnaways where access is limited Mention COUNTER Other “…our most important ejournal usage data is also generated internally through our link-resolver / knowledgebase.

General Conclusions COUNTER helps, but does not go far enough to ensure consistency. ISSN and Title formats Lack of information about source for journals in aggregator products Inclusion of non-journals in JR-1 report “Not everyone does COUNTER the same way.”

General Conclusions COUNTER - continued Zero-use titles not included Inclusion of full title list, not just subscribed Inclusion of data about trials COUNTER does not take into account US reporting of data for June-July (ARL/ACRL)

General Conclusions Lack of resources (time, staff, technical expertise, etc.) to spend on processing and analyzing vendor use data. We plan to write an article with more detail on this work. “Absence of a standardized access and retrievel interface is requiring time expenditure that could be better used in analysis. We are eagerly awaiting approval and implementation of SUSHI standard.”

Suggestions for Further Research How are use data utilized in decision making? What data are we not getting from the vendor or how can the vendor enhance the use data to be more useful? More data from the vendor: increase the granularity--- Year of publication ranges Subject headings Subscribed/non-subscribed indicator in “big deals” Cost (list price)

Suggestions for Further Research Cost per use (T. Koppel talk) How can one utilize use data for e-books and reference works (new COUNTER standard)? How can a library use SUSHI (XML) if they do not have an ERMS to deal with usage statistics? Cost per use (go to T. Koppel talk.– when / title?

QUESTIONS?