Presentation is loading. Please wait.

Presentation is loading. Please wait.

Vendor Usage Data for Electronic Resources: A Survey of Libraries

Similar presentations


Presentation on theme: "Vendor Usage Data for Electronic Resources: A Survey of Libraries"— Presentation transcript:

1 Vendor Usage Data for Electronic Resources: A Survey of Libraries
ER&L 2007 Conference February 23, 2007 Gayle Baker & Eleanor Read

2 MaxData “Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis” Funded by Institute of Museum and Library Services (IMLS)

3 MaxData Project: UT Libraries Team
Gayle Baker, UT Libraries Eleanor Read, UT Libraries Maribeth Manoff, UT Libraries Ron Gilmour, UT Libraries Carol Tenopir, Project Director, UT SIS and Center for Information Studies

4 Talk Outline Survey (why, who, when) Responses General Conclusions
Suggested areas for more research

5 Survey: Purpose How much effort is involved in working with vendor-supplied use data? How are the data used? What data are most useful in managing electronic resources? Mention Yovits-Ernst definition of information as data of value used in decision-making. At the UT Libraries, before the project, we: downloaded individual usage reports every 6 months. For ARL, we put together totals of July-June downloads, searches, sessions for spreadsheet for ARL supplemental stats. (Approx. 120 hours or 3 weeks every 6 months) put them up on an intranet by vendor/publisher and told the subject librarians that they were available. No detail analysis by title. In a few instances, mainly “big deals,” we added price data and identified subscribed titles, We looked at the use of subscribed vs. non-subscribed titles, determined that some non-subscribed titles had very high use, while some subscribed titles had little use. We were able to use this information to drop unused subscriptions and add unsubscribed titles of high use as subscriptions. Vendor definition – delivers content to the library and with whom the library has a contractual agreement for content delivery

6 Survey: Subjects Sent to Library Directors at Carnegie I and II research institutions (360+) April 2006 99 replies, 92 respondents A copy of the survey is on the conference web site for this talk. Library directors were asked to forward the survey to the person in their library who worked with vendor-supplied usage statistics. I would like to thank those of you who responded to the survey. We received 99 replies, with 92 that answered the survey. Replies not counted as respondents: Some duplicates Indication that they did not receive vendor-supplied usage data Some not academic institutions with an FTE (government library??—check out) Very thoughtful letter about the difficulty providing answers to the questions

7 FTE of Respondents

8 Number of Vendors Providing Usage Reports
1. Please provide the following information about usage statistics for electronic resources that you receive from vendors. Number of vendors that provide reports ____ There may have been some confusion between the number of different vendors and the number of individual e-resources. I was going after the number of individual sources/sites of vendor usage data. On the histogram, you can see… We are not looking at the mean to make decisions; median may be better. Why median, not mean: In most cases the responses were not normally distributed. So, we use the median.. No statistical tests were done. This was descriptive analysis.

9 Number of Vendors Providing Usage Reports
Here’s a graphical representation to show you how the data is distributed with a few high values.

10 Reports for Different Types of Resources
We asked what percentage of the reports relate to various types of electronic resources: 1.b. Full-text journals Abstract & Index databases (no full-text journals) Electronic books Other electronic resources I probably should have had a category for what I call hybrids – resources with components of all of these (like LION, which has full-text of literature and criticism, reference sources, etc. Median responses in table

11 Reports for Different Types of Resources
This is a graphical representation of the chart that I just showed you. Note that some of the data did not add up to 100%u from some respondents. Note that the bar for ALL group and the groups with more respondents are very similar. Reports for A&I and Full-text account for most ot the vendor reports.

12 Purpose for Reviewing and/or Analyzing Vendor Data
2. We then asked for what purpose or purposes are you reviewing and/or analyzing vendor-supplied usage data? We offered 4 options for responding: Meeting reporting requirements (like ARL and ACRL) Make subscription decisions Justify expenditures Other

13 Other Purposes Collection management
Cost/use Cancellation decisions Change to electronic-only Promotion / marketing / training for lower use e-resources Administrative Strategic planning / budget Curiosity 2. Other reasons to work with this data. >>>Look for some quotes

14 Number of Hours Processing Usage Reports in 2005
3. In 2005, approximately how many hours did you and/or your staff spend processing usage reports from vendors? Processing may include: Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing Other Asking for an estimate

15 Number of Hours Processing Usage Reports in 2005

16 Percentage of Time 3.a. As part of the same question, I asked for the percentage of time spent on the various stages of working with the use data in 2005 Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing Other Table is of median percentages from respondents Several respondents stated that this was difficult to estimate These are estimates. There can be varying levels of personnel working on this, from student workers, library paraprofessionals, and librarians.

17 Percentage of Time Processing Vendor Data
Look at the ALL bar: 20/20/20/20 3.a. Then I asked for an estimate of the percentage of time spent on the various stages of working with the use data: Downloading Reformatting (csv to xls, normalizing data (issn, titles), adding/removing information from data fields Manipulating (sorting, merging with other types of data, merging totals by title, …) Reviewing and/or analyzing (this was especially difficult for some because it may be split among several people who do it at varying levels of intensity) Other Several respondents stated that this was difficult to estimate Bar chart shows median percentages from respondents These are estimates. There can be varying levels of personnel working on this, from student workers, library paraprofessionals, and librarians.

18 Other Work with Vendor Data
Interacting with the vendor about problems Making data available on staff intranet Using data in reports 3. Other …If they answered, then what “other” work did you do? 18 responded “Yes” to doing other work with vendor data I analyzed the written answers By far the most frequent type of “other” work with vendor data was interacting with the vendor about problems: How to get the data (especially with new e-resource titles) Changes in username/password Change in URL and format to get to the data (hunting around the page for admin area…, login to the e-res site with admin username/password or login to a special admin URL) Data not being ed This past year, many vendors were converting to revision 2 of COUNTER, so formats changed for some and not others Making data available online (intranet) – for internal use Using the use data in reports Among some of the other reasons (with single answers: Input to ERM

19 Combine Data from Different Sources to Look at Use
Combine vendor stats (36) Combine / compare with other use data gathered electronically (SFX, web logs, consortia reports) (17) Cost (12) Fund code/subject (5) Other (12) 4. Did you combine the data from different sources in order to look at usage of e-journals across vendors? If yes, please describe what you do. Looking for other sources of data – fund codes, cost, etc. 53 responded “yes” to this question. My interpretation of the written answers – self coded: Combine vendor stats – 36 THIS WAS THE MOST COMMON (db vs. journal) (Should have anticipated this answer and asked about totals or detail by title) Combine/compare with other use data – 17 Cost – 12 Fund code / subject – 5 (not easy to individual journal level) Other (12) Identify the “other” data: Combine data by title from aggregator and publisher sources Add more…

20 Biggest Challenges Lack of consistency / standards (61)
Takes too much time (27) COUNTER standards help but… (14) 5. What are the biggest challenges that you face in making effective use of vendor usage statistics? 61 stated that inconsistency / lack of standards are still a problem 27 stated that dealing with the statistics took too much time. “I’ve heard of software that “grabs” stats for a library. This has to be the future since otherwise, this business is way too staff intensive.” 14 mentioned that the COUNTER standards have been helpful; and most of them indicated lack of consistency/standards as the biggest challenge: Definitions (especially trying to merge COUNTER and non-COUNTER data) Inconsistencies in so-called COUNTER-compliant reports (ISSN, title (all caps, leading THE, other info in the title field), non-journals listed in JR-1, listing of non-subscribed titles in report, …) .”Before discussing challenges, I would have to say that COUNTER has helped a lot. The biggest challenge is apples vs. oranges and being sure that we correctly understand whatever denomination the vendor is using. … The next biggest problem is the sheer range of practices – push vs. pull, different denominations, changing passwords, revised data after an error has been detected, people who leave the company without passing along the responsibility to sent you data, etc….

21 Most Useful Statistic(s)
Number of full-text downloads (67) Number of searches (41) Number of sessions (27) COUNTER statistics (26) Number of turnaways (17) Other (17) The next to last question, before asking about FTEs for demographic purposes was: 6. What usage statistic or report from vendors do you find most useful and why? Answers categorized: # full-text downloads # searches for databases # sessions / logins for databases (mention Tim Bucknell at UNC-Greensboro talk at CC last November) # turnaways where access is limited Mention COUNTER Other “…our most important ejournal usage data is also generated internally through our link-resolver / knowledgebase.

22 General Conclusions COUNTER helps, but does not go far enough to ensure consistency. ISSN and Title formats Lack of information about source for journals in aggregator products Inclusion of non-journals in JR-1 report “Not everyone does COUNTER the same way.”

23 General Conclusions COUNTER - continued Zero-use titles not included
Inclusion of full title list, not just subscribed Inclusion of data about trials COUNTER does not take into account US reporting of data for June-July (ARL/ACRL)

24 General Conclusions Lack of resources (time, staff, technical expertise, etc.) to spend on processing and analyzing vendor use data. We plan to write an article with more detail on this work. “Absence of a standardized access and retrievel interface is requiring time expenditure that could be better used in analysis. We are eagerly awaiting approval and implementation of SUSHI standard.”

25 Suggestions for Further Research
How are use data utilized in decision making? What data are we not getting from the vendor or how can the vendor enhance the use data to be more useful? More data from the vendor: increase the granularity--- Year of publication ranges Subject headings Subscribed/non-subscribed indicator in “big deals” Cost (list price)

26 Suggestions for Further Research
Cost per use (T. Koppel talk) How can one utilize use data for e-books and reference works (new COUNTER standard)? How can a library use SUSHI (XML) if they do not have an ERMS to deal with usage statistics? Cost per use (go to T. Koppel talk.– when / title?

27 QUESTIONS?


Download ppt "Vendor Usage Data for Electronic Resources: A Survey of Libraries"

Similar presentations


Ads by Google