MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.

Slides:



Advertisements
Similar presentations
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Advertisements

Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Marketing Library Poster Services via a USB Flash Drive Catherine Soehner, Associate Dean for Research and Learning Services, J. Willard Marriott Library,
Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented.
CMU has been said to be the “most wired campus in the US” for two years in a row. What kind of impact does such a infrastructure has on the daily academic.
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
Brinley Franklin Vice Provost, University of Connecticut Libraries MINES for Libraries Measuring the Impact of Networked Electronic Services (MINES): The.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Ontario University Library Consortia Activity Ontario University Library Consortia Activity Gwendolyn Ebbett Dean of the Library University of Windsor.
DigiQUAL™ Regrounding LibQUAL+® for the Digital Library Environment: An Analysis of the DigiQUAL® Data Presented at 9 th Northumbria by.
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
What do they think of us? Rutgers University Libraries Brand Assessment Study VALE Assessment Fair 2014 Rose Barbalace Christine.
Copyright Shanna Smith & Tom Bohman (2003). This work is the intellectual property of the authors. Permission is granted for this material to be shared.
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
LibQUAL+ Process Overview Introduction to LibQUAL+ Workshop University of Westminster, London 21st January 2008 Selena Killick Association of Research.
Health Center Library Overview Evelyn Morgen, Director March 2007.
Academic Research Library Support of Sponsored Research in the United States Brinley Franklin Vice Provost University of Connecticut Libraries Qualitative.
Measuring the Impact of Networked Electronic Services: Developing Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
E-book Resources for HINARI Users (Advanced Course Module 7)
Business and Management Research
Evaluating and Purchasing Electronic Resources- The University of Pittsburgh Experience Sarah Aerni Special Projects Librarian University of Pittsburgh.
Technical Implementation of the MINES Survey Methodology ACRL 2005 Minneapolis, MN April 7, 2005 Terry Plum Assistant Dean, Simmons GSLIS Association of.
Mary Beth Schell Adam Dodd NC AHEC Digital Library National AHEC: Wednesday June 23, 2010 Using Social Networking Tools to Support Graduate Medical Education:
Alice’s Adventures in LibQual-Land Kitty Tynan Assistant Director for Public Services CUA Libraries All illustrations from The Victorian Web: A Tenniel.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
University of Kentucky Proxy Service Presentation By Kelly Vickery
New Ways of Listening To Our Users: LibQUAL+ Queen’s.
Getting Staff Involved in Assessment at the University of Connecticut Libraries Brinley Franklin 17 August 2009.
Martha Kyrillidou Brinley Franklin Terry Plum MINES for Libraries TM ASSOCIATION OF RESEARCH LIBRARIES Assessing the Value of Networked Electronic Services:
ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL Association of College and Research Libraries ACRL Conference 2005 Minneapolis.
Demonstrating Value and Creating Value: Evidence-Based Library Management through MINES for Libraries™
HELPING YOUR LIBRARY BE THE BEST PARTNER FOR RESEARCH.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
Web-Based Usage Surveys MINES for Libraries TM Brinley Franklin University of Connecticut Terry Plum Simmons GSLIS
Dana Thomas, Ontario Council of University Libraries Alan Darnell, Ontario Council of University Libraries Understanding Use of Networked Information Content:
LibQUAL+™, Libraries, and Google™ CNI Spring 2005 Task Force Meeting Washington, DC 4/4/2005 Martha Kyrillidou Fred Heath Jonathan D. Sousa old.libqual.org.
Assessing the Scholars Portal Melody Burton, Queen’s University Toni Olshen, York University Ontario Library Association Superconference February 3, 2005.
LibQUAL+ ® Survey Administration American Library Association Midwinter Meeting Denver, CO January 26, 2009 Presented by: MaShana Davis Technical Communications.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
Brinley Franklin Vice Provost University of Connecticut Libraries.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Use Measures for Electronic Resources: Theory and Practice A Librarian’s Perspective ALA / ALCTS June 27, 2005 Chicago, Illinois Brinley Franklin Vice.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
® LibQUAL+ ® Implementation Procedures The Third Lodz [Poland] Library Conference Technical University of Lodz June, 2008 Presented by: Bruce Thompson.
LibQUAL+ ® Survey Administration LibQUAL+® Exchange Northumbria Florence, Italy August 17, 2009 Presented by: Martha Kyrillidou Senior Director, Statistics.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
LibQUAL 2005 at London South Bank and a Lincolnshire man in Chicago.
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
Brinley Franklin Vice Provost University of Connecticut Libraries MINES for Libraries.
Project URL – TM QUANTITATIVE EVIDENCE Auckland, NZ April 5, 2005 Presented by: Colleen Cook Bruce Thompson.
Brinley Franklin Vice Provost University of Connecticut Libraries.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Greater Western Library Alliance 2006 Spring Membership Meeting Arizona State University, Tempe AZ February 27, 2006 Brinley Franklin Vice Provost, University.
Webmetrics Workshop American Library Association Chicago, IL June 24, 2004 Martha Kyrillidou Director, ARL Statistics and Measurement Program Association.
Assessing the Value and Impact of Digital Content Brinley Franklin Vice Provost for University Libraries University of Connecticut March 1, 2007.
Making MINES for Libraries © Work for Your Library Karen R. Harker, Collection Assessment Librarian Priya B. Parwani, Collection Development Graduate Library.
Brinley Franklin and Terry Plum August 18, 2005
ARL New Measures MINES for Libraries Measuring the Impact of Networked Electronic Services Brinley Franklin Vice Provost University of Connecticut Libraries.
E-book Resources for HINARI Users (Advanced Course Module 7)
Evaluating the Portal: The Story Behind the Numbers
Assessing Library Performance:
E-book Resources for HINARI Users (Advanced Course Module 7)
Brinley Franklin Vice Provost University of Connecticut Libraries.
E-book Resources for HINARI Users (Advanced Course Module 7 Part A)
Standards for Use Measures for Electronic Resources
MINES for Libraries Measuring the Impact of Networked Electronic Services (MINES): The North American Experience Brinley Franklin Vice Provost, University.
The Story Behind the Numbers: Measuring the Impact of Networked Electronic Services (MINES) and the Assessment of the Ontario Council of University Libraries’
E-book Resources for HINARI Users (Advanced Course: Module 7 Part A)
Presentation transcript:

MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service Quality Programs Association of Research Libraries at Rutgers University Library June New Brunswick, NJ

ARL Overall

Libraries Remain a Credible Resource in 21 st Century 98% agree with statement, “My … library contains information from credible and known sources.” Note. Digital Library Federation and Council on Library and Information Resources. (2002). Dimensions and Use of the Scholarly Information Environment.

Changing Behaviors Only 15.7% agreed with the statement “The Internet has not changed the way I use the library.” Note. Digital Library Federation and Council on Library and Information Resources. (2002). Dimensions and Use of the Scholarly Information Environment.

ARL Toolkit… StatsQUAL+™ –ARL Statistics –LibQUAL+® –E-Metrics –DigiQUAL+™ –MINES for Libraries™

What is MINES? Action research –Historically rooted in indirect cost studies –Set of recommendations for research design –Set of recommendations for web survey presentation –Set of recommendations for information architecture in libraries –Plan for continual assessment of networked electronic resources –An opportunity to benchmark across libraries

MINES for Libraries TM MINES is a transaction-based research methodology consisting of a web-based survey form and a random moments sampling plan MINES typically measures who is using electronic resources, where users are located at the time of use, and their purpose of use in the least obtrusive way MINES was adopted by the Association of Research Libraries (ARL) as part of the “New Measures” toolkit in May, MINES is different from other electronic resource usage measures that quantify total usage (e.g., Project COUNTER, E- Metrics) or measure how well a library makes electronic resources accessible (LibQUAL+™).

Questions Addressed By MINES for Libraries™ for the OCUL Scholars Portal How extensively do sponsored researchers use OCUL’s Scholars Portal? How much usage is for non-funded research, instruction/education, student research papers, and course work? Are researchers more likely to use the Scholars Portal from inside or outside the library? What about other classifications of users? Are there differences in Scholars Portal based on the user’s location (e.g., in the library; on-campus, but not in the library; or off-campus)? Could MINES, combined with usage counts, provide an infrastructure to make Scholars Portal usage studies routine, robust, and easily integrated into OCUL’s administrative decision-making process for assessing networked electronic resources?

MINES for Libraries TM Survey Form Five Questions and a Comment Box

Methodological considerations Experience with the MINES Survey Terry Plum Assistant Dean Simmons GSLIS Rutgers University June 1, 2007

Issues with web surveys Research design –Coverage error Unequal access to the Internet Internet users are different than non-users –Response rate Response representativeness –Random sampling and inference –Non-respondents Data security

MINES strategy A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library. Random moment/web-based surveys are employed at each site. Participation is usually mandatory, negating non- respondent bias, and is based on actual use in real-time. –IRB waiver or approval Libraries with database-to-web gateways or proxy re- writers offer a comprehensive networking solution for surveying all networked services users during survey periods.

MINES strategy Placement –Point of use –Not remembered, predicted or critical incident Usage rather than user –What about multiple usages –Time out ? –Cookie or other mechanism with auto-population or more recently counting invisibly with a time out. Distinguish patron association with libraries. –For example, medical library v. main library. –But what if the resources are purchased across campus for all. Then how to get patron affiliation?

Web Survey Design Guidelines Web survey design guidelines that MINES followed: –Presentation Simple text for different browsers – no graphics –Different browsers render web pages differently Few questions per screen or simply few questions Easy to navigate Short and plain No scrolling Clear and encouraging error or warning messages Every question answered in a similar way - consistent –Radio buttons, drop downs ADA compliant Introduction page or paragraph Easy to read –Must see definitions of sponsored research. Can present questions in response to answers – for example if sponsored research was chosen, could present another survey

Quality Checks Target population is the population frame – surveyed the patrons who were supposed to be surveyed - except in libraries with outstanding open digital collections. Check usage against IP. In this case, big numbers may not be good. May be seeing the survey too often. Alter order of questions and answers, particularly sponsored and instruction. Spot check IP against self-identified location Spot check undergraduates choosing sponsored research – measurement error Check self-identified grant information against actual grants Content validity – discussed with librarians and pre-tested. Turn-aways – number who elected not to fill out the survey Library information architecture -- Gateway v. HTML pages – there is a substantial difference in results.

Documenting the Purpose and Use of Electronic Resources: Experience with the MINES Survey Brinley Franklin Vice Provost for University Libraries University of Connecticut Rutgers University June 1, 2007

“It is useless to tell the acquisitions librarian that half the monographs ordered will never be used, unless we can specify which 50% to avoid buying.” (Galvin and Kent, 1977)

Reliance on Vendor Statistics Vendor statistics, while more reliable than in the past, are still maturing.

Measuring Digital Content Use The most popular current method of measuring usage of electronic resources by libraries is not through web- based usage surveys, but through vendor supplied data of library patron usage or transaction usage. Web-based usage surveys are increasingly relevant in the collection of usage data to make collection development and service decisions, to document evidence of usage by certain patron populations, and to collect and analyze performance outputs. Brinley Franklin and Terry Plum, “Successful Web Survey Methodologies for Measuring the Impact of Networked Electronic Services (MINES for Libraries TM )” IFLA Journal 32 (1) March, 2006

A web-based transactional survey that collects data on users’ demographics and their purpose of use. It is administered in real time over the course of at least a year using a random moments sampling plan. MINES for Libraries TM has been administered at 40 North American universities in the last four years. More than 100,000 North American networked services users have been surveyed using a standard protocol.

Library User Survey Patron Status

Library User Survey Affiliation

Library User Survey Location

Library User Survey Purpose

Sample Survey Data File Generated Other UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://newfirstsearch.oclc.org/done=referer;dbname=WorldCat;autho= ;FSIP12:36:5012/3/2004Off CampusUConn Faculty Family StudiesInstruction/Education/Departmental (Non-Funded) Researchhttp:// CampusUConn Undergraduate Student Non-UConnOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First13:08:4112/3/2004Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html13:31:2912/3/2004Off CampusNon-UConn Non-UConnOther Activitieshttp://magic.lib.uconn.edu/index_real.html12:11:0612/3/2004Off CampusNon-UConn Agriculture & Natural ResourcesInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html12:33:5712/3/2004 Off CampusNon-UConn EducationInstruction/Education/Departmental (Non-Funded) Researchhttp:// Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi-bin/Pwebrecon.cgi?DB=local&PAGE=First13:28:5212/3/2004 Off CampusNon-UConn Business AdministrationOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:56:4612/3/2004In the LibraryUConn Faculty Liberal Arts & SciencesOther Activitieshttp:// Campus - StorrsUConn Graduate Student EngineeringInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:04:3112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/pqdweb?RQT=31812:16:3312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp:// Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:29:5312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:48:4112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/login?COPT=SU5UPTAmVkVSPTImREJTPTE3MjErMysxNkJD&clientId= :04:2312/3/2004On Campus - StorrsUConn Graduate Student

Demographics by Location of User U.S. Main Libraries On Campus, Not in the Library n = 6,391 Inside the Library n = 9,172 Off-Campus n = 4,953 MINES for Libraries™

Off-Campus n = 9,163 Demographics by Location of User Ontario Council of University Libraries Inside the Library n = 4,047 On Campus, Not in the Library n = 7,090

Purpose of Use Are users engaged in coursework, funded (or unfunded) research, public service, patient care, or other activities?

Purpose of Use By Location U.S. Main Campus Libraries 2003 – 2005 *72% of sponsored research usage of electronic resources occurred outside the library; 83% took place on campus. On-Campus, Not in the Library n = 9,460 In the Library n = 9,733 Off-Campus n = 7,790 Overall Use n = 26, %

OCUL Scholars Portal Users Purpose of Use In a sample of 20,300 electronic resources uses at OCUL libraries, there were four uses outside the library for each use in the library.

Questions? Learn more about LibQUAL +TM, DigiQUAL TM, & MINES for Libraries TM at:

Analysis Web deliverables: –Crosstabulations in html for all OCUL data –Interactive crosstabs for all OCUL and institutions Print deliverables: –summary tables for OCUL –summary tables for each institution –Final report

OCUL Scholars Portal Usage Affiliation

Affiliation by Purpose of Use Purpose of Use AffiliationCoursework Other Activities Other Research Patient CareSponsoredTeachingTotal Applied Sciences24.0%7.6%17.7%0.6%46.3%3.7%100.0% Business34.8%7.6%30.0%0.9%10.8%16.0%100.0% Education40.9%5.4%17.1%0.8%11.8%24.0%100.0% Environmental Studies43.5%2.5%24.0%0.3%23.3%6.3%100.0% Fine Arts56.3%6.9%20.6%1.3%5.6%9.4%100.0% Humanities51.5%10.8%21.0%0.5%9.5%6.7%100.0% Law67.5%6.8%12.8%0.9%2.6%9.4%100.0% Medical Health29.7%5.5%18.4%8.6%32.0%5.7%100.0% Other51.9%22.8%10.9%2.1%7.4%5.0%100.0% Sciences44.6%9.7%11.1%0.4%31.8%2.4%100.0% Social Sciences62.6%4.5%14.4%0.7%13.6%4.2%100.0% Total42.0%7.5%16.2%2.4%26.2%5.6%100.0%

User Status by Purpose of Use Purpose of Use User StatusCoursework Other Activities Other Research Patient CareSponsoredTeachingTotal Faculty1.5%4.7%21.2%4.4%42.6%25.6%100.0% Graduate Professional19.5%3.9%25.5%2.5%45.4%3.2%100.0% Library Staff23.5%24.1%13.1%16.5%17.7%5.2%100.0% Other6.0%35.2%20.8%8.7%26.8%2.5%100.0% Staff3.5%9.5%20.6%2.1%51.6%12.7%100.0% Undergraduate75.8%7.8%7.7%0.9%5.9%1.9%100.0% Total42.0%7.5%16.2%2.4%26.2%5.6%100.0%

Location by Purpose of Use Purpose of Use LocationCoursework Other Activities Other Research Patient CareSponsoredTeachingTotal Library52.8%14.9%10.8%1.2%12.3%7.9%100.0% Off-campus47.2%7.0%17.3%4.1%19.9%4.6%100.0% On-campus29.2%4.0%17.9%0.9%42.2%5.7%100.0% Total42.0%7.5%16.2%2.4%26.2%5.6%100.0%

Reason for Use Reason for Use (n=20293)FrequencyReason for Use (n=20293)Percent Important Journal10219Important Journal50.4% Recommended Colleague2436Recommended Colleague12.0% Reference/Citation6090Reference/Citation30.0% Recommended Librarian620Recommended Librarian3.1% Course Reading925Course Reading4.6% Other4388Other21.6%

Library Assessment Conference August 4-6, 2008 Seattle, WA, USA