Measuring the Impact of Networked Electronic Services: Developing Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented.

Slides:



Advertisements
Similar presentations
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Advertisements

Group Purchases of Korean Online Databases Mikyung Kang Korean Studies Librarian UCLA.
Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Marketing Library Poster Services via a USB Flash Drive Catherine Soehner, Associate Dean for Research and Learning Services, J. Willard Marriott Library,
Measuring the Impact of Networked Electronic Resources: Developing an Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
Brinley Franklin Vice Provost, University of Connecticut Libraries MINES for Libraries Measuring the Impact of Networked Electronic Services (MINES): The.
Ontario University Library Consortia Activity Ontario University Library Consortia Activity Gwendolyn Ebbett Dean of the Library University of Windsor.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
How to use Student Voice Training Session Gary Ratcliff, AVC - Student Life.
Copyright Shanna Smith & Tom Bohman (2003). This work is the intellectual property of the authors. Permission is granted for this material to be shared.
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
LibQUAL+ Process Overview Introduction to LibQUAL+ Workshop University of Westminster, London 21st January 2008 Selena Killick Association of Research.
Julia Bauder, Grinnell College & Jenny Emanuel, University of Illinois Be Where our Faculty Are: Emerging Technology Use and Faculty Information Seeking.
Academic Research Library Support of Sponsored Research in the United States Brinley Franklin Vice Provost University of Connecticut Libraries Qualitative.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Proposed Cross-center Project Survey of Federally Qualified Health Centers Vicky Taylor & Vicki Young.
The ARL E-Metrics Project Measures for Electronic Resources ACRL/NEC Information Technology Interest Group May 17, 2002 Brinley Franklin Director, University.
Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge University at Albany, SUNY American Library Association ALCTS Affiliates.
Evaluating and Purchasing Electronic Resources- The University of Pittsburgh Experience Sarah Aerni Special Projects Librarian University of Pittsburgh.
Technical Implementation of the MINES Survey Methodology ACRL 2005 Minneapolis, MN April 7, 2005 Terry Plum Assistant Dean, Simmons GSLIS Association of.
Project web site: old.libqual.org LibQUAL+™ from a Technological Perspective: A Scalable Web-Survey Protocol across Libraries Spring 2003 CNI Task Force.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
University of Kentucky Proxy Service Presentation By Kelly Vickery
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
LibQUAL+ ® Survey Administration Introduction to LibQUAL+ University of Westminster, London February 5, 2010 Selena Killick ARL/SCONUL LibQUAL+ Administrator.
Types of Assessment Satisfaction of the customer. Satisfaction of the worker. Workflow effectiveness and speed. Service delivery effectiveness and speed.
Getting Staff Involved in Assessment at the University of Connecticut Libraries Brinley Franklin 17 August 2009.
Martha Kyrillidou Brinley Franklin Terry Plum MINES for Libraries TM ASSOCIATION OF RESEARCH LIBRARIES Assessing the Value of Networked Electronic Services:
ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL Association of College and Research Libraries ACRL Conference 2005 Minneapolis.
Demonstrating Value and Creating Value: Evidence-Based Library Management through MINES for Libraries™
Old.libqual.org What will ARL do for you? What will you do? January 2005 Shrivenham, UK.
February 28, 2008The Teaching Center, Washington University The Teaching Citation Program & Creating a Teaching Portfolio Beth Fisher, Ph.D. Assistant.
ZLOT Prototype Assessment John Carlo Bertot Associate Professor School of Information Studies Florida State University.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
Web-Based Usage Surveys MINES for Libraries TM Brinley Franklin University of Connecticut Terry Plum Simmons GSLIS
Dana Thomas, Ontario Council of University Libraries Alan Darnell, Ontario Council of University Libraries Understanding Use of Networked Information Content:
LibQUAL+™, Libraries, and Google™ CNI Spring 2005 Task Force Meeting Washington, DC 4/4/2005 Martha Kyrillidou Fred Heath Jonathan D. Sousa old.libqual.org.
Assessing the Scholars Portal Melody Burton, Queen’s University Toni Olshen, York University Ontario Library Association Superconference February 3, 2005.
LibQUAL+ ® Survey Administration American Library Association Midwinter Meeting Denver, CO January 26, 2009 Presented by: MaShana Davis Technical Communications.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Brinley Franklin Vice Provost University of Connecticut Libraries.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
Use Measures for Electronic Resources: Theory and Practice A Librarian’s Perspective ALA / ALCTS June 27, 2005 Chicago, Illinois Brinley Franklin Vice.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
® LibQUAL+ ® Implementation Procedures The Third Lodz [Poland] Library Conference Technical University of Lodz June, 2008 Presented by: Bruce Thompson.
LibQUAL+ ® Survey Administration LibQUAL+® Exchange Northumbria Florence, Italy August 17, 2009 Presented by: Martha Kyrillidou Senior Director, Statistics.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
March 14, 2009 ACRL 14 th National Conference Seattle, WA ClimateQUAL™: Organizational Climate and Diversity Assessment Presented by Martha Kyrillidou,
ClimateQUAL™: Organizational Climate and Diversity Assessment Sue Baughman Texas Library Association April 2009.
Library Online Resource Analysis (LORA) System Introduction Electronic information resources and databases have become an essential part of library collections.
Brinley Franklin Vice Provost University of Connecticut Libraries MINES for Libraries.
Your LibQUAL+ ® Community: A Results Meeting American Library Association (ALA) Annual Conference Washington, DC June 25, 2007 Martha Kyrillidou, Director.
Brinley Franklin Vice Provost University of Connecticut Libraries.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Greater Western Library Alliance 2006 Spring Membership Meeting Arizona State University, Tempe AZ February 27, 2006 Brinley Franklin Vice Provost, University.
Webmetrics Workshop American Library Association Chicago, IL June 24, 2004 Martha Kyrillidou Director, ARL Statistics and Measurement Program Association.
Assessing the Value and Impact of Digital Content Brinley Franklin Vice Provost for University Libraries University of Connecticut March 1, 2007.
Making MINES for Libraries © Work for Your Library Karen R. Harker, Collection Assessment Librarian Priya B. Parwani, Collection Development Graduate Library.
Brinley Franklin and Terry Plum August 18, 2005
ARL New Measures MINES for Libraries Measuring the Impact of Networked Electronic Services Brinley Franklin Vice Provost University of Connecticut Libraries.
LibQUAL+® Survey Administration American Library Association
Evaluating the Portal: The Story Behind the Numbers
Brinley Franklin Vice Provost University of Connecticut Libraries.
Amie Freeman, University of south Carolina
MINES for Libraries Measuring the Impact of Networked Electronic Services (MINES): The North American Experience Brinley Franklin Vice Provost, University.
The Story Behind the Numbers: Measuring the Impact of Networked Electronic Services (MINES) and the Assessment of the Ontario Council of University Libraries’
Presentation transcript:

Measuring the Impact of Networked Electronic Services: Developing Assessment Infrastructure for Libraries, State, and Other Types of Consortia Presented By: Terry Plum, Simmons GSLIS Brinley Franklin, University of Connecticut Martha Kyrillidou, ARL Gary Roebuck, ARL Raynna Bowlby, ARL Consultant & Simmons GSLIS MaShana Davis, ARL Kristina Justh, ARL Library Assessment Conference 2008 University of Washington Seattle, WA

ARL New Measures Toolkit: StatsQUAL ® LibQUAL + ® is a rigorously tested Web- based survey that libraries use to solicit, track, understand, and act upon users‘ opinions of service quality. LibQUAL + ® DigiQUAL ® The DigiQUAL ® online survey designed for users of digital libraries that measures reliability and trustworthiness of Web sites. DigiQUAL ® is an adaptation of LibQUAL + ® in the digital environment. MINES for Libraries ® Measuring the Impact of Networked Electronic Resources (MINES) is an online transaction- based survey that collects data on the purpose of use of electronic resources and the demographics of users. ARL Statistics™ ARL Statistics™ is a series of annual publications that describe the collections, expenditures, staffing, and service activities for Association of Research Libraries (ARL) member libraries. ClimateQUAL™ ClimateQUAL™: Organizational Climate and Diversity Assessment is an online survey that measures staff perceptions about: (a) the library's commitment to the principles of diversity, (b) organizational policies and procedures, and (c) staff attitudes.

What is MINES?  Action Research  Historically rooted in indirect cost studies  Set of recommendations for research design  Set of recommendations for web survey presentation  Set of recommendations for information architecture in libraries  Plan for continual assessment of networked electronic resources  An opportunity to benchmark across libraries

What is MINES? (cont’d)  Measuring the Impact of Networked Electronic Services (MINES)  MINES is a research methodology that measures the usage of networked electronic resources of a library or consortium by a specific category of the patron population.  MINES is a Web-based survey form consisting of 5 questions that is administered at the time of transaction.  MINES measures:  User status and discipline/affiliation (who)  Physical location (where)  Primary purpose and reason of use (why)  MINES a part of Association of Research Libraries’ New Measures & Assessment Initiatives (since 2003).  MINES is different from other electronic resource usage measures that quantify total usage (e.g., Project COUNTER, E-Metrics) or measure how well a library makes electronic resources accessible (LibQUAL+ ® ).

Early Data Collection Activities Academic Medical Libraries  U of Connecticut Health Center  U of North Carolina  U of Texas Medical Branch  U of Texas Southwestern  U of Utah  U of Virginia  Washington U Main Universities  U of Colorado  U of Connecticut  U of North Carolina  Oregon State U  U of Utah  U of Virginia  Washington U Data was collected at seven main campus libraries and seven academic health sciences libraries in the U.S. between 2003 and 2005.

Early Data Collection Activities  More than 45,000 networked electronic services uses were surveyed.  At each library, the MINES survey was one component of a comprehensive cost analysis study that assigned all library costs to sponsored research, instruction/education/non-sponsored research, patient care, other sponsored activities and other activities.

Recent Data Collection Activities via ARL  Ontario Council of University Libraries (OCUL)  University of Iowa Libraries  University of Macedonia

Questions Addressed  How extensively do sponsored researchers use the new digital information environment?  Are researchers more likely to use networked electronic resources from inside or outside the library?  Are there differences in usage of electronic information based on the user’s location (e.g., in the library; on-campus, but not in the library; or off-campus)?  What is a statistically valid methodology for capturing electronic services usage both in the library and remotely through web surveys?  Are particular network configurations more conducive to studies of digital libraries patron use?

Questions Addressed by OCUL Implementation  How extensively do sponsored researchers use OCUL’s Scholars Portal? How much usage is for non-funded research, instruction/education, student research papers, and course work?  Are researchers more likely to use the Scholars Portal from inside or outside the library? What about other classifications of users?  Are there differences in Scholars Portal based on the user’s location (e.g., in the library; on-campus, but not in the library; or off- campus)?  Could MINES, combined with usage counts, provide an infrastructure to make Scholars Portal usage studies routine, robust, and easily integrated into OCUL’s administrative decision-making process for assessing networked electronic resources?

MINES for Libraries ® Survey Form Five Questions and a Comments Box

Methodological Considerations: Experience with the MINES Survey Terry Plum Assistant Dean Simmons GSLIS Library Assessment Conference 2008 University of Washington Seattle, WA

Issues with Web surveys  Research design  Coverage error  Unequal access to the Internet  Internet users are different than non-users  Response rate  Response representativeness  Random sampling and inference  Non-respondents  Data security

MINES Strategy  A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library.  Random moment/web-based surveys are employed at each site.  Participation is usually mandatory, negating non- respondent bias, and is based on actual use in real-time.  IRB waiver or approval  Libraries with database-to-web gateways or proxy re- writers offer a comprehensive networking solution for surveying all networked services users during survey periods.

MINES Strategy  Placement  Point of use  Not remembered, predicted or critical incident  Usage rather than user  What about multiple usages  Time out ?  Cookie or other mechanism with auto-population or more recently counting invisibly with a time out.  Distinguish patron association with libraries.  For example, medical library v. main library.  But what if the resources are purchased across campus for all. Then how to get patron affiliation?

Web Survey Design Guidelines  Web survey design guidelines that MINES followed:  Presentation  Simple text for different browsers – no graphics  Different browsers render web pages differently  Few questions per screen or simply few questions  Easy to navigate  Short and plain  No scrolling  Clear and encouraging error or warning messages  Every question answered in a similar way - consistent  Radio buttons, drop downs  ADA compliant  Introduction page or paragraph  Easy to read  Must see definitions of sponsored research.  Can present questions in response to answers – for example if sponsored research was chosen, could present another survey

Quality Checks  Target population is the population frame – surveyed the patrons who were supposed to be surveyed - except in libraries with outstanding open digital collections.  Check usage against IP. In this case, big numbers may not be good. May be seeing the survey too often.  Alter order of questions and answers, particularly sponsored and instruction.  Spot check IP against self-identified location  Spot check undergraduates choosing sponsored research – measurement error  Check self-identified grant information against actual grants  Content validity – discussed with librarians and pre-tested.  Turn-aways – number who elected not to fill out the survey  Library information architecture -- Gateway v. HTML pages – there is a substantial difference in results.

Documenting the Purpose and Use of Electronic Resources: Experience with the MINES Survey Brinley Franklin Vice Provost for University Libraries University of Connecticut Library Assessment Conference 2008 University of Washington Seattle, WA

“It is useless to tell the acquisitions librarian that half the monographs ordered will never be used, unless we can specify which 50% to avoid buying.” (Galvin and Kent, 1977)

Reliance on Vendor Statistics Vendor statistics, while more reliable than in the past, are still maturing.

 The most popular current method of measuring usage of electronic resources by libraries is not through web- based usage surveys, but through vendor supplied data of library patron usage or transaction usage.  Web-based usage surveys are increasingly relevant in the collection of usage data to make collection development and service decisions, to document evidence of usage by certain patron populations, and to collect and analyze performance outputs. Brinley Franklin and Terry Plum, “Successful Web Survey Methodologies for Measuring the Impact of Networked Electronic Services (MINES for Libraries ® )” IFLA Journal 32 (1) March, 2006 Measuring Digital Content Use

 A web-based transactional survey that collects data on users’ demographics and their purpose of use. It is administered in real time over the course of at least a year using a random moments sampling plan.  MINES for Libraries ® has been administered at 40 North American universities in the last four years. More than 100,000 North American networked services users have been surveyed using a standard protocol. Measuring Digital Content Use

Library User Survey

Library User Survey Patron Status

Library User Survey Affiliation

Library User Survey Location

Library User Survey Purpose

Sample Survey Data File Generated Other UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://newfirstsearch.oclc.org/done=referer;dbname=WorldCat;autho= ;FSIP12:36:5012/3/2004Off CampusUConn Faculty Family StudiesInstruction/Education/Departmental (Non-Funded) Researchhttp:// CampusUConn Undergraduate Student Non-UConnOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First13:08:4112/3/2004Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html13:31:2912/3/2004Off CampusNon-UConn Non-UConnOther Activitieshttp://magic.lib.uconn.edu/index_real.html12:11:0612/3/2004Off CampusNon-UConn Agriculture & Natural ResourcesInstruction/Education/Departmental (Non-Funded) Researchhttp://magic.lib.uconn.edu/index_real.html12:33:5712/3/2004 Off CampusNon-UConn EducationInstruction/Education/Departmental (Non-Funded) Researchhttp:// Off CampusNon-UConn Non-UConnInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi-bin/Pwebrecon.cgi?DB=local&PAGE=First13:28:5212/3/2004 Off CampusNon-UConn Business AdministrationOther Activitieshttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:56:4612/3/2004In the LibraryUConn Faculty Liberal Arts & SciencesOther Activitieshttp:// Campus - StorrsUConn Graduate Student EngineeringInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:04:3112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/pqdweb?RQT=31812:16:3312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp:// Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:29:5312/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://homerweb.lib.uconn.edu/cgi- bin/Pwebrecon.cgi?DB=local&PAGE=First12:48:4112/3/2004On Campus - StorrsUConn Graduate Student Business AdministrationInstruction/Education/Departmental (Non-Funded) Researchhttp://proquest.umi.com/login?COPT=SU5UPTAmVkVSPTImREJTPTE3MjErMysxNkJD&clientId= :04:2312/3/2004O n Campus - StorrsUConn Graduate Student

On Campus, Not in the Library n = 6,391 Inside the Library n = 9,172 Off-Campus n = 4,953 MINES for Libraries ® Demographics by Location of User U.S. Main Libraries

Inside the Library n = 4,047 On Campus, Not in the Library n = 7,090 MINES for Libraries ® Demographics by Location of User Ontario Council of University Libraries Off-Campus n = 9,163

Purpose of Use Are users engaged in coursework, funded (or unfunded) research, public service, patient care, or other activities?

*72% of sponsored research usage of electronic resources occurred outside the library; 83% took place on campus. On-Campus, Not in the Library n = 9,460 In the Library n = 9,733 Off-Campus n = 7,790 Overall Use n = 26,983 66% MINES for Libraries ® Purpose of Use by Location U.S. Main Campus Libraries

MINES for Libraries ® Purpose of Use OCUL Scholars Portal Users In a sample of 20,300 electronic resources uses at OCUL Libraries, there were four uses outside the library for each use in the library.

Analysis  Web deliverables:  Cross-tabulations in HTML for all institutional data  Interactive crosstabs for all institutional  Print deliverables:  Summary tables  Final report

Web Interface

OCUL Scholars Portal Usage Affiliation

OCUL Scholars Portal Usage User Status

OCUL Scholars Portal Usage Location

OCUL Scholars Portal Usage Purpose of Use

Cross Tabulations Affiliation by Purpose of Use

%5.6%26.2%2.4%16.2%7.5%42.0%TOTAL 100.0%1.9%5.9%0.9%7.7%7.8%75.8%Undergraduate 100.0%12.7%51.6%2.1%20.6%9.5%3.5%Staff 100.0%2.5%26.8%8.7%20.8%35.2%6.0%Other 100.0%5.2%17.7%16.5%13.1%24.1%23.5% Library Staff 100.0%3.2%45.4%2.5%25.5%3.9%19.5% Graduate Professional 100.0%25.6%42.6%4.4%21.2%4.7%1.5%Faculty TotalTeachingSponsored Patient Care Other Research Other ActivitiesCourseworkUSER STATUS PURPOSE OF USE Cross Tabulations User Status by Purpose of Use

Cross Tabulations Location by Purpose of Use 100.0%5.6%26.2%2.4%16.2%7.5%42.0%TOTAL 100.0%5.7%42.2%0.9%17.9%4.0%29.2%On-campus 100.0%4.6%19.9%4.1%17.3%7.0%47.2%Off-campus 100.0%7.9%12.3%1.2%10.8%14.9%52.8%Library TotalTeachingSponsored Patient Care Other Research Other ActivitiesCourseworkLOCATION PURPOSE OF USE

4,388Other 925Course Reading 620Recommended Librarian 6,090Reference/Citation 2,436Recommended Colleague 10,219Important Journal Frequency Reason for Use (n=20293) 21.6%Other 4.6%Course Reading 3.1%Recommended Librarian 30.0%Reference/Citation 12.0%Recommended Colleague 50.4%Important Journal Percent Reason for Use (n=20293) OCUL Scholars Portal Usage Reason for Use

Resources  Web interface  ARL New Measures and Assessment Initiatives: MINES for Libraries ®  Articles & Presentations

Contact Us ARL Staff:  MaShana Davis, Technical Communications Liaison  Martha Kyrillidou, Director, ARL Statistics and Service Quality Programs  Gary Roebuck, Technical Operations Manager Project Management:  Brinley Franklin, Vice Provost for University Libraries, University of Connecticut Libraries  Toni Olshen, Business Librarian, Peter F. Bronfman Business Library, York University  Terry Plum, Assistant Dean for Technology and Director, Simmons Graduate School of Library and Information Sciences

Questions