VIVA Value Metric Project

Slides:



Advertisements
Similar presentations
1 of 18 Information Dissemination New Digital Opportunities IMARK Investing in Information for Development Information Dissemination New Digital Opportunities.
Advertisements

Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
VIVA - Collaborating to Build: Using Collection Analysis to Inform Consortial Collection Development 2015 ALA Midwinter Print Archive Network Forum Genya.
VIVA’s View of the Big Deal, Cost Sharing, and Cooperative Collection Management; Why sometimes having just one key is not enough. Museum of Innocence:
It’s Not Just About Weeding Using Collaborative Collection Analysis to Develop Consortial Collections Charleston Conference 2014 Leslie O’Brien Genya O’Gara.
The Impact of Consortial Purchasing on Library Acquisitions: the Turkish Experience Tuba Akbaytürk 24 th Annual IATUL Conference Ankara, Turkey.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Rensselaer Research Libraries: Trends Migration from Print to Electronic –Research databases, e-journals and e-books –Enhanced scope, size, searchability.
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
Presented at the GAELIC Summer Training Camp November 2010 Use of Usage Statistics in Academic Libraries: Experiences of the University of the Witwatersrand.
Student Assessment Inventory for School Districts Context setting, assessment scenarios, and communications.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
WorldCat Knowledge Base and Direct Request: Successful Implementation for ILL Usage Carol Creager and Sean Crowley, MBC Katherine McKenzie, CWM Anne C.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
VIVA UPDATE ILL COMMUNITY FORUM JULY 17, 2015 Anne C. Osterman, VIVA Director.
COUNTER Code of Practice: An update ICOLC Spring Meeting April 2007 Montreal, Canada Presented by Oliver Pesch EBSCO Information Services.
Assessment Workshop College of San Mateo February 2006.
HECSE Quality Indicators for Leadership Preparation.
Electronic Resources at Copley Library Selection and Deselection Michael J. Epstein Reference/ Electronic Resources Librarian University of San Diego Copley.
1 Jo Lambert and Paul Meehan. JUSP aims Supports libraries by providing a single point of access to e-journal usage data Assists management of e- journals.
Revising priorities in the statistical programme Management Group on Statistical Cooperation * 24 & 25 March 2011 * Carina Fransen.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
ERead and Report. What is... Independent eBook Reading with a Vocabulary and Comprehension Assessment Focuses mainly on Reading Informational Texts Aligns.
The Achievement Chart Mathematics Grades Note to Presenter:
VIVA Shared Collections Project 2015 VIVA Collections Forum Alison Armstrong, Collection Management, Radford Genya O’Gara, Associate Director for VIVA.
Usage Consolidation & Analytics Digital Interest Group Jay Glaisyer, Senior Director of Sales.
February 14, 2016NAME OF EVENT1 What’s the Big Deal? Collection Evaluation at the National Level Clare Appavoo, Executive Director, Canadian Research Knowledge.
Walking a Tightrope in the Transition to Electronic Resources Debra G. Skinner Georgia Southern University GOLD/GALILEO Users Group Conference July 31,
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Collective Bargaining Contracts with Performance Metrics A “Success Pool” and ”Faculty Excellence Awards” Kent State University NCSCBHEP 39 th Annual National.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
VIVA and Whole Ebook ILL:
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Luke Vilelle, Hollins University, Outreach Committee Chair
VIVA and Open Access initiatives
Ask VIVA! March 15, 2017 Carrie Cooper, CWM, Steering Committee Chair
Assessment & Evaluation Committee
Annual Evaluation (TFI 1.15 )
Valuing Consortial Resources: A Framework for Assessment
VIVA’s Whole Ebook ILL Initiative: Background
Laura Morse & Amira Aaron ELUNA Steering Committee
Viva usage statistics and counter 5
Library Collections Budget
Ask VIVA! September 7, 2016 Carrie Cooper, Dean of University Libraries, College of William & Mary Luke Vilelle, University Librarian, Hollins University.
January 12, 2016 Anne Osterman, VIVA Director
Assessing Library Performance:
Online Education 2025 Strategic Plan
ACRL Conference 2017 Genya O’Gara, VIVA Associate Director
VIVA Linked Data Pilot with Zepheira and Atlas Systems
ORGANIZATIONAL STRUCTURE
We‘re Not So Different, you and I:
Tell a Vision: 3 Vignettes
E-Resource Management and Workflows in the Network Zone
Presented to IEEE Standards Education Committee 11 April 2014
Engagement Follow-up Resources
Institutional Effectiveness Presented By Claudette H. Williams
Qualtrics Proposal Gwen Gorzelsky, Executive Director, TILT
VALE Annual Users’ Conference
CVE.
Engagement Follow-up Resources
Assessment & Evaluation Committee
Wednesday, December 1st Today’s Facilitators: Kim Glow & Cindy Dollman
SUPPORTING THE Progress Report in MATH
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
The Big“gest” Deal Renewal
Understanding the Issues: Textbook Affordability at USF
Information Literacy: What is it and Why Should I Care?
National Center for Chronic Disease Prevention and Health Promotion
Presentation transcript:

VIVA Value Metric Project VIVA Collections Forum 2017 Anne Osterman, VIVA Director

Why a Value Metric Project? Libraries continually search for better, more informed ways to make resource decisions. Budgets are tight or declining, information universe is expanding. A difficult task at the institutional level becomes even more complex at the consortial level. Diversity in user populations of members creates diversity of collection priorities. The need for a system that can compare dissimilar formats. What is the value of an ebook collection compared to a streaming media subscription?

Value Metric Task Force Project Origins 2014-2016 biennium, VIVA received a 5% cut and used data to inform its cancellation decisions. Looking forward, the Collections Committee wanted standardized criteria to apply to the evaluation of its resources. The VIVA Collections Committee formed the Value Metric Task Force (VMTF) to figure out a consortial approach.

Developing the Value Metric System The VMTF Charge: Design and apply a framework for the coherent and holistic evaluation of VIVA products. Determine what the highest collection development priorities are for the consortium and examine how these can be translated into quantifiable values. The end result will be an assessment framework and value metric system for the evaluation of shared resources that are reflective of VIVA’s overarching values.

Developing the Value Metric System Membership of task force was representative of the four major institution types within VIVA. Examined priorities for the consortium from “institution type” perspective. Persona/brainstorming exercise surfaced institutional priorities - instead of user type (undergraduate, faculty), used institution type (community college, doctoral). Over 40% of brainstormed priorities were priorities for all 4 institutional types, 30% were priorities for 3 institutional types, 9% for 2 institutional types, and only 7 priorities only applied to1 institution type.

Developing the Value Metric System The persona/brainstorming exercise identified overlapping priorities. These were using in a survey of member institutions that focused on how institutions valued the identified facets depending on the specific format.

Developing the Value Metric System For all consortial resource format types, the top two concerns were cost savings and alignment with curriculum. The other ranked facets varied widely by format.

Developing the Value Metric System Needed accessible data that was both measurable and attainable, in order to create a tool that was easy to implement and sustainable. Very important to the group to NOT reinvent the wheel or add additional tasks to busy staff. Wanted to ensure that the tool/framework could be adapted at the local level.

Developing the Value Metric System The group conducted a data inventory. Included data such as degree and graduate counts, usage and cost data, etc. They then mapped pre-existing data to answerable questions from the survey-identified areas of need. For each product type the group asked: What data do we already collect? Does this data align with ways libraries measure value for users? Are there other factors we aren’t collecting that could answer this question?

Value Metric: Putting it All Together The group used the results of the survey to weight the included components according to relative importance to the consortium. Two kinds of grids (current and prospective) were developed for each format type (databases, ebooks, ejournals, and streaming media). Grids were divided in two parts: Demonstrated usefulness to the consortium (e.g. cost per use, alignment with degrees awarded). VIVA “values” (e.g. an emphasis on open initiatives, COUNTER-compliant usage statistics, usage rights, etc.). Each grid has a potential score of 100, allowing for cross-format comparisons.

Value Metric: Database Example Member Criteria CRITERIA SCORE 1. Alignment with Curriculum and/or Accreditation Requirements   a. Resource constitutes a high percentage of VIVA content within the subject area by format 6 b. Resource belongs to subject area with high number of degrees awarded c.Percentage of total use coming from single public highest-use institution 3 d. Percentage of total use coming from public highest-use institution type TOTAL CATEGORY Score 18 2. Cost Effectiveness b. Cost-per-Use a. Cost Avoidance 5 c. Annual Increase 4 d. Private Pooled Funds 3. Interoperability w/Discovery Systems a. Discovery Tools in which Product is Indexed 14 4. Easy, One-Stop Content Delivery a. Platform 7 b. Full-Text Availability 13 Common across all formats

Value Metric: Database Example Consortial Values CRITERIA SCORE 6. Multidisciplinarity a. Subject by Call Number 8 TOTAL CATEGORY Score 7. Usage Statistics a. COUNTER compliant 4 b. Institution-Level Statistics 3 7 8. Technical Issues a. Frequency and Nature of Technical Issues b. Vendor Responsiveness 6 9. Supports Open Initiatives Demonstrable commitment to open initiatives/exploring alternate open access publishing models

Value Metric: Rubric & Instructions Example CRITERIA RUBRIC INSTRUCTIONS b. Resource belongs to subject area with high number of degrees awarded < 10% = 0 10-19% = 1 20-29% = 2 30-39% = 3 40-49% = 4 50-59% = 5 >60%=6 To score, please refer to the "Format Breakdown" sheet's mapped subject areas. Filter "Degree-Type-LC-Mapping spreadsheet" subjects by those three subjects. This spreadsheet maps (at only the highest level) LC subjects to degree types awarded in Virginia. Add the percentages of relevant degrees, and assign the number according to the rubric. c. Percentage of total use coming from single public highest-use institution < 20% = 3 points 20-40% = 2 points 40-60% = 1 point > 60% =0 points. Using the consortial usage statistics (http://library.gmu.edu/vivasafe/index.htm), find the total Record Views from public institutions from the most recent complete fiscal year of data. Then find the public institution that had the highest number of Record Views for this resource. The ratio of this individual institution to the whole is the percentage of total use coming from the single public highest-use institution. For example, if a resource had 150 Record Views from GMU and there were 500 total Record Views from all public institutions, the ratio would be 150/500 = 0.3 = 30%. This product would therefore get 2 points in this category.

Value Metric: Putting it into Operation Collections Committee Product Managers “tested” the grids by filling them out, and they were further refined. VIVA Central filled out the grids based on each product. We created a database to store the grid data and ease comparison and reporting of different evaluative sections across products. The Collections Committee used the data to identify cancellations to meet a state budget reversion.

Value Metric: Sample Chart

Next Steps VIVA Central will continue to fill these out in consultation with product managers, in upcoming years, for each licensed resource. Although extensive, designed to be plug and play – already being adapted by the Virginia Community College System. We will make grid adjustments as appropriate – they are meant to be living, not static documents; as consortial and state priorities shift, so should the assessment of our resources.

Thank You to the Task Force! Genya O’Gara (VIVA, Chair) Beth Blanton-Kent (University of Virginia) Cheri Duncan (James Madison University) Summer Durrant (University of Mary Washington) Julie Kane (Washington & Lee University) Madeline Kelly (George Mason University) Crystal Newell (Piedmont Virginia Community College)

Credits Noun Project: “Question” by Jessica Lock, CA Noun Project: “Graph” by Chance Smith, US Noun Project: “EReader” by Amelia Edwards, US: from the Reading and eBooks Collection Noun Project: “Measuring” by pictohaven: from the marketing - bold Collection Noun Project: “Choice” by Kirby Wu, TW: from the Business / Enterprise / Management Collection Noun Project: “People” by Gregor Cresnar: from the Business: Marketing Vol. 2 Collection