Improving Efficiencies Through Cost- Benefit Analysis of Metadata Creation Joyce Celeste Chapman NCSU Libraries Fellow Metadata and Digital Object roundtable:

Slides:



Advertisements
Similar presentations
Sam Hastings University of North Texas School of Library and Information Sciences User Input into Image Retrieval Design.
Advertisements

CC 2007, 2011 attribution – R.B. Allen Overture. Recent Headlines AA files lawsuit against Google over trademark words Katrina People Finder Interchange.
Magia G. Krause Ph.D. Candidate School of Information University of Michigan Society of American Archivists Annual Meeting August 14, 2009 Undergraduates.
Development Times for Instructor- Led Learning (ILT) A Chapman Alliance, Learning Brief November 2007 By Bryan Chapman, Chief Learning Strategist
Ball State University Libraries A destination for research, learning, and friends Using Google Analytics Data to Expand Discovery and Use of Digital Archival.
Mass Digitization of Archival Manuscripts To ThisGoing from this.
Constituent Mail Analysis Project Emory University Center for Technology Initiatives IASSIST 2003.
Metadata for Digital Content Jane Mandelbaum, Ann Della Porta, Rebecca Guenther.
Amanda Spink : Analysis of Web Searching and Retrieval Larry Reeve INFO861 - Topics in Information Science Dr. McCain - Winter 2004.
Focus Group Methodology  Five focus groups science educators (n = 38)  K-5, 6-12 (inservice and preservice group), undergraduate faculty (two groups)
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Redesigning Technical Services By Reconceptualizing Staff University of Connecticut Libraries Francine M. DeFranco Living the Future VI April 7, 2006.
Administration Of A Website Information Architecture November 17, 2010.
All about Empirical Research Articles What’s in them and how to read them… Developed by Debbie Lahav and Elana Spector-Cohen.
Inbound Statistics Slides Template Resources for Partners.
The Public Library Catalogue as a Social Space: A Case Study of Social Discovery Systems in Two Canadian Public Libraries Louise Spiteri. School of Information.
1 Evaluation. 2 Evaluating The Organization Effective evaluation begins at the organizational level. It starts with a strategic plan that has been carefully.
LITERATURE REVIEWS. What is a literature review?  “a synthesis of the literature on a topic.”  (Cottrell & McKenzie, 2011, pg 40)
Evaluation of digital Libraries: Criteria and problems from users’ perspectives Article by Hong (Iris) Xie Discussion by Pam Pagels.
What difference a good tool? using Endeca for a faceted catalog Emily Lynema NCSU Libraries ACRL Delaware Valley Chapter Fall Program November 3, 2006.
Conducting Usability Tests ITSW 1410 Presentation Media Software Instructor: Glenda H. Easter.
Context and Prosopography: Putting the 'Archives' Into LOD-LAM Corey A Harper SAA MDOR
The Natural Resources Digital Library Needs, Partners, and Challenges Bonnie Avery, Janine Salwasser, & Janet Webster Oregon State University.
Helping researchers maximize reach and impact of their work Whose work is it anyway? Melinda Kenneway Kudos.
Creating rich shareable metadata: The DLF Aquifer MODS implementation guidelines Sarah L. Shreeves University of Illinois at Urbana-Champaign ALA Annual.
Providing Access to Digitized Content Via the Finding Aid: A Usability Study Jody L. DeRidder University of Alabama Libraries
Eye Tracking in the Design and Evaluation of Digital Libraries
Gathering and Analyzing Web Use Statistics: A Practical Tutorial for Archivists Michael Szajewski, Ball State University, Archivist for Digital Development.
Time Management.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
Understanding and Predicting Graded Search Satisfaction Tang Yuk Yu 1.
Digital Special Collections Users Council Annual Meeting May 9, 2008.
Can we be doing more? Beth Tillinghast University of Hawaii at Manoa October 19, 2011 Archive-It Partner Meeting ACCESS TO OUR ARCHIVED WEBSITE COLLECTIONS.
NCSU Libraries Kristin Antelman NCSU Libraries June 24, 2006.
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
The Portal to Texas History: Harnessing Technology to Enable Collaboration with Small Museums and Libraries CNI, December 6, 2005 Cathy Nelson Hartman.
Treating Chronic Pain in Adolescents Amanda Bye, PsyD, Behavioral Medicine Specialist Collaborative Family Healthcare Association 15 th Annual Conference.
CMPT480 Term Project Yichen Dang Nov 28,2012.   For me:  Introduce a technology for painting without hands requirement  Deeper understanding of eye.
Archive Engine West Contextualizing Digital Objects with EAD Metadata Jodi Allison-Bunnell, Orbis Cascade Alliance Worthy Martin, Institute for Advanced.
The Cross-Search and Context Utility: Contextualizing Digital Content and Associated Encoded Archival Description Finding Aid Metadata in the Northwest.
Florida Fish and Wildlife Conservation Commission.
JENN RILEY, HEAD, CAROLINA DIGITAL LIBRARY AND ARCHIVES WHAT EVERY LIBRARIAN NEEDS TO KNOW ABOUT DIGITAL COLLECTIONS.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Best Practices for Digital Imaging and Metadata Roy Tennant The Library, University of California, Berkeley
21/11/20151Gianluca Demartini Ranking Clusters for Web Search Gianluca Demartini Paul–Alexandru Chirita Ingo Brunkhorst Wolfgang Nejdl L3S Info Lunch Hannover,
August 2005 TMCOps TMC Operator Requirements and Position Descriptions Phase 2 Interactive Tool Project Presentation.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Digital Library Evaluation Flora McMartin Broad Based Knowledge
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
26/01/20161Gianluca Demartini Ranking Categories for Faceted Search Gianluca Demartini L3S Research Seminars Hannover, 09 June 2006.
Managing Access at the University of Oregon : a Case Study of Scholars’ Bank by Carol Hixson Head, Metadata and Digital Library Services
CONGRESSIONAL PAPERS ROUNDTABLE PRE- CONFERENCE, SOCIETY OF AMERICAN ARCHIVISTS Capitol Visitors Center, Washington, DC August 13, 2014 Andrea L’Hommedieu,
Predicting User Interests from Contextual Information R. W. White, P. Bailey, L. Chen Microsoft (SIGIR 2009) Presenter : Jae-won Lee.
EAD 101: An Introduction to Encoded Archival Description XML and the Encoded Archival Description: Providing Access to Collections Oregon Library Association.
The Usability of Electronic Finding Aids during Searches for Known Items Christopher J. Prom Assistant University Archivist University of Illinois at Urbana-Champaign.
Research Design in Education Research Methods. Describe your research topic What is the nature of the problem and your research question? To answer the.
The Digital Public Library of America: How will it benefit my patrons? How are Missouri libraries participating? Emily Jaycox Missouri History Museum
Course : Study of Digital Convergence. Name : Srijana Acharya. Student ID : Date : 11/28/2014. Big Data Analytics and the Telco : How Telcos.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Hannah Marshall January 2015 Preliminary Findings: A Comparative Study Of User- And Cataloger- Assigned Subject Terms.
Systems/Web Services Digital Libraries & Technical Services
Multiple approaches to archival description
Metrics & Management: Cost & value of metadata workflows
Collaboration with Google Drive
RACIAL VIOLENCE ARCHIVE
Redesigning the Archival Services’ Website with User Perspectives
Implementing Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries Amanda K. Hawk,
Focus Group #2 Conducted using Adobe Connect
Web archives as a research subject
Presentation transcript:

Improving Efficiencies Through Cost- Benefit Analysis of Metadata Creation Joyce Celeste Chapman NCSU Libraries Fellow Metadata and Digital Object roundtable: lightning talks SAA 2010 Annual (August 11, 2010)

Cost and value of metadata We assume there to be inherent value in the work we do with metadata Libraries are lacking metrics for measuring cost and value of metadata Problem: unlike for-profits, we cannot model on cost versus sales

Operational definitions of “value” We must identify our own operational definitions of value against which we can evaluate cost Examples: –Value as use/circulations –Value as discovery success –Value as the ability to operate successfully on the open Web

Operational definitions of “value” We must identify our own operational definitions of value against which we can evaluate cost Examples: –Value as use/circulations –Value as discovery success –Value as the ability to operate successfully on the open Web

Users, archival metadata, and value through discovery success User study: how frequently do users use certain elements in information discovery (specifically: when determining the relevancy of resources returned in results list)? Time study: how long do processors spend creating these same metadata elements?

Methodology Part I: User study 10 advanced archival researchers 5 subjective information discovery tasks Part II: Time study 14 collections, 9 processors (5 archivists and 4 catalogers), 2 partner institutions* Time data collected to the minute for metadata creation * NCSU and Avery Research Institute for African American History and Culture, SC

Which elements were studied? 1.Abstract 2.Biographical / Historical Note 3.Scope and Content Note (collection-level) 4.Subject Headings 5.Collection Inventory 6.Other* * Catch-all category for all other elements. Required to analyze ratios.

Disclaimer We have very few data sets for timing data (14 split among 3 groups). This is not enough to be sure of anything! More data needs to be tracked before we can be sure that patterns we see for are accurate.

Analyzing usability findings Participant behavior ranked elements in the following order from most used to least used: 1.Collection Inventory 2.Abstract 3.Subject Headings 4.Scope Content 5.Biographical Note

Behavioral scores by order visited

Problematic metadata overlap Some participants were confused about the overlap between content in Abstract and Scope and Content Note Out of all the instances in which participants navigated to the Abstract, 64% of the time, they never subsequently looked at the Scope and Content Note

Timing analysis Average ratios

Timing analysis Real time

Cost to Value? Issues and Questions 1. Compared to use, a disproportionately high % of time is spent creating Biographical Notes. –Data acquires meaning in context: do we care if we spend a high % of metadata creation time on the Biographical Note if that translates to real numbers that are not “significant” by our institutional standards? –We might want to regulate time spent on metadata, but only for metadata creation that exceeds a certain real time baseline.

Data acquires meaning in context Examples in real numbers from NCSU: –Collection A Biographical Note = 51% of total metadata time = 24 minutes –Collection B Biographical Note = 43% of total metadata time = 9.8 hours

Cost to Value? Issues and Questions 2. Abstract is high value, users often go there first, and some use only the Abstract and never Scope and Content Note. 3. Users emphasize importance of Collection Inventory. Are we spending enough time there to equal the value rating?

Next steps: study other facets of value This study examined only one facet of value. In order to form a complete picture of value for metadata, further studies must be conducted.

Thanks! For more information, a longer presentation on this study will be given at the SAA Description Section meeting on Friday 1:00-3:00pm. Contact: