Download presentation
Presentation is loading. Please wait.
Published byColin Lyons Modified over 6 years ago
1
Metrics & Management: Cost & value of metadata workflows
Joyce Chapman Triangle Research Libraries Network (N.C.) SAA 2011, August 27
2
“Return on investment”
External-facing ROI = proving our worth to funders or parent institutions Internal-facing ROI = metrics for management
3
McKinsey Global Institute report
In the next 5-10 years, the U.S. workforce will face a shortage of 140, ,000 people with the necessary data analysis skills to fill market demand, and a shortage of 1.5 million managers who know how to make use of data analysis and assessment findings to make effective management decisions. Full report:
4
The problem with ROI Unlike for-profits, libraries and archives don’t measure value via sales We must create other operational definitions of “value” with which to work Examples: Discovery success Interoperability on the open web Ability to support administrative tasks
5
ROI initiatives at NCSU
The challenge: to manage metadata workflows based on data about cost and benefit to users There is little prior work in this arena to guide us
6
Projects I’ll discuss today
Evaluating the cost and benefit of manual metadata enhancements on use of digital images Analyzing how researchers’ value different parts of the finding aid in the information discovery process, versus the time processors spend creating the same metadata
7
Evaluating the effects of manual metadata enhancements
Study #1 Evaluating the effects of manual metadata enhancements
8
Background 30 hours per week manual metadata creation for ongoing, non-grant digitization Images have pre-existing, generic metadata automated from finding aids
9
Questions To what degree is the metadata helping drive users to our online materials? Is that value worth the cost in staff time? Should we stop performing manual enhancements in favor of the generic metadata?
10
Methodology Performance comparison with A/B testing and a single collection split into two random groups (A and B) Group A received enhancements, group B had only generic metadata Metadata enhancements were timed After 6 months, we analyzed differences in unique page views (using Google Analytics)
11
Value Manual metadata enhancements greatly increase traffic to our digital images Images with manually enhanced metadata received four times more unique page views(!)
12
28% of search strings contained person names (available only in enhanced metadata)
13
Cost Pre-set goal: 5 minute per image Study mean: 7 minutes per image
Our administrators weighed the information and made a cost/benefit decision, taking localized context into account
14
Cost versus benefit? Localized context: catalogers are creating metadata enhancements for digital images faster than Special Collections can scan Decision: Yes, either 5 or 7 minutes per image of staff time is well worth the value of quadrupled unique page views
15
A successful project Instead of relying on the assumption that our existing workflows were of value, we gathered data that showed a concrete correlation between a workflow and increased discovery. In this way, we were able to take a data-driven approach to decision-making around resource allocation.
16
Conclusion Read the full study here: http://go.ncsu.edu/llzhzy
This study was simple and involved very little staff time or effort If you would like to undertake a similar study and need help, contact me!
17
Cost and value of finding aid metadata in determining relevancy
Study #2 Cost and value of finding aid metadata in determining relevancy
18
Research gap How do researchers “value” different archival metadata elements during different phases of the discovery process? How do archivists “value” (~ spend time creating) these same metadata elements?
19
Background Our study looked at both of these issues in order to determine whether our current finding aid creation practices were inline with both departmental expectations and with what users value
20
Methodology This study used the operational definition of value “discovery success.” Value: usability test and interviews Experienced academic researchers Cost: timing of metadata creation Archival processors, and catalogers retrained to perform archival processing
21
Methodology Timing data was tracked for all aspects of metadata creation combined (research, authority work, encoding, supervisor editing) 6 months, 14 collections, 9 processors, two partner institutions (NCSU and Avery Research Institute at the College of Charleston)
22
findings
23
behavioral, perceived, rank
Value Three measures of use/usability: behavioral, perceived, rank Ranked results* Collection inventory Abstract Subject Headings Scope & Content Note (collection-level) Biographical/Historical Note * From most to least useful/valuable
24
Behavioral scores by order visited
25
Abstract v. Scope Note Some researchers didn’t understand the difference; content looks similar Some said they only ever read one Problem: during the study, in 64% of instances in which a participant read the Abstract, they never subsequently read the Scope and Content Note
26
Behavioral scores by order visited
27
Behavioral scores by order visited
28
Biographical Note what causes the very low ranking for use and perceived value of the Biographical Note? Participants discussed two reasons
29
Trust 1. Have archivists represented the collection adequately? Infrequently footnoting is problematic. Even if researchers trust the accuracy of the note, its value is decreased because they must reproduce the research anyway; they can’t cite finding aids.
30
Research subject 2. They may only read Biographical Notes if their research subject is a person or corporate entity, not a topical subject. 40% of participants claimed they would only read the Biographical Note during this phase of the discovery process if their research topic was the record creator.
31
“cost” (creation time)
Findings “cost” (creation time)
32
Metadata timing ratios
An extraordinarily high percentage of total metadata creation time was spent on the Biographical Note, mean: 25% (including Collection Inventory) 41% (excluding Collection Inventory)
33
Metadata timing ratios (Collection Inventory excluded to normalize for collection size)
34
Metadata timing ratios
35
Metadata timing ratios
Collection Inventory: how much time is “enough,” considering the extremely high value attributed by users? When we looked at our real time data instead of ratios, we decided our processors could afford to spend more time on Inventory metadata
36
Real time Two groups of archivists spend very similar amounts of time on metadata If anything, we were surprised at how little time our processors were spending on metadata The catalogers were spending much more time on metadata creation than either group of archivists
37
Catalogers’ real time: what does it mean?
Special Collections had not emphasized how to write a narrative Biographical or Scope Content note enough in training No temporal benchmarks had been provided Supervisors were unable to give any feedback on metadata creation in training because they were not monitoring the process; they only monitored total processing time
38
Cost value comparison Inequality for Biographical Notes (for the aspect of “value” studied here) More time could be spent on Collection Inventory metadata to equal value Confusion around Abstract / Scope Note requires further thought
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.