Download presentation
Presentation is loading. Please wait.
Published byJasmin Logan Modified over 6 years ago
1
Data Collection and Beyond Assessment at the U.Va. Library
Jim Self University of Virginia Library James Madison University February 10, 2012
2
Today’s Program A short history of stats at U.Va.
Adding qualitative info to the mix Focusing on the user experience Designing for your user Questions, comments
3
Why Measure or Count? To evaluate To compare To improve
Why do we collect data? To find out how we are doing. To compare ourselves with something. How we were doing last month, last year, or ten years ago. Or how we are doing compared to another library, or another agency. Or how we are doing compared to our goals, or our targets. And we collect data so we can improve what we do.
4
When not to collect data
When someone has already done it When the data will not be useful When the results do not justify the costs We don’t want to collect data unless the data are useful--worth the expense. And data collection has costs: The time designing an instrument. The time spend in actual data collection--interviewing, handing out surveys, counting. Out of pocket expenses--postage, printing, hiring of personnel Cost to the subjects (respondents) Their loss of time and convenience The library’s loss of goodwill Expenditures for incentives
5
Making Data Meaningful
Summarize Compare Analyze Present A page full of numbers is usually worthless. To make use of the data, you have to work with a manageable number of numbers. You have to summarize--then you can compare the summarized numbers. And then you can analyze the compared numbers. You can look for patterns--for trends. But for the data to be useful, you need to present the findings in an appropriate and meaningful fashion.
6
“…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931
7
Data at the U.Va. Library Long involvement with ARL statistics
Statistical compilations Comparisons with peers Mining existing records Unit cost calculations Calculation of turnaround times Customer surveys Staff surveys Performance indicators and targets Ongoing involvement with VIVA statistics
8
Management Information Services
MIS committee formed in 1992 Evolved into a department Coordinates collection of statistics Publishes annual statistical report Conducts surveys Coordinates assessment Resource for management and staff
9
Reasons to Collect Data
“No questions arise more frequently in the mind of the progressive librarian than these: Is this method the best? Is our practice adapted to secure the most effective administration? Are we up to the standards set by similar institutions? The success with which we answer them depends much on the success of our administration” J.T. Gerould, 1906
10
Message from the 21st Century
Use data to IMPROVE Services Collections Processes Performance Etc., etc. Don’t use data to preserve the status quo Karin Wittenborg, ALA Annual, Orlando 2004
11
U.Va. Library Surveys Faculty Students LibQUAL+ 2006
1993, 1996, 2000, 2004 Response rates 59% to 70% Students 1994, 1998, 2001, 2005 Separate analysis for grads and undergrads Response rates 43% to 63% LibQUAL+ 2006 Response rates 14% to 24% Annual Surveys 2008+ Student samples One third of faculty Response rates 29% to 47% 11
12
Sharing the Data
13
U.Va. Library Services and Resources: Overall Importance by Group
14
Overall Satisfaction University of Virginia Library 1993-2010
Grads Undergrads
15
Weekly Visits to a U.Va. Physical Library 1993-2011
38% Faculty
16
U.Va: Monitoring the Online Catalog Satisfaction with VIRGO since 1993
The Library’s now-annual user satisfaction survey has measured users’ satisfaction with Virgo and other services on a five point scale. Looking back to the beginning in 1993, we can see there was a long, slow decline in satisfaction. Satisfaction peaked about the time Google became a household word. The good news is that it has bounced back in the last two surveys. Note that this chart starts at 3 rather than 0 in order to give you a better idea of the movement over time. Google was named the top search engine by PC Magazine in Many positive reviews came out in In 2000 Google announced the first billion-URL index. Sirsi was implemented in AFTER the faculty survey (1996) but BEFORE the student survey (1998).
17
Using Data at U.Va Additional resources for the science libraries (1994+) Redefinition of collection development (1996) Initiative to improve shelving (1999) Clemons Library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support for transition from print to e-journals (2004) New and improved study space ( ) Increased appreciation of the role of journals (2007) Re-design of main floor of Clemons Library (2008) Enhanced usability of discovery tools (2010) Dedicated space for graduate students (2011)
18
“To assess, in general is to determine the importance, size or value of; to evaluate.”
SPEC Kit 303: Library Assessment, December 2007.
19
Why Should Libraries Do Assessment?
To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities
20
An Assessment-Focused Institution:
Collects, analyzes and uses data for management, program development, and decision-making Emphasizes ongoing communication with customers, opportunities for collaboration, qualitative measures and circular process of continuous improvement
21
Common Library Measures
Revenues and expenditures Community size Staff size and salaries Collections size Circulation and other collections usage Interlibrary borrowing and lending Instruction sessions Service measures (hours, patrons, reference use) Computer use detail Web site usage How many of you collect these statistics?
22
What is missing? These don’t tell us the VALUE to the user
23
The Challenge For Libraries
Traditional statistics: emphasize inputs – how big and how many do not tell our story do not measure service quality do not include collaborative partners and services Need measurements from the perspective of the user Need the culture and the skills to answer a basic question:
24
What difference do we make to our communities?
25
Time For A New Assessment Model
User-Centered Library View services & activities through the eyes of users Users determine quality Our services and resources add value to the user Assess the Value Provided the Community Through Online resources and services In-library resources and services In person services outside the library Partnership with others involved in learning Contribution to learning, research and life
26
Moving forward… Continued collection of quantitative data
Expansion of the use of qualitative information Creation of a user experience team Integration of web design, development and assessment
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.