Data Collection and Beyond Assessment at the U.Va. Library

Slides:



Advertisements
Similar presentations
Counting resource use: The publisher view
Advertisements

Learner-Centered Information Science and Librarianship Programs Exemplary Standards and Guidelines VI.
Using LibQUAL+ to Develop Accountability with Key Stakeholders Raynna Bowlby Based upon presentation made w/co-presenter Dan O’Mahony (Brown U. Library)
Changes in Technology Use 1997 The first version of the survey didn’t even include a question about computer use. “Used MMC” (Multi Media Center) was added.
Listening To Our Users Queen’s 2010
William Paterson University Five Strategic Areas of Focus at the Cheng Library Fairleigh Dickinson University June 18, 2009 Anne Ciliberti
. The Balanced Scorecard and MIS— Strategy Development and Evolution Jim Self Management Information Services University of Virginia Library 20 th Pan.
Case Study Team 9. 2 Mission Statement The aim is to support teaching and researching of all students and faculty through the provision of relevant information,
Helping Students Succeed at Identifying Organic Compounds: Optimizing Location and Content of a Guide to the Literature Susan K. Cardinal & Kenneth J.
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
How are we doing with assessment? Update from the Information Services Assessment Council March 8, 2006.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library June 17, 2002.
Creating a User-Centered Culture of Assessment Stella Bentley and Bill Myers University of Kansas EDUCAUSE Southwest Regional Conference 2005.
Assessment: What it means What we’ve done What’s ahead Update from the Information Services Assessment Council March 30, 2006.
Presenter: Ira Bray Monday, September 14, :00 noon to 1:00 p.m. Infopeople webinars are supported by the U.S. Institute of.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
LibQUAL + Surveying the Library’s Users Supervisor’s Meeting March 17, 2004.
LibQUAL + ™ Data Summary An overview of the results of the LibQUAL+™ 2003 survey with comparisons to the 2001 survey.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
LLC Meeting August 26, Updates on Current Projects o UC journal access partnership o Integrated Library System (ILS) o Reports Submission System.
2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment.
What does Elsevier count? Use Measures for Electronic Resources: Theory & Practice ALCTS Program, ALA, Chicago Daviess Menefee Director, Library Relations,
Charting Library Service Quality Sheri Downer Auburn University Libraries.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
Old.libqual.org What will ARL do for you? What will you do? January 2005 Shrivenham, UK.
Making Library Assessment Work ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 Steve Hiller and Jim Self University.
LIBQUAL+ and Library Summit: The Clemson Experience.
Margaret Martin Gardiner Assessment Librarian The University of Western Ontario LibQUAL+2007 Canada 25 October 2007.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
Continuing the work of the Bill & Melinda Gates Foundation Presented by: Jeff Stauffer WebJunction Service Manager Date: 3 February 2005.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
CAMERON UNIVERSITY LIBRARY LIBRARY SERVICES Fall 2009Program Quality Improvement Report
LibQUAL+ Finding the right numbers Jim Self Management Information Services University of Virginia Library ALA Conference Washington DC June 25, 2007.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
The New Metrics at U.Va. Jim Self University of Virginia Library ARL Survey Coordinators Meeting Orlando, Florida June 25, 2004.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
A POCKET GUIDE TO PUBLIC SPEAKING 4 TH EDITION Chapter 9 Locating Supporting Material.
1 Identifying Instruction-Related Research Issues Deborah Lines Andersen School of Information Science and Policy University at Albany June 26, 2004.
Gabrielle Wong HKUST Library
Market Research.
LibQUAL+® Survey Administration American Library Association
Digital Library Development in Australia
BUS 642 Course Experience Tradition / snaptutorial.com
Leading E Competent Schools – Implementing Digital Learning Materials
ELI 2012 Annual Meeting February 15, 2012 Austin, Texas
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
Functional Area Assessment
Today’s Agenda The importance of a conversation
Library Assessment Tools & Technology
Building on our tradition of excellence – planning for the future.
CCC Library Strategic Plan
Evaluating the Portal: The Story Behind the Numbers
DWQ Web Transformation
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Amie Freeman, University of south Carolina
Assessing the Assessment Tool
Cabrillo College’s Ellucian Portal Project
SSarah The Value of Scholarly Communications Programming: Perspectives from Three Settings Sarah Beaubien • Scholarly Communications.
Model T(eamwork) in The Aid Office
Using Student Survey Data to Build Campus Collaborations
Participatory Data-Gathering and Community Building
Purpose of EPIC Evaluation Program
Collecting Library Statistics for Management Decision-making
Cabrillo College’s Ellucian Portal Project
Presentation transcript:

Data Collection and Beyond Assessment at the U.Va. Library Jim Self University of Virginia Library James Madison University February 10, 2012

Today’s Program A short history of stats at U.Va. Adding qualitative info to the mix Focusing on the user experience Designing for your user Questions, comments

Why Measure or Count? To evaluate To compare To improve Why do we collect data? To find out how we are doing. To compare ourselves with something. How we were doing last month, last year, or ten years ago. Or how we are doing compared to another library, or another agency. Or how we are doing compared to our goals, or our targets. And we collect data so we can improve what we do.

When not to collect data When someone has already done it When the data will not be useful When the results do not justify the costs We don’t want to collect data unless the data are useful--worth the expense. And data collection has costs: The time designing an instrument. The time spend in actual data collection--interviewing, handing out surveys, counting. Out of pocket expenses--postage, printing, hiring of personnel Cost to the subjects (respondents) Their loss of time and convenience The library’s loss of goodwill Expenditures for incentives

Making Data Meaningful Summarize Compare Analyze Present A page full of numbers is usually worthless. To make use of the data, you have to work with a manageable number of numbers. You have to summarize--then you can compare the summarized numbers. And then you can analyze the compared numbers. You can look for patterns--for trends. But for the data to be useful, you need to present the findings in an appropriate and meaningful fashion.

“…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931

Data at the U.Va. Library Long involvement with ARL statistics Statistical compilations Comparisons with peers Mining existing records Unit cost calculations Calculation of turnaround times Customer surveys Staff surveys Performance indicators and targets Ongoing involvement with VIVA statistics

Management Information Services MIS committee formed in 1992 Evolved into a department 1996-2000 Coordinates collection of statistics Publishes annual statistical report Conducts surveys Coordinates assessment Resource for management and staff

Reasons to Collect Data “No questions arise more frequently in the mind of the progressive librarian than these: Is this method the best? Is our practice adapted to secure the most effective administration? Are we up to the standards set by similar institutions? The success with which we answer them depends much on the success of our administration” J.T. Gerould, 1906

Message from the 21st Century Use data to IMPROVE Services Collections Processes Performance Etc., etc. Don’t use data to preserve the status quo Karin Wittenborg, ALA Annual, Orlando 2004

U.Va. Library Surveys Faculty Students LibQUAL+ 2006 1993, 1996, 2000, 2004 Response rates 59% to 70% Students 1994, 1998, 2001, 2005 Separate analysis for grads and undergrads Response rates 43% to 63% LibQUAL+ 2006 Response rates 14% to 24% Annual Surveys 2008+ Student samples One third of faculty Response rates 29% to 47% 11

Sharing the Data www.lib.virginia.edu/mis

U.Va. Library Services and Resources: Overall Importance by Group

Overall Satisfaction University of Virginia Library 1993-2010 Grads Undergrads

Weekly Visits to a U.Va. Physical Library 1993-2011 38% Faculty

U.Va: Monitoring the Online Catalog Satisfaction with VIRGO since 1993 The Library’s now-annual user satisfaction survey has measured users’ satisfaction with Virgo and other services on a five point scale. Looking back to the beginning in 1993, we can see there was a long, slow decline in satisfaction. Satisfaction peaked about the time Google became a household word. The good news is that it has bounced back in the last two surveys. Note that this chart starts at 3 rather than 0 in order to give you a better idea of the movement over time. Google was named the top search engine by PC Magazine in 1998. Many positive reviews came out in 1999. In 2000 Google announced the first billion-URL index. Sirsi was implemented in 1996--AFTER the faculty survey (1996) but BEFORE the student survey (1998).

Using Data at U.Va Additional resources for the science libraries (1994+) Redefinition of collection development (1996) Initiative to improve shelving (1999) Clemons Library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support for transition from print to e-journals (2004) New and improved study space (2005-06) Increased appreciation of the role of journals (2007) Re-design of main floor of Clemons Library (2008) Enhanced usability of discovery tools (2010) Dedicated space for graduate students (2011)

“To assess, in general is to determine the importance, size or value of; to evaluate.” SPEC Kit 303: Library Assessment, December 2007.

Why Should Libraries Do Assessment? To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities http://www.keepmoneylocal.org/2010/03/buy-local/

An Assessment-Focused Institution: Collects, analyzes and uses data for management, program development, and decision-making Emphasizes ongoing communication with customers, opportunities for collaboration, qualitative measures and circular process of continuous improvement

Common Library Measures Revenues and expenditures Community size Staff size and salaries Collections size Circulation and other collections usage Interlibrary borrowing and lending Instruction sessions Service measures (hours, patrons, reference use) Computer use detail Web site usage How many of you collect these statistics?

What is missing? These don’t tell us the VALUE to the user

The Challenge For Libraries Traditional statistics: emphasize inputs – how big and how many do not tell our story do not measure service quality do not include collaborative partners and services Need measurements from the perspective of the user Need the culture and the skills to answer a basic question:

What difference do we make to our communities?

Time For A New Assessment Model User-Centered Library View services & activities through the eyes of users Users determine quality Our services and resources add value to the user Assess the Value Provided the Community Through Online resources and services In-library resources and services In person services outside the library Partnership with others involved in learning Contribution to learning, research and life

Moving forward… Continued collection of quantitative data Expansion of the use of qualitative information Creation of a user experience team Integration of web design, development and assessment