Download presentation
Presentation is loading. Please wait.
1
Effective, Sustainable and Practical Library Assessment
Steve Hiller Director, Assessment & Planning, University of Washington Libraries ARL Visiting Program Officer Jim Self Director, Management Information Services, University of Virginia Library University of South Carolina University Libraries December 3, 2007
2
Library Assessment More than Numbers
Library assessment is a structured process: To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities
3
Why Assess? Assumicide! Accountability and justification
Improvement of services Comparisons with others Identification of changing patterns Marketing and promotion Opportunity to tell our own story Using data, not assumptions, to make decisions Assumicide! Why do we assess? Why do we collect data? To find out how we are doing. To explain ourselves to others; to convince our funding agency that money is being well spent. To improve what we do. To compare ourselves with something –other libraries, other agencies, our competitors, if you will. To compare this year with last year, with ten years ago. To learn about trends over time—where is demand increasing, where is it decreasing.? What services have fallen from favor? Finally, if the assessment is positive, it can serve as a marketing tool. We can let the world know what a great job we do.
4
The Challenge for Libraries
Traditional statistics are no longer sufficient Emphasize inputs – how big and how many Do not tell the library’s story May not align with organizational goals and plans Do not measure service quality Need measurements from the user’s perspective Need the organizational culture and the skills to answer a basic question: What difference do we make to our communities?
5
ARL Sponsored Assessment
Tools ARL Statistics LibQUAL+® MINES for Libraries Building a Community of Practice Library Assessment Conferences Service Quality Evaluation Academy Library Assessment blog Workshops Individual Library Consultation (Jim and Steve) Making Library Assessment Work (24 libraries in ) Effective, Sustainable, Practical Library Assessment (6 in 2007)
6
What We Found in Our Visits
Strong interest in using assessment to improve customer service and demonstrate value Uncertainty on how to establish and sustain assessment Lack of assessment knowledge among staff More data collection than data utilization Effectiveness not dependent on library size or budget Each library has a unique culture and mission
7
Effective Assessment Focuses on the customer
Is aligned with library and university goals Assesses what is important Is outcomes oriented Develops criteria for success Uses appropriate and multiple assessment methods Uses corroboration from other sources Provides results that can be used If you are going to do assessment, you should do it right. You should examine services and products from the viewpoint of the user. And not just the users who are your friends—a diverse and broad sample of users. Results should be measurable, with goals or criteria of some sort. When you finish measuring, it should be obvious whether something is successful. In a serious assessment it is often good to have both quantitative results—numbers, scores, data—and qualitative information—interviews, reports, narratives. If results are unexpected or surprising, can you get corroboration from another source? Maybe something is wrong with the measurement.
8
Sustainable Assessment Needs . .
Organizational leadership Sufficient resources Supportive organizational culture Identifiable organizational responsibility Connection to strategic planning and priorities Iterative process of data collection, analysis, and use Involvement of customers, staff and stakeholders
9
Practical Assessment Keep it simple and focused – “less is more”
Know when enough is enough Use assessment that adds value for customers Present results that are understandable Organize to act on results
10
Customer-Centered Library and the Culture of Assessment
All services and activities are viewed through the eyes of the customers Customers determine quality Library services and resources add value to the customer Culture of Assessment Organizational environment in which decisions are based on facts, research and analysis, Services are planned and delivered to maximize positive customer outcomes It’s not about us! It’s about the customer.
11
Understanding our Customers: What We Need to Know to Support Our Communities
What are their teaching, learning, and research interests? How do they work? What’s important to them? How do they find information needed for their work? How do they currently use library/information services? How would they prefer to do so? How do they differ from each other in library use/needs? How does the library add value to their work? How does the library contribute to their success?
12
If It Was Only This Easy!
13
Good Assessment Starts Before You Begin . . . Some Questions to Ask
Define the question What do you need to know and why How will you use the information/results Where/how will you get the information Methods used Existing data New data (where or who will you get it from) How will you analyze the information Who will act upon the findings
14
University of Washington (Site of the 2008 Library Assessment Conference!)
Located in beautiful Seattle metro population 3.2 million Comprehensive public research university 27,000 undergraduate students 12,000 graduate and professional students (80 doctoral programs) 4,000 research and teaching faculty $800 million annually in federal research funds (2nd in U.S.) Large research library system $40 million annual budget 150 librarians on 3 campuses
15
University of Washington Libraries Assessment Methods Used
Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007 In-library use surveys every 3 years beginning 1993 Focus groups/Interviews (annually since 1998) Observation (guided and non-obtrusive) Usability Usage statistics/data mining Information about assessment program available at:
16
Case Study: UW Libraries Review of Support of Bioscience Programs
Reasons for review Better understand how bioscientists work Understand significance and value of bioscience and research enterprise to University Gauge extent and impact of interdisciplinary research Understand implications of changes in library use patterns Review viability of Libraries organizational structure/footprint Strengthen library connection in support of bioscience programs and the research enterprise
17
The Importance of the Research Enterprise University of Washington Operating Revenues $2.4 Billion in Research Grants $1 Billion Health and Human Services $510 million National Science Foundation $95 million Other federal agencies $190 million Industry/Foundations $100 million Other non-federal $110 million
18
More Than Surveys and Statistics The Qualitative Often Provides the Key
Increased use and importance of such qualitative methods as, comments, interviews, focus groups, usability, observation Statistical data often can’t tell us Who, how, why Value, impact, outcomes Qualitative provides information directly from users Their language Their issues Their work Qualitative provides context and understanding
19
Biosciences Review Process (2006)
Define scope (e.g. what is “bioscience”?) Identify and mine existing data sources Extensive library assessment data Including usage information Institutional and external data Including bibliometric information Acquire new information through a customer-centered qualitative approach Environmental scan Interviews (12 faculty) Focus groups (6 total – 3 faculty, 2 grad, 1 undergrad) Peer library surveys NO NEW USER SURVEYS
20
Faculty Interview Key Findings
First stop: Google or Pub Med Central; also WOS Those with grant support tend to buy books from Amazon The transaction costs from discovery to delivery is too high Need to integrate fragmented library systems and processes Graduate students are self-sufficient in finding information Faculty who teach undergraduates use libraries differently Had difficulty coming up with “new services” unprompted
21
Focus Group Themes Content is primary link to the library
Identify library with ejournals; want more titles & backfiles Print is dead, really dead If not online want it delivered online Provide library-related services and resources in our space not yours Discovery begins outside of library space with Google and Pub Med; lesser use of library bibliographic data bases Faculty/many grads go to physical library as last resort; too many physical libraries Lack understanding of many library services and resources
22
Biosciences Task Force Recommendations
Integrate search/discovery tools into users workflow Expand/improve information/service delivery options Make physical libraries more inviting/easier to use Consolidate libraries, collections and service points Reduce print holdings; focus on services and work space Use an integrated approach to collection allocations Get librarians to work outside library space Lead/partner in scholarly communications and E-science Provide more targeted communication and marketing
23
In God We Trust: All Others Must Bring Data
Did themes raised in the interviews/focus groups reflect the bioscience population? The campus community? The 2007 Triennial Survey as corroborating source Related Questions Mode of access (in-person, remote) Resource type importance Sources consulted for research Primary reasons for using Libraries Web sites Libraries contribution to work and academic success Useful library services (new and/or expanded)
24
UW Triennial Library Survey Number of Respondents and Response Rate 1992-2007
2004 2001 1998 1995 1992 Faculty 1455 36% 1560 40% 1345 1503 1359 31% 1108 28% Grad Student 580 33% 627 597 457 46% 409 41% 560 56% Undergrad 467 20% 502 25% 497 787 39% 463 23% 407
27
Change in Off-Campus Remote Use (Percentage using library services/collections at least 2x week)
28
Graduate Student Mode of Access by Academic Area(% Using at least 2x week)
29
Where Do they Go? Sources Consulted for Information on Research Topics (Scale of 1 “Not at All” to 5 “Usually”) The open Internet is now used more often by all groups (regardless of academic area) than UW Libraries provided bibliographic databases to find information on research topics. Undergraduates also rely heavily on open Internet sources such as Wikipedia for basic information. “If it’s not on the Internet, it doesn’t exist.” My students at all levels behave this way. They also all rely on Wikipedia almost exclusively for basic information. Associate Professor, English
30
Faculty: Resource Type Importance by Academic Area (Scale of 1 “not important” to 5 “very important”) Journals were the identifying brand for the library in the focus groups. And you can see that current journals are important to all groups. Note how importance varies by group for the other resource types. Bioscience students also rate older journals as important while grad students in the humanities/social sciences and physical sciences-engineering find books important. Books were seen as especially useful for review type information in related fields. Note the relatively low importance given to bibliographic databases.
31
Reasons for Faculty Use of Libraries Web Sites by Academic Area (Use at least 2x per week)
32
Usefulness of New/Expanded Services for Faculty & Grads
33
Libraries Contribution to: (Scale of 1 “Minor” to 5 “Major”)
34
Survey Follow-Up Actions
Probe deeper on specific library contributions to research and student academic success using qualitative methods Interviews/focus groups beginning Winter 2008 Review scope and effectiveness of information literacy programs Develop plan to deliver “print” content to faculty & grad students in their format of choice and in their space Pilot test “scan on demand” begins January 2008 Strengthen our subject librarian liaison efforts to better understand and support research in their areas Develop standardized toolkit for assessing library connection to research enterprise. Revisit scholarly communications policy Integrate library services & resources into user workflows
35
How UW Libraries Has Used Assessment
Extend hours in Undergraduate Library (24/5.5) Create more diversified student learning spaces Eliminate print copies of journals Enhance usability of discovery tools and website Provide standardized service training for all staff Stop activities that do not add value Change/reallocate budgets and staffing Inform the strategic planning process Support budget requests to University Reduce collections space, add computers and work areas Open undergraduate library 24 hours
36
How Are We Doing? Overall Satisfaction by Group 1995-2007
We are a great library and our users know it. Faculty and student satisfaction with the Libraries is exceptionally high And keeps rising. Our is to maintain and increase satisfaction with the Libraries by providing services and resources that add value to their work and keep them successful. We can do that by continuing to listen to our users, understand their needs, and provide services that address them. You guys and gals rock!!!!!! We need to invest in our library system to keep it the best system in America. The tops! My reputation is in large part due to you. Professor, Forest Resources
37
UVA MIS Consult
38
Collecting the Data at U.Va.
Customer Surveys Staff Surveys Mining Existing Records Comparisons with peers Qualitative techniques
39
Corroboration Data are more credible if they are supported by other information John Le Carre’s two proofs Earlier I mentioned corroboration. It applies to library assessment, and to international espionage. John Le Carre in his novels about cold war spying and betrayal noted that at least “two proofs” were needed to confirm a fact. Let’s look at an example of corroboration -- using data from two sources: student surveys, and tallies of library use.
40
UVa Customer Surveys Faculty Students LibQUAL+™ in 2006
1993, 1996, 2000, 2004 Separate analysis for each academic unit Response rates 59% to 70% Students 1994, 1998, 2001, 2005 Separate analysis for grads and undergrads Undergrad response rates 43% to 50% Grad response rates 54% to 63% LibQUAL+™ in 2006
41
Analyzing U.Va. Survey Results
Two Scores for Resources, Services, Facilities Satisfaction = Mean Rating (1 to 5) Visibility = Percentage Answering the Question Permits comparison over time and among groups Identifies areas that need more attention In each of our surveys, we ask respondents to rate on a 1 to 5 scale a long list of services, resources, and facilities. The list has varied from survey to survey. But it has always been more than 50 and fewer than 100 items. By calculating two scores (satisfaction and visibility) we are able to compare the items from high to low, and to group them for attention. “Low satisfaction/ High visibility” items are obvious candidates for attention. “High Satisfaction/ Low Visibility” items may be candidates for instruction or publicity. We also are able to compare items over time and between groups: Is satisfaction increasing or decreasing? What about visibility? One example: the visibility of the reference function has declined markedly among undergraduates. On the 1994 student survey 76% of undergraduates gave a rating to “Answering questions in person.” On the 1998 survey 64% of undergrads gave a rating to the same query. In the 2001 survey only 39% of undergrads rated this query: “Answering questions by phone, or in person.”
42
Reference Activity and Visibility in Student Surveys
You can see the survey visibility results correlate with the decline in reference questions, as tallied and reported each year. It is almost an exact match r= .98. Among undergraduates the decline in reference use is real and (I would say) undeniable. We have two proofs--corroboration. The number of recorded reference questions has fallen 50%. The number of undergrads answering the reference survey question has fallen 50%.
43
Making the most of LibQUAL
Scan the results by user category Use thermometer charts Identify high and low desire areas Identify the ‘red zones’ Examine the comments Compare satisfaction scores with peers Follow up on problem areas
44
This takes the thermometer graph to a new level of detail—displaying the faculty scores for desired, perceived, and minimum for each of the 22 core questions. This graph presents the same data as the LibQual radar chart. However, I would assert that this presentation is much more intelligible, and much more informative than the radar charts. A chart like this can be easily explained to an administrator or an outsider. Looking at this chart, we see that UVa faculty place a relatively high value on service (the right side of the chart), and believe the library is offering almost optimal service. They place an even higher value on information control, and believe the library is barely meeting the minimum, or falling short. Library as place is not important for faculty, and given those low expectations, the library is performing adequately.
47
LibQUAL Follow Up on Journals
Examining the comments Drilling into data Talking to faculty and grad students Corroborating with other data Comparing with other libraries
48
2006 LibQUAL+™ Results UVa and ARL Overall Satisfaction
Undergrad Overall Grad Overall Faculty UVa 7.52 7.48 7.87 ARL Range 6.61 to 7.63 6.51 to 5.87 to Mean 7.18 7.16 7.24
49
The Balanced Scorecard Managing and Assessing Data
The Balanced Scorecard is a layered and categorized instrument that Identifies the important statistics Ensures a proper balance Organizes multiple statistics into an intelligible framework Much of our statistical and assessment data is organized using a tool called the Balanced Scorecard.
50
Metrics Specific targets indicating full success, partial success, and failure At the end of the year we know if we have met our target for each metric The metric may be a complex measure encompassing several elements Each measure, or metric, has a set of two specific targets, so that at the end of year, we know if we have been successful in these areas.
51
What Do We Measure? Customer survey ratings Staff survey ratings
Timeliness and cost of service Usability testing of web resources Success in fundraising Comparisons with peers Here are some examples -- the types of metrics we ended up with.
52
Metric U.1.A: Overall Rating in Student and Faculty Surveys.
Target1: A score of at least 4.00 (out of 5.00) from each of the major constituencies. Target2: A score of at least 3.90 from each of the major constituencies. FY05 Result: Target1 Undergraduates 4.08 Graduate Students 4.13 Now I will tell you about a few specific metrics—our experience over the past three years.
53
Metric U.4.B: Turnaround time for user requests
Target1: 75% of user requests for new books should be filled within 7 days. Target2: 50% of user requests for new books should be filled within 7 days. Result FY06: Target1. 79% filled within 7 days.
54
Metric U.3.A: Circulation of New Monographs
Target1: 60% of newly cataloged monographs should circulate within two years. Target2: 50% of new monographs should circulate within two years. Result FY06: Target1. 61% circulated. Now I will tell you about a few specific metrics—our experience over the past three years.
55
Using Data for Results at UVa
Additional resources for the science libraries (1994+) Redefinition of collection development (1996) Initiative to improve shelving (1999) Undergraduate library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support for transition from print to e-journals (2004) New and Improved Study Space ( ) At U.Va.the surveys have offered support for initiatives. They have highlighted services and resources that need improvement. And they have illuminated the concerns of certain specific areas and constituencies.
56
in conclusion Assessment is not…
Free and easy A one-time effort A complete diagnosis A roadmap to the future Assessment costs money and time. And not just our time, but it often costs our users. If we do a survey or a focus group, we are asking people to spend their time to help us. It is important to remember that—we should not ask our patrons to help us unless the results will be worth their investment of time. We should call on them sparingly. An assessment procedure may identify problems, but it will probably not tell us how to fix them. Typically a more in depth study is required to solve a problem. Because assessment focuses on the present, and the past, it generally does not tell us what we should be doing in the future. Typically the creative, innovative ideas, the revolutionary changes, do not come from assessment.
57
Assessment is… A way to improve An opportunity to know our customers
A chance to tell our own story A positive experience
58
Moving Forward Keep expectations reasonable and achievable
Strive for accuracy and honesty—not perfection Assess what is important Use the data to improve Keep everyone involved and informed Focus on the customer
59
For more information… Steve Hiller Jim Self ARL Assessment Service
Jim Self ARL Assessment Service
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.