Effective, Sustainable and Practical Library Assessment Steve Hiller Director, Assessment and Planning University of Washington Libraries ARL Visiting Program Officer York, England 23 June 2008
Building Assessment Capability in Libraries through Consultation Services Association of Research Libraries (ARL) project “Making Library Assessment Work” began in 2005 to “ Assess the state of assessment efforts in individual research libraries, identify barriers and facilitators of assessment, and devise pragmatic approaches to assessment that can flourish in different local environments” Funded by participating libraries Conducted by Steve Hiller and Jim Self under the aegis of Martha Kyrillidou of ARL –1.5 day site visit –Presentation and best practices –Interviews and meetings –Report to the library with recommendations
Key Catalysts for Developing an Assessment Consulting Service LibQUAL+® results – what to do with them E-Metrics data – how to understand them New emphasis on outcomes-based assessment from accreditation agencies and associations Data driven university administrations Article by Jim and Steve, “From Measurement to Management”... Library Trends, Summer 2004, highlighted issues involved with data collection, analysis and use in libraries. Long history of collecting data but little application to management and improvement.
The University of Virginia 14,000 undergraduates –66% in-state, 34% out –Most notable for liberal arts –Highly ranked by U.S. News 6,000 graduate students –Prominent for humanities, law, business –Plans expansion in sciences Located in Charlottesville –Metro population of 160,000
University of Washington Located in Seattle; metro population 3.2 million Comprehensive public research university –27,000 undergraduate students –12,000 graduate and professional students (80 doctoral programs) – 4,000 research and teaching faculty $ 800 million annually in federal research funds (#2 in U.S) Large research library system –$40 million annual budget –150 librarians on 3 campuses
Steve and Jim in the Air and on the Road Whilst holding full-time day jobs at their respective institutions: Visited 24 ARL libraries in U.S. and Canada in Succeeded by Effective, Sustainable and Practical Library Assessment in 2007 (open to all libraries) 14 libraries participating in (3 outside North America)
The Geographic Distribution of Participants ARL Participant Non-ARL Participant Steve’s Home University of Washington Jim’s Home University of Virginia Canada USA Cape Town Haifa York Other ARL Libraries
Participant Distribution by Rank on the ARL Expenditures Focused Index (28 Libraries) Median 43 out of 113 academic libraries
Library Assessment Process Focuses on customer needs, defining measurable “outputs” and offering services that meet those needs Collects, analyzes and uses data for management, program development, and decision-making Emphasizes ongoing communication with customers, opportunities for collaboration, qualitative measures and circular process of continuous improvement
MLAW/ESP: Data Collection Methods Pre-Visit Survey on assessment activities, needs etc. Telephone follow-up Mining library and institutional web pages Visit (1.5 days) Presentation on effective assessment Group meetings Follow-up and report Pursue leads and additional information
Pre-Visit Survey Summary of recent assessment activity Important motivators/catalysts Organizational structure for assessment What has worked well Problems or sticking points Specific areas to address Expectations for this effort Inventory of statistics (separate survey)
Most Commonly Used Assessment Methods (30 Libraries)
Commonly Identified Assessment Needs (30 Libraries)
What We Found: Organizational Culture and Structure Are Critical to Success Strong customer-focus and leadership support are keys to developing effective and sustainable assessment Demonstrated interest in using assessment to improve customer service and demonstrate value of library Effectiveness of assessment program not dependent on library size or budget Many libraries uncertain on how to establish, maintain, and sustain effective assessment; need assessment skills Each library has a unique culture and mission. No “one size fits all” approach works.
Using Data in Decision Making (From Pfeffer and Sutton, 2006) What makes it hard to be evidence-based? There’s too much evidence There’s not enough good evidence The evidence doesn’t quite apply People are trying to mislead you You are trying to mislead you The side effects outweigh the cure Stories are more persuasive anyways
When the Evidence Isn’t Used
Some Reasons Why Libraries Aren’t Evidence-Based Don’t know what evidence to collect –Few libraries understand or are skilled in basic research methods Don’t understand the evidence –Few library staff have experience in data analysis Don’t know how to present the evidence –Difficulty in identifying what is important and actionable Don’t want to use the evidence –“We know what’s best for our customers” Difficulty using the evidence for positive change –All of the above and organizational structure/culture
Organizational Factors That Impede Effective and Sustainable Assessment Lack of an “institutional” research infrastructure Emphasis on management and service responsibilities No assessment advocate within organization Library staff lack research methodology skills Library “culture” is skeptical of data Librarians have multiple time-consuming responsibilities Leadership does not view as priority Library organizational structure is “silo-based”
Common Cognitive Biases Hypothesized to Occur in Libraries (per Jon Eldridge) Anchoring Attribution Authority Confirmation Deformation Professionelle Group Think Halo or Horns Effect Outcome bias Perseverance of Belief Primacy Effects Recency Effects Selective Perception Storytelling Wishful Thinking Worst-Case Scenario
Biases Common to Libraries We Visit Deformation Professionelle (Professional Deformation) –Viewing a situation through the common perceptions of one’s profession rather than by taking a broader perspective. Halo or Horns Effect –Allowing another person’s positive or negative characteristics to affect perception of this person in other unrelated contexts. Perseverance of Belief –To persist in believing previously acquired information even after it has been discredited Wishful Thinking –Assessing a situation incompletely according to a desired rather than a likely outcome Worst-Case Scenario –Emphasizing or exaggerating those possible negative outcomes disproportionate to all possible outcomes
Skeptical Staff “ Oh, people can come up with statistics to prove anything Kent [Brockman]. 14% of people know that.” “Facts are meaningless. You could use facts to prove any- thing that's even remotely true!” Homer Simpson
Organizational Indicators of Effective Assessment Library leadership/management “truly” supportive Customer focus is a shared library value Organizational culture receptive to change & improvement Assessment responsibility recognized and supported Library has strategic planning process and prioritizes Evidence/Data used to improve services/programs –Web sites (usability) –Facilities (qualitative methods) –Serial subscriptions (emetrics) –LibQUAL+™ results are followed-up
Evidence of Effective and Sustainable Assessment Formal assessment program established Institutional research agenda tied to strategic priorities Training in research/assessment methodology Research balanced with timely decision-making Assessment results presented, understood and acted upon Results reported back to the customer community Library demonstrates value provided community
What Difference Have MLAW/ESP Made? 10 libraries have assessment librarians/coordinators 15 libraries have assessment-related committees Most libraries have continued with LibQUAL+® on a cyclical basis and undertaken additional assessments Libraries have become more active in their institutional assessment efforts Participating libraries are sending 55 staff to the 2008 Library Assessment Conference with 40% involved in the program
ARL: Building a Community of Practice Biennial Library Assessment Conference –220 registrants for 2006 conference in Charlottesville, VA –380 registrants for August 2008 in Seattle, WA Workshops –Biennial Service Quality Evaluation Academy –Full day and half day workshops Library Assessment SPEC Kit (December 2007) Assessment tools –LibQUAL+® (“Millions served”) –MINES for Libraries® –DigiQUAL and more