Effective, Sustainable and Practical Library Assessment Steve Hiller Director, Assessment and Planning University of Washington Libraries ARL Visiting.

Slides:



Advertisements
Similar presentations
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Advertisements

But What Does It Mean ? Using Statistical Data for Decision Making in Academic Libraries Steve Hiller University of Washington Libraries Seattle, Washington.
Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright.
Using LibQUAL+ to Develop Accountability with Key Stakeholders Raynna Bowlby Based upon presentation made w/co-presenter Dan O’Mahony (Brown U. Library)
the UNCG University Libraries ASERL Meeting November 13, 2012 Atlanta, GA Kathryn Crowe Associate Dean for Public Services
Building Effective, Sustainable, and Practical Assessment During Challenging Times Steve Hiller University of Washington Libraries Seattle, USA Martha.
Collective Opportunity in Collaboration. Reasons to Rejoice Funders usually like To impact large numbers. To impact large geographic areas. To interact.
. The Balanced Scorecard and MIS— Strategy Development and Evolution Jim Self Management Information Services University of Virginia Library 20 th Pan.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
Building A Community of Practice: The Library Assessment Conference Steve Hiller, Martha Kyrillidou, Jim Self Baltimore,
How are we doing with assessment? Update from the Information Services Assessment Council March 8, 2006.
Creating a User-Centered Culture of Assessment Stella Bentley and Bill Myers University of Kansas EDUCAUSE Southwest Regional Conference 2005.
SUSTAINING A COLLABORATIVE ORGANIZATION IN A CHANGING ENVIRONMENT Gale S. Etschmaier Associate University Librarian for Public Services Gelman Library,
Making Library Assessment Work: Practical Approaches for Developing and Sustaining Effective Assessment Phase I Update ARL VISITING PROGRAM OFFICERS
The person or persons who have associated work with this document (the "Dedicator" or "Certifier") hereby either (a) certifies that, to the best of his.
5. How to Amass Evidence (Evaluation) of Change and its Effects? How does assessment drive transformative change in the classroom, at the department level,
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Outcome Assessment Tools for the Library of the Future ACRL Conference 2005 April 7, 2005 Minneapolis, MN Martha Kyrillidou Director, ARL Statistics.
Making Library Assessment Work Progress Report of an ARL Project Steve Hiller University of Washington Jim Self University of Virginia & Martha Kyrillidou.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Outcome Based Evaluation for Digital Library Projects and Services
ARL Statistics and Measurement Library Assessment Thessaloniki, Greece June 13-15, 2005 Brinley Franklin, University of Connecticut Martha.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
By Elizabeth Meade Our Reaccreditation through Middle States Commission on Higher Education Presentation to the Board of Trustees, May 11, 2012.
Slides from a workshop at the annual conference of the American Theological Library Association, New Orleans, June 2014 TD Lincoln.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
Cathrine Harboe-ReeMarie Pernat University LibrarianSenior Policy and Planning Librarian March 2004 Fitness for purpose Monash University.
1 Strategic Thinking for IT Leaders View from the CFO Seminars in Academic Computing Executive Leadership Institute.
Accreditation in the higher education
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
ARL and SCONUL Assessment Initiatives: Synergies and Opportunities Stephen Town University of York, UK LibQUAL+ Exchange Florence, 2009.
Employee Recognition and Wellness Benchmarking Project Healthy Workplace Champions June 29, 2009.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Evaluation Process and Findings. 300 briefings and presentations 10,000 people 400 workshops 12,000 people 175 service trips 3,000 people Program Activities.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
Old.libqual.org A fairytale about “ 22 items and a box ” presented by Martha Kyrillidou May 24, 2004 Medical Library Association Washington, DC.
S AN D IEGO AND I MPERIAL V ALLEY B ASIC S KILLS N ETWORK Dr. Lisa Brewster.
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
Middle States Reaccreditation Process at The Catholic University of America.
Le New Measures Initiative de l’American Library Association (ARL) A CREPUQ 1 février 2005 Montreal, Canada Martha Kyrillidou Director, ARL.
Strategic Planning System Sacramento City College Strategic Planning System ….a comprehensive system designed to form a reliable, understood system for.
Western Carolina University Office of Assessment A Division of the Office of the Provost.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Transforming the Learning, Teaching, and Leadership Environment Summer Institutes 2001 Office of Superintendent of Public Instruction/Association of Washington.
Our Story: Our Story: The Story of One Student Affairs Division’s Quest to Improve Assessment Don Whalen, Coordinator of Assessment, Department of Residence.
March 14, 2009 ACRL 14 th National Conference Seattle, WA ClimateQUAL™: Organizational Climate and Diversity Assessment Presented by Martha Kyrillidou,
“priorities” by xkcd. Question, Find, Evaluate, Apply: Translating Evidence Based Practice to Information Literacy Instruction Megan Oakleaf, Syracuse.
ClimateQUAL™: Organizational Climate and Diversity Assessment Sue Baughman Texas Library Association April 2009.
Accreditation 101 Julie Bruno, Sierra College Glenn Yoshida, Los Angeles Southwest College Roberta Eisel, Citrus College, facilitator Susan Clifford, ACCJC,
Assessment Kirstin Dougan/Tina Chrzastowski Strategic Planning Task Force Open Meeting July 14, 2011.
Old.libqual.org LibQUAL+ TM : A Total Market Survey Duane Webster ARL Executive Director January 2004 San Diego, CA.
Quality Assurance as An Empowerment Tool for Women: A Case from Saudi Arabia INQAAHE Conference, 2009 Dr. Eqbal Z. Darandari King Saud University NCAAA.
Your LibQUAL+ ® Community: A Results Meeting American Library Association (ALA) Annual Conference Washington, DC June 25, 2007 Martha Kyrillidou, Director.
A Presentation for the Annual Conference of the Missouri Community College Association November 6, 2003 Larry McDoniel Ann Campion Riley Assessment of.
ARL New Measures —a View from Cornell Anne R. Kenney ARL Survey Coordinators and SPEC Liaisons Meeting ALA June 2007.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Cañada College Professional Development Committee Determining Participants.
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
LibQUAL+TM : A Total Market Survey
Presentation transcript:

Effective, Sustainable and Practical Library Assessment Steve Hiller Director, Assessment and Planning University of Washington Libraries ARL Visiting Program Officer York, England 23 June 2008

Building Assessment Capability in Libraries through Consultation Services Association of Research Libraries (ARL) project “Making Library Assessment Work” began in 2005 to “ Assess the state of assessment efforts in individual research libraries, identify barriers and facilitators of assessment, and devise pragmatic approaches to assessment that can flourish in different local environments” Funded by participating libraries Conducted by Steve Hiller and Jim Self under the aegis of Martha Kyrillidou of ARL –1.5 day site visit –Presentation and best practices –Interviews and meetings –Report to the library with recommendations

Key Catalysts for Developing an Assessment Consulting Service LibQUAL+® results – what to do with them E-Metrics data – how to understand them New emphasis on outcomes-based assessment from accreditation agencies and associations Data driven university administrations Article by Jim and Steve, “From Measurement to Management”... Library Trends, Summer 2004, highlighted issues involved with data collection, analysis and use in libraries. Long history of collecting data but little application to management and improvement.

The University of Virginia 14,000 undergraduates –66% in-state, 34% out –Most notable for liberal arts –Highly ranked by U.S. News 6,000 graduate students –Prominent for humanities, law, business –Plans expansion in sciences Located in Charlottesville –Metro population of 160,000

University of Washington Located in Seattle; metro population 3.2 million Comprehensive public research university –27,000 undergraduate students –12,000 graduate and professional students (80 doctoral programs) – 4,000 research and teaching faculty $ 800 million annually in federal research funds (#2 in U.S) Large research library system –$40 million annual budget –150 librarians on 3 campuses

Steve and Jim in the Air and on the Road Whilst holding full-time day jobs at their respective institutions: Visited 24 ARL libraries in U.S. and Canada in Succeeded by Effective, Sustainable and Practical Library Assessment in 2007 (open to all libraries) 14 libraries participating in (3 outside North America)

The Geographic Distribution of Participants ARL Participant Non-ARL Participant Steve’s Home University of Washington Jim’s Home University of Virginia Canada USA Cape Town Haifa York Other ARL Libraries

Participant Distribution by Rank on the ARL Expenditures Focused Index (28 Libraries) Median 43 out of 113 academic libraries

Library Assessment Process Focuses on customer needs, defining measurable “outputs” and offering services that meet those needs Collects, analyzes and uses data for management, program development, and decision-making Emphasizes ongoing communication with customers, opportunities for collaboration, qualitative measures and circular process of continuous improvement

MLAW/ESP: Data Collection Methods Pre-Visit Survey on assessment activities, needs etc. Telephone follow-up Mining library and institutional web pages Visit (1.5 days) Presentation on effective assessment Group meetings Follow-up and report Pursue leads and additional information

Pre-Visit Survey Summary of recent assessment activity Important motivators/catalysts Organizational structure for assessment What has worked well Problems or sticking points Specific areas to address Expectations for this effort Inventory of statistics (separate survey)

Most Commonly Used Assessment Methods (30 Libraries)

Commonly Identified Assessment Needs (30 Libraries)

What We Found: Organizational Culture and Structure Are Critical to Success Strong customer-focus and leadership support are keys to developing effective and sustainable assessment Demonstrated interest in using assessment to improve customer service and demonstrate value of library Effectiveness of assessment program not dependent on library size or budget Many libraries uncertain on how to establish, maintain, and sustain effective assessment; need assessment skills Each library has a unique culture and mission. No “one size fits all” approach works.

Using Data in Decision Making (From Pfeffer and Sutton, 2006) What makes it hard to be evidence-based? There’s too much evidence There’s not enough good evidence The evidence doesn’t quite apply People are trying to mislead you You are trying to mislead you The side effects outweigh the cure Stories are more persuasive anyways

When the Evidence Isn’t Used

Some Reasons Why Libraries Aren’t Evidence-Based Don’t know what evidence to collect –Few libraries understand or are skilled in basic research methods Don’t understand the evidence –Few library staff have experience in data analysis Don’t know how to present the evidence –Difficulty in identifying what is important and actionable Don’t want to use the evidence –“We know what’s best for our customers” Difficulty using the evidence for positive change –All of the above and organizational structure/culture

Organizational Factors That Impede Effective and Sustainable Assessment Lack of an “institutional” research infrastructure Emphasis on management and service responsibilities No assessment advocate within organization Library staff lack research methodology skills Library “culture” is skeptical of data Librarians have multiple time-consuming responsibilities Leadership does not view as priority Library organizational structure is “silo-based”

Common Cognitive Biases Hypothesized to Occur in Libraries (per Jon Eldridge) Anchoring Attribution Authority Confirmation Deformation Professionelle Group Think Halo or Horns Effect Outcome bias Perseverance of Belief Primacy Effects Recency Effects Selective Perception Storytelling Wishful Thinking Worst-Case Scenario

Biases Common to Libraries We Visit Deformation Professionelle (Professional Deformation) –Viewing a situation through the common perceptions of one’s profession rather than by taking a broader perspective. Halo or Horns Effect –Allowing another person’s positive or negative characteristics to affect perception of this person in other unrelated contexts. Perseverance of Belief –To persist in believing previously acquired information even after it has been discredited Wishful Thinking –Assessing a situation incompletely according to a desired rather than a likely outcome Worst-Case Scenario –Emphasizing or exaggerating those possible negative outcomes disproportionate to all possible outcomes

Skeptical Staff “ Oh, people can come up with statistics to prove anything Kent [Brockman]. 14% of people know that.” “Facts are meaningless. You could use facts to prove any- thing that's even remotely true!” Homer Simpson

Organizational Indicators of Effective Assessment Library leadership/management “truly” supportive Customer focus is a shared library value Organizational culture receptive to change & improvement Assessment responsibility recognized and supported Library has strategic planning process and prioritizes Evidence/Data used to improve services/programs –Web sites (usability) –Facilities (qualitative methods) –Serial subscriptions (emetrics) –LibQUAL+™ results are followed-up

Evidence of Effective and Sustainable Assessment Formal assessment program established Institutional research agenda tied to strategic priorities Training in research/assessment methodology Research balanced with timely decision-making Assessment results presented, understood and acted upon Results reported back to the customer community Library demonstrates value provided community

What Difference Have MLAW/ESP Made? 10 libraries have assessment librarians/coordinators 15 libraries have assessment-related committees Most libraries have continued with LibQUAL+® on a cyclical basis and undertaken additional assessments Libraries have become more active in their institutional assessment efforts Participating libraries are sending 55 staff to the 2008 Library Assessment Conference with 40% involved in the program

ARL: Building a Community of Practice Biennial Library Assessment Conference –220 registrants for 2006 conference in Charlottesville, VA –380 registrants for August 2008 in Seattle, WA Workshops –Biennial Service Quality Evaluation Academy –Full day and half day workshops Library Assessment SPEC Kit (December 2007) Assessment tools –LibQUAL+® (“Millions served”) –MINES for Libraries® –DigiQUAL and more