Effective, Sustainable & Practical A ssessment Steve Hiller, Director, Assessment and Planning, UW Martha Kyrillidou, Statistics and Service Quality Programs,

Slides:



Advertisements
Similar presentations
Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright.
Advertisements

Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Understanding the Library Connection to UW Graduate Students in the Biosciences Steve Hiller University of Washington Libraries [Access to online resources]
Using LibQUAL+ to Develop Accountability with Key Stakeholders Raynna Bowlby Based upon presentation made w/co-presenter Dan O’Mahony (Brown U. Library)
2025 Planning Contacts Meeting November 8, 2012 K-State 2025.
Building Effective, Sustainable, and Practical Assessment During Challenging Times Steve Hiller University of Washington Libraries Seattle, USA Martha.
Queen’s Libraries User Surveys Selected information from the Faculty and Student surveys June 2002.
Bound for Disappointment Faculty and Journals at Research Institutions Jim Self University of Virginia Library USA 7 th Northumbria Conference Spier, South.
Listening To Our Users Queen’s 2010
Using Assessment Data to Improve Library Services Christopher Stewart Dean of Libraries, Illinois Institute of Technology Charles Uth, Head of Collection.
Family Resource Center Association January 2015 Quarterly Meeting.
William Paterson University Five Strategic Areas of Focus at the Cheng Library Fairleigh Dickinson University June 18, 2009 Anne Ciliberti
. The Balanced Scorecard and MIS— Strategy Development and Evolution Jim Self Management Information Services University of Virginia Library 20 th Pan.
Effective, Sustainable and Practical Library Assessment
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library June 17, 2002.
Redesigning Technical Services By Reconceptualizing Staff University of Connecticut Libraries Francine M. DeFranco Living the Future VI April 7, 2006.
I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries
Making Library Assessment Work: Practical Approaches for Developing and Sustaining Effective Assessment Phase I Update ARL VISITING PROGRAM OFFICERS
TAMU 2012 Enrollment Undergrads40,100 Graduates9,600 Professional527 Faculty3,810 TAMU HSC 2012 Enrollment Undergrads206 Graduates959 Professional1,121.
ARL perspectives on accuracy in web analytics reporting on digital libraries CNI Fall 2013 Task Force Meeting Washington, DC December 9, 2013.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Measures that matter: Designing and developing your own Balanced Scorecard Karin de Jager & Jim Self 7th Northumbria International Conference on Performance.
Understanding and Strengthening Library Connections to the Research Enterprise: From Assessment to Action Steve Hiller Neil Rambo Betsy Wilson University.
Jacqui Dowd SCONUL Workshop University of Westminster 17 th November 2009 Performance Measures & Metrics at University of Glasgow Library.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
Assess for Success: Proving Library Value
STUDENT-CENTERED VALUE RESEARCH Assessment activities of the UNT Libraries Sian Brannon, Ph.D. Kathleen Murray, Ph.D. UNT Libraries May 2, 2013.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Outcome Assessment Tools for the Library of the Future ACRL Conference 2005 April 7, 2005 Minneapolis, MN Martha Kyrillidou Director, ARL Statistics.
Types of Assessment Satisfaction of the customer. Satisfaction of the worker. Workflow effectiveness and speed. Service delivery effectiveness and speed.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Five Years of Keeping Score What are the Results? Jim Self Donna Tolson University of Virginia Library ALA Annual Conference Washington DC June 23, 2007.
Data Summary July 27, Dealing with Perceptions! Used to quantifiable quality (collection size, # of journals, etc.) Survey of opinions or perceptions.
The Balanced Scorecard and Collection Management Jim Self University of Virginia Library 27 June 2004.
LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice LibQUAL+® Canada Workshop October 24-25, 2007 Ottawa,
Making Library Assessment Work ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 Steve Hiller and Jim Self University.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
How Scientists and Engineers Find Information and Use Libraries Steve Hiller University of Washington Libraries ACRL-STS Program “Partners in Science:An.
Library Assessment: Why Today and not Tomorrow Library Assessment Thessaloniki, Greece June 13-15, 2005 Martha Kyrillidou, Association of Research.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
B.R.A.I.N.C.E.L.L.S. The goal is to provide an opportunity for existing and new DPL librarian staff to enhance and refresh their professional skills through.
Demonstrating Library Value: Practical Approaches to Effective and Sustainable Library Assessment Steve Hiller Director, Assessment and Planning University.
LibQUAL+ Finding the right numbers Jim Self Management Information Services University of Virginia Library ALA Conference Washington DC June 25, 2007.
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
Le New Measures Initiative de l’American Library Association (ARL) A CREPUQ 1 février 2005 Montreal, Canada Martha Kyrillidou Director, ARL.
The New Metrics at U.Va. Jim Self University of Virginia Library ARL Survey Coordinators Meeting Orlando, Florida June 25, 2004.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
By Billye Darlene Jones EDLD 5362 Section ET8004-1B February, 2010.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
Faculty information needs: How well do we support the biosciences? 2007 CNI Spring Task Force Meeting Neil Rambo University of Washington Libraries Association.
Sarah Beaubien Scholarly Communications Outreach Coordinator Grand Valley State University Libraries Open Education Materials: Collecting,
Bepress Session – ALA Midwinter, Philadelphia Supporting Undergraduate Success; Institutional Repositories as curricular tools Teresa A. Fishel January.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Listening to the Customer: Using Assessment Results to Make a Difference.
Webmetrics Workshop American Library Association Chicago, IL June 24, 2004 Martha Kyrillidou Director, ARL Statistics and Measurement Program Association.
Data Collection and Beyond Assessment at the U.Va. Library
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Using the LibQUAL+ Survey to Inform Strategic Planning
Presentation transcript:

Effective, Sustainable & Practical A ssessment Steve Hiller, Director, Assessment and Planning, UW Martha Kyrillidou, Statistics and Service Quality Programs, ARL Jim Self, Director, Management and Information Services, UVA Boston University September 08, 2008

Monday Morning Metrics With Steve Baseball Presentation #535 8 September 2008 Steve Hiller, UW Libraries

Case Study: Worst Trade Ever Made by the Seattle Mariners When Who Why was it the worst Evaluating performance Defining success and value –Statistics –“Intangibles”

Library Assessment More than Numbers Library assessment is a structured process: To learn about our communities To respond to the needs of our users To improve our programs and services To support the goals of the communities

Why Assess? Accountability and justification Improvement of services Comparisons with others Identification of changing patterns Marketing and promotion Opportunity to tell our own story Using data, not assumptions, to make decisions –Assumicide!

The Challenge for Libraries Traditional statistics are no longer sufficient –Emphasize inputs – how big and how many –Do not tell the library’s story –May not align with organizational goals and plans –Do not measure service quality Need measurements from the user’s perspective Need the organizational culture and the skills to answer a basic question: What difference do we make to our communities?

“If you think you're too small to have an impact, try going to bed with a mosquito.” »Anita Roddick

Wide Array of User Studies Now Available Students start with Google Format Agnostic Seek convenience Findings Charting User Change User Behavior User Expectations Customer Service (qualified & helpful staff) Library as a place, symbol, refuge Self-sufficiency & control of information seeking process Ready access to wide range of content (e.g. complete runs of journals) Born Digital

Showcase for recruitment Re-conceptualizing Library Facilities Re-configuring library facilities: Learning commons Collaborative study Social and intellectual center Secondary Storage Library as physical place, intellectual space, and community center Changing nature of library usage

ARL Sponsored Assessment Tools - StatsQUAL –ARL Statistics™ - descriptive, longitudinal, comparative –LibQUAL+®, ClimateQUAL™, DigiQUAL™ –MINES for Libraries®, E-metrics …. Google Analytics Building a Community of Practice –Library Assessment Conferences –Service Quality Evaluation Academy (training events) –Library Assessment blog Individual Library Consultation (Jim and Steve) –Making Library Assessment Work (24 libraries in ) –Effective, Sustainable, Practical Library Assessment (14 in )

ARL Tools for Library Assessment As a result of the work of the New Measures and Assessment Initiative (1999)… ARL Statistics™ Since LibQUAL+® Since 2000 MINES for Libraries™ Since 2003 DigiQUAL® Since 2003 ClimateQUAL™ Since 2007

Survey Structure (Detail View)

“22 Items and The Box….” Why the Box is so Important –About 40% of participants provide open- ended comments, and these are linked to demographics and quantitative data. –Users elaborate the details of their concerns. –Users feel the need to be constructive in their criticisms, and offer specific suggestions for action.

MINES for Libraries TM MINES is a transaction-based research methodology consisting of a web-based survey form and a random moments sampling plan MINES typically measures who is using electronic resources, where users are located at the time of use, and their purpose of use in the least obtrusive way MINES was adopted by the Association of Research Libraries (ARL) as part of the “New Measures” toolkit in May, MINES is different from other electronic resource usage measures that quantify total usage (e.g., Project COUNTER, E- Metrics) or measure how well a library makes electronic resources accessible (LibQUAL+™).

ESP Insights Strong interest in using assessment to improve Uncertainty on how to establish and sustain assessment Lack of assessment knowledge among staff More data collection than data utilization Effectiveness not dependent on library size or budget Each library has a unique culture and mission

Effective Assessment Focuses on the customer Is aligned with library and university goals Assesses what is important Is outcomes oriented Develops criteria for success Uses appropriate and multiple assessment methods Uses corroboration from other sources Provides results that can be used

Sustainable Assessment Needs.. Organizational leadership Sufficient resources Supportive organizational culture Identifiable organizational responsibility Connection to strategic planning and priorities Iterative process of data collection, analysis, and use Involvement of customers, staff and stakeholders

Practical Assessment Keep it simple and focused – “less is more” Know when enough is enough Use assessment that adds value for customers Present results that are understandable Organize to act on results

The University of Virginia 14,000 undergraduates –66% in-state, 34% out –Most notable for liberal arts –Highly ranked by U.S. News 6,000 graduate students –Prominent for humanities, law, business –Plans expansion in sciences Located in Charlottesville –Metro population of 160,000

Collecting the Data at the U.Va. Library Customer Surveys Staff Surveys Mining Existing Records Comparisons with peers Qualitative techniques Long involvement with ARL statistics

Management Information Services MIS committee formed in 1992 Evolved into a department Currently three staff Coordinates collection of statistics Publishes annual statistical report Coordinates assessment Resource for management and staff

“…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931

UVa Customer Surveys Faculty –1993, 1996, 2000, 2004 –Response rates 59% to 70% Students –1994, 1998, 2001, 2005 –Separate analysis for grads and undergrads –Response rates 43% to 63% LibQual –Response rates 14% to 24% Annual Surveys 2008 –Student samples –One third of faculty –Response rates 29% to 47%

Corroboration Data are more credible if they are supported by other information John Le Carre’s two proofs

Analyzing U.Va. Survey Results Two Scores for Resources, Services, Facilities –Satisfaction = Mean Rating (1 to 5) –Visibility = Percentage Answering the Question Permits comparison over time and among groups Identifies areas that need more attention

U.Va. Reference Activity and Reference Visibility in Student Surveys

Data Mining Acquisitions Circulation Finance University Records

Investment and Customer Activity University of Virginia Library

The Balanced Scorecard at the U.Va. Library Implemented in 2001 Results tallied FY02 through FY07 Tallying results for FY08 Completing metrics for FY09 Builds upon a rich history of collecting data A work in progress

The Balanced Scorecard Managing and Assessing Data The Balanced Scorecard is a layered and categorized instrument that –Identifies the important statistics –Ensures a proper balance –Organizes multiple statistics into an intelligible framework

Metrics Specific targets indicating full success, partial success, and failure At the end of the year we know if we have met our target for each metric The metric may be a complex measure encompassing several elements

What Do We Measure? Customer survey ratings Staff survey ratings Timeliness and cost of service Usability testing of web resources Success in fundraising Comparisons with peers

Metric L.2.B: Retention Rate of Employees Target1: Retain 95% of employees. Target2: Retain 90% of employees. Result FY08: Target1. –95% of employees retained.

Metric U.4.B: Turnaround time for user requests Target1: 75% of user requests for new books should be filled within 7 days. Target2: 50% of user requests for new books should be filled within 7 days. Result FY07: Target1. –77% filled within 7 days.

Metric U.3.A: Circulation of New Monographs Target1: 60% of newly cataloged monographs should circulate within two years. Target2: 50% of new monographs should circulate within two years. Result FY07: Target1. –63% circulated.

Balanced Scorecard U.Va. Library FY2007

Using Data for Results at UVa Additional resources for the science libraries (1994+) Redefinition of collection development (1996) Initiative to improve shelving (1999) Undergraduate library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support for transition from print to e-journals (2004) New and Improved Study Space ( ) Increased appreciation of the role of journals (2007)

University of Washington Located in Seattle; metro population 3.2 million Comprehensive public research university –27,000 undergraduate students –12,000 graduate and professional students (80 doctoral programs) – 4,000 research and teaching faculty $ 800 million yearly in U.S. research funds (#2) Large research library system –$40 million annual budget –150 librarians on 3 campuses

The Basic Question How Does the Library Contribute to the Success of our Researchers and Students? Our assessment priorities: Information seeking behavior and use Patterns of library use User needs Library contribution to learning and research User satisfaction with services, collections, overall What have we learned (short version): Faculty perceive success through collections Grad students through timely access to resources and services Undergrads through library as place for work and community

University of Washington Libraries Assessment Methods Used Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007 In-library use surveys every 3 years beginning 1993 –4000 surveys returned in 2008 Focus groups/Interviews (annually since 1998) Observation (guided and non-obtrusive) Usability Usage statistics/data mining Information about assessment program available at:

UW Triennial Library Survey Number of Respondents and Response Rate Faculty % % % % % % Grad Student % % % % % % Undergrad467 20% % % % % %

What Did You Do in the Library Today? (In-Library Use Surveys 2008/2005)

Activity in the Library by Group 2008 Users: 73% UG, 22% Grad, 5% Faculty

Usefulness of New and/or Expanded Services for Undergrads : Library as Place (2007 Triennial Survey)

Library As Place: Using the Results Libraries are student places –350 computer lab installed in Undergrad Library Autumn 1998 –Hours extended to 24/5.5 in Undergrad Library 2002 –Collection footprint reduced –Diversified user spaces provided (group, quiet, presentation) –Student advisory committee provides ongoing feedback –Add other collaborative student support services into library Upgrade/renovate facilities to meet student needs –Furniture that encourages collaboration –More electrical outlets –Better lighting and noise control Plan for major renovation of Undergraduate Library

What Faculty/Grad Students Told Us Bioscience Interview/Focus Groups (2006) Content is primary link to the library –Identify library with ejournals; want more titles & backfiles Provide library-related services and resources in our space not yours –Discovery begins primarily outside of library space with Google and Pub Med; Web of Science also important –Library services/tools seen as overly complex and fragmented Print is dead, really dead –If not online want digital delivery/too many libraries –Go to physical library only as last resort Uneven awareness of library resources and services

Sources Consulted for Information on Research Topics ( Scale of 1 “Not at All” to 5 “Usually”)

Off-Campus Remote Use ( Percentage using library services/collections at least 2x week) 76% of faculty and 72% of graduate students use library services online at least 2x week

Primary Reasons for Faculty Use of Libraries Web Sites 2007 (at least 2x per week)

E-Journal Usage at UW Scholarly Stats 2007: 5 million article requests from 19 vendors/platforms - 7 accounted for nearly 75% High Wire Press1,200,000 Biomedical Science Direct 975,000 Science-Medical JSTOR 750,000 Multidisciplinary Nature 300,000 Science-Medical Meta Press 250,000 Science-Medical Blackwell Synergy 225,000 Science-Medical Ovid 225,000 Biomedical

Importance of Books, Journals, Databases Academic Area (2007, Faculty, Scale of 1 “not important” to 5 “very important)

Bibliographic Database Use: Login Sessions Database Change Art Abstracts % Geobase % MLA % ERIC % Expanded Academic % Web of Science %

Librarian Liaison Satisfaction & Visibility By Selected School (2007 Triennial Survey; Satisfaction on 1 to 5 scale; visibility % who rated)

Usefulness of New/Expanded Services for Faculty & Grads: Integrate into my space

Integrate Library Services & Resources into User Workflows: Follow-Up Actions WorldCat Local became primary catalog+ in 2007 “Scan on demand” pilot began 2008 Increase chat and remote services staffing Increase ILL staffing (due to WorldCat Local) Integrate course reserves & services into “My UW” portal Redesign UW Libraries Homepage Use qualitative methods to gain deeper understanding of user work and behavior Strengthen librarian liaison efforts to academic programs

How UW Libraries Has Used Assessment Extend hours in Undergraduate Library (24/5.5) Create more diversified student learning spaces Enhance usability of discovery tools and website Provide standardized service training for all staff Review and restructure librarian liaison program Consolidate and merge branch libraries Change/reallocate collections budget Change/reallocate staffing Support budget requests to University

Overall Satisfaction by Group You guys and gals rock!!!!!! We need to invest in our library system to keep it the best system in America. The tops! My reputation is in large part due to you. Professor, Forest ResourcesYou guys and gals rock!!!!!! We need to invest in our library system to keep it the best system in America. The tops! My reputation is in large part due to you. Professor, Forest Resources

Four Useful Assessment Assumptions Your problem/issue is not as unique as you think You have more data/information than you think You need less data/information than you think There are useful methods that are much simpler than you think Adapted from Douglas Hubbard, “How to Measure Anything” (2007)

in conclusion Assessment is not… Free and easy A one-time effort A complete diagnosis A roadmap to the future

Assessment is… A way to improve An opportunity to know our customers A chance to tell our own story A positive experience

Moving Forward Keep expectations reasonable and achievable Strive for accuracy and honesty—not perfection Assess what is important Use the data to improve Keep everyone involved and informed Focus on the customer

For more information… Steve Hiller Jim Self – – ARL Assessment Service