Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright.

Slides:



Advertisements
Similar presentations
Planning Collaborative Spaces in Libraries
Advertisements

HART RESEARCH P e t e r D A S O T E C I Raising The Bar
DPAS II EVALUATION 2012 Dr. Donald Beers Progress Education Corporation DPAS II.
But What Does It Mean ? Using Statistical Data for Decision Making in Academic Libraries Steve Hiller University of Washington Libraries Seattle, Washington.
Integrating Assessment and Planning: Path to Improved Library Effectiveness Wanda Dole, Donna Rose, Maureen James, Suzanne Martin, University of Arkansas.
Chad Allison May 2013  1-2 Formal Classroom Evaluations  Drop-in Visits.
North Carolina Educator Evaluation System. Future-Ready Students For the 21st Century The guiding mission of the North Carolina State Board of Education.
Expanding & Sustaining Systems of Care: New Challenges and Opportunities Presentation Beaver County (PA) System of Care: Optimizing Resources, Education.
Assessment is about Quality AMICAL CONFERENCE – MAY 2008 Ann Ferren, Provost, American University in Bulgaria.
ARL Library Investment Index: why is it important? Chania, Crete, Greece May, 2009 Presented by Brinley Franklin Martha Kyrillidou Colleen.
the UNCG University Libraries ASERL Meeting November 13, 2012 Atlanta, GA Kathryn Crowe Associate Dean for Public Services
CSG Survey to understand Teaching & Learning space domain Guenthar, Lakhavani, Leonhardt, Stringer, Werner CSG Discussion May 16, 2014.
The Library Assessment Journey at Emory For MLAW Meeting ALA Midwinter January 20, 2006 Susan B. Bailey Library Assessment Coordinator.
Project Monitoring Evaluation and Assessment
Purpose of Evaluation  Make decisions concerning continuing employment, assignment and advancement  Improve services for students  Appraise the educator’s.
William Paterson University Five Strategic Areas of Focus at the Cheng Library Fairleigh Dickinson University June 18, 2009 Anne Ciliberti
Quality Assurance and Development Unit College of Applied Medical Sciences Females 1.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Usage & Usability Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie Mellon June 16, 2001 – LRRT,
How are we doing with assessment? Update from the Information Services Assessment Council March 8, 2006.
Creating a User-Centered Culture of Assessment Stella Bentley and Bill Myers University of Kansas EDUCAUSE Southwest Regional Conference 2005.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Title I Needs Assessment and Program Evaluation
Assessment: What it means What we’ve done What’s ahead Update from the Information Services Assessment Council March 30, 2006.
National Public Health Performance Standards Local Assessment Instrument Essential Service:10 Research for New Insights and Innovative Solutions to Health.
I Don’t Do Research... But Steve Hiller Director, Assessment and Planning University of Washington Libraries
Making Library Assessment Work: Practical Approaches for Developing and Sustaining Effective Assessment Phase I Update ARL VISITING PROGRAM OFFICERS
Leadership for Student Achievement National School Boards Association.
Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge University at Albany, SUNY American Library Association ALCTS Affiliates.
The person or persons who have associated work with this document (the "Dedicator" or "Certifier") hereby either (a) certifies that, to the best of his.
Strategic Planning Summit GAP/Committee Chairs/IE December 5,
Securing the High Ground – Strategies & Technologies for a Comprehensive Assessment Program Copyright Rod Henshaw & Teri Koch, This work is the intellectual.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Outcome Assessment Tools for the Library of the Future ACRL Conference 2005 April 7, 2005 Minneapolis, MN Martha Kyrillidou Director, ARL Statistics.
2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment.
Making Library Assessment Work Progress Report of an ARL Project Steve Hiller University of Washington Jim Self University of Virginia & Martha Kyrillidou.
Types of Assessment Satisfaction of the customer. Satisfaction of the worker. Workflow effectiveness and speed. Service delivery effectiveness and speed.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Group. “Your partner in developing future Lifelong Learners” UROWNE UNIVERSITY LIBRARY.
Effective, Sustainable and Practical Library Assessment Steve Hiller Director, Assessment and Planning University of Washington Libraries ARL Visiting.
Using LibQUAL+™ Results Observations from ARL Program “Making Library Assessment Work” Steve Hiller University of Washington Libraries ARL Visiting Program.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
ARL and SCONUL Assessment Initiatives: Synergies and Opportunities Stephen Town University of York, UK LibQUAL+ Exchange Florence, 2009.
General Capacity Building Components for Non Profit and Faith Based Agencies Lakewood Resource and Referral Center nd Street, suite 204 Lakewood,
Staying on Message in Changing Times Oklahoma Statewide System of Support (SSOS) January 7, 2011 Dr. Cindy Koss, Assistant State Superintendent Oklahoma.
When the Evidence Isn’t Enough: Organizational Factors That Influence Effective and Sustainable Library Assessment Steve Hiller University of Washington.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
The Interactive Model Of Program Planning
Old.libqual.org A fairytale about “ 22 items and a box ” presented by Martha Kyrillidou May 24, 2004 Medical Library Association Washington, DC.
Implementing an Institutional Repository: Part III 16 th North Carolina Serials Conference March 29, 2007 Resource Issues.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
LibQUAL+ Finding the right numbers Jim Self Management Information Services University of Virginia Library ALA Conference Washington DC June 25, 2007.
Integrated Media and Technology Program with an Emphasis on Student Achievement.
User Needs Assessment to Support Collection Management Decisions Steve Hiller University of Washington Libraries For ALCTS-CMDS.
Middle States Reaccreditation Process at The Catholic University of America.
Le New Measures Initiative de l’American Library Association (ARL) A CREPUQ 1 février 2005 Montreal, Canada Martha Kyrillidou Director, ARL.
Strategic Planning System Sacramento City College Strategic Planning System ….a comprehensive system designed to form a reliable, understood system for.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
ARL New Measures —a View from Cornell Anne R. Kenney ARL Survey Coordinators and SPEC Liaisons Meeting ALA June 2007.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
International Safety Rating System
Practical Library Assessment Stephen Spohn. About me Academic libraries University of Maryland - George Mason University - Northern Essex Community.
Achieving the Dream Mark A. Smith.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
LibQUAL+TM : A Total Market Survey
February 21-22, 2018.
Presentation transcript:

Turning Results into Action: Using Assessment Information to Improve Library Performance OR WHAT DID YOU DO WITH THE DATA? Steve Hiller Stephanie Wright University of Washington Libraries

Data Sources For This Study Library Assessment ARL SPEC Kit 303 (Dec. 2007) Survey sent to ARL Libraries May-June respondents (60%), nearly all academic Self-reported information ARL Consultation Service Making Library Assessment Work/Effective, Sustainable and Practical Library Assessment 35 Libraries visited (32 North A., 28 ARL) Observed and confirmed information

Library Assessment SPEC Kit Survey Questions Impetus for assessment Assessment methods used Organizational structure for assessment Distribution/presentation of assessment results Using assessment information (up to 3 actions) Professional development needs in assessment

SPEC Survey: Impetus for Assessment Desire to know more about your customers91% Investigation of possible new library services/resources71% Desire to know more about your processes65% Desire to identify library performance objectives62% Need to reallocate library resources55% Accountability requirements from parent institution37% Institutional or programmatic accreditation process29%

Building Assessment Capability in Libraries through Consultation Services Association of Research Libraries (ARL) project began in 2005 as Making Library Assessment Work (MLAW) Assess the state of assessment efforts in individual research libraries, identify barriers and facilitators of assessment, and devise pragmatic approaches to assessment that can flourish in different local environments Funded by participating libraries Conducted by Steve Hiller and Jim Self under the aegis of Martha Kyrillidou of ARL In 2007 name changed to Effective, Sustainable and Practical Library Assessment (ESP) and opened up to all libraries

MLAW/ESP: Data Collection Methods Pre-Visit Survey on assessment activities, needs etc. Telephone follow-up Mining library and institutional web pages Visit (1.5 days) Presentation on effective assessment Group meetings and observation/verification Follow-up and report Pursue leads and additional information

ESP Self-Identified Assessment Needs (31 NA Libraries)

Data Caveats Different methodological techniques used Information gathered at different times ESP confirmed on ground; SPEC self-reported Libraries are different (21 of 73 SPEC survey respondents also participated in MLAW/ESP) Some bias in libraries self-selecting to participate in ESP and respond to SPEC survey (likely that more is done in these libraries)

Assessment Methods Used SPECESP Data collection 100% Web usability testing 80% LibQUAL+® survey 75%100% Focus groups/Interviews 75%80% Facilities use studies 55%60% Student instruction evaluations 55%75% Observation 50%65% Benchmarking and process improvement 50% Other locally developed surveys 50%75%

ACTION SCOREBOARDS (Data will be in paper) Score Widespread Occasional to general Sometimes to occasional Seldom

Action Scoreboard: Websites SPECESP Website redesign Change content Change library catalog display

Action Scoreboard: Facilities SPECESP Renovate existing space Repurpose existing space New furniture/equipment Environmental (HVAC, lighting) Close libraries/service points Signage Plan new space

Action Scoreboard: Services SPECESP Hours Service Desk Staffing Service Quality Instruction Process improvement

Action Scoreboard: Collection Development and Management SPECESP Journal review/decisions Going Electronic Weeding, relocation, storage Fund allocations Scholarly communication

Action Scoreboard: Organization SPECESP Organizational climate Staff Training Marketing Communications (external) Collaborations (external) Stop doing specific activities

Overall Action Scoreboard SPECESP Web site Facilities Collection development Reference services Access services Instructional services Hours Organizational changes

Using Assessment Data: Actions Lots of data collected but actions generally limited to either low hanging fruit or one-time changes: –Website improvements (Usability) –Hours (Comments, observation) –Collection development/management decisions –Facilities (Observation, qualitative methods) More actions taking place than reported (both to SPEC and MLAW/ESP) Little evidence of action in: –Instruction/Learning outcomes –Organizational changes

Organizational Factors That Impede Turning Data into Actions Lack tradition of using data for improvement No assessment advocate within organization Library staff lack research methodology abilities Weak analysis and presentation of data Inability to identify actionable data Library culture is skeptical of data Leadership does not view as priority/provide resources Library organizational structure is silo-based Staff do not have sufficient time

Sustainable Assessment and Actions Leadership believes and supports Formal assessment program established Institutional research agenda tied to strategic priorities Staff training in research/assessment methodology Staff have time and resources to follow-up Research balanced with timely decision-making Assessment results presented, understood and acted upon Results reported back to the customer community Library demonstrates value provided community