ACRL Academic Library Trends and Statistics Editorial Board June 27, 2015.

Slides:



Advertisements
Similar presentations
IPEDS Changes and Updates Michigan Community Colleges Data Workshop, August 2, 2013 PRESENTED BY: Eileen Brennan, Oakland Community College 1.
Advertisements

The annual SCONUL Statistics Philip Payne JIBS workshop on management data in higher education libraries 1 st. June 2009.
ARL Library Investment Index: why is it important? Chania, Crete, Greece May, 2009 Presented by Brinley Franklin Martha Kyrillidou Colleen.
Oregon State Library Transformation Project Launch
Snark Hunting Using Millennium for ACRL Annual Statistics Matt Polcyn 2009 OPAL Conference.
Common Statistics Programs and Projects in Canadian Academic Libraries Sylvie Belzile Université de Sherbrooke Statistiques de bibliothèques au 21 e siècle:
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
Academic Libraries: Building Campus Partnerships Jeffrey Trzeciak, University Librarian Washington University in St. Louis.
SWAN Quarterly Meeting June 6,  Nov – April  March Quarterly: Consortia Survey Results  April SWAN Board meeting Executive Director/Board negotiate.
Project COUNTER Trends in Statistical Standards for E- Resource Management March 2005 Oliver Pesch Chief Strategist, E-Resources EBSCO Information Services.
Visual Literacy Standards Task Force Open Meeting ACRL Image Resources Interest Group Virtual meeting, ALA Midwinter 2011.
Job Analysis and Rewards
Use of the ACRLMetrics Service 9th Northumbria International Conference University of York, England Joe Matthews August 2011.
Human Resources Office of 1 Job Classification System Redesign Information Session Health Care and Animal Care October 28, 2014.
CEP: THE MOST EFFECTIVE (YET UNDERUTILIZED) DOCUMENT FOR ASSURING CAMPUS SUCCESS DR. JAMIE MORLEY CEO, Education Consulting Solutions & ACICS Commissioner.
GETTING THE WORD OUT DEVELOPING A MARKETING PLAN FOR ACCESS SERVICES.
Presenter: Ira Bray Monday, September 14, :00 noon to 1:00 p.m. Infopeople webinars are supported by the U.S. Institute of.
Elisha Chiware Debbie Becker CPUT Libraries. Agenda The role of statistics in library operations and management planning Statistics and the research librarian.
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
E-Book Usage Statistics Data Collection & Assessment Maryland Library Association Technical Service Division E-Resources Boot Camp August 4, 2014 Randy.
21 st Century Maricopa Review of Process Human Resources Projects Steering Team Meeting May 12, 2010.
Evaluating and Purchasing Electronic Resources- The University of Pittsburgh Experience Sarah Aerni Special Projects Librarian University of Pittsburgh.
Cataloging in digital age Li Sun Asian Languages Cataloger Metadata Librarian Cataloging and Metadata Services Rutgers University Libraries CEAL Annual.
ACRLMetrics ACRL Philadelphia 2011.
Strategic Planning: Theme 1 – Develop and Inspire Creative Thinkers and Leaders and Life-long Success Lever 1.1 : Require all undergraduate students to.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
BENCHMARKING An Essential Reporting Tool Presented By: Nancy Brooks 2008 NAEP Annual Meeting.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
1 Community-Based Care Readiness Assessment and Peer Review Team Procedures Overview Guide Department of Children and Families And Florida Mental Health.
From ARL Profiles to Questions for the ARL Supplementary Statistics? ARL Survey Coordinators and SPEC Liaisons Meeting Washington, DC June.
Slides from a workshop at the annual conference of the American Theological Library Association, New Orleans, June 2014 TD Lincoln.
Technology Planning. Primary Elements Stakeholders Leadership team Needs assessment Technology components Work plan Budget Policies Evaluation.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
E-Metrics Project Update American Library Assocation Orlando June 27, 2002 Sherrie Schmidt Dean of University Libraries Arizona State.
Web-Based Usage Surveys MINES for Libraries TM Brinley Franklin University of Connecticut Terry Plum Simmons GSLIS
April 2011 NCAA Division I Board of Directors charge: o New program should focus on the student-athlete experience and be simplified, streamlined and.
ARL Statistics In recognition of Kendon Stubbs’ contributions Webcast Washington, DC October 9, 2003 Martha Kyrillidou Senior Program Officer.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
Data Resources for Student Affairs Professionals May 15, 2012.
“ Serving Distant Learning Business Programs and Students,” RUSA/BRASS, ALA Atlanta, June 15, 2002 Marilyn Hankel, Associate Dean of Library Services University.
Toward Meaningful Academic Library Ratio Analysis Brinley Franklin Stellenbosch, South Africa 15 August 2007.
Information Competency: Next Steps Presented at California Library Association Conference, Ontario CA November 15, 2003 Erlinda Anne Estrada Mission College.
Avenues of Advocacy: Using Standards to Make a Difference Carla J. Funk Executive Director Medical Library Association.
ESEA Consolidated Monitoring Office of Federal Programs December 10, 2013.
Prepared by the Office of Grants and Contracts1 INDIRECTS vs. REDIRECTS.
May 2007 Registration Status Small Group Meeting 1: August 24, 2009.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR)
Proposed Changes to PLS FY 2016 Data Elements SDC Meeting December 09, 2015.
Square Peg, Round Hole How to fit you into the institutional scorecard February 2013.
Reference Department Kamilya Assylbekova
DESIGNING 21 ST CENTURY LIBRARIES Jeffrey Trzeciak HKUL Leadership Institute 2016 Kuala Lumpur, Malaysia.
Nevada Department of Education Office of Educational Opportunity Nevada Comprehensive Curriculum Audit Tool for Schools NCCAT-S August
Bepress Session – ALA Midwinter, Philadelphia Supporting Undergraduate Success; Institutional Repositories as curricular tools Teresa A. Fishel January.
Slide 1 (e)Book Snapshot: Print and eBook Use in an Academic Library Consortium Joanna Voss Collections Analyst OhioLINK.
Consider Your Audience
Assessing Library Performance:
Assessment Cycle and Academic Effect
Salary Policy Task Force Recommendations A presentation to the University of Wyoming Board of Trustees November 16, 2017.
University of West Florida Libraries
Alternate Assessment Updates
Benchmarking Reference Data Collection
Leveraging Data Visualization to Enhance Benchmarking Reports
Ohio AHEAD 2017 Business Meeting
Implementing Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries Amanda K. Hawk,
LR HEAnet Seminar: 5 December 2018
Collecting Library Statistics for Management Decision-making
Springshare’s LibInsight: E-Journals/Databases Dataset
Presentation transcript:

ACRL Academic Library Trends and Statistics Editorial Board June 27, 2015

 Update on IPEDS Academic Library component ◦ Bob Dugan, University of West Florida  2015 ACRL survey and instrument ◦ Terri Fishel, Macalester College  Conversation with the audience about data collecting ◦ Mark McCallon, Abilene Christian University We are all members of the ACRL Academic Library Trends and Statistics Editorial Board

 first data collection completed in April 2015  summary of challenges academic libraries encountered with the IPEDS survey: ◦ IPEDS AL component definitions ◦ HR component

 e-books and e-media: counted simultaneous use; excluded e-content from databases  e-government documents: excluded in counts  microforms, maps, and non-print: excluded in counts  circulation: restrictive definition to essentially transactions over the circulation desk; inclusion of non-returnable ILL; included renewals and reserves

 circulation: no identification for means to count/report e-resources such as e-books  expenditures for salary and wages: included staff salaries from any identified institutional source but excluded student assistants if not expended from the library’s budget

 libraries accustomed to reporting FTE counts by library staff classification  IPEDS counts headcounts in the Human Resources component; used BLS SOC staff classification which does not align with past library surveys; collects gender and ethnicity  Institutional HR departments often did not consult with library to align data reported for staffing types; headcount hinders staff benchmarking based upon FTE and typical staff classifications

 an ACRL, ALA and ARL Joint Task Force convened in May 2015 to address issues ◦ ARL and ACRL had five members each; ALA Office of Research & Statistics and IPEDS AL component survey director ◦ Conference calls and a face-to-face in DC on June 19  The TF developed several recommendations for review here at ALA (

Library Collections  count titles rather than simultaneous use  count titles accessible via library catalog or discovery system for: ◦ e-books ◦ open access resources ◦ government documents ◦ maps, microforms and media

Library Services  count only initial circulation ◦ exclude renewals and reserves  count e-book and e-media usage ◦ downloads and views  use COUNTER metrics if you have them ◦ optional; not required Library Expenses  salaries and wages from all institutional sources (may not be acceptable because of the other IPEDS components which capture expenses)

 change for the next data collection cycle for IC Header (starts on August 5, 2015) ◦ no longer have to state how much the library expended as a screening question  we should know by August 2015 which recommendations IPEDS has accepted from the Joint Task Force, if any  IPEDS spring data collection to start December 9, 2015

 Why the ACRL Survey? ◦ Comparison data ◦ Benchmarking ◦ Identifying trends ◦ If you are not an ARL library, this is your only source  IPEDS vs ACRL Survey ◦ Why both?

 IPEDS ◦ institutional survey, not library focused ◦ certain data elements are no longer collected ◦ staffing data is collected from HR not library ◦ any changes require approval by OMB; minimum of two years  ACRL ◦ Flexibility, responsive and adaptive to changing needs in 21 st century ◦ Utilize NISO standards for many of our definitions; also ALS ◦ Able to release results in less time ◦ We collect the IPEDS data elements and can generate a report for you to share with your IPEDS data keyholder – so you just fill out ONE survey

 Staffing – distinguish from IPEDS in order to collect more specific staffing data  Salary data – IPEDS limit salary data to just library budget ◦ IPEDS is institutional, ACRLMetrics is library specific  New as well as traditional services – services are also big piece missing from IPEDS ◦ Institutional Repositories ◦ Reference consultations  Utilize new tools made available by new systems ◦ Discovery tools ◦ ERMS

 Accountability  Assessment  Best practices  Peer comparison  Informed decision-making  Demonstrating value  Analyze trends – local and national

visited 6/2/2015

From ACRLMetrics, Counting Opinions (report run 6/6/2015)

Permission to use granted by Christina Hillman; Lavery Library 6/5/2015

 (previous slide)     

 Contribute for the "common good”  Low cost for access to entire data available in ACRLMetrics  Collecting data on 21st century libraries that will help us track trends  We need your feedback on what questions should be added, when definitions need improvement, and what trends we should be investigating Image Source:

 What do you find difficult and challenging when collecting statistics?  Is there anything that we could do from our end to help make this less burdensome?