VALE REFERENCE SERVICES COMMITTEE January 5th, 2011

Slides:



Advertisements
Similar presentations
Assessing Reference Services Using the READ Scale (Reference Effort Assessment Data) Penny Coppernoll-Blach, Reference Coordinator; Dominique Turnbow,
Advertisements

Moving to Online Reference Statistics: Brockport's LibStats Experience Charles Cowling and Mary Jo Orzech For general questions contact Charlie Cowling,
Case Study Team 9. 2 Mission Statement The aim is to support teaching and researching of all students and faculty through the provision of relevant information,
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
EPIC Online Publishing Use and Costs Evaluation Program: Summary Report.
R U There? Looking for those Teaching Moments in Chat Transcripts Frances Devlin, John Stratton and Lea Currie University of Kansas ALA Annual Conference.
© 2010 Pearson Education, Inc., publishing as Longman Publishers. 1 Chapter 3 The Research Process in Technical Communication Strategies for Technical.
Process and Survey Overview State Library Representatives Academic Library Survey 2010.
Introducing Suma: An open source tool to aid in assessment of reference services Joyce Chapman North Carolina State University Libraries RUSA MARS Hot.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
Ask A Librarian and QuestionPoint: Integrating Collaborative Digital Reference in the Real World (and in a really big library) Linda J. White Digital Project.
Introduction to the Library UTS Library Why can’t I just use Google? The vast majority of academic literature suitable for university assignments.
Library Research Sources at UGA. UGA Libraries  Comprised of the Main library, Science library, Student Learning Center and Research Facilities  3.7.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
“ Serving Distant Learning Business Programs and Students,” RUSA/BRASS, ALA Atlanta, June 15, 2002 Marilyn Hankel, Associate Dean of Library Services University.
Building Rubrics that Align with Standards & Documenting Candidates’ Effects on Student Learning Cynthia Conn, Ph.D., Associate Director, Office of Academic.
Virtual Reference in CARL Libraries Susan Beatty Head Information Commons University of Calgary Library Peggy White Head Science & Technology Liaison Services.
VALE REFERENCE SERVICES COMMITTEE January 5 th, 2011.
Ma Lei Hsieh Instruction Librarian Rider University Patricia H. Dawson Science Librarian Rider University VALE User.
The Power Punch of Virtual Reference Tools Sponsored by: VALE Reference Services Committee.
M ARKET R ESEARCH Topic 3.1. W HAT IS MARKET RESEARCH ? The process of gaining information about customers, products, competitors etc through the collection.
# ? ? ? # Using Wordpress & Formidable Forms to Track Reference Desk Stats at Drew University Jennifer Heise,
LibQUAL+ TM Library Survey LIBQUAL+ “ Only customers judge quality – all other measures are irrelevant”
Good, Fast, and Cheap: The Pioneering Reference Chat Service at the University of North Texas Presented by Martha Tarlton and Donna Arnold Music Library.
Promoting e-resources Gintarė Tautkevičienė Kaunas University of Technology, Lithuania.
Where Do I Find It and What Do I Do with It? Emily Springer Manager, Regions and Sections.
Local Points of Contact Webinar
2016 College Planning and application workshop
ACCUPLACER Orientation
Good B.I..
Exploring and Evaluating Computational resources on the Web Module 4
Interpreting, Editing, and Communicating MISO Survey Results
as presented on that date, with special formatting removed
Functional Area Assessment
Click It, No More Tick It: Using “Gimlet” Desk Statistics to Improve Services at the Charles W. Chesnutt Library Velappan velappan, M.S., M.L.I.S., Head.
Adastra v3 Reporting & National Quality Requirements
Patron Service Excellence or How to Win an Exemplary Chat Award
Fording the Data Stream MLA Midwest Chapter 2013
COMP390/3/4/5 Final Year Project Introduction & Specification
Consider Your Audience
How to Conduct Toileting Trials: A Webinar Course Evaluation
Application Software Chapter 6.
College Success Skills (Library Edition)
Heather McCullough, Ph. D
2016 Fall Parent Meeting TAG
Scholarship Information
Part III – Gathering Data
Welcome to the Nevada Test Administration Training and Q&A Session
Quantifying the value of our libraries. Are our systems ready?
Your session will begin shortly
Introduction to Computers and their Applications Structure and Grading
Marvin Library Web Page
What is a Canadian Data Dude?
Applying to Graduate School
Writing to Learn vs. Writing in the Disciplines
Introduction to Computers and their Applications Structure and Grading
VALE Annual Users’ Conference
Suspension Petition Process
COMP390/3/4/5 Final Year Project Introduction & Specification
Sarah Lucchesi Learning Services Librarian
Benchmarking Reference Data Collection
Business and Management Research
Reference on the Respirator Current status of academic library reference services in New Jersey Samantha Kennedy, Information Literacy Librarian Dan Kipnis,
A tale of three surveys: How librarians, faculty and students perceive and use electronic resources March 2009 © SkillSoft Corporation 2003.
Research Problem: The research problem starts with clearly identifying the problem you want to study and considering what possible methods will affect.
Experience from Statistical Office of Montenegro – MONSTAT
Purpose of EPIC Evaluation Program
Electronic Outreach Services for Library Users
Populating the Knowledge Base,Entering Questions, and Analytics
WORKSHOP Establish a Communication and Training Plan
Presentation transcript:

VALE REFERENCE SERVICES COMMITTEE January 5th, 2011 Embracing Change at the Reference Desk: Data Collection Methods and Tools in NJ Academic Libraries VALE REFERENCE SERVICES COMMITTEE January 5th, 2011

VALE MEMBER LIBRARIES - 52 academic institutions (68 physical locations on the map; Some institutions operate more than one library ) http://valenj.org/members/member-map

ACRL DEFINITION: “A reference transaction is an information contact that involves the knowledge, use, recommendations, interpretation, or instruction in the use of one or more information sources by a member of the library staff….” “…Duration should not be an element in determining whether a transaction is a reference transaction”….. (ACR) http://www.ala.org/rusa/resources/guidelines/definitionsreference

TIMELINE February 14, 2011: Survey launched Mass e-mail to NJ Directors list February 23rd: survey re-sent March 3rd: individual emails sent Responses received from 37 libraries (including 5 Rutgers libraries) OF THOSE THAT RESPONDED, TWO LIBRARIES DO NOT COLLECT REFERENCE STATISTICS 30 INSTITUTIONS (58% of VALE members) 35 individual libraries (5 Rutgers libraries) 13 PUBLIC; 12 COM. COLl.; 10 PRIVATE

QUESTIONS ASKED 1)  What kind of data is collected in your library (libraries)? 2)  How do you define different reference transaction categories? 3)  Are statistics collected only at the reference desk? 4)  Do you collect the data on a daily basis or as a sample (once a week, once a month)? 5)  Do you register each question or rather each reference transaction? 6)  Do you use an electronic tool to collect the data? Is so, what type of software program is used? 7)  How is the collected data being used?

Question 1: Types of data being collected Three main categories with a variety of terms: Research: reference, extended reference, demonstration &/or instruction, consultation, professional, research levels; Directional: information, ready reference, quick reference, basic, activity; Technology related: computer assistance, technical support; Other types: Virtual (electronic) reference includes: IM, Meebo, Q&A NJ, text-a –librarian, blog visits, chat, e-mail; Phone Assignments Government documents

Question 1 cont.: Research/Reference – 34 (1 library did not identify this category) Directional - 31 (5 libraries either do not collect directional encounters or did not specify this category) Technology - 15 Email/Virtual Ref. - 13 Phone - 12 Glouchester: Directional and research transactions and within these two categories further subdivide whether “in-person, phone, or email.” Under this division, we further separate by directional/computer-related, and directional/non-computer; research questions we also separate by computer-related and non-computer/print; Ramapo: Quick Reference questions (less than 10 minutes) In-Depth reference questions (more than 10 minutes) Directional/Technical (e.g. use of photocopiers) questions Reference questions through telephone Directional/Technical through telephone Meebo chat reference Rfaritan Valley: Three tiers of questions: Basic/Directional; Reference/Demonstration; Research/Instruction. We also track: Online questions (chat, email) and Phone questions. These questions are recorded twice – once for the transaction (phone or online) and once in the appropriate tier (Basic, Reference, Research). TCNJ: directional and professional and extended professional questions, each category subdivided by phone, email/chat, or in-person. William Paterson: Data is collected based on time of day including walk-in, telephone and chat. We also collect data based on individual appointments and other non-ref-desk encounters including email.

OBSERVATIONS: MULTIPLE MODELS IN DATA COLLECTION Main categories versus tiers or levels (level 1, 2, etc.) Main categories and subcategories (general ref. vs. extended ref., “2 types of reference”) General categories vs. specific categories (i.e. research assistance and consultation) Virtual (phone) versus on-site reference General categories divided into time periods (under 15 minutes, over 20 minutes etc.) Outliers: “…data (what type of data?) is collected based on time of day including walk- in, telephone and chat.” Glouchester: Directional and research transactions and within these two categories further subdivide whether “in-person, phone, or email.” Under this division, we further separate by directional/computer-related, and directional/non-computer; research questions we also separate by computer-related and non-computer/print; Ramapo: Quick Reference questions (less than 10 minutes) In-Depth reference questions (more than 10 minutes) Directional/Technical (e.g. use of photocopiers) questions Reference questions through telephone Directional/Technical through telephone Meebo chat reference Rfaritan Valley: Three tiers of questions: Basic/Directional; Reference/Demonstration; Research/Instruction. We also track: Online questions (chat, email) and Phone questions. These questions are recorded twice – once for the transaction (phone or online) and once in the appropriate tier (Basic, Reference, Research). TCNJ: directional and professional and extended professional questions, each category subdivided by phone, email/chat, or in-person. William Paterson: Data is collected based on time of day including walk-in, telephone and chat. We also collect data based on individual appointments and other non-ref-desk encounters including email.

Question 2: Definition of Reference Transaction (Category ) Research – more than 5 min.; Extended time – over 15 min.; “Reference transaction -questions that involve the use, recommendation, and interpretation of library resources;” Research - means finding data, helping with search skill sets, search criteria - no time limit.  “use, recommendation, interpretation, or instruction of one or more information sources;” Research levels from 1 to 4; “information contact that involves the use of one or more information sources; consultation = reference question: skill base, strategy based or non- resource; “we try to follow the definitions used in the ACRL statistics gathering to differentiate directional from professional.” . Some of the definitions just use slight variations on the ACRL definition, which I don’t see as a big problem, but some of them use duration or level of difficulty while others do not. However for data comparison purposes you can always group more finely-grained data back into larger sets. For example I could group Rowan’s Level 2, 3, and 4 questions into Research, and put our Level 1, Directions and a few others into Directional. So the main issue is that you would have to pre-process some of the data in order to make cross-library comparisons.

Question 3: Place(s) where data is collected

Question 3: Locations - issues Reference desk was omitted in several responses: “reference service (email and chat) are collected separately,” or “Off desk/offices emails, phones, consultations” Reference encounters (rowing?, appointments?) Phone /e-mail separately at the Ref. Desk Ref. desk include in person, phone and e-mail; Chat separately, Libguides and dbase e-book use separately, but: Ref. desk, e-mails, chat reference, library blog, phone.. Libguides/database usage/e-book usage –probably not complete count – not everyone considered it a reference transaction (is this a reference transaction statistic?).

Question 4: Frequency of data collection SAMPLE (7 libraries): Three weeks during semester 3 week period during the Fall semester Once a week 3 sample weeks spaced out during each semester 2-3 times a semester sample week once per year Four weeks a year MONTHLY (1) - cumulative DAILY (27 libraries – 77%): Daily in three intervals Hourly / Daily Collected daily, reported by sampling

Question 5: Transaction vs. Question Transactions BUT (3 out of 23 libraries are not clear): “If it is a reference transaction than it is recorded as a reference transaction; should additional assistance for another service be needed, that too is recorded under the appropriate topic.” “If the students work on several papers, then questions.” “Each transaction but it’s usually at the discretion to the reference librarian how to record it.”

Question 6: ELECTRONIC TOOL TO COLLECT DATA ELECTRONIC TOOLS IN USE: Google Tools/Google Docs In house Access dbase Excel & “spreadsheet” - 9 NONE, but: “looking into developing home grown program” LibAnswers soon “our data librarian has developed a simple program that can be used at the Reference Desk.. which we're looking to implement” Gimlet trial version None 23 Excel 5 Paper to spreadsheet 4 Other 3

Question 7: HOW IS DATA BEING USED

Question 7: HOW IS DATA BEING USED

FEW FORMS/SYSTEMS CURENTLY IN USE

Question 7: How is this data used What kind of: “Annual reports“ “Reports to various associations” “We report them to library associations that require them” “To understand the value of our reference and research services” “Assessment ” (staffing, col. development, instruction?) VAGUE

WHAT HAVE WE LEARNED FROM THIS QUICK SURVEY ? Results – snapshot showing how academic libraries define, collect and use reference data Data is being collected in three main categories with a variety of terms Definitions of reference transactions vary Reference desk still leading location for collecting data Daily - preferable frequency for data collection Transactions rather than questions Electronic tools – few; homegrown rather than commercial Data used for reports, but also for staffing, coll. devel., assessment Next survey - clarification of terms necessary !

STEVENS INSTITUTE OF TECHNOLOGY

GEORGIAN COURT

RUTGERS - KILMER

RARITAN VALLEY

DREW UNIVERSITY REFERENCE STATS FORM

Berkeley College Gimlet.us

Gimlet.us

Gimlet.us

COMERCIAL SOFTWARE ON MARKET LibAnswers: http://www.springshare.com/libanswers/lib.html Libstats: http://code.google.com/p/libstats/ Desk Tracker: http://www.desktracker.com/ RefTracker: http://www.altarama.com.au/reftrack.htm Gimlet: http://gimlet.us/

Surveys on Reference Services Novotny, Eric, and Washington, DC. Office of Leadership and Management Services. Association of Research Libraries. "Reference Service Statistics & Assessment. SPEC Kit." SPEC Kit 268 (2002): ERIC. Web. 4 Jan. 2012. 2009 VALE Reference Services Survey Report, July 2010; prepared by Tony Joachim, edited by Jody Caldwell. http://www.valenj.org/sites/default/files/VALE2 009SurveyReport_final_0.pdf

PANEL & DISCUSSION THANK YOU Panelists: Moderator Denise Brush - Rowan University Jennifer Heise - Drew University Pat Dawson - Rider University Moderator David McMillan – Caldwell College THANK YOU Maria Deptula Berkeley College