Benchmarking Reference Data Collection

Slides:



Advertisements
Similar presentations
The Standards of Practice for a Tobacco Treatment Specialist (TTS) Gaylene Mooney, M.Ed., RRT-NPS, CTTS Program Director, Respiratory Therapy San Joaquin.
Advertisements

Collaborative Technical Services Team Report GUGM May 15, 2014 Cathy Jeffrey.
From a Distance: Library Services for Distance Learners Joseph Dobbs University of Texas at Austin.
An Assessment Primer Fall 2007 Click here to begin.
Changes in Library Usage, Usability, & User Support Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie.
Changing times, Changing needs? Library Program Analysis at the Duke University Medical Center Library & UNC Health Science Library Carol Perryman, IMLS/TRLN.
LS407 R EFERENCE AND I NFORMATION S ERVICES Spring 2010 Laura Saunders.
Educational Measurement and School Accountability Directorate Better informed, better positioned, better outcomes Data Analysis Skills Assessment Introduction,
The Academic Assessment Process
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
Teachers directing the work of paraprofessionals
New Faculty Orientation Program Center for Teaching and Learning (CTL) January 23, 2015.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Performance and Development Culture Preparing for P&D Culture accreditation April 2008.
Created by: Christopher J. Messier Learning Commons Supervisor.
Teaching and Supporting Students with Vision Impairments An Australian Universities Teaching Committee Funded Project WAANU Conference March 2005.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
Group. “Your partner in developing future Lifelong Learners” UROWNE UNIVERSITY LIBRARY.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
1 Seminar on 2008 SNA Implementation June 2010, Saint John’s, Antigua and Barbuda GULAB SINGH UN Statistics Division Diagnostic Framework: National.
Ann Campion Riley University of Missouri
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Mission The Center’s mission is to support and facilitate the identification, expansion, and transfer of expert knowledge and best practices in child.
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
NCknows Challenges “There is only one little problem, of course. We are not on the Web, or we are not on it in any kind of a meaningful way. That position.
The Library’s Place in UTSA’s Growth Projections Stefanie Wittenbach Assistant Dean for Collections October 2007.
LSTA Grant Workshop Jennifer Peacock, Administrative Services Bureau Director David Collins, Grant Programs Director Mississippi Library Commission September.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
VALE REFERENCE SERVICES COMMITTEE January 5 th, 2011.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
BUILDING BLOCKS TO EVALUATE MEASURABLE PROGRAM OUTCOMES AKA: PROGRAM MONITORING.
ACRL Academic Library Trends and Statistics Editorial Board June 27, 2015.
Conversation with the SLOA&C March 20, 2015 Professional Development Day SJCC Presenters: C. Cruz-Johnson, S. Datta, and R. Gamez Paving the Way for Student.
Rebecca L. Mugridge LFO Research Colloquium March 19, 2008.
Educator Recruitment and Development Office of Professional Development The NC Teacher Evaluation Process 1.
Paul Bracke, PhD Clarence Maybee, PhD Sharon A. Weiner, EdD
Administration, Finances, and Resources
BUILDING BLOCKS TO EVALUATE MEASURABLE PROGRAM OUTCOMES
There will be a fold here
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
An agency of the Office of the Secretary of Education and the Arts
Program Quality Assurance Process Validation
A Guide for Managers for Professional Staff
Rebecca Jackson Iowa State University June 17, 2005
Quantifying the value of our libraries. Are our systems ready?
California Community Colleges
Tell a Vision: 3 Vignettes
Amie Freeman, University of south Carolina
Credentialing Community
Designed for internal training use:
Sam Houston State University
College Planning Council 8 June 2015
A Guide for Professional Staff
Standard Four Program Impact
A Guide for Professional Staff
Assessing Student Learning
Presented by: Skyline College SLOAC Committee Fall 2007
February 21-22, 2018.
Sam Houston State University
College Planning Council 8 June 2015
Accountability and Assessment in School Counseling
Presentation transcript:

Benchmarking Reference Data Collection Library Assessment Conference 2018 Benchmarking Reference Data Collection Results of a National Survey on Reference Transaction Instruments with Recommendations for Effective Practice Rebecca Eve Graff, Southern Methodist University Libraries, Dallas Paula R. Dempsey, University of Illinois at Chicago Adele Dobry, California State University, Los Angeles

Our Survey What type of library do you work in? Does your library collect data on reference interactions? How are data gathered? Please upload your data collection form. In the last decade, has your library made substantive changes in what data are recorded?  What is the most useful data you record?

Comparing Studies Population/Sample SPEC Kit 268 (2002) RUSA survey (2016) Population/Sample ARL members (n=124); 77 responses (62%) Email recruiting; 232 (142 academic, 73 public, 9 special, 8 other) Libraries collecting data 96% 95% Regular collection 51% 94% Method of collection 99% hand tabulated 25% online data entry 4% clicker 8% other 75% commercial platform 6% hand tabulated 8% online spreadsheet 11% other

Why Capture Reference Interactions? Required by professional association or accreditation agency Evidence-based staffing Training: What skills are most important? Programmatic evaluation Demonstrate value to stakeholders

How Is Data Captured? Printed Form Example Data Capture Method Commercial Apps Freeware

Why Survey Now? Collecting Reference Data in an Era of Change Transitional time for how we collect data Changing reference practices: Fewer, but more complicated questions Collections moving online Lateral, not just linear use of sources Users ask anytime, anywhere Librarians respond from different locations Service points merging Technological innovations change what is possible, in terms of service

Defining Reference Transactions: RUSA Information consultations in which library staff recommend, interpret, evaluate, and/or use information resources to help others to meet particular information needs. Reference transactions do not include formal instruction or exchanges that provide assistance with locations, schedules, equipment, supplies, or policy statements. http://www.ala.org/rusa/guidelines/definitionsreference

Defining Reference Transactions: ARL Information contact that involves the knowledge, use, recommendations, interpretation, or instruction in the use of one or more information sources by a member of the library staff... Duration should not be an element in determining whether a transaction is a reference transaction... A directional transaction is an information contact that facilitates the logistical use of the library and that does not involve the knowledge, use, recommendations, interpretation, or instruction in the use of any information sources other than those that describe the library, such as schedules, floor plans, and handbooks. http://www.ala.org/rusa/guidelines/definitionsreference

What Data Do We Gather? Where Were You? Is this form labeled or intended for use at one specific reference service point? What other labels get used? Circulation Consolidated Service Desk Multiple Service Points Office Unclear / Other

What Data Do We Gather? Contact Mode Contact Mode: Is this recorded?

What Data Do We Gather? Clearly a Reference Question? Is there a clear distinction between reference questions and other types of inquiries?

Question Categories Galore

What Data Do We Gather? Qualitative Elements Are there any qualitative assessments of the question being asked? What qualitative elements describing the interaction are included?

What Data Do We Gather? Value Added Components Multiple Selections Possible

What Data Do We Gather? Librarian Impact Is the form designed to capture the value added by interacting with a trained librarian or the patron's learning outcomes?

Which Would You Rather Use?

Recommendations for Data Collection & Use Streamline forms: collect only what you will analyze Use consistent definitions across service points and institutions Map to value added by librarian Map to ACRL Framework for Information Literacy Use interactions as evidence for performance review Demonstrate contribution to mission

Bibliography: Reference & Information Literacy Definitions of Reference http://www.ala.org/rusa/guidelines/definitionsreference Definitions of Reference: A Chronological Bibliography http://www.ala.org/rusa/sites/ala.org.rusa/files/content/sections/rss/rsssection/rssc omm/evaluationofref/refdefbibrev.pdf Radford University, McConnell Library, Instruction Menu https://www.radford.edu/content/library/instruction/faculty-request-a- workshop/instruction-menu.html

Bibliography: ARL & ACRL Reports SPEC Kit 268: Reference Service Statistics & Assessment (September 2002) https://www.arl.org/focus-areas/statistics-assessment/1772-spec-kit-268- reference-service-statistics-a-assessment-september-2002#.XAGnV9tKjGg Value of Academic Libraries: A Comprehensive Research Review and Report http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf