Download presentation
Presentation is loading. Please wait.
1
NCknows Evaluation Overview Jeffrey Pomerantz, Lili Luo School of Information & Library Science UNC Chapel Hill @unc.edu Tapping the vast reservoir of human knowledge --Louis Round Wilson, founder, 1931 Charles McClure School of Information Studies Florida State University cmcclure@lis.fsu.edu
2
Evaluation question Is collaborative virtual reference an effective way to meet the information needs of North Carolinians?
3
Secondary evaluation questions What is required of a library that wishes to offer virtual reference? What is the value added if different types of institutions work together? What is the impact on libraries that provide virtual reference service? Is virtual reference expandable to the whole state? How will this project increase our knowledge of effective organizational models? How can the quality of the reference service provided be measured? Can staff from different types of libraries provide quality reference service to users from other types of libraries? How will the project further greater use of existing resources such as NC LIVE? What partnering or leveraging opportunities exist?
4
Stakeholders NCknows users The individual participating libraries The entire collaborative effort The State Library of North Carolina Evaluation from all of these perspectives
5
Data collection methods 1.Service: statistical analysis 2.Chat sessions: peer review of transcripts 3.Patrons: exit surveys & follow-up phone interviews 4.Librarians: phone interviews
6
Statistical analysis What patterns are emerging in the volume of questions received by NCknows? Usage over time NCknows librarians vs. 24/7 staff Net asker / Net answerer
7
Sessions per month
8
Sessions per day
9
Sessions per hour: weekdays
10
Sessions per hour: weekends
11
Net asker / Net answerer
12
Transcript peer review What is the quality of reference service being provided to users by NCknows? Overall quality of sessions NCknows librarians vs. 24/7 staff Academic vs. public librarians
13
NCknows vs. 24/7: Evidence of user satisfaction % of responses NCknows librarian24/7 staff Yes32.017.6 Indirect evidence32.335.9 No3.02.3 Indeterminate32.744.3
14
Academic vs. Public librarians: Complete and correct answer % of responses Via academic libraryVia public library Academic librarian Public librarian Academic librarian Public librarian Complete & correct 40.062.043.169.6 Incomplete but correct 36.422.522.423.2 Incomplete and incorrect 3.61.43.42.9 Incorrect0.0 5.20.0 No answer provided 20.014.125.94.3
15
Patron exit surveys & Follow-up interviews What are the demographics of the user population? How do users find out about NCknows? What is the users’ level of satisfaction with the reference service being provided by NCknows? What motivates users to use NCknows? How do users use the information provided to them?
16
Role
17
Answer completeness
18
Motivation for the question For a work-related task (49%): business-related and school-related (50/50 split) For Personal Reasons (28%) Known-item search (23%)
19
Use of the information provided Use: patron had used the information provided and found it useful (61%) Partial use: patron had used the information provided and had found it partially useful, or had partially used the information provided (24%) No use: patron had not used the information provided at all (15%).
20
Discovery of NCknows Teacher/professor (10%) Search engine (20%) Library materials (70%)
21
Phone interviews with librarians How has involvement in the NCknows service impacted the participating libraries and librarians?
22
Training Most useful: Patron-librarian “role playing” Most in need of review: Co-browsing
23
Policies & Procedures Scheduling Handling email follow-ups Quality control
24
Tech support Support: NCknows: good 24/7: ok Infrastructure: Adequate or better in university & large public libraries Sometimes less than adequate in community college & small public libraries
25
Thoughts on chat reference Inferior to desk reference: Lacks “non-verbal” cues Conveys less information in more time More difficult to conduct an interview Good for: Quick answers to well-defined questions Will continue to develop & evolve as an aspect of traditional reference.
26
Additional & future work Cost/benefit analysis Sustainability Scalability: to the entire state? Situational and contextual factors unique to specific libraries that affect quality of chat reference MISs & databases to relate reference statistics to other library statistics Longitudinal data
27
Key evaluation issues Understanding the importance of evaluation Ongoing funding/support for evaluation: a “culture of assessment” How will evaluation data be used? Quality of data: both the data reported here, and in other data collection activities
28
The big picture The need to have ongoing evaluation data Importance of a statewide initiative in digital reference Understanding impacts and applications The context of digital reference efforts elsewhere Ultimately the question is: Do the benefits and outcomes outweigh the costs? Congratulations!
29
Reports online ils.unc.edu/~jpom/ncknows/
30
THANKS! Questions or Comments? Jeffrey Pomerantz, Lili Luo School of Information and Library Science UNC Chapel Hill @unc.edu Charles McClure School of Information Studies Florida State University cmcclure@lis.fsu.edu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.