LibQUAL+ Data Summary A brief overview of the aggregate results of the LibQUAL+ survey with specific comparisons of BYU with other institutions This presentation.

Slides:



Advertisements
Similar presentations
Student Survey Results and Analysis May Overview HEB ISD Students in grades 6 through 12 were invited to respond the Student Survey during May 2010.
Advertisements

Bound for Disappointment Faculty and Journals at Research Institutions Jim Self University of Virginia Library USA 7 th Northumbria Conference Spier, South.
Listening To Our Users Queen’s 2010
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
LibQUAL + ™ Data Summary An overview of the results of the LibQUAL+™ 2003 survey with comparisons to the 2001 survey.
LibQUAL Tales from Past Participants Vanderbilt University Library Flo Wilson, Deputy University Librarian
LibQUAL+ ® Survey Results Presented by: Selena Killick ARL/SCONUL LibQUAL+ Administrator Cranfield University Introduction to LibQUAL+
The votes are in! What next? Introduction to LibQUAL+ Workshop University of Westminster, London 21st January 2008 Selena Killick Association of Research.
Student Engagement Survey Results and Analysis June 2011.
 Large-scale, web-based, user-centered assessment of library service effectiveness across multiple universities.  Co-developed by ARL and Texas A&M University,
New Ways of Listening To Our Users: LibQUAL+ Queen’s.
Data Summary July 27, Dealing with Perceptions! Used to quantifiable quality (collection size, # of journals, etc.) Survey of opinions or perceptions.
LibQual 2013 Concordia University Montréal, Québec.
Frank Haulgren Collection Services Manager & Assessment Coordinator Western Libraries Lite 2010 Survey Results.
LibQUAL+ ® Survey Results American Library Association (ALA) Midwinter Meeting Philadelphia, PA January 14, 2008 Martha Kyrillidou, Director Statistics.
Testing the LibQUAL+ Survey Instrument James Shedlock, AMLS, Dir. Linda Walton, MLS, Assoc. Dir. Galter Health Sciences Library Northwestern University.
LibQUAL+ Finding the right numbers Jim Self Management Information Services University of Virginia Library ALA Conference Washington DC June 25, 2007.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
June 25, 2007 ALA Washington, DC Emmanuel d’Alzon Library Assumption College Using Your LibQUAL+ Results Dr. Dawn Thistle Director of Library Services.
Employment, unemployment and economic activity Coventry working age population by ethnicity Source: Annual Population Survey, Office for National Statistics.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Slide 1 Customer Satisfaction Monitoring Rolling data 2014/15 –Waves 1-12 (April 14-March 15)
Library Satisfaction Survey Results Spring 2008 LibQUAL Survey Analysis User Focus Team (Sharon, Mickey, Joyce, Joan C., Paula, Edith, Mark) Sidney Silverman.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
LibQUAL + ™ 2004 Data Summary An overview of the results of the LibQUAL+™ 2004 survey with comparisons to past surveys.
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
LibQual Survey. The CUC Group Resp.% Calvin College & Theological Seminary1, % Cedarville University Centennial Library % Geneva College %
A half decade of partnership and the love affair continues….. LibQual+: A Total Market Survey with 22 Items and a Box ALA Midwinter Meeting January 17,
Chapter 12 Review How can you measure the impact of ServicePrusOne over time?
© FSAI FSAI Advice-Line Evaluation Survey of Advice-line and Query Users and Mystery Shopper Measurement Evaluation carried out by by Insight Statistical.
© 2016 Results and Analysis: Elementary Schools Only 2016 School Quality Survey Spring ISD January 19 – 31, 2016.
Interpreting, Editing, and Communicating MISO Survey Results
An agency of the Office of the Secretary of Education and the Arts
Library Assessment Tools & Technology
WOREP: What we learned at CSUF?
LibQUAL+ Finding the right numbers
Director of Policy Analysis and Research
BY DR. M. MASOOM RAZA  AND ABDUS SAMIM
How to participate LibQUAL+
Data Analysis of EnchantedLearning.com vs. Invent.org
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Survey of Organizational Excellence
Director, Institutional Research
A Step-By-Step Tutorial for the Discipline Data Reporting Tool The Delaware Positive Behavior Support Project Slide 1:   Welcome to.
What Do Users Think of Us? Mining Three Rounds of Cornell LibQUAL Data
School Climate Data Workshop
A Step-By-Step Tutorial for the Discipline Data Reporting Tool The Delaware Positive Behavior Support Project Slide 1:   Welcome to.
EVAAS Overview.
Reading Radar Charts.
Faculty use of digital resources and its impact on digital libraries
Customer Satisfaction Research 2018 Q3 Results October 22, 2018
Integrating Outcomes Learning Community Call February 8, 2012
Results of the Organizational Performance
Imagine Success Engaging Entering Students Innovations 2009
Devon & Cornwall Police Authority Performance Management Committee
Student Satisfaction Results
Listening To Our Users Queen’s 2007
Student Tracking of Progress With Special Education Students
Workforce Engagement Survey
P ! A L S Interpreting Student Data to
Using the LibQUAL+ Survey to Inform Strategic Planning
Customer Satisfaction Survey: Volunteer Training Overview
LibQUAL+® Survey Results
2017 Postgraduate Research Experience Survey (PRES) Results
NHS DUDLEY CCG Latest survey results August 2018 publication.
Presentation transcript:

LibQUAL+ Data Summary A brief overview of the aggregate results of the LibQUAL+ survey with specific comparisons of BYU with other institutions This presentation is to summarize the results from LibQUAL+, the web delivered assessment tool to measure library service quality. This project is managed by the Association or Research Libraries (ARL) and the libraries and researchers of Texas A&M University and has been going on since 1999. With assistance from a FIPSE grant from the U.S. Department of Education, the plan is to continue this effort for some time to come. BYU’s involvement came about through our association with the Big 12 Consortium of Libraries of which we and A&M are a part.

LibQUAL + Goals LibQUAL + as a whole BYU expectations Tools & protocols for evaluating library service quality Effective web-based survey delivery mechanisms Identify best practices Establish ARL service quality assessment program BYU expectations How BYU patrons rate the Lee Library Benchmark results against other institutions Where to focus improvements The principle reason behind why LibQUAL+ was initiated was to to move libraries towards more outcome-based assessment efforts. Its principal goals are: 1) develop tools & protocols for evaluating library service quality; 2) develop effective web-based survey delivery mechanisms; 3) identify best practices; 4) establish an ARL service quality assessment program. For the Lee Library at BYU we hope through this effort to: 1) gain a better understanding of how the BYU community rates Lee Library services; 2) benchmark our results against that of other institutions; 3) help us see where service improvements can be made. We felt strongly that this instrument would be an ideal means of helping us achieve our Strategic Plan as a Library. And as you will see from the results, we feel that our patron’s perception of our services is such that lets us believe that we are indeed on the right track in improving the quality of our services to our patrons. I do need to make a comment concerning comparisons with other institutions. Shortly after making a preliminary presentation to the Strategic Planning Committee, a request was received from ARL that institutions presenting LibQUAL+ results from other institutions must do so in confidence. Therefore, when specific instances of user perceptions of services are shown that may be critical in one way or another, the name of a specific institution has been removed.

Response Summary Over 43 institutions participated Services rated on scale from 1 to 9 Three perspectives of service – Minimum, Desired, Perceived BYU sampled 3900 individuals Final sample count 3702 Some 30,000 replies study wide 20416 total usable surveys 789 usable surveys from BYU BYU ranked 7th in number of responses Effective response rate = 21.3% Response to the survey was good. In all, 43 institutions participated in the effort. Individuals sampled were sent email and given a link to a institution specific website where they could then take a survey where they were asked to rate on a scale of 1 to 9 the quality level of library services from three perspectives – minimum, desired, and perceived. BYU sampled 3900 individuals which included 1800 undergraduate students, 1200 graduates, and 900 faculty. Similar samples were taken at the other institutions. Final sample counts varied from institution to institution depending on several factors. Our final effective sample size was 3702 (1735 undergrad, 1138 grad, & 829 faculty). By the time the survey sites were shut down at the end of April, nearly 30,000 individuals replied to the email and attempted to take the survey. Obviously not all finished the survey for various reasons – some of which were technical, others did not because of its length. Those that did not complete the survey, or replied N/A to more than 30% of the survey, or gave inappropriate answers (minimum exceeded desire) were excluded from the survey. For the results you are about to see, a total of 20416 usable surveys were included. The range in responses was as few as 193 from the University of Mississippi to 1000 from Miami University in Ohio. BYU had a total of 789 usable surveys which ranked 7th amongst the 43 participating institutions. This translated to an effective response rate of 21.3%. Though the response rate number in and of itself was low, demographics collected for the survey indicated that the response was fairly representative across the several demographics, including ranks, disciplines, sex, etc.

Library Use on Premises All Institutions Brigham Young One question of interest in the survey that has been of particular interest at BYU dealt with the use library resources on the actual premises. Those surveyed were simply asked how often they used resources on library premises – daily, weekly, monthly, quarterly, or never. As you can see from the All Institution chart, virtually everyone surveyed tended to use the physical facilities to satisfy their needs; coming to the library at least once a week or month. This is true even at Virginia Tech, a perceived high-tech/wired university. All the institutions showed similar tendencies. So, if they are still coming into the library, to what extent do they use library electronic resources remotely? Virginia Tech

Library Use Electronically All Institutions Brigham Young The second use question asked how often the patron would use electronic library services remotely. Interestingly enough, though the numbers varied slightly from institution to institution, ALL of them showed a greater percentage of their patrons NEVER using library electronic resources than those on the premises themselves. One would think given the rapid development of technology today that those numbers (premises vs. electronic) would be reversed. At least from these results one implication could be that the necessity for a physical facility is very much evident. Virginia Tech

Overall Service Quality And Service Satisfaction BYU Rank = 5th, 4th, 5th There were also three sets of questions dealing with a patron’s overall perception of library service quality, their satisfaction with the affect of service received at the library (how the patron is treated), and their satisfaction with the support the library provides for meeting learning, research and teaching needs. First off note that the average scores (based on a Likert scale from 1 to 9 with 1 being very dissatisfied and 9 being very satisfied) for all three questions are all above 5.5. Hence the ratings tend to be high – individuals feel on average that their libraries do a pretty decent job of meeting expectations in these three areas. However, an interesting phenomenon occurred with these results. At every institution, affect satisfaction ALWAYS rated highest, with overall quality of service next, and support satisfaction rated lowest. BYU ranked 5th, 4th, and 5th respectively in these questions against the other institutions.

Survey Summary 56 questions about library services Minimum expected service Desired service Perceived service Analysis reduced practical areas to four Affect – How the patron is treated Place – The library facility & environment Personal Control – Patron self-reliance Information Access The bulk of the survey contained 56 questions dealing with every imaginable area of library service. And as mentioned before, for each question, the patron was asked to evaluate the service from three perspectives, the minimum expected level of service, the desired level of service, and finally the perceived level of service they were currently experiencing from their university’s library. They rated each perspective on a Likert scale from 1 to 9. As the analysis proceeded, the areas could be reduced and summarized in four logical groups: 1) The Affect of Services – this basically entails how the patron was treated in the library, 2) The Library as Place – meaning the physical facility, its overall feeling and environment, 3) Personal Control – the patrons ability to be self-reliant is the use of materials and services, 4) Information access – the patrons ability to obtain what they need at the library.

LibQUAL+ Zone of Tolerance All Institutions BYU To assess how the patron’s perception of service meets the minimum & desired level of service across these four specific areas, a Zone of Tolerance graph was devised to help visually display this concept. The gray box in the chart reflects the range from the minimum expected service level to the desired level of service. This is the Zone of Tolerance. The perceived level of service is then displayed with a diamond. The idea is for the perceived level to fall well within the Zone of Tolerance hopefully leaning towards the desired service level – the upper part of the box. As you can see, the institutions user perceptions varied widely in their charts as these examples show, with some having perceived levels well below the minimum implying a definite need for improvement. It is also interesting to note that the values tended to be high based on a rating scale from 1 to 9. In general though, as shown in the All Institutions chart, on average the perceived service level tended to be towards the lower end of the Zone of Tolerance. BYU was an exception to this – being more in the center of the zone. To assess how well BYU rated against the other institutions, the gap between the minimum level and perceived level was calculated and comparisons made based on that gap.

Gap Distribution Perceived - Minimum BYU rank = 4th BYU rank = 5th BYU rank = 2nd These charts show the distribution of perceived minus minimum gaps for all institutions. In the overall total aggregate, BYU ranked second in the gap score with only one institution showing a higher gap. The implication here is that the perception of patrons of Lee Library services is that they are doing a good job of meeting their service expectations. When the total is broken down to the four individual areas of service, BYU ranked 5th, 4th, 4th, and 1st respectively in affect, place, personal control and information access. BYU rank = 4th BYU rank = 1st

LibQUAL+ Antarctic Charts All BYU In order to be more specific, individual questions can be examined in similar fashion using this Zone of Tolerance idea. The LibQUAL+ “Antarctic Charts” shown above attempt to do this. Each spoke of the chart represents an individual question in the LibQUAL+ study. The questions have also been grouped by like classification for ease in interpretation (AC = Access to collections, A = Assurance, E = Empathy, LP = Library as place, R = Reliability, RS = Responsiveness, T = Tangibles, SR = Self-reliance, and I = Instruction). The desired service level would be represented by the most outside edge of the colored portion of the chart with the minimum service level the most inside edge. If the perceived level consistently falls between the desired and minimum, then the colors in the Antarctic chart would be exclusively yellow and blue. This is evident in the chart in the lower right hand corner and pretty much evident in the BYU chart as well. If the perceived exceeds the desired level, then the color would be green (as evidenced in BYU’s chart for T-1 – representing question one of the survey, visually appealing facilities). This would be good in a sense – but could also be perceived that perhaps too many resources are being expended in a given area where they might be better spent in other more critical areas. More important would be if the perceived level fell below the minimum. This would result in a red color and would be a definite signal that improvement is necessary. Note that BYU has NO red showing in its chart. However, other institutions and the study as a whole tended to show much less blue and more instances of red. Legend: Perceived > Desired = Green Perceived < Minimum = Red

LibQUAL+ Overall Summary The need for a physical facility for patrons to come is still important Overall Quality of Service and Satisfaction with the library is positive Respondents tend to view library services positively Perceived service is just above minimum expected service BYU ranks high in all areas From these results the following stands out from LibQUAL+ 2001. First and foremost, or should be in the minds of librarians, patrons still value the need of a physical facility to meet their study, research and social needs. The library as a place is important. Overall quality of service, as well as satisfaction with how patrons are treated at the library and are supported in their academic pursuits, is positive. In general, respondents tend to view library services positively – 6 or better on a scale from 1 to 9. However, patrons perceived service, on average, is just above the minimum expectation of service. This certainly suggests that though they feel good about the library, there is still room for improvement. And finally, in virtually all areas, BYU rates high when compared to other institutions that participated in the study.

LibQUAL+ BYU Summary Areas of positive note Patrons love the new addition and find it a safe/secure facility Library staff makes concerted efforts to help the patron Potential areas for improvement Improve tools to help the patron be more self-sufficient Improve the accuracy of library records For BYU in particular there are some positive and negative aspects that deserve attention. As to areas of positive note, patrons love the new addition to the Lee Library and find it a safe and secure facility to which they can come. Also, the library staff does a wonderful job of helping the patron with their immediate needs. In contrast, there are a couple areas that could use some improvement. One would be to improve the tools available at the library (both on the premises and electronically) that will help the patron be more self-sufficient. They want to know how to do it but don’t want someone around holding their hand during the process. And finally, the Lee Library can work to improve the accuracy of library catalog, borrowing, and overdue records.

The Future of LibQUAL+ The next round of surveys will be conducted Spring 2002 Estimated 200 participating institutions BYU will not take part Fall 2001 special effort BYU will continue its improvement efforts LibQUAL+ Spring 2003 Compare those results with the benchmarks set in Spring 2001 In conclusion, LibQUAL+, as has been mentioned many times before, is a dynamic, on-going effort. In fact, ARL and Texas A&M will do this again with a greatly revised (and reduced) instrument in the Spring of 2002. At this point it is anticipated that some 200 institutions will participate. However, one of those will not be BYU. The Lee Library wants time to assess the information from this study together with the studies that are on-going to see where they might improvement the quality of services provide to the BYU community. But, because of the technical problems associated with this last round surveys, BYU has volunteered to help A&M and ARL with a special study in the fall to address those problems, which should help improve the delivery of the instrument and overall response rates of usable surveys. In the meantime, BYU will continue its improvement efforts. When the Spring 2003 LibQUAL+ effort gets underway, BYU will be anxious to again participate. At that point it will be exciting to assess, based on the benchmarks we have established with the Spring 2001 study, just how much progress has been made in improving library services.