Mining EZProxy Data: User Demographics and Electronic Resources

Slides:



Advertisements
Similar presentations
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Advertisements

Integrated Postsecondary Education Data System (IPEDS) Interrelated surveys conducted annually by the National Center for Education Statistics (NCES)
INVESTIGATING MILESTONES AND COMPLETIONS AT A LOCAL INSTITUTION BASED ON ADELMAN’S STUDY, “THE TOOLBOX REVISITED.” What’s in YOUR toolbox? DANIEL MARTINEZ,
Nancy Turner Syracuse University Library ARL Survey Coordinators and SPEC Liaisons January 19, 2007 E-Metrics Services and ARL Statistics.
Illinois State University transfer student overview Spring Advisor Day 2010.
San Jose State University’s Martin Luther King, Jr. Library Patricia Chan Natalie Kahn Melinda Vogel Summer, 2013.
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
Glasgow, Scottland May 24, 2010 ITEM SAMPLING IN SERVICE QUALITY ASSESSMENT SURVEYS TO IMPROVE RESPONSE RATES AND REDUCE RESPONDENT BURDEN: THE “LibQUAL+®
TAMU 2012 Enrollment Undergrads40,100 Graduates9,600 Professional527 Faculty3,810 TAMU HSC 2012 Enrollment Undergrads206 Graduates959 Professional1,121.
Session 15b Digital resource usage statistics E-Resources usage statistics with EzProxy Simon Huggard, Systems Manager, Monash University.
Revisiting Retention: A Four Phase Retention Research Initiative 2012 SLOAN Conference October 10 th, 2012 Gary J. Burkholder, PhD Senior Research Scholar.
Diversity Data Resources from the Office of Academic Planning and Institutional Research apir.wisc.edu/diversity.htm.
Gulf of Maine and the World Ocean REU Efforts to Increase Minority Participation in the Ocean Sciences David M. Fields, Rebecca A.
Pay Per View: A Library’s Perspective Beth R. Bernhardt Electronic Journals/Document Delivery Librarian University of North Carolina at Greensboro NC Serials.
Charting Library Service Quality Sheri Downer Auburn University Libraries.
FA FAEIS & FALCON faeis.usda.gov Bill Richardson FAEIS Project Manager
No More Sleepy Hollow: A Collaborative Approach to Teaching and Promoting E-Resources to Attentive College Students Kristin D’Amato, Head of Acquisitions,
Hospitality and Tourism Students Use of Technology Cary C. Countryman Michael Sciarini Matthew Roberts.
Alma Analytics Usage Yoel Kortick | Senior Librarian.
Joining the Borg: Creating the Library Cube Dr. David Evans Erik Bowe Dr. Linda Golian-Lui.
Non-Tenure Track (NTT) Appointment Process (Preparer Edition) Version 1.0 Presented by the Office of Organizational Research and Data Management School.
A half decade of partnership and the love affair continues….. LibQual+: A Total Market Survey with 22 Items and a Box ALA Midwinter Meeting January 17,
The Quick Guide to the New Eskind Biomedical Library Website.
Planning and Budget Committee April 13, Agenda How are we doing? How are we doing? – 2nd Qtr Results How much is tuition going up? How much is tuition.
Coastal Carolina University
Undergraduate Population: 2731 Faculty: 254
Disqualification Options
Brinley Franklin and Terry Plum August 18, 2005
ARL New Measures MINES for Libraries Measuring the Impact of Networked Electronic Services Brinley Franklin Vice Provost University of Connecticut Libraries.
Joining the Borg: Creating the Library Cube
Better Informed Academic Planning Using Student Flow Models
A Statistical Analysis Utilizing Detailed Institutional Data
Walmart Foundation, AIHEC, HACU, and NAFEO Student Success Collaborative Mentor Institution and Project Staff Meeting St. Mary’s University April 27-29,
Correlating use of library services with student success,
Interpreting, Editing, and Communicating MISO Survey Results
Taylor Brodner, Information Systems Specialist
Frederick Burrack – Director Chris Urban – Assistant Director
ALIA Information Online 2017 Conference, February 14, 2017
Ken Smith 2017 Commission on Information, Management & Analysis
Library Assessment Tools & Technology
FAEIS & FALCON faeis.usda.gov
Lydia Hofstetter, Georgia Gwinnett College
Pace’s Inaugural Retention Conference June 16, 2017
Distance Education Programs
Alma Analytics Usage Yoel Kortick | Senior Librarian.
Breakfast for Progress
FGCU GOVERNANCE STRUCTURE
Bronx Community College
We‘re Not So Different, you and I:
Lydia Hofstetter, Georgia Gwinnett College
Measuring the Impact of Services & Interventions on Student Success
Civitas And Illume Sept. 21, 2014.
New Mexico Dual Credit Program For Academic Year
Chad Teman – Assistant Director of Admissions Rhodes State College
Online Probation Workshop
How to Run a DataOnDemand Report
Brinley Franklin Vice Provost University of Connecticut Libraries.
Towards a Global, Beyond Boundaries Land Grant
Undergraduate Retention
Explanation of rating scales
MINES for Libraries Measuring the Impact of Networked Electronic Services (MINES): The North American Experience Brinley Franklin Vice Provost, University.
New Features in Alma Analytics
Analytics for Student Success
Erika Callahan Rosalba Conde Northern Essex Community College
Making Your Web Presence User-Friendly
Is There a (Data) Point? Are All of These Measures Useful?
2018 UNC System employee engagement survey
DegreeWorks Training Guide
LibQual+ Survey Results 2002
Road to Academic Recovery
Presentation transcript:

Mining EZProxy Data: User Demographics and Electronic Resources Connie Stovall and Ellie Kohler ARL Assessment Conference, 12/2018

About Virginia Tech Land-grant institution with 34K students STEM emphasis but comprehensive

About University Libraries 152 Employees 5 Branches $23.8 M Budget

University Budgeting Environment Performance Based Budgeting (PIBB) Demonstrate Impact on Student Success Utilize Data Moved from incremental budgeting to performances based budgeting 440 metrics so far in PIBB

Total Materials Budget VT Electronic Resources

Connecting ER to Student Success Pilot project in Summer 2018 First step: user demographics analysis LOCATION COLLEGE MAJOR STUDENT LEVEL GPA AGE ETHNICITY GENDER

How much data is enough data? How do we decide? Can off-campus usage stand in for all database usage? Is summer usage comparable to fall usage? What can EZProxy data tell us that COUNTER reports don’t? How much data is enough data? How can we figure out if we can use off-campus usage as an appropriate sample for all campus usage? JR1, JR5, DB1, DB5, BR1, JR1GOA, etc.

Step 1: Compare On and Off Campus Usage (by date and time) On Campus and off campus database usage- While we really wanted to look at student usage only, we began by comparing all of the data gathered from EZProxy. Initial dataset wrangling included- removal of “admin” (logins that said admin on them) .02% of total removal of automated system checks “part of “on campus” usage 19.9% of total Changing the timestamp into a recognized date/time After looking at the data (such as what is included on the next few slides), then the dataset was filtered for only off-campus use. In the off-campus use dataset, remote users were identified (by unique login) and then merged with university student demographic data. Logins that had no student demographic data were removed from dataset (and considered faculty or staff users). This data was merged by VT Academic Analysis & Reporting and returned completely anonymized Can only see unique users, not how heavily or what databases each user went to.

(by database) What databases are being used? Off Campus On Campus EBSCO JSTOR Science Direct Web of Knowledge ProQuest Ancestry (Library Edition)

by location Off Campus

And location On Campus

Step 2: Get demographic information and compare (off campus use to Summer II enrollment demographics) Enrollment Off-Campus use Undergraduate 77% 36% Graduate 17% 62% DVM, MD, etc 4% 2% Very unbalanced. But is this the best comparison? Maybe graduate students use databases more in the summer. Or more in general. This is where domain knowledge is so important. So if isn’t an accurate comparison barometer, what about gender and race?

Gender and Race?

By college? When looking at a scatterplot, the items in the lower left quadrant indicate a close enrollment to unique student ratio in these colleges. Also, maybe here it is important to note that:

No Comparison Data from University What about GPA? Graduate Undergraduate Total Average Avg. Overall GPA 3.74 3.24 3.49 Avg. Age 31.98 21.61 28.23 No Comparison Data from University

So what does this mean? It’s complicated! But this does not stop us from understanding how off-campus users (students) are using our resources, or which college is most/least likely to use them. This may not be the collections silver bullet. Image used from: https://www.convergentresults.com/single-post/2016/02/07/Paralysis-by-Perfection

Next steps Fall semester data Usage workaround Find awesome insights from the data we have

Thank you! Ellie Kohler: EllieK@vt.edu Connie Stovall: CJStova@vt.edu