Download presentation
Presentation is loading. Please wait.
Published byChristine Flynn Modified over 9 years ago
1
E-Assessment: Evaluating Resources in a Digital World Bonnie Tijerina, E-Resources Coordinator GA Tech Library & Information Center Cory Tucker, Business Librarian University of Nevada, Las Vegas Libraries
2
Overview What are we assessing Why is assessing critical Challenges of e-collections New opportunities in the e- environment Effective Assessment
3
What are we assessing? Resources with a cost involved Databases, Abstracting/Indexing tools Electronic Journals Digitized Newspapers Electronic Books E-Reference Material Open Access publications Born Digital Resources
4
Why is Assessing Critical? Budget The constantly changing/increasing cost of many e-resources Lack of a a baseline for a resources Justification New users, new user needs New Products, New Platforms Cooperative Purchasing
5
Challenges of Assessing Content Overlap of content Packages are interdisciplinary Bundles may have unnecessary material Authority and stability of content
6
Challenges of Assessing, cont. Format E-only? Print + Online? Technical issues Unique features Equity of Access What hardware/software is needed?
7
Challenges of Assessing, cont. Access Response time from vendors Technical issues with product Customer service Effective use of technology
8
Opportunities/Challenges More Data Attempts at standardized use data Transaction data User behavior on our local resources
9
Effective Assessment “Collection Analysis is an ongoing process defined both by individual analysis projects and constant attention to collection quality and its responsiveness to the user community” – Peggy Johnson, Fundamentals of Collection Development & Management, 2004
10
Effective Assessment, cont. Assessment How does the resource/collection support our local users? Evaluation Is this a quality resource? How does a resource or collection compare against another resource or collection?
11
Effective Assessment, cont. Library-Driven Analysis Use- and User-Centered Analysis Quantitative Qualitative Collaborative e-resource management
13
Conclusions Many assessment & evaluation tools Involving others in/outside the library Constant change, constant assessment User’s Needs are central Format & Content
14
Assessment of Electronic Resources for Business Cory Tucker, Business Librarian
15
Library-driven E-resources Assessment Budgetary constraints requiring cancellation of e-resources Increased funding providing opportunity to add e-resources Changes to curriculum requiring new or different e-resources Changes to database interface or platform Changes to database content
16
1. The Association of College and Research Libraries, Information Literacy Competency Standards for Higher Education, 2000. User-driven E-resources Assessment Usability Content Information Overload Information Literacy Access the needed information effectively/efficiently Evaluate information and sources critically
17
Electronic Databases in Business Periodical databases – journals, including abstracts and full- text resources Reference databases – company, industry and market research resources Statistical databases – statistical data
18
2. Bick, Dawn and Reeta Sinha. “Maintaining a high quality, cost-effective journal collection. C&RL News. September 1991: 485-490. Evaluation/Assessment Project Utilizes: objective and subjective evaluation factors Value Analysis (matrix) 2 weighted evaluation factors Cost/Benefit ratio (actual cost & perceived benefit to users)
19
Phase 1 Establish Evaluation Criteria Determined four broad categories to establish list of evaluation criteria: Usability Content University curriculum Special features Other factors (local, product-specific, etc.)
20
Assessment Criteria Access/Usability IP-access EZProxy (or other proxy server) Interface design Help & Training (tutorials and context-specific help)
21
Assessment Criteria Journal Content Currency Backfiles Embargo Subject Coverage Length (backfiles, from Vol. 1) #Full-text vs Abstract-only Source and Authority (Publishers) Rank and Impact Factor # Peer Reviewed
22
Assessment Criteria Content: Other publications (books, dissertations, Gov. Docs., etc.) Case studies Company information Industry information Market Research information Economic data
23
Assessment Criteria University Curriculum & Research: Core subjects (curriculum) User information need Programs offered Academic Level undergraduate Graduate (Master’s and PhD) Special Areas of Research/Study
24
Assessment Criteria Special Features: Compatibility with link-resolvers (Open URL) Export options Available formats Compatibility with RefWorks or similar software Integration with e-learning software such as WebCT or Blackboard
25
Assessment Criteria Other Factors: ABLD List Peer Library Holdings In line with Library’s collection policy License terms Consortial subscriptions Pricing Vendor relations Usage stats Network connection (speed/reliability)
26
Phase 2 Value Analysis Value analysis is An organized effort directed at analyzing the function of systems, products, specifications, standards, practices, and procedures for the purpose of satisfying the required function at the lowest total cost of effective ownership consistent with the requirements for performance, reliability, quality and maintainability. Uses a matrix to rank evaluation criteria by importance (defined locally by library) Ranking of helps to establish weighted value for each criterion Weighted value helps to establish a ‘score’ for each e- resource = perceived benefit Enables calculation of a cost/benefit ratio for each resource
27
Value Analysis Matrix
28
Pairs of evaluation criteria are compared to determine which of the two is more important to the library For each pair, rank the difference in importance as: Major = 3 Medium = 2 Minor = 1 Once ranked, assign weights to each of the criterion (percentage)
29
Phase 3: Worksheet
30
Phase 3 Assessment Worksheet Complete worksheet for each database to determine its ‘score’ according to weighted criteria Score incorporates subjective evaluation (e.g., content created by local faculty may trump non-core subject focus) May add common research question(s) to measure how effectively database provides relevant information [results?]
31
Phase 4 Calculation of Cost/Benefit Ratio For each worksheet: Sum the weighted values for criteria to obtain database Score Multiply Score for each database with its Annual Use to obtain Value –(Database Score) X (Annual Use) = Benefit = perceived value of database Divide Annual Subscription Cost by Benefit score & multiply by 10 –(Annual Cost)/Benefit x 10=Cost/Benefit Ratio
32
Electronic Resources Evaluation Very time consuming Continuous process When to do this: Annually? (near renewal time) Bi-annually? Up to library due to staff and time constraints
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.