Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ranking and benchmarking for repositories Janice Chan, Curtin Peter Green, Curtin Stephen Cramond, University.

Similar presentations


Presentation on theme: "Ranking and benchmarking for repositories Janice Chan, Curtin Peter Green, Curtin Stephen Cramond, University."— Presentation transcript:

1 Ranking and benchmarking for repositories Janice Chan, Curtin University @icecjan Peter Green, Curtin University @lgreenpd Stephen Cramond, University of Melbourne #NotOnTwitter Attribution: MicroAssist http://www.microassist.com/e-learning-creative-commons-stock-gallery 13 Nov 2015

2 OVERVIEW OF EXISTING TOOLS

3 CAUL Statistics Total number of OA/restricted/metadata-only items Number of OA/metadata-only items added during reporting year Number of accesses: metadata-only Number of accesses: OA items Inspired by SCONUL Annual Stats collection

4 Issues Interpretation and definition – what are we counting? Content and collection in IRS differ System limitations – not able to report on all items

5 Ranking Web of Repositories http://repositories.webometrics.info/ Publish twice a year by Cybermetrics Lab in Spain since 2008 Indicators

6 Issues SEO focused Disadvantage several repository platforms Changing methodology, lack of stability Lack of transparency Not measuring usage

7 Open Access Repository Ranking http://repositoryranking.org/ Open and transparent metric Evaluates repositories assets and service Criteria and max scores: Interoperability 25

8 Issues Germany, Austria, and Switzerland only Not measuring usage and activities May increase country coverage in 2016+17 but global coverage unlikely. Reviewing criteria in 2016

9 IRUS-UK http://www.irus.mimas.ac.uk/ Aggregates stats for broad range of UK and [some] Dutch IR's. Allow comparison and benchmarking of performance. COUNTER-compliant so apples are compared with apples. Stats by item type Includes Thesis downloads report

10 Issues Only supports DSpace, ePrints and "Fedora" at present Not available to ANZ institutions

11 EMERGING TOOLS

12 "Measuring Up" Project OCLC Research, ARL, MSU, UNM Web analytics focus Will develop standard web metrics for outcomes-based assessment Will demonstrate relationship between IR's, citations and university rankings. Allow libraries can build more effective business cases for IR initiatives.

13 bepress benchmarking model http://digitalcommons.bepress.com/webinars/68/ Percentile rank – 3 factors – Growth: content added – Breadth: number of collections getting new objects – Demand: download counts, interest

14 CrossRef DOI Event Tracker Tracking events associated with a DOI Built on PLOS ALM code Production service available mid-2016 Might develop an understanding of relative overall importance of IR downloads within the broader resource discovery ecosystem. http://crosstech.crossref.org/2015/03/crossrefs-doi-event- tracker-pilot.html http://crosstech.crossref.org/2015/03/crossrefs-doi-event- tracker-pilot.html

15 What is missing? Coverage - % of institutional output captured Open Access ratio Quality (beyond ERA metrics) Scope of repository beyond peer reviewed publications Interoperability Demonstrating impact and not just usage

16 DISCUSSION


Download ppt "Ranking and benchmarking for repositories Janice Chan, Curtin Peter Green, Curtin Stephen Cramond, University."

Similar presentations


Ads by Google