Ranking and benchmarking for repositories Janice Chan, Curtin Peter Green, Curtin Stephen Cramond, University of Melbourne #NotOnTwitter Attribution: MicroAssist 13 Nov 2015
OVERVIEW OF EXISTING TOOLS
CAUL Statistics Total number of OA/restricted/metadata-only items Number of OA/metadata-only items added during reporting year Number of accesses: metadata-only Number of accesses: OA items Inspired by SCONUL Annual Stats collection
Issues Interpretation and definition – what are we counting? Content and collection in IRS differ System limitations – not able to report on all items
Ranking Web of Repositories Publish twice a year by Cybermetrics Lab in Spain since 2008 Indicators
Issues SEO focused Disadvantage several repository platforms Changing methodology, lack of stability Lack of transparency Not measuring usage
Open Access Repository Ranking Open and transparent metric Evaluates repositories assets and service Criteria and max scores: Interoperability 25
Issues Germany, Austria, and Switzerland only Not measuring usage and activities May increase country coverage in but global coverage unlikely. Reviewing criteria in 2016
IRUS-UK Aggregates stats for broad range of UK and [some] Dutch IR's. Allow comparison and benchmarking of performance. COUNTER-compliant so apples are compared with apples. Stats by item type Includes Thesis downloads report
Issues Only supports DSpace, ePrints and "Fedora" at present Not available to ANZ institutions
EMERGING TOOLS
"Measuring Up" Project OCLC Research, ARL, MSU, UNM Web analytics focus Will develop standard web metrics for outcomes-based assessment Will demonstrate relationship between IR's, citations and university rankings. Allow libraries can build more effective business cases for IR initiatives.
bepress benchmarking model Percentile rank – 3 factors – Growth: content added – Breadth: number of collections getting new objects – Demand: download counts, interest
CrossRef DOI Event Tracker Tracking events associated with a DOI Built on PLOS ALM code Production service available mid-2016 Might develop an understanding of relative overall importance of IR downloads within the broader resource discovery ecosystem. tracker-pilot.html tracker-pilot.html
What is missing? Coverage - % of institutional output captured Open Access ratio Quality (beyond ERA metrics) Scope of repository beyond peer reviewed publications Interoperability Demonstrating impact and not just usage
DISCUSSION