Metrics What they are and how to use them Chris Biggs and David Jenkins
Bibliometrics are the quantitative analysis of publication and citation data
Examples of bibliometric measures include citation counts, Journal Impact Factor and H-index
You can get bibliometrics from databases such as Scopus, Web of Science and Google Scholar
Altmetrics look at the mentions and uses of research that are not captured by traditional bibliometrics
Altmetrics include mentions, uses, downloads, views, shares from a wide variety of sources
Metrics are seen as important because they acknowledge that a paper is relevant and known
Bibliometrics are seen as objective, inexpensive to produce/use, economical in terms of time and scalable
What do you think about metrics?
How are you using metrics?
What do you want to get out of this session?
Metrics can be used on CVs, personal websites and on bid applications
Fawcett, Tom. "An introduction to ROC analysis." Pattern recognition letters 27.8 (2006): 861-874.
Citations measure what is cited not what is high quality
Journal Impact Factor is designed to allow you to see how highly cited the average article in a journal is
Journal Impact Factor is used as a proxy for importance and quality
No. of citations received in 2015 by articles published in the journal during 2013 and 2014 No. of articles published in the journal during 2013 and 2014 JIF 2015 =
You can get Journal Impact Factor from Journal Citation Reports
The H-index is designed to measure productivity and impact
“The definition of the index is that a scholar with an index of h has published h papers each of which has been cited in other papers at least h times”
H-index is criticised because it can’t be used to compare researchers in different disciplines, at different career stages or who have different outputs
The H-index measures consistency rather than outright impact
Altmetrics aim to cover “the evaluation gap”
You can get altmetrics from dedicated services or embedded in databases
Altmetrics can complement bibliometrics
Research can get high metrics for different reasons
Author profile systems
The Benefits To make your research known To increase chance of citation To correct attributions To ensure research is counted in research assessment To increase chance of new collaboration To increase chance of funding
Different functions have created different systems Disambiguation – ORCID/Scopus ID/Researcher ID Promotion/Visibility/Networking - Personal sites/Institutional sites/Academia.edu/ResearchGate/Facebook/LinkedIn Reference Management tools – Mendeley Open Access Publishing – Institutional Repository Research Assessment – CRIS/Institutional Repository Search Engines – Google Scholar
Different systems have different modus operandi Institutional / Personal Commercial / Not for profit Open / Closed (… or shall I say walled garden)
Some profiles already exist, some need to be created, all need to be curated.
Scopus
1 2 3
OU People Profile
ORCID
Google Scholar
ResearchGate
Mendeley
The Uses and Abuses of Metrics
A shortcut to evaluation A shortcut to evaluation. Metrics appear to be objective, cost effective and useful where the evaluator isn't a domain expert.
Their success is not related to the quality of the information they provide, but more on the facility: a saving of the time necessary to make a real evaluation.
Funders might use metrics to evaluate bids.
Funders might use metrics to allocate monies.
Metrics were used to support assessment in the 2014 REF.
Metrics have been considered as a route to a more light touch REF.
Metrics and The Stern Review.
Universities might use metrics to recruit and promote.
This environment may encourage unwanted behaviours.
Universities might seek to associate themselves to highly cited researchers.
Journals might encourage authors to increase reference lists to increase Impact Factor.
Individuals will seek to publish in Journals with high impact factors.
Individuals might seek to increase citations by self-citations & citation cartels.
Individuals might seek to increase number of publications by salami slicing research.
Individuals might seek to increase their number of publications and citations as guest or gift authors.
The Metric Tide. "Metrics hold real power: they are constitutive of values, identities and livelihoods."
The Metric Tide. [We] propose the notion of responsible metrics as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research.
Responsible Metrics. The Leiden Manifesto.
Responsible Metrics. San Francisco Declaration on Research Assessment.
What are your thoughts about metrics now?
Any questions?
Thank you!