Download presentation
Presentation is loading. Please wait.
1
Library Research Analytics Services: Things to Make and Do with Bibliometrics
DARTS5 ~ 3rd June 2016 Katie Evans, Research Analytics Librarian, University of Bath
2
Outline What? Why? Where? Things to make and do
Activity: are we change agents? Why a Library Service? Q&A
3
Research Analytics Service What? Why? Where?
4
& Entertain Facilitate?
What? Inform … Educate … CC-BY color line CC-BY-NC-ND Brian Everett My role is to do anything we can usefully or interestingly do with publications & citations data, and to enable others in the University – both academics & professional services – to access and use this data to do things they find useful or interesting. With apologies to the BBC, can summaries this as: inform, educate & facilitate. Inform = providing data and analysis, e.g. to profile & benchmark the University’s publishing; to evidence our research strengths Educate = workshops & one-to-one training for researchers (both doctoral students & academic staff) and professional services staff, e.g. developing your publishing strategy, choosing journals, finding potential collaborators, tracking attention to your work & Facilitate? Sometimes I give people analysis or training and wonder ‘so what?’ i.e. will they do anything different as a result. Does the Library have a role in driving/facilitating change? & Entertain Facilitate?
5
Why? University of Bath Library Research Analytics Service was established 2 years ago in 2014 as the result of a ‘citations improvement initiative’. One of the motivations for the University to invest in this is to enhance our position and reputation as a research university. The University of Bath does well in national league tables, and did well in REF2014, but would like to do better in the world university rankings. Citations are a major component of these rankings.
6
Where? Library Academic Services Technical Services Research Services
Open Access Research Data Management Research Analytics Researchers Research Office Planning Office International Office Marketing Archives & Records Management I’m based in the Research Services section of the Library. However, I work with colleagues from across the University (and from within the Library!)
7
Things to Make & Do
8
Swimming in data? The Wilsdon Report – the report of the HEFCE commissioned independent review into the role of metrics in research assessment and management – published summer 2015 is called ‘The Metric Tide’. See: It describes a rising tide of metrics. Are we swimming in / overwhelmed by more data and metrics than we know what to do with? Which of the following tools do you / the researcher you support use: Google Scholar Web of Science Scopus Journal Citation Reports / Journal & Highly Cited Data SCImago CWTS journal indicators Incites SciVal Altmetric PlumX ResearchGate
9
Make: a graph Do: track a trend
% of UK publications that are internationally co-authored 25% over 50% Trend: the internationalisation of research and higher education. The percentage of UK publications that include an international co-author has doubled from about 25% in 1996 to over 50% in 2015. Over a half of the UK’s research publications now include a co-author from outside the UK! And in general, more collaborative research is associated with higher citation rates (but be careful: the numbers show a statistical correlation, but can’t show causation). Data from SciVal 10/5/16
10
Make: benchmarks Do: inform strategy
Uni A Uni B Bath This graphs shows the percentage of publications that include an international co-author for the University of Bath and two comparator universities over the 10 year period All three universities have been increasing the proportion of their publications that include an international co-author, but Uni A and Uni B have increased quicker than Bath. If we look at citation impact, all 3 universities achieve a higher citation impact on average for their internationally co-authored publications than domestic publications, but the step up is higher for Uni A and Uni B than for Bath. This sort of analysis is informing the University of Bath’s internationalisation strategy and activities. Of course, a lot depends on who you choose as your comparators! Data from SciVal Oct 2015.
11
Make: benchmarks Do: win research funding
Highly Cited Papers Research A Subject area benchmarks Benchmarking is useful for many different purposes. As well as internal monitoring, researchers and the University as a whole needs to present their research strengths for various assessments, e.g. grant applications. In this example, we’re using subject area benchmarks to demonstrate the strengths of Research A’s publishing track record. Research A has a higher proportion of highly cited articles, articles in high impact journals and internationally co-authored articles than average for the subject areas they work in. This analysis was incorporated into a grant application. Looking at journal articles published y-axis shows percentage of articles that are among the top 10% most cited in the world for their year of publication, i.e. highly cited articles X-axis shows percentage of articles that are published in journals whose SNIP is among the top 10% of all journals indexed by Scopus, i.e. high impact journals. SNIP is a subject normalised, citation based journal metric. Bubble size shows percentage of articles that include an international co-author Data from SciVal 5/1/16; Papers in high impact journals
12
Make: a literature search Do: improve publishing strategy
So far we’ve looked at profiling, benchmarking and presenting existing publications. But we want the University’s publication profile to improve. That means getting researchers and departments to think about their publishing strategy – are they writing and publishing their research effectively so that its quality can be easily recognised? This is a screenshot of using Scopus’s ‘Analyse search results’ function. You can do similar analysis in Web of Science. I started with a search for keywords (breakfast AND (obese OR obesity)). Here we’re looking at which journals most frequently publish items matching my search terms. You can select a few journals and then move across to Scopus’s ‘compare journals’ tool to compare citation-based journal metrics for the journals. You can also look at authors and institutions contributing most frequently to the search results. This can be a way to get researchers reflecting on where they publish and what the options are, or who they might want to seek collaborations with. Similarly you can look at which papers in the search results are most cited and use these as a basis for a conversation about writing for publication: how are the publication titles and abstracts written? What can we learn from them? Etc.
13
Make: a network map Do: reflect on collaborations
VoSViewer is a free software tool for constructing and visualising bibliometric networks, created by Ludo Waltman and Nees Jan van Eck of CWTS, Leiden University and available from: You can use VoSViewer to make co-authorship network maps (like example on slide), but also journal and term maps. This example is a section of co-authorship network map for the topic ‘bibliometrics’, (based on authors with at least 5 publications matching the keyword search ‘bibliometrics’ in Scopus since 2006 onwards). The zoomed in section shows Ludo Waltman and Nees Jan van Eck – VoSViewer’s creators. I have made maps like this for departments to use to profile co-authorship relations within their department. This can be useful for e.g. deciding theme groupings, seeing where there’s potential for further collaboration, and possibly even for head-hunting (can you recruit someone who would bridge 2 clusters?)
14
Make: a donut Do: cultivate impact
Altmetric (and other alternative metric provides such as Plum Analytics) track online attention to research publications, e.g. Tweets, news mentions, blog posts, … This example is the Altmetric ‘donut’ for a paper on research impact by Dr Richard Watermeyer from University of Bath. I wouldn’t use the ‘score’ as an indicator of quality, but the contextual information – who’s talking about the research, what are they saying – is more useful. There’s information here that can be used to cultivate, track and (maybe) evidence the societal impact of a piece of research.
15
Make: the same thing twice! Do: teach responsible metrics
Comparing Web of Science and Google Scholar citation figures for Prof Jonathan Knight, University of Bath Pro-VC Research. When using bibliometrics, it’s important to be clear about where the data comes from – what it does and doesn’t cover.
16
Are we change agents?
17
Supporting change Supporting research Changing research
Do we want to support research, change research, or support change to research? Or all of the above!
18
Changes we want … “Recognition & reward for research data”
“Access not money as primary driver in publishing” “Increased academic impact of Uni Bath publishing” Asking round the Research Services team in the University of Bath Library, these are changes we want to see to research communications. There’s a tension between getting the best results out of the current system and challenging/changing the current ‘publish or perish’ culture.
19
Activity What changes to research (communications) do you want?
What steps can you take towards these?
20
Why a Library service? A Research Analytics Service is more than just access to a research analytics tool. But why a Library service? (Or at least, why a service within the University’s Professional Services family?)
21
Responsible metrics “The research community should develop a more sophisticated and nuanced approach to the contribution and limitations of quantitative indicators” “The problem is that evaluation is now led by the data rather than by judgement” Quotes from: Hicks, D., Wouters, P., de Rijcke, S., & Rafols, I., The Leiden Manifesto for research metrics. Nature. 520 (7548) doi: /520429a Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. DOI: /RG There are interesting an useful things we can make and do with bibliometrics. Bibliometrics & bibliometricians/librarians have a role to play in enhancing the University’s research performance. BUT there are dangers and pitfalls to avoid. There’s no perfect bibliometric measure of research quality. If the research agenda were driven by bibliometrics alone then there could be undesirable side effects – e.g. discouraging innovative/risky research is new/small/local areas; stress, anxiety & low morale caused by blunt/heavy-handed performance management. Hence the growing awareness and promotion of the idea of ‘responsible metrics’. This is where a library (or other professional services) based specialist research analytics service comes into its own. We can take the lead on working out what responsible metrics use looks like in practice; ensure that this becomes embedded in University processes and support colleagues across the University to use bibliometrics effectively and responsibly.
22
Responsible metrics framework
Robust Humble Transparent Diverse Reflexive ‘The Metric Tide’ proposes these 5 dimensions as a framework for responsible metrics “•Robustness: basing metrics on the best possible data in terms of accuracy and scope; •Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment; •Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results; •Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system; •Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.” Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. DOI: /RG Robustness: Because I spend a lot of time working with bibliometrics, I have an awareness of and can advise on issues around coverage and quality of different data sources. E.g. if we look at Department X in Scopus, we’re seeing roughly Y% of their publications. Humility: The Leiden Manifesto talks about the importance of measuring performance against research mission. When colleagues ask me for bibliometric reports/training, we spend time discussing what the purpose is – what they’re trying to measure and achieve, and then what bibliometrics can and can’t contribute to that. When bibliometrics are consistent with expert judgement, this increases confidence in the service we offer. Transparency: We’re taking a very open approach – offering everyone within the University access to research analytics tools and training/support on how to use and interpret them. We do this because (1) it allows other people to make & do interesting & useful stuff with bibliometrics and (2) it increases transparency – if academic staff see bibliometrics analysis I’ve produced in a report they know where it comes from and can replicate/examine it for themselves if they want to. If I’m reporting on individuals, I make sure they have the opportunity to check the publication lists I’m working from and comment on the report. The Library is trusted as a ‘neutral’ service. Diversity: treating different disciplines fairly is one of the biggest challenges for bibliometrics. The expertise I’ve developed from working day-in day-out with bibliometrics equips me to navigate these challenges and to support others to do so. E.g. being aware of the quirks & limitations of subject normalised metrics as well as there strengths. Raising awareness of issues such as gender bias being reflected in publication & citation metrics. Reflexivity: I am able to share ideas around the university and to iteratively modify the analysis I produce.
23
Research Analytics Service adds …
Word cloud created with
24
Library Research Analytics Services: Things to Make and Do with Bibliometrics
Q&A, comments, discussion DARTS5 ~ 3rd June 2016 Katie Evans, Research Analytics Librarian, University of Bath
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.