How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015.

Slides:



Advertisements
Similar presentations
Driving forward excellence in research: institutional strategies and approaches Professor Malcolm Grant UCL President and Provost HEPI conference Research.
Advertisements

Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director
IREG Forum on University Rankings May 2013 Dr Lisa Colledge Snowball Metrics Program Director
Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
The transition to Finch: implications for the REF 29 November 2012 Paul Hubbard Head of Research Policy, HEFCE.
0 UK university research as a driver of economic growth? HEPI Conference: University Research & the Economy: Opportunities for UK universities in the drive.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
ANALYSING RESEARCH – A GLOBAL PERSPECTIVE Krzysztof Szymanski – Country Manager Thomson Reuters October 2009.
International Conference KRE-11 Prague, 9 September 2011 J.M. Verheggen, Elsevier Multi-dimensional research assessment and SciVal; a very brief overview.
Scopus. Agenda Scopus Introduction Online Demonstration Personal Profile Set-up Research Evaluation Tools -Author Identifier, Find Unmatched Authors,
SciVal Experts & SciVal Funding Information Sessions.
Presented by: Charles Pallandt Title: Managing Director EMEA Academic & Governmental Markets Date: April 28 th, Turkey “Driving Research Excellence.
Latest Developments in Research Performance Evaluation Yasushi Adachi Regional Manager, Asia Pacific Elsevier 研究產出 競爭力剖析 洞悉全球趨勢.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
 Jennifer Blanke Director, Senior Economist World Economic Forum  Montenegro | 20 May, 2008 Assessing Southeast Europe’s Competitiveness in an International.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
ARC Update for CAUL 20 March 2015 Professor Aidan Byrne CEO, Australian Research Council.
1 Do More Searching in Less Time Spring Term 2010 Ask an Expert Helen B. Josephine
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
REF2014 – results and the way forward SSHP Meeting 12 March 2015.
Are downloads and readership data a substitute for citations? The case of a scholarly journal? Christian Schlögl Institute of Information Science and Information.
Welcome to Scopus Training by : Arash Nikyar June 2014
The Latest in Information Technology for Research Universities.
ELSEVIER SOLUTIONS TO SUPPORT RESEARCH 6 th of April 2011 AYHAN AKANAY SARACOGLU Account Manager Turkey and Central Asia
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Discovery tools and research assessment solutions APRIL 2012 Shahrooz Sharifrazy Regional Sales Manager.
Beyond the RAE: New methods to assess research quality July 2008.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
May MD Anderson Keio University Kiel University Gazi University Queen’s University Belfast Ural Federal University CAPES Brazil Nanyang Technological.
By Timon Oefelein Springer, Account Development Manager, North Western Europe Altmetrics for Librarians: a publisher dashboard, a university use case.
IREG Forum: National University Rankings on the Rise Bratislava, October 11 th 2011 Valerie Thiel MBA, SciVal Consultant, Elsevier SciVal: Input, Output.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
Assessing the Impact of Australian Journals in the Social Sciences and Humanities Paul Genoni, Gaby Haddow & Petra Dumbell Curtin University of Technology.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RUNNING ON ELSEVIER | SciVal – an introduction Marcel Vonder Head of Product Development, SciVal 3d NEICON International Conference Halkidiki, Greece.
| 1 Jennifer Lewis-Gallagher Eric Livingston Elsevier: Leveraging Data to Lead 5 November 2015.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
#SciVal Publication and Citation data: what can SciVal show you? Dr Peter Darroch – Research Intelligence Consultant.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Looking to find & evaluate the right research? Scopus has you covered.
7 steps to maximise your research profile
Bibliometrics toolkit: Thomson Reuters products
Christina Lohr Product Manager Research Metrics, Elsevier
SciVal & SciVal Funding Quick Guide
Sustainable Consumption Institute 188 Waterloo Place, Oxford Road
Scopus & SciVal for researchers Help make stay up to date with your field and learn how to use Elsevier’s Scopus and SciVal tools for scholarly purposes.
How to Improve the Visibility and Impact of Your Research
Optimize your research performance using SciVal
Metrics: a game of hide and seek
SciVal to support building a research strategy
Research Excellence Framework: Past and Future
Comparing your papers to the rest of the world
Looking to find & evaluate the right research?
Presentation transcript:

How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015

2 Each year 1.2 million article manuscripts received by ~2,000 journals (all offer Open Access options) 350,000 new articles published, in addition to 11M existing articles 2,000 new books published ScienceDirect: 800M digital article downloads Scopus: 55M records, 21,900 titles, 5,000 publishers, 700M citations SciVal: 75 trillion metrics values Pure: current research information system: >200,000 researchers supported Mendeley: 3M users globally Grants:7,000 sponsors, 20,000+ active opportunities, ~5M awarded grants Patents: >93m records, 100 patent offices Elsevier has a unique vantage point on research Primary publishing Derived and aggregated data

Elsevier has a unique vantage point on research National research assessment and benchmarking reports UK REF, UK BIS reports ERA (Australia) FCT (Portugal) VQR (Italy) Global University Rankings Times Higher World University Rankings QS rankings US News rankings (Arab Region) Research reports conducted with UK Royal Society Science Europe European Commission, FENS, HBP, Kavli Foundation, RIKEN BSI World Bank EuroStemCell, Kyoto University

Elsevier’s perspective on metrics 4

Elsevier perspective 1: 5 Metrics should complement, not replace human judgment Metrics should be used together with peer review and expert opinion

Elsevier perspective 2: 6 Metrics embody human judgment, they are not independent of it When metrics and peer review or expert opinion give different answers, probe further

Elsevier perspective 3: 7 “Metrics” does not only mean bibliometrics Metrics also describe activities related to funding, collaboration, commercialisation, and impact Multiple metrics used together give the richest perspective

8 Q. Which metrics? A. Snowball Metrics! Recipes in first recipe book Recipes added in second recipe book

Common counter arguments against metrics 9 Greater use of metrics will lead to “gaming” by researchers Metrics aren’t suitable for the Humanities Metrics risk perpetuating biases, such as gender biases

Counter argument 1: Gaming 10 Greater use of metrics will lead to “gaming” by researchers Well-selected metrics drive positive behaviours Multiple metrics make gaming very hard Metrics plus peer review and expert opinion will detect gaming Counter-argument Counter-counter argument

Counter argument 2: Humanities 11 Metrics aren’t suitable for the Humanities Data sources to cover humanities are becoming more complete Research metrics are relevant for Humanities researchers Counter-argument Counter-counter argument

Counter argument 3: Biases 12 Metrics risk perpetuating biases, such as gender biases Metrics reflect researchers’ activity Metrics can help monitor and eliminate biases Counter-argument Counter-counter argument

Consideration 1: cost 13

Consideration 2: availability of tools and systems 14

Analysis 1: QR vs number of top 5% publications 15 MeasureNotes REF QR / REG£M Mainstream QR / REG awarded for 15/16 Citation pip5# outputs that fall within the global top 5% of indexed outputs within the relevant subject area, in terms of citations generated Only those English and Scottish institutions with citation data available are represented R² =

Analysis 2: QR vs field-weighted citation impact 16 MeasureNotes REF QR / REG£M Mainstream QR / REG awarded for 15/16 Citation Weighted fwciField weighted citation impact Only those English and Scottish institutions with citation data available are represented R² =

Analysis 3: REF GPA vs citation GPA per UoA 17 MeasureNotes Output GPAGPA (4:3:2:1) of the Outputs sub-profile Citation GPAGPA (4:3:2:1) of the citation data profile (% top 1%, % top 5%, …) Only those submissions with > 50 indexed articles were included UoA number

Conclusions 18 The case for change is real: costs, availability of new tools Metrics should complement human judgment, not be a substitute for it. Metrics are much more than bibliometrics Multiple metrics should be used to address questions of research assessment. Counter arguments can be addressed Let’s seize the opportunity!