Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.

Slides:



Advertisements
Similar presentations
FORMATION ON :  WEB OF SCIENCE Prepared by Diane Sauvé, B. Sc., M. Bibl. November 2013.
Advertisements

The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
Bibliometrics – an overview of the main metrics and products The MyRI Project team.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
1 Guide to exercise 10 Bibliometric searching on indicators for journals, papers, and institutions Tefko Saracevic.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using Scopus for Literature Research. 2 Why Scopus?  A comprehensive abstract and citation database of peer- reviewed literature and quality web sources.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
Bibliometrics: the black art of citation rankings Roger Mills OULS Head of Science Liaison and Specialist Services February 2010 These slides are available.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Journal Status* Using the PageRank Algorithm to Rank Journals * J. Bollen, M. Rodriguez, H. Van de Sompel Scientometrics, Volume 69, n3, pp , 2006.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Journal Metrics Iran Research Excellence Forum Tehran, October 2014
Using Journal Citation Reports The MyRI Project Team.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Journal Impact Factors and H index
The Changing Role of Intangibles over the Crisis Intangibles & Economic Crisis & Company’s Value : the Analysis using Scientometric Instruments Anna Bykova.
BIBLIOMETRICS & IMPACT FACTORS University of Idaho Library Workshop Series October 15 th, 2009.
Journal level impact assessment a diversity of new metrics Sarah Huggett Publishing Information Manager, Scientometrics & Market Analysis, Research & Academic.
Welcome to Scopus Training by : Arash Nikyar June 2014
The Profile (Google Scholar Citations) May 2015 Prof Hiran Amarasekera University of Sri Jayewardenepura Japura Media.
Institute of Information Technology of ANAS Rahila Hasanova "New Challenges in the European Area: International Baku Forum of Young Scientists.
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
1 Scopus as a Research Tool March Why Scopus?  A comprehensive abstract and citation database of peer-reviewed literature and quality web sources.
Bibliometric research methods Faculty Brown Bag IUPUI Cassidy R. Sugimoto.
Journal Impact Factors and the Author h-index:
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
ISC Journal Citation Reprots تقارير استنادية للمجلات Mohammad Reza – Ghane Assistant Prof. in Library and Information Science & Director of Research Department.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
Science Classifications Current situation 1)Each research group/data provider/government agency has its own classification; 2)Different data sources (funding,
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
Database collection evaluation An application of evaluative methods S519.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Selecting the Right Journal for your Scientific Manuscript.
Bibliometrics and Publishing Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE School of Education and Communication in Engineering.
Astrophysics publications on arXiv, Scopus and Mendeley – A Case Study Judit Bar-Ilan Department of Information Science, Bar-Ilan University, Israel.
CiteSearch: Multi-faceted Fusion Approach to Citation Analysis Kiduk Yang and Lokman Meho Web Information Discovery Integrated Tool Laboratory School of.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
TIPS WHEN USING BIBLIOMETRICS UNITED KINGDOM OCTOBER 2010.
Bibliometrics: the black art of citation rankings Roger Mills Head of Science Liaison and Specialist Services, Bodleian Libraries June 2010 These slides.
Altmetrics Eleonora Presani – Elsevier
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
1 e-Resources on Social Sciences: Scopus. 2 Why Scopus?  A comprehensive abstract and citation database of peer-reviewed literature and quality web sources.
Assessing Hyperthermia and Cancer Research Productivity Shu-Wan Yeh 1 *, Shih-Ting Hung 1, Yuan-Hsin Chang 1, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School.
S CORING I NTERNATIONALITY O F J OURNALS V IA S TATISTICAL E STIMATION Snehanshu Saha CBIMMC & CSE PESIT South Campus
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
#SciVal Publication and Citation data: what can SciVal show you? Dr Peter Darroch – Research Intelligence Consultant.
Where Should I Publish? Journal Ranking Tools
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometric Analysis of Herbal Medicine Publications, 1991 to 2004
Bibliometric Analysis of Water Research
به نام هستی بخش یکتا کارگاه آموزشی علم سنجی با تاکید بر:
Advanced Scientometrics Workshop
Sándor Soós1 1Hungarian Academy of Sciences (MTA), Budapest, Hungary
Comparing your papers to the rest of the world
Presentation transcript:

Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana University August 11th 2011

Current situation Since the early 2000s, we are witnessing: 1)Increase in the use of bibliometrics in research evaluation; 2)Increase in the size of the bibliometric community; 3)Increase in the variety of actors invoved in bibliometrics (e.g. no longer limited to LIS or the STS community); 4)Increase in the variety of existing metrics for mesuring research impact: H-index (with its dozen varieties); engenvalues, SNIP and SCIMAGO impact indicators, etc. 5)No longer an ISI monopoly (Scopus, Google Scholar + several other initiatives (SBD, etc.).

Why do we need standardized bibliometric indicators? 1)Symptomatic of the immaturity of the research field – no paradigm is yet dominant; 2)Bibliometric evaluations are spreading at the levels of countries, institutions, research groups and individuals; 3)Worldwide rankings are spreading and often yield diverging results 4)Standards shows the consensus in the community and allows for various measures to be : 1)Comparable 2)Reproducable

Impact indicators Impact indicators have been used for quite a while in science policy and research evaluation. Until quite recently, only a handful of metrix were available or compiled by research groups involved in bibliometrics: 1) raw citations 2) citations per publication 3) Impact factors Only one database was used: ISI Only one normalization was made: by field (when it was done!)

Factors to take into account in the creation of a new standard 1)Field specificities: citation potential and aging characteristics. 2)Field definition: at the level of journal or at the level of paper? Interdisciplinary journals? 3)Differences in the coverage of databases 4)Distributions vs. aggregated measures 5)Skewness of citation distributions (use of logs?) 6)Paradox of ratios (0  1  ∞) 7)Averages vs medians vs ranks 8)Citation windows 9)Unit vs fractional counting 10)Equal or different weight for each citation?

Ex. 1: Impact indicators Example of how a very simple change in the calculation method of an impact indicator can change the results obtained – even when very large number of papers are involved. All things are kept constant here: same papers, same database, same subfield classification, same citation window. The only difference is the order of operations leading to the calculation: average of ratio (AoR) vs ratio of averages (RoA). Both these methods are considered as standards in research evaluation. 4 levels of aggregation are analyzed: individuals, departments, institutions and countries

Relation between RoA and AoR field normalized citation indicators at the level of A) individual researchers (≥20 papers), B) departments (≥50 papers), C) institutions (≥500 papers) and D) countries (≥1000 papers)

Figure 2. Relationship between (AoR – RoA) / AoR and the number of papers at the level of A) individual researchers, B) departments, C) at the level of institutions (≥500 papers), D) countries.

Ex. 2: Productivity measures Typically, we count the research productivity of units by summing the distinct number papers they produced and dividing it by the total number of researchers of the unit. Another method is to assign papers to each researcher of the group, and then perform the average of their individual output. Both counting methods are correlated, but nonetheless yield different results:

Difference in the results obtained for 1223 departments (21,500 disambiguated researchers)