The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.

Slides:



Advertisements
Similar presentations
Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
Advertisements

Overview What is ‘Impact’, and how can it be measured? Citation Metrics Usage Metrics Altmetrics Strategies and Considerations.
Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
What are the characteristics of academic journals
Bibliometrics – an overview of the main metrics and products The MyRI Project team.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
Bibliometrics: Measuring the Impact of Your Publications Jane Buggle Deputy Librarian.
WISER: Bibliometrics II The Black Art of Citation Rankings Angela Carritt Juliet Ralph March 2011 These slides are available on
BIBLIOMETRICS Presented by Asha. P Research Scholar DOS in Library and Information Science Research supervisor Dr.Y.Venkatesha Associate professor DOS.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Using Journal Citation Reports The MyRI Project Team.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Bibliometrics in Computer Science MyRI project team.
Journal Impact Factors and H index
Publication and impact in English
BIBLIOMETRICS & IMPACT FACTORS University of Idaho Library Workshop Series October 15 th, 2009.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
The Web of Science database bibliometrics and alternative metrics
Welcome to Scopus Training by : Arash Nikyar June 2014
Making an impact ANU Library What is impact What (the heck) are bibliometrics Publish with impact – an overview Debate on impact How innovative are you.
The Latest in Information Technology for Research Universities.
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
IL Step 1: Sources of Information Information Literacy 1.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
Bibliometrics and Impact Analyses at the National Institute of Standards and Technology Stacy Bruss and Susan Makar Research Librarians SLA Pharmaceutical.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
1 Scopus as a Research Tool March Why Scopus?  A comprehensive abstract and citation database of peer-reviewed literature and quality web sources.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
The Web of Science, Bibliometrics and Scholarly Communication 11 December 2013
T H O M S O N S C I E N T I F I C Marian Hollingsworth Manager, Publisher Relations July 18, 2007 Using Metrics to Improve your Journal Veterinary Journal.
Journal Impact Factors and the Author h-index:
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
How to use Bibliometrics in your Career The MyRI Project Team.
Announcements Literature search lab on Wednesday (focus on your project) Keep track of your searching to document on the search log…for each search instance:
Presented by Dr. S. C. Jindal Librarian Central Science Library University of Delhi Delhi Information Competency.
An Open Access Bibliometrics Toolkit Ellen Breen, Library, Dublin City University Open University Library, March 2013.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
The Web of Science, Bibliometrics and Scholarly Communication
Project Thesis 2006 Adapted from Flor Siperstein Lecture 2004 Class CLASS Project Thesis (Fundamental Research Tools)
Today’s lineup… Data-to-Story Project – description due questions about mid-term? Bibliometrics and citation analysis.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Where Should I Publish? Journal Ranking Tools
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometrics toolkit: Thomson Reuters products
journal metrics university of sulaimani college of science geology dep by Hawber Ata
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication counts, amount of research income, number of PhD students, size of research group, number of PI projects, views and downloads of online outputs, number of patents and licenses obtained, and others Use of bibliometrics and citation analysis is only one of these quantitative indicators, but is widely and increasingly being used The ability to apply citation analysis meaningfully in the overall assessment of research varies from field to field Attempts at quantitative measures can be contrasted with the main alternative assessment approach - qualitative peer review in various forms The balance between use of bibliometrics and peer review in assessing academic performance at both the individual and unit levels is currently a “hot topic” being played out locally, nationally and internationally This datasheet provides an introductory overview of the field - other datasheets in the series look in more depth at: the key uses of bibliometrics for journal ranking and individual assessment; the main metrics available; the main data sources and packaged toolkits available Website: edited: 11 Mar 2011 Issues & Limitations Bibliometrics are ways of measuring patterns of authorship, publication, and the use of literature The attraction of using bibliometrics stems from the quantitative nature of the results and their perceived efficiency advantage, producing a variety of statistics quite quickly in comparison to the resource-intensive nature of peer review to assess the quality and innovation of intellectual thought Bibliometrics remain highly controversial as proxy indicators of the impact or quality of published researchers, even in those disciplines where there is good coverage of research output in the main citation data sources Bibliometric analysis often forms one part of a university’s assessment strategy, looking at the impact of the research at institutional and unit level The two other key areas where bibliometrics are commonly used are: 1. As evidence to support an individual in relation to consideration for promotion, tenure and grant funding 2. In deciding where to publish research, to obtain maximum visibility and citation rate by targeting high impact titles Despite its many shortcomings ranking tables for universities give considerable weighting to bibliometrics in their calculations A source dataset The main source datasets are those of ISI, Scopus and Google Scholar plus subject-specialist options in some fields Each collects the citation information from the articles in a select range of publications only – the overlap between the content of these sources has been shown to be quite modest in particular studies. So a health warning needs to be applied to using only one data source as many research offices, for example, tend to do Metric tools and techniques applied to the data source Basic building blocks are a series of techniques such as h-index, Journal Impact Factor (JIF), eigenfactor - these formulae transform the raw data into various quantitative evaluations The main metric types Publication counts measure productivity but arguably not impact - 28% of the 8,077 items of UCD research from indexed in the ISI Citation Indexes were not cited other than self- citations. Overall as much as 90% of the papers published in scientific journals are never cited Citation analysis Most current bibliometric indicators are based around analysis of citations to publications The key concept is that the number of times you get cited is meaningful – the more citations the greater the relevance There are three main approaches to citation analysis: Citation counts - total number of citations, total number of citations over a period of time, total number of citations per paper More sophisticated counts such as: number of papers cited more than x times; number of citations to the x most cited papers Normalisation and “crown indicators” Citation counts alone are commonly used but this is meaningless unless normalised by some combination of time, journal of publication, broad or narrow field of research and type of publication. This normalised approach, allowing benchmarking, is the most commonly used at present In some fields it is not the tradition to cite extensively the work that your scholarship and research is building upon – yet this is the whole principle of the citation analysis system Seminal research is also often taken for granted and not cited The data sources are selective and journal- focused and often do not index the publications where research in a field is typically published and the citations occur – local publications, non- English publications, monographs, conference and working papers are poorly indexed Negative citations are counted as valid Manipulation of the system by such means as self-citation, multiple authorship, splitting outputs into many articles, citing work of others in your research group, and journals favouring for publication the more highly cited review articles over reporting primary research Defining the broad or narrow field to benchmark. This can dramatically alter the result for an individual or group when using normalised benchmarked scores. A number of metrics such as the Scopus SNIP metric for journals and the universal h-index are trying to weight sufficiently to allow cross-discipline comparison to take place Inappropriate use of citation metrics, such as using the Journal Impact Factor of a journal to evaluate an individual researcher’s output, or comparing h-index across fields “In our view, a quality judgment on a research unit or institute can only be given by peers, based on a detailed insight into content and nature of the research conducted by the group …. impact and scientific quality are by no means identical concepts.” Bibliometric Study of the University College Dublin , CWTS, February 2009 “The terrible legacy of IF is that it is being used to evaluate scientists, rather than journals, which has become of increasing concern to many of us. Judgment of individuals is, of course, best done by in-depth analysis by expert scholars in the subject area. But, some bureaucrats want a simple metric. My experience of being on international review committees is that more notice is taken of IF when they do not have the knowledge to evaluate the science independently” [Alan Fersht “The most influential journals: Impact Factor and Eigenfactor” PNAS April 28, 2009 vol. 106 no. 17] “….We publish in books and monographs, and in peer-reviewed journals. However, we have a range of real requirements that include official reporting to state agencies and authorities; public archaeology and communication in regional and local journals and in interdisciplinary publication across several journals, that most bibliometrics are incapable of measuring“ [Academic]