Round Table Discussion Bibliometric Indicators in Research Evaluation and Policy Colloque Evolution des publications scientifiques Académie des sciences,

Slides:



Advertisements
Similar presentations
UNQUESTIONABLE NEED FOR PEER-REVIEWING 1. Visibility of major papers 2. Improvement of manuscript content 3. Identification of excellence Colloque de lAcadémie.
Advertisements

Library The Web of Science, Bibliometrics and Rankings 23 November 2011.
Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"
Beyond the article: Altmetrics, publishing and marketing 1: AM, Altmetrics conference, London, 26 September 2014, Hans Zijlstra,
Overview What is ‘Impact’, and how can it be measured? Citation Metrics Usage Metrics Altmetrics Strategies and Considerations.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
BIBLIOMETRICS Presented by Asha. P Research Scholar DOS in Library and Information Science Research supervisor Dr.Y.Venkatesha Associate professor DOS.
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
The Literature Review Literature search and review from the perspective of citations.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
Use of bibliometry for the evaluation of researchers and teams in medicine and biology A practical view from a School of Medicine Dean Patrick Berche Universitary.
H E L S I N G I N K A U P P A K O R K E A K O U L U H E L S I N K I S C H O O L O F E C O N O M I C S Orientaatiopäivät 1 Writing Scientific.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Using Journal Citation Reports The MyRI Project Team.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Left click or use the forward arrows to advance through the PowerPoint Upon clicking, each section of the article will be highlighted one by one Read.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Journal Impact Factors and H index
The Changing Role of Intangibles over the Crisis Intangibles & Economic Crisis & Company’s Value : the Analysis using Scientometric Instruments Anna Bykova.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
HUNGARIAN ACADEMY OF SCIENCES NORMS AND STANDARDS IN INTERNATIONAL RESEARCH NORBERT KROO HUNGARIAN ACADEMY OF SCIENCES BUDAPEST,
By: Dr. Hamid Alizade IranEssential Science Indicators is a web-based research tool that enables researchers and research evaluators to measure.
Institute of Information Technology of ANAS Rahila Hasanova "New Challenges in the European Area: International Baku Forum of Young Scientists.
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Ranking and classification of universities based on advanced bibliometric mapping Leiden University 3rd International Symposium on University Rankings.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Swedish Institute for Studies in Education and ResearchGöran Melin 1 Effects of funding young, promising scientists by Göran Melin 1 and Rickard Danell.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
Impact factorcillin®: hype or hope for treatment of academititis? Acknowledgement Seglen O Per (BMJ 1997; 134:497)
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
Scholarly communications Discussion group Linked Data Workshop May 2010.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
How to use Bibliometrics in your Career The MyRI Project Team.
Roadmap Activity 2a: A GEOSS citation standard : Hans-Peter Plag IEEE University of Nevada, Reno, Nevada, USA;
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
EuroCRIS Platform Meeting - Vienna 2-3 October 1998 CRIS as a source for tracking science publication patterns Fulvio Naldi - Carlo Di Mento Italian National.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Research assessment for at the Academy of Sciences of the Czech Republic Eva Zažímalová The Academy of Sciences of the Czech Republic.
Is a further increase in the number of scientific journals compatible with their utility? Colloque Evolution des publications scientifiques Académie des.
1 RUSSIAN SCIENCE UNDER THE MICROSCOPE Philip Purnell Moscow, October 2013.
Assessing Hyperthermia and Cancer Research Productivity Shu-Wan Yeh 1 *, Shih-Ting Hung 1, Yuan-Hsin Chang 1, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School.
Sept 17, 2007C.Watters 1 Reviewing Published Articles.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
CPD 3 - Advanced Publishing Skills 1 - How to Get Published and to Continue to Get Published in Leading Academic Journals Professor Tarani Chandola with.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
Bibliometric Analysis of Herbal Medicine Publications, 1991 to 2004
European VIRTA pilot – eurooppalaisen julkaisutietovirran pilotointi
By: Azrul Abdullah Waeibrorheem Waemustafa Hamdan Mat Isa Universiti Teknologi Mara, Perlis Branch, Arau Campus SEFB, Universiti Utara, Malaysia Disclosure.
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
The Impact Factor معامل التأثير
Indication of Publication Pattern of Scientometrics
Bibliometric Analysis of Process Safety and Environmental Protection
Scientometrics of Horizontal Gene Transfer Research during
Thanks and best wishes The American Journal of Medicine
Qualities of a Good Researcher
Presentation transcript:

Round Table Discussion Bibliometric Indicators in Research Evaluation and Policy Colloque Evolution des publications scientifiques Académie des sciences, mai 2007 Pierre Braunstein Académie des sciences Institut de Chimie (UMR 7177 CNRS) Université Louis Pasteur - Strasbourg

CHEMISTRY Core chemistry Numerous interfaces chemistry/mathematics chemistry/biology Increasing number of international collaborations The increasing number of journals leads to their selection according to some criteria (tradition, userfriendly submission process, rapidity, etc …) The highest Impact Factor in chemistry is 20 (review articles) The ranking of authors on a publication is highly variable

Do impact factors reflect what is important or what is fashionable ? Areas in which much is published generate more citations and therefore higher impact factors The period covered (2 years) is too short to be really significant (compare e.g. an article published in December of year N vs. January of year N+1. The ISI classification « General » vs. « Specialized » journals is very far from being satisfactory. Any ranking based on this splitting is rather useless. Total citations counts (or H factor = number of articles cited more than H times) provide a usefull image of the impact of the scientific contributions of an author, in particular a senior one (but remember the « cold fusion » effect!). However, since practices/numbers vary considerably from one discipline to another, comparisons become difficult/impossible at scientific interfaces.

Indicators can be useful, they will be used anyway, we better understand their meaning, strengths and weaknesses in order to improve the system. A European model? A calibration system is required and various benchmarking procedures will minimize the misinterpretations Indicators can assist in identifying performances within a similar area, at the national and international levels, particularly for more senior scientists (ex most cited chemists). They have no absolute value. Other criteria are needed to identify the younger, most promising scientists: key role of the community! There must be feedback between the evaluators and the individuals or labs or institutions being evaluated to explain and communicate the source of indicators. This must be a multicriteria and transparent mechanism

BIBLIOMETRICS BIBLIO MIXI CATORS