ICT trends in scientometrics and their impact on the academia

Slides:



Advertisements
Similar presentations
Citation Impact Trends: Attention, Aggregation, and Relation James Pringle Thomson Reuters 4/14/2010.
Advertisements

Bibliometrics: Measuring the Impact of Your Publications Jane Buggle Deputy Librarian.
1 Academic Impact : What it means and how to calculate it Prof. Neal Ashkanasy.
SCIENTROMETRIC By Preeti Patil. Introduction The twentieth century may be described as the century of the development of metric science. Among the different.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Bibliometrics: the black art of citation rankings Roger Mills OULS Head of Science Liaison and Specialist Services February 2010 These slides are available.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Using Journal Citation Reports The MyRI Project Team.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
OCLC Changing support/supporting change, June 2014, Amsterdam New roles for research libraries in performance measurement? Paul Wouters, Centre for.
Journal Impact Factors and H index
Publication and impact in English
BIBLIOMETRICS & IMPACT FACTORS University of Idaho Library Workshop Series October 15 th, 2009.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
Making an impact ANU Library What is impact What (the heck) are bibliometrics Publish with impact – an overview Debate on impact How innovative are you.
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Detection of different types of bibliometric performance at the individual level in the Life Sciences: methodological outline Rodrigo Costas & Ed Noyons.
Journal Impact Factors and the Author h-index:
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
Carla Basili - Luisa De Biagi Carla Basili * - Luisa De Biagi * * IRCrES Institute, Rome (IT) *CNR –IRCrES Institute, Rome (IT) Central Library ‘G. Marconi’,
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Altmetrics CAS Scholarship Day March 24, 2015 Betty Landesman Head of Technical Services and Content Management Langsdale Library.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Where Should I Publish? Journal Ranking Tools
QuicK Reference Cards for Research Impact Metrics.
Metrics What they are and how to use them
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometrics toolkit: Thomson Reuters products
Disciplinary structure and topical complexity in SSH—the IMPACT EV mission Sándor Soós, András Schubert, Zsófia Vida.
journal metrics university of sulaimani college of science geology dep by Hawber Ata
Citation Analysis Your article Jill Otto InCites Other?
EU Expert Group Altmetrics
Altmetrics: Analysis of Library and Information Science (LIS) Research in the Social Media Ifeanyi J. Ezema (Ph.D) Paper Presented at the 1st International.
D. E. Koditschek 358 GRW ESE 290/291 Introduction to Electrical & Systems Engineering Research Methodology & Design
Bryan G. Cook, University of Hawaii
Bibliometric Analysis of Water Research
Altmetrics 101 LITA Altmetrics & Digital Analytics Webinar
What Does Responsible Metrics Mean?
به نام هستی بخش یکتا کارگاه آموزشی علم سنجی با تاکید بر:
Lívia Vasas, PhD 2018 About the prestige of the journals and articles start from Mozilla Firefox/Google Chrome Lívia Vasas,
Optimize your research performance using SciVal
Advanced Scientometrics Workshop
Metrics: a game of hide and seek
THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent
Sándor Soós1 1Hungarian Academy of Sciences (MTA), Budapest, Hungary
SciVal to support building a research strategy
Bibliometric Analysis of Process Safety and Environmental Protection
Make me famous: quick guide to citation benchmarking
Lívia Vasas, PhD 2018 About the prestige of the journals and articles start from Mozilla Firefox/Google Chrome Lívia Vasas,
Bibliometrics: the black art of citation rankings
The quality of scientific periodicals Mozilla Firefox or Google Chrome
Measuring Your Research Impact
Altmetrics: The Practical Implications Michael Taylor Head of Metrics Development Digital
Isid.research.ac.ir
EERQI Innovative Indicators and Test Results
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

ICT trends in scientometrics and their impact on the academia Sándor Soós 1Hungarian Academy of Sciences (MTA), Budapest, Hungary

Scientometrics as a research field By definition: „The quantitative study of scientific communication” Björneborg & Ingewersen, 2004

Methodological frameworks in Scientometrics (…) Information science Computer science Statistics Econo- Metrics Economics Network science

Research directions in Scientometrics Research program Direction Field Scientometrics Structural Dynamics, development, structure of S&T (cognitive, social level) Evaluative Models and measurement of research performance

First wave: two levels of bibliometrics Science Citation Index (SCI) 1961, Institute of Scientific Information, Eugene Garfield Journal Citation Report (SCI) SCI-derivative, 1975, Impact Factor (IF), 1955 Eugene Garfield Web of Science (WoS) On-line platform for SCI (+JCR), 2001 Early stage: study of the „development and direction of scientific research, rather than to evaluate its quality” (Wouters et al. HEFCE, 2015) > Professional bibliometrics „Nevertheless, the SCI’s success did not stem from its primary function as a search engine, but from its use as an instrument for measuring scientific productivity, made possible by the advent of its by-product, the SCI Journal Citation Reports (JCR) and its Impact Factor rankings.” (Garfield, 2010) > Citizen bibliometrics

Emergence of „citizen bibliometrics” Recognition of the potential in research evaluation (Policy): 1980’s, 1990’s „Big” data availability Quick utilization in research administration BUT Lack of an underlying theory of performance (impact): „And with the creation of the Science Citation Index, the field of bibliometrics and the use of citation analysis in research evaluation have been driven by the availability of data to the point that it has come to shape the definition of scientific impact.” (Haustein, 2016) „Policy pull + Data push”

First wave: „theory-unladenness” Indicators (measurement) Theoretical level Performance dimension Scientific impact Sociology of science: Theory of citation # citations, IF, …

Characterization of „citizen bibliometrics” „Citizen bibliometrics”: bibliometrics-based evaluation activity carried out by non-experts (mostly in science administration) Category mistakes (JIF misuse in author (individual) evaluations, „aggregate IF”) Blind application (context and problem-insensitive) Comparing apples with oranges (raw, unnormalized, size-dependent etc. Indicators) Single-number obsession (Hirsch-index) Metric-based evaluation absolutized Wave 2 further characteristics Selective on access (commercialized indicators vs. Open access indicators) Conservative (IF, Hirsch, raw cit count „obsession”) Self-service Theoretically non-grounded or not sound Decontextualized (mechanistic application of metrics, not fitted to the assessment problem)

First wave effect: responses within academia Pressure and strategic behavior in response to the „IF and Hirsch culture” Goal displacement Task reduction Salami publishing Strategic collaborations (to boost individual metrics) : farm citations Gaming Hiring of researchers Declarations: DORA , „The San Francisco Declaration on Research Assessment (DORA), initiated at the 2012 Annual Meeting of the American Society for Cell Biology by a group of editors and publishers of scholarly journals, recognizes the need to improve the ways in which the outputs of scientific research are evaluated.”

First wave effect: responses within academia

Meanwhile in professional bibliometrics Indicators (measurement) Theoretical level Performance dimension Scientific impact Sociology of science: theory of citation # citations, IF, … Statistical models of the citation process Normalized, „model-based” impact metrics

Meanwhile in professional bibliometrics Scientific impact Statistical characterization of citation distributions Models of citation dynamics Main factors of citedness (for papers): fields, doctypes, age Normalization for factors Normalized indicators (MNCS, PP10% etc.) Output and productivity Size dependencies Activity indexes Collaboration Co-authorship indicators Quality Journal metrics (SJR, SNIP, Eigenfactor, AIS etc.) Profile Interdisciplinarity metrics (IDR) Dimensions of performance and Commensurability

Second wave: Metrics services 2004: Scopus (Elsevier) and Google Scholar (Google) Related on-line metrics services (open access) Publish or Perish (Anne Harzing) on Google Scholar data Hirsch index and derivatives (e-index, g-index, Hl-index etc.) Correcting the Hirsch index for various known biases Scimago Journal and Country Rank (Scimago Group, Uni Granada, Spain) SJR metric and its Quartile-based presentation new quality dimension and more sound cross-field comparisons Initial infiltrations of professional bibliometrics

Second wave: Professional metrics services ISI Thomson Reurters: InCites „research analytics” tool (WoS-based) Elsevier SciVal „research analytics” tool (Scopus-based) On-line metrics services Subscription-based „Commertionalization” of professional bibliometrics

Second wave: Professional metrics services

Second wave: Professional metrics services Scival: „Map of research competences” Science mapping -> „Strategic research intelligence”

Wave2 outcome: citizen bibliometrics 2.0 „Citizen bibliometrics”: bibliometrics-based evaluation activity carried out by non-experts (mostly in science administration) Category mistakes (JIF misuse in author (individual) evaluations, „aggregate IF”) Blind application (context and problem-insensitive) Comparing apples with oranges (raw, unnormalized, size-dependent etc. Indicators) Single-number obsession (Hirsch-index) Metric-based evaluation absolutized Wave 2 potential unexpected outcomes (professional tools „unleashed”) Selective on access (commercialized indicators vs. Open access indicators) Diversity not accomodated: selective on usage Conservative (IF, Hirsch, raw cit count „obsession”) Self-service (commensurability violated) Theoretically non-grounded or not sound Decontextualized due to misinterpreted „professionalism” (mechanistic application of metrics, not fitted to the assessment problem)

Wave2 outcome: responses within academia Pressure and strategic behavior in response to the „metrics culture” Goal displacement Task reduction Salami publishing Strategic collaborations (to boost individual metrics) : farm citations Gaming Hiring of researchers Confusion concerning „publication strategies” (SJR or IF Q1?) Trust issues: transparency and communicability reduced

Wave2: ALTmetrics Altmetrics: response to the challenges of classical bibliomerics Predecessor: Webometrics (usage and on-line „referencing” of scholarly content on the Web as alternative impact measuement) Focus of Altmetrics: on-line „social” acts related to scholarly content on the web as impact indication (detected posts, blogs, reads etc.) Priem et al. 2010: ALTMETRICS manifesto Main hypothesized benefits: Much broader range and more dimensions of impact (social impact!) No delay in impact manifestation (citations need years to accrue…) Big data Much wider coverage than citation databases: a fair business for SSH fields as well

ALTmetrics: Metrics services „Big data” and technology opportunities

ALTmetrics vs. professional bibliometrics Indicators (measurement) Theoretical level Performance dimension Scientific impact Sociology of science: theory of usage Access, appraise, apply counts Statistical models of usage Normalized, „model-based” impact metrics Haustein, S. (2016). Grand challenges in altmetrics: heterogeneity, data quality and dependencies. Scientometrics, 108(1), 413-423. Plus: data quality (e.g. persistence)

ALTmetrics: social impact (Scientometrics 2.0) Policy push: the measurement of social impact Is ALTmetrics the answer? Wouters et al., HEFCE, 2015

Summing up: history repeating First wave Policy push Data supply (ISI, WoS) Citizen bibliometrics Strategic behavior Second wave Professional bibliometrics Data and ICT supply (Scopus, GS, WoS, tools, services, ALTmetrics)

Way out: good (collaborative) practices Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429.