OST Workshop 12 May 2014, Paris The debate on uses and consequences of STI indicators Paul Wouters, Sarah de Rijcke and Ludo Waltman, Centre for Science.

Slides:



Advertisements
Similar presentations
S&T foresight and government decision-making in the European Research Area Brussels, 21 November 2001 Introduction Rémi Barré OST Associate Professor,
Advertisements

Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
MIRA - WP 2 Observatory of Euro-Med S&T cooperation White Paper Coord. IRD (France) CNRS (Lebanon) MIRA Mediterranean Innovation and Research coordination.
Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
LIDA 2014, June 2014, Zadar, Croatia The metrics acumen: supporting individual researchers in assessment Paul Wouters, Centre for Science and Technology.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
CURTIN UNIVERSITY LIBRARY Curtin University is a trademark of Curtin University of Technology CRICOS Provider code 00301J July 2014 Tell your impact story:
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Primož Južnič, Polona Vilar & Tomaž Bartol, University of Ljubljana, Slovenia What do researchers think about altmetrics and are they familiar with their.
Reviewing and Critiquing Research
BIBLIOMETRICS Presented by Asha. P Research Scholar DOS in Library and Information Science Research supervisor Dr.Y.Venkatesha Associate professor DOS.
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
1 Academic Rankings of Universities in the OIC Countries April 2007 April 2007.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Using Journal Citation Reports The MyRI Project Team.
OCLC Changing support/supporting change, June 2014, Amsterdam New roles for research libraries in performance measurement? Paul Wouters, Centre for.
ACUMEN Portfolio Workshop: Opstellen van goed en breed portfolio Paul Wouters en Inge van der Weijden Symposium Grensoverstijgende Talentontwikkeling Utrecht,
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
Journal level impact assessment a diversity of new metrics Sarah Huggett Publishing Information Manager, Scientometrics & Market Analysis, Research & Academic.
KNOWLEDGE PRODUCTION: THE NEW ROLE OF UNIVERSITIES Two experts group have prepared reports on the future of university/research relations They have proposed.
Achutha Menon Centre for Health Science Studies Sree Chitra Tirunal Institute for Medical Sciences and Technology.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
Margaret J. Cox King’s College London
Impact, measurement and funding Jane Tinkler RENU RESEARCH EXCELLENCE AND FUNDING 28 APRIL 2015.
Communication Degree Program Outcomes
The Web of Science database bibliometrics and alternative metrics
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Implementation and Management of an Information Systems Practicum in a Graduate Computer Information Technology Curriculum S amuel C onn, Asst. Professor.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
DIRECTORY OF EXISTING PROFESSIONAL AND TECHNICAL QUALIFICATIONS IN THE EU (Guy Van Gyes, Tom Vandenbrande, Ellen Schryvers) Budapest, June 12 & 13, 2003.
Detection of different types of bibliometric performance at the individual level in the Life Sciences: methodological outline Rodrigo Costas & Ed Noyons.
Managing Learning and Knowledge Capital Human Resource Development: Chapter 4 HRD Needs Investigation: An overview Copyright © 2010 Tilde University Press.
国 家 科 技 部 评 估 中 心国 家 科 技 部 评 估 中 心 National Center for S&T Evaluation Recent Experiences and Challenges of Research Program Evaluation in China: An Introduction.
Bridging the divide between science and politics Annual Meeting of the African Science Academy Development Initiative (ASADI) Royal Society, London, 5.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Research Quality Framework Presentation to APSR - ARROW - Repository Market Day 4 May 2007 Sandra Fox Department of Education Science and Training.
VELS The Arts. VELS (3 STRANDS) Physical, Personal and Social Learning Discipline-based Learning Interdisciplinary Learning.
Bibliometrics and Publishing Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE School of Education and Communication in Engineering.
A Very Rugged Landscape: The End of Disciplines? The View from a Head of School of Sociology and Anthropology (University of Canterbury)
Queensland University of Technology CRICOS No J HOW RESEARCHERS FIND INFORMATION IN THE NEW DIGITAL AGE Gaynor Austen Director, Library Services.
Carla Basili - Luisa De Biagi Carla Basili * - Luisa De Biagi * * IRCrES Institute, Rome (IT) *CNR –IRCrES Institute, Rome (IT) Central Library ‘G. Marconi’,
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
| 0 Scopus content selection and curation processes Susanne Steiginga, MSc. Product Manager Scopus Content 5th International Scientific and Practical Conference.
Educational contributions to cohesion and well-being in European social and institutional life.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Where Should I Publish? Journal Ranking Tools
Research Indicators for Open Science
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Knowing science Synopsis of the state of the art based on collected research results of the team.
EU Expert Group Altmetrics
WP2. Excellent university for the researchers
Advanced Scientometrics Workshop
Information Technology (IT)
Launching SICI’s Year on Impact of Inspection
The metrics acumen: supporting individual researchers in assessment
Workshop 1: PROJECT EVALUATION
Bibliometric Services at the Masaryk University
Presentation transcript:

OST Workshop 12 May 2014, Paris The debate on uses and consequences of STI indicators Paul Wouters, Sarah de Rijcke and Ludo Waltman, Centre for Science and Technology Studies (CWTS)

Debate so far 1

2 “The variety of available bibliographic databases and the tempestuous development of both hardware and software have essentially contributed to the great rise of bibliometric research in the 80-s. In the last decade an increasing number of bibliometric studies was concerned with the evaluation of scientific research at the macro- and meso-level. Different database versions and a variety of applied methods and techniques have resulted sometimes in considerable deviations between the values of science indicators produced by different institutes” Glänzel (1996): THE NEED FOR STANDARDS IN BIBLIOMETRIC RESEARCH AND TECHNOLOGY, Scientometrics, 35 (2), p. 167.

Applications citation analysis Citation analysis has four main applications: Qualitative and quantitative evaluation of scientists, publications, and scientific institutions Reconstruction and modeling of the historical development of science and technology Information search and retrieval Knowledge organization

Why standards? Bibliometrics increasingly used in research assessment Data & indicators for assessment widely available Some methods blackboxed in database-linked services (TR as well as Elsevier) No consensus in bibliometric community Bewildering number of indicators and data options Bibliometric knowledge base not easily accessible Ethical and political responsibility distributed and cannot be ignored

State of affairs on standards “End users” demand clarity about the best way to assess quality and impact from bibliometric experts Most bibliometric research focused on creating more diversity rather than on pruning the tree of data and indicator options Bibliometric community has not yet a professional channel to organize its professional, ethical, and political responsibility We lack a code of conduct with respect to research evaluation although assessments may have strong implications for human careers and lives

Three types of standards Data standards Indicator standards Standards for good evaluation practices

Data standards Data standards: – Choice of data sources – Selection of documents from those sources – Data cleaning, citation matching and linking issues – Definition and delineation of fields and peer groups for comparison

Indicator standards Indicator standards: – Choice of level of aggregation (nations; programs; institutes; groups; principal investigators; individual researchers) – Choice of dimension to measure (size, activity, impact, collaboration, quality, feasibility, specialization) – Transparency of construction – Visibility of uncertainty and of sources of error – Specific technical issues: Size: fractionalization; weighting Impact: averages vs percentiles; field normalization; citation window

Standards for GEP Standards for good evaluation practices: – When is it appropriate to use bibliometric data and methods? – The balance between bibliometrics, peer review, expert review, and other assessment methodologies (eg Delphi) – The transparency of the assessment method (from beginning to end) – The accountability of the assessors – The way attempts to manipulate bibliometric measures is handled (citation cartels; journal self-citations) – Clarity about the responsibilities of researchers, assessors, university managers, database providers, etc.

Preconference STI_ENID workshop 2 September 2014 Advantages and disadvantages of different types of bibliometric indicators, Which types of indicators are to be preferred, and how does this depend on the purpose of a bibliometric analysis? Should multiple indicators be used in a complementary way? Advantages and disadvantages of different approaches to the field- normalization of bibliometric indicators, eg cited-side and citing-side approaches. The use of techniques for statistical inference, such as hypothesis tests and confidence intervals, to complement bibliometric indicators. Journal impact metrics. Which properties should a good journal impact metric have? To what extent do existing metrics (IF, Eigenfactor, SJR, SNIP) have these properties? Is there a need for new metrics? How can journal impact metrics be used in a proper way?

An example of a problem in standards for assessment 11

Example: individual level bibliometrics Involves all three forms of standardization Not in the first place a technical problem, but does have many technical aspects Glaenzel & Wouters (2013) presented 10 dos and don’ts of individual level bibliometrics Moed (2013) presented a matrix/portfolio approach ACUMEN (2014) presented the design of a Web based research portfolio

aim is to give researchers a voice in evaluation ➡ evidence based arguments ➡ shift to dialog orientation ➡ selection of indicators ➡ narrative component ➡ Good Evaluation Practices ➡ envisioned as web service portfolio expertise output influence narrative

ACUMEN Portfolio Career Narrative Links expertise, output, and influence together in an evidence-based argument; included content is negotiated with evaluator and tailored to the particular evaluation Output - publications - public media - teaching - web/social media - data sets - software/tools - infrastructure - grant proposals Expertise - scientific/scholarly - technological - communication - organizational - knowledge transfer - educational Influence - on science - on society - on economy - on teaching Evaluation Guidelines -aimed at both researchers and evaluators -development of evidence based arguments (what counts as evidence?) -expanded list of research output -establishing provenance -taxonomy of indicators: bibliometric, webometric, altmetric -guidance on use of indicators -contextual considerations, such as: stage of career, discipline, and country of residence Tatum & Wouters | 14 November 2013

Portfolio & Guidelines ➡ Instrument for empowering researchers in the processes of evaluation ➡ Taking in to consideration all academic disciplines ➡ Suitable for other uses (e.g. career planning) ➡ Able to integrate into different evaluation systems Tatum & Wouters | 14 November 2013

What type of standardization process do we need? 16

Evaluation Machines Primary function: make stuff auditable Mechanization of control – degradation of work and trust? (performance paradox) Risks for evaluand and defensive responses What are their costs, direct and indirect? Microquality versus macroquality – lock-in Goal displacement & strategic behaviour

Citation as infrastructure Infrastructures are not constructed but evolve Transparent structures taken for granted Supported by invisible work They embody technical and social standards Citation network includes databases, centres, publishers, guidelines

Effects of indicators Intended effect: behavioural change Unintended effects: – Goal displacement – Structural changes The big unknown: effects on knowledge? Institutional rearrangements Does quality go up or down?

Constitutive effects Limitations of conventional critiques (eg ‘perverse or unintended effects’) Effects: Interpretative frames Content & priorities Social identities & relations (labelling) Spread over time and levels Not a deterministic process Democratic role of evaluations