THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent

Slides:



Advertisements
Similar presentations
Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
Advertisements

Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
Purpose of the Standards
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
Measuring Your Research Impact Citation and Altmetrics Tools University Libraries Search Savvy Seminar Series April 9 & 10, 2014 Prof. Amanda Izenstark.
Bibliometrics at the University of Glasgow Susan Ashworth.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Standard 4: Faculty, Staff, & Students 1. Standard 4: Faculty, Staff, and Students Standard 4: Faculty, Staff, and Students (#82) INTENT STATEMENTS 4.1.
Where Should I Publish? Journal Ranking Tools
Dr.V.Jaiganesh Professor
Building Your Personnel Action Dossier
QuicK Reference Cards for Research Impact Metrics.
Metrics What they are and how to use them
7 steps to maximise your research profile
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometrics toolkit: Thomson Reuters products
Evaluating of Information
Lesson Overview 1.2 Science in Context.
CILIP Performance Framework – Business metrics & KPI
Using metrics to change the narrative
Altmetrics What do they measure?
Citation Analysis Your article Jill Otto InCites Other?
EU Expert Group Altmetrics
Auditor Training Module 1 – Audit Concepts and Definitions
Bryan G. Cook, University of Hawaii
Post-publication evaluation through tags
UGC RAE /9/20.
Altmetrics 101 LITA Altmetrics & Digital Analytics Webinar
What Does Responsible Metrics Mean?
How to Improve the Visibility and Impact of Your Research
Reading Research Papers-A Basic Guide to Critical Analysis
Lesson Overview 1.2 Science in Context.
Advanced Scientometrics Workshop
Metrics: a game of hide and seek
Second Edition Chapter 3 Critically reviewing the literature
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
Using Secondary Sources [Secondary Sources.pptx]
Introductory Reviewer Development
Demystifying doughnuts:
Towards Excellence in Research: Achievements and Visions of
Journal evaluation and selection journal
This content is available under a Creative Commons Attribution License
The Rosabeth Moss Kanter Award Module 2, Class 2 A Teaching Module Developed by the Curriculum Task Force of the Sloan Work and Family Research Network.
Lesson Overview 1.2 Science in Context.
Lesson Overview 1.2 Science in Context.
Lesson Overview 1.2 Science in Context.
Lesson Overview 1.2 Science in Context.
Measuring Your Research Impact
PUBLONS INTEGRATION INTO AN INSTITUTIONAL REPOSITORY
Lesson Overview 1.2 Science in Context.
Questioning and evaluating information
The Office for Scholarly Communication
You publish. We care..
Lesson Overview 1.2 Science in Context.
Managerial Decision Making and Evaluating Research
Meta-analysis, systematic reviews and research syntheses
Good assessment design The overarching principles of good assessment
Introducing the GCU Research
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Lesson Overview 1.2 Science in Context.
Shasta CCD Board Retreat CEO Search, Accreditation & Student Success
Presentation transcript:

THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent

Why metrics? Numbers are easy...

Why responsible metrics? Numbers can be unhelpful “Mine’s a 3!” This 3? 0, 0, 0, 0, 0.1, 0.2, 0.7, 0.8, 0.83, 3 Or this 3? 3, 5, 12, 24, 67, 89, 93, 105, 213, 1980

Why responsible metrics? Numbers don’t tell the whole story, but they can help

Why responsible metrics? Bigger isn’t always better... What is most appropriate? Attention? Citation?

Metrics v Effort

San Francisco Declaration on Research Assessment General Recommendation Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions. For institutions Be explicit about the criteria used to reach hiring, tenure, and promotion decisions, clearly highlighting, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published. For the purposes of research assessment, consider the value and impact of all research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.

The Leiden Manifesto The Leiden Manifesto brings together accepted but disparate principles of good practice in research evaluation. The manifesto represents the “distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account and evaluators can hold their indicators to account” The ten principles are: Quantitative evaluation should support qualitative, expert assessment. Measure performance against the research missions of the institution, group or researcher. Protect excellence in locally relevant research. Keep data collection and analytical processes open, transparent and simple. Allow those evaluated to verify data and analysis. Account for variation by field in publication and citation practices. Base assessment of individual researchers on a qualitative judgement of their portfolio. Avoid misplaced concreteness and false precision. Recognize the systemic effects of assessment and indicators. Scrutinize indicators regularly and update them.

What was it designed to measure? Journals: e.g. JIF, SNIP, SJR Measures - From Clarivate Analytics: “The JIF is defined as citations to the journal in the JCR year to items published in the previous two years, divided by the total number of scholarly items, also known as citable items [articles and reviews], published in the journal in the previous two years.” Researchers: e.g. H-index Article Attention: e.g. Altmetrics Specific interaction Downloads, mentions etc. etc.

Appropriate use

Avoiding over precision 1.73205081 2 Square root of 3? Minutes to lunch?

The time factor

Context is vital Largest bodies of water....

Metrics may not be appropriate And that is fine. New measures are coming all the time e.g. Humane Metrics Initative COLLEGIALITY, which can be described as the professional practices of kindness, generosity, and empathy toward other scholars and oneself; QUALITY, a value that demonstrates one’s originality, willingness to push boundaries, methodological soundness, and the advancement of knowledge both within one’s own discipline and among other disciplines and with the general public, as well; EQUITY, or the willingness to undertake study with social justice, equitable access to research, and the public good in mind; OPENNESS, which includes a researcher’s transparency, candor, and accountability, in addition to the practice of making one’s research OPEN ACCESS at all stages; and COMMUNITY, the value of being engaged in one’s community of practice and with the public at large and also in practicing principled leadership.

For REF 2021 Outputs should be selected for inclusion based on qualitative, expert assessment with support from quantitative indicators where appropriate. Metrics used in relation to a research output should relate directly to that output, not the researcher or the publication it is part of. Metrics regarding an output should be considered in relation to the context of the article, taking into consideration factors such as career stage, gender, language of publication and date of publication, even within a UOA context. Ensure Metrics reflect the reach of the work Check KAR to ensure all research works are recorded there correctly. Maximise metrics, by maximising visibility of research Make Open Access (pre-print, green, gold, …) as soon as possible Open Data reporting/references in the article Contact osc@kent.ac.uk for specific advice or queries Encourage researchers to register for and use an ORCID iD to ensure consistent, reliable attribution of work Work with the Research Excellence Team to ensure details in Scopus are accurate – this is particularly the case if researchers have recently changed name or institution.

This