Dr Peter Darroch SciVal Consultant Looking past the usual metrics to help researchers demonstrate Excellence to support Grant applications.

Slides:



Advertisements
Similar presentations
INFORMATION SOLUTIONS Citation Analysis Reports. Copyright 2005 Thomson Scientific 2 INFORMATION SOLUTIONS Provide highly customized datasets based on.
Advertisements

Making the Case for Research Academic Promotions 2015 Professor Stephen Garton | Provost and Deputy Vice-Chancellor Academic Promotions 2015.
Service to the University, Discipline and Community Academic Promotions Briefing Session Chair, Academic Board Peter McCallum.
RESEARCH PORTFOLIO sydney.edu.au/research_support How to Evaluate Research Performance PROFESSOR STEPHEN GARTON PROVOST & DEPUTY VICE-CHANCELLOR Briefing.
Using Incites to evaluate Research Performance Advanced
Overview What is ‘Impact’, and how can it be measured? Citation Metrics Usage Metrics Altmetrics Strategies and Considerations.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
Project Monitoring Evaluation and Assessment
PROMOTION OF ACADEMIC STAFF Professor Merlin Crossley Acting Deputy-Vice-Chancellor (Research)
Prof. Robert Morrell, UCT Research Office Presentation to North West University 28 February 2014.
Making the Case for Research Academic Promotions 2012 Professor Stephen Garton | Provost and Deputy Vice-Chancellor Academic Promotions 2012.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Standards and Guidelines for External Quality Assurance 19 May 2009 Axel Aerden International Policy Advisor.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Journal Impact Factors and H index
How to promote your publications via live CV? Stop Searching, Start Discovering.
College of Agriculture and Life Sciences WELCOME Associate Professor P&T Workshop Transitioning from Associate to Full Professor April 23, 2015.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
REF Information Session August Research Excellence Framework (REF)
Career Mobility – Preparing for the Tenure Process Professor Michael D. Kimbrough University of Maryland.
The Profile (Google Scholar Citations) May 2015 Prof Hiran Amarasekera University of Sri Jayewardenepura Japura Media.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Training Seminar The Professional Association of Research Managers and Administrators Improved Research Information and Decision Management for Strategic.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Publishing Your Work Not a Question, But rather an Execution Who? Why? When? Where? How? รัตติกร ยิ้มนิรัญ สาขาวิชาฟิสิกส์ สำนักวิชา วิทยาศาสตร์ มหาวิทยาลัยเทคโนโลยีสุรนารี
How to Write a Critical Review of Research Articles
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Background Briefings for Librarians #1: Impact Factor Paola Gargiluo speaking with Dr. David F. Kohl CASPUR Podcast Series May 28, 2008.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
PROMOTION AND TENURE FOR CLINICAL SCIENTISTS – BOTH PATHWAYS Peter Emanuel, M.D. Laura Lamps, M.D.
Peer review of digital resources for the arts and humanities David Bates and Jane Winters.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
How to use Bibliometrics in your Career The MyRI Project Team.
DOSSIER PREPARATION MENTORING PROGRAM Session #4 June 23, 2015  CV and Summary Statements (feedback)  Review Teaching Statement of Endeavors and Supporting.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
RUNNING ON ELSEVIER | SciVal – an introduction Marcel Vonder Head of Product Development, SciVal 3d NEICON International Conference Halkidiki, Greece.
DOSSIER PREPARATION MENTORING PROGRAM Session #3 June 17, 2014  CV and Summary Statements (feedback)  Review Teaching Statement of Endeavors and Supporting.
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
Scientists and public communication: A survey of popular science publishing across 15 countries EMA Thematic Conference, Bordeaux March 29-30, 2010 Peter.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Bibliometrics at the University of Glasgow Susan Ashworth.
#SciVal Publication and Citation data: what can SciVal show you? Dr Peter Darroch – Research Intelligence Consultant.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
QuicK Reference Cards for Research Impact Metrics.
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Phil Quirke RAE 2008 & REF 2014 panels
Academic Promotion of HE Teaching Personnel : China
How to Improve the Visibility and Impact of Your Research
Optimize your research performance using SciVal
Advanced Scientometrics Workshop
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
Research metrics for Academic Promotion
SciVal to support building a research strategy
Comparing your papers to the rest of the world
Bibliometric Services at the Masaryk University
Presentation transcript:

Dr Peter Darroch SciVal Consultant Looking past the usual metrics to help researchers demonstrate Excellence to support Grant applications

Halt the avalanche of performance based metrics “Bogus measures of ‘scientific quality’ can threaten the peer-review system” “The increasing dominance of quantitative research assessment threatens the subjective values that really matter in Academia” 2 World view – Colin Macilwain, Nature, Vol 5, Comment on Snowball metrics “I suspect that in practice, however, it will end up being used mainly to exercise yet more control over academic staff, with every aspect of their professional lives tracked on the system.” Response from Glenn Swaford, Oxford University “However his portrayal of Project Snowball does not ring true with us here at Oxford. We are founding participants in this Project. Snowball puts HEIs in the driver's seat. We (not others) elect what to measure and why. Oxford is involved to assist academic-led planning at a unit level. There is and will be no 'central' much less local use of these data to assess individuals.

Coming up… Conditions for a good metric and factors that can affect the value of citation metrics A model to help you select appropriate metrics Showcasing excellence to support for example a grant application An example of useful metrics for showcasing a senior researcher An example of useful metrics for showcasing a junior researcher What currently happens? 3 Health warning Using metrics is not black and white This session is a discussion about potential uses There are always exceptions, so always engage your brain!!

Adapted from Scholary Kitchen podcast, July 10th Phil Davis – Bibliometrics in an age of abundance 4 Conditions for a good metric 1.Transparent underlying data – Can you trace the data. Is there authority and accountability in the data set (related to 5) 2.External validity of metric – Needs theoretical connection to what you are trying to measure which is not always clear 3.Reliable – Query several times and get same or similar result 4.Easy to replicate – Some metrics are based on complicated algorithms/calculations 5.Hard to distort – Needs to be structural and human systems in place to prevent distortion/gaming

Factors that can affect the value of citation metrics Variety in the size of entities within the data set Several disciplines within the data set Multiple publication types within the data set Coverage of data source, by geography and/or discipline Ease of manipulation Quality of performance 5 Accounting for these Reveals this

A model to help you select appropriate metrics Based on four questions

Q1 What am I trying to achieve? This may be the most difficult part of any process. Your question/goal should drive the data and metrics that you use, not the other way round Example questions/goals: How can I show I deserve this award/grant? Which of several applicants would be a good fit with our existing group? How can I show that I should be promoted or get tenure? How can I attract more students/researchers to my group? Researchers will experience the reverse direction also: Funders/line managers using metrics as one input into decisions (in addition to opinion and peer review) Note: Metrics don’t need to be about top down evaluation or benchmarking 7

Evaluations/showcasing can fall into 3 types 8 Distinguishing between performance: “looking for the best” Demonstrating excellence: “showing off” Modeling scenarios: “fantasy academia” Will I have the best chance of a positive outcome if I invest in X or in Y? How can I showcase Z to look the best possible? What if I…? TYPICAL QUESTION Average of all publications e.g. Citations per Publication Highlight the few top publications in a data set e.g. Publications in Top Percentiles USEFUL APPROACH Depends… Snowball Metric;

Evaluations/showcasing can fall into 3 types 9 Distinguishing between performance: “looking for the best” Demonstrating excellence: “showing off” Modeling scenarios: “fantasy academia” Will I have the best chance of a positive outcome if I invest in X or in Y? How can I showcase Z to look the best possible? What if I…? TYPICAL QUESTION Average of all publications e.g. Citations per Publication Highlight the few top publications in a data set e.g. Publications in Top Percentiles USEFUL APPROACH Depends… Snowball Metric;

Q2: What am I looking to evaluate or showcase? Institution / group of / discipline within Country / group of / discipline within Research Area / group of Researcher / group of Publication Set / group of Awards program of a funding agency Effectiveness of policy Etc. 10

Q3: How will I recognise/demonstrate good performance? A: usually, relative to peers that you have selected to be Distinguishing: equivalent to and a bit above your status Demonstrating excellence: equivalent to and somewhat below your status A few considerations that may affect your selection of peers Size – of publication output / student program / funding Status – recognition / league tables / reputation / influence Disciplinary focus Geographic location Degree of focus on research or teaching Comparators for your university, or for a department or academic 11

Q4: Which metrics could help me make my decision? This list displays metrics being developed by SciVal 12 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration

Showcasing excellence to support for example a grant application

Which metrics help me showcase performance? This list displays metrics being developed by SciVal 14 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration (geographical) Academic-Corporate Collaboration Snowball Metric; Productivity metrics. Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations.

Extensive Citation Impact metrics address many needs This list displays metrics being developed by SciVal 15 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Snowball Metric; Field-Weighted Citation Impact is very useful because it accounts for several variables that affect the metric value, and recent values do not drop. But it is not transparent for new users. Citation Count is a “power” metric. Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations. Citations per Publication is a size- normalized metric. Useful to compare entities of different sizes. Few recent citations.

h-index variants emphasize different strengths Examples given for researchers only This list displays metrics being developed by SciVal 16 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Snowball Metric; m-index is h-index per year of publishing activity. It ‘levels’ the playing field for researchers with different career lengths. It is not useful for researchers who have had a career break. An h-index of 7, means that 7 of a papers have each been cited at least 7 times. For researchers, it is useful to indicate both productivity and citation impact. It is not useful for new researchers with few citations. g-index emphasizes and rewards the most highly cited papers. Is always the same as or higher than h-index. For researchers, it is good to emphasize exceptional papers. It is not useful for average researchers where h=g, or for new researchers.

Not all Citation Impact metrics need the data set to have citations! This list displays metrics being developed by SciVal 17 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Snowball Metric; Publications in Top Journal Percentiles. This is a good metric to engage researchers. It is also useful early in a strategy or career because publications do not need their own citations. However, publications are judged based on the average performance of the journal. Publications in Top Percentiles. Useful to distinguish between entities whose averages are similar, and to show off. Not always inclusive of average entities, and time is needed for citations to be received.

Topical Collaboration metrics have broad value This list displays metrics being developed by SciVal 18 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration (geographical) Academic-Corporate Collaboration Snowball Metric; Collaboration metrics only need the affiliation information that authors have included on their publications. They do not need any citations. They are very useful e.g. at the start of new strategy, or early in a researcher’s career, when publications exist but too little time has passed for citation-based metrics to be reliable.

Showcasing the performance of a senior researcher Likely a large body of work available to showcase Publication volume and total citation counts as well as associated indices (h- and g-index) should work well? Metrics to consider cpp Top citation percentiles Top Journal percentiles h- and g-index Collaboration metrics –Demonstrate the expertise of your collaborators who support your research –Demonstrate your network/reach 19

Showcasing the performance of a junior researcher Potentially smaller body of work available to showcase Simple counts perhaps not so useful due to lower volume and maybe not enough time to accumulate the necessary citations h- and g-index not so useful potentially but m-index? Metrics to consider cpp Top citation percentiles Top Journal percentiles m-index Collaboration metrics –Demonstrate the expertise of your collaborators who support your research –Demonstrate your network/reach 20

A model for selecting metrics with 4 questions What question am I trying to achieve/answer? What am I evaluating/showcasing? How will I recognise good performance? Which metrics will help me? 21

What currently happens? 22 How do Academics at your institution currently showcase their expertise? o Grant applications o PDR discussions o Promotion Do they/you use citations or any metrics? How do you support Academics around showcasing expertise currently?

Thank you for your attention

Snowball Metrics are a subset of SciVal metrics This list displays metrics being developed by SciVal 24 Productivity metrics Scholarly Output h-indices (h, g, m) Publication Share Citation Impact metrics Citation Count Citations per Publication Cited Publications h-indices (h, g, m) Field-Weighted Citation Impact Publications in Top Percentiles Publications in Top Journal Percentiles Citation Share Collaboration Impact (geographical) Academic-Corporate Collaboration Impact Disciplinarity metrics Journal count Category count Collaboration metrics Number of Co-authors Authorship Count Number of Collaborating Countries Number of Citing Countries Collaboration Academic-Corporate Collaboration Snowball Metric; Snowball Metrics are endorsed by distinguished universities. They are a manageable, convenient way to start using benchmarking data in university strategy. Other metrics allow more sophisticated analysis, and are useful for other entities.