Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"

Slides:



Advertisements
Similar presentations
UNQUESTIONABLE NEED FOR PEER-REVIEWING 1. Visibility of major papers 2. Improvement of manuscript content 3. Identification of excellence Colloque de lAcadémie.
Advertisements

28 April 2004Second Nordic Conference on Scholarly Communication 1 Citation Analysis for the Free, Online Literature Tim Brody Intelligence, Agents, Multimedia.
Library Scholarly Communication & Bibliometrics and a new ISI University Rankings Methodology Max Weber Programme 17 November 2010.
DUAL SUPPORT DUEL FOR SUPPORT Professor Sir Gareth Roberts University of Oxford.
June 2006 How good is our research? New approaches to research indicators.
Oct 2006 Research Metrics What was proposed … … what might work Jonathan Adams.
The future of the British RAE The REF (Research Excellence Framework) Jonathan Adams.
Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
Research networks for innovation in East Asia – who does the future belong to 27/09/2012 Research Collaboration in Selected ASEAN Countries Dr Janet Ilieva.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Research Assessment and UK publication patterns Jonathan Adams.
HOW CONCENTRATED IS THE UK RESEARCH BASE? THE DISTRIBUTION OF EXCELLENCE AND DIVERSITY JONATHAN ADAMS 14 OCTOBER 2009.
A Review of Canadian Publications and Impact in the Natural Sciences and Engineering, 1996 to 2005 OST Colloquium/Colloque April 26, 2007 Barney Laciak.
Does It Matter Where We Publish? Adam Eyre-Walker University of Sussex.
Measuring Scholarly Communication on the Web Mike Thelwall Statistical Cybermetrics Research Group University of Wolverhampton, UK Bibliometric Analysis.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Hyperlinks and Scholarly Communication Mike Thelwall Statistical Cybermetrics Research Group University of Wolverhampton, UK Virtual Methods Seminar, University.
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
Journal Impact Factors and H index
The Changing Role of Intangibles over the Crisis Intangibles & Economic Crisis & Company’s Value : the Analysis using Scientometric Instruments Anna Bykova.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
Journal level impact assessment a diversity of new metrics Sarah Huggett Publishing Information Manager, Scientometrics & Market Analysis, Research & Academic.
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Discovery tools and research assessment solutions APRIL 2012 Shahrooz Sharifrazy Regional Sales Manager.
Beyond the RAE: New methods to assess research quality July 2008.
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Google Scholar as a cybermetric tool Alastair G Smith Victoria University of Wellington New Zealand
Amy W. Apon* Linh B. Ngo* Michael E. Payne* Paul W. Wilson+
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
1 International Open Access Week St. Lukes Campus, University of Exeter, 25 th October 2012.
CITATION ANALYSIS A Tool for Collection Development and Enhanced Liaison Services Christine Brown and Denis Lacroix.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
Round Table Discussion Bibliometric Indicators in Research Evaluation and Policy Colloque Evolution des publications scientifiques Académie des sciences,
"Razzle Dazzle" on literature-based bibliometrics for Research Assessment P. Larédo Seminar "Research Assessment: What Next?" May 17-20, Washington.
INCITES TM INSTITUTIONAL PROFILES David Horky Country Manager – Central & Eastern Europe Informatio Scientifica / Informatio.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
Sunday October 28, www.eprints.org Cross-Discipline Self-Archiving through Distributed Archives … or changing this... arXiv submission rates - linear.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Where Should I Publish? Journal Ranking Tools
The swedish research barometer 2016
Jisc Open Access Dashboard
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometrics toolkit: Thomson Reuters products
Bryan G. Cook, University of Hawaii
Bibliometric Analysis of Water Research
What Does Responsible Metrics Mean?
Advanced Scientometrics Workshop
ارزیابی پژوهش: روشها، مدلها و چالشها
Bibliometric Analysis of Quality of Life Publication
Bibliometrics: the black art of citation rankings
Comparing your papers to the rest of the world
How good is our research? New approaches to research indicators
Presentation transcript:

Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques" mai 2007

Paris, May 2007 What is Evidence? Research performance analysis and interpretation –Founded 2000, grew from government and HE research management Studies for e.g. UUK, HEFCE, OST, Defra, EC, universities –Annual OSI PSA target indicators for UK science and engineering –Research funding and impact studies across research base –Current work for Austria, New Zealand, Sweden Quantitative research analysis products –Overview of complete research process Funding, activity and outputs in detailed, structured and mapped databases Data reconciled to subject areas and institutions –Higher Education Research Yearbook (5 th ed’n) –Indicator applications, Publication databases, Research profiling

Paris, May 2007 Linking indicators to management information ‘Average impact’ is a good bibliometric index but not sufficient –A great tool for reporting but not for action –Average is a metric; distribution is a picture Data are skewed, so average is not central –Many papers are uncited and a few papers are very highly cited New approach looks at where the spread of performance falls –Activity is located within distribution by more than a single metric –Thresholds help in describing peak of performance This improves descriptive power, information content and management value

Paris, May 2007 Traditional impact indicators are excellent for international reports

Paris, May 2007 Bibliometrics track increase in UK share of world citations in response to research assessment

Paris, May 2007 Impact index is coherent across UK grade levels - data are for core science disciplines, grade at RAE96

Paris, May 2007 Chemistry – alternative bibliometric indices, both correlate with our mapping to RAE grade Each data point is an institution

Paris, May 2007 Bibliometric impact ( ) is related to RAE2001 grade for UoA14 Biology

Paris, May 2007 Assumed distribution of “research performance”

Paris, May 2007 Actual distribution of data values The variables for which we have metrics are skewed and therefore difficult to picture in a simple way

Paris, May 2007 Simplify the data picture Scale data relative to a benchmark, then categorise –Could do this for any data set All journal articles –Uncited articles (take out the zeroes) –Cited articles Cited less often than benchmark Cited more often than benchmark –Cited more often but less than twice as often –Cited more than twice as often »Cited more often but less than four times as often »Cited more than four times as often

Paris, May 2007 Categorise the impact data This grouping is the equivalent of a log 2 transformation. There is no place for zero values on a log scale.

Paris, May 2007 UK Impact Profile TM [10 years; 680,000 papers] AVERAGE RBI = 1.24 MODE (cited) MEDIAN THRESHOLD OF EXCELLENCE? MODE

Paris, May 2007 Implications Is the UK research base as good as we thought? –YES - the average is unchanged –What lies beneath just became apparent The ‘peak’ of high impact is very concentrated Evaluate Impact Profile TM methodology –Do other countries look similar? Yes, we profiled the USA as well –Does it work by year and by subject? See Scientometrics, Vol. 72, No. 2 (2007) 325–344 –How can we apply it?

Paris, May 2007 Impact Profiles TM for subjects & sites – molecular biology

Paris, May 2007 Impact Profiles TM for international institutes Location USA EMBL UK France Japan

Paris, May 2007 Where does this take us? New metrics would advance utility and application –‘Average impact’ is not indicative of distribution –Should we also use e.g. median, mode? –Index proportion of activity at thresholds of excellence? Above world average, More than 4 x world average, etc Weight the categories to produce a single metric –A single metric is still useful for reporting –How do we treat uncited? –How much do we value the truly exceptional? A picture has value in itself –It has descriptive power beyond a simple index –It enables rapid and transparent comparisons for the less expert –It helps us locate specific activity within a profile

Paris, May How good is the research base? New approaches to research indicators