Download presentation
Presentation is loading. Please wait.
Published byBartholomew Park Modified over 9 years ago
1
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010
2
HOW DO WE EVALUATE RESEARCH? Research grants –Number and value Prestigious awards –Nobel Prizes Patents –Demonstrating innovative research Faculty –Number of post-graduate researchers Citation analysis –Publication and citation counts –Normalised by benchmarks Peer Evaluation –Expensive, time consuming and subjective
3
A BRIEF HISTORY OF THE CITATION INDEX Concept first developed by Dr Eugene Garfield –Science, 1955 The Science Citation Index (1963) –SCI print (1960’s) –On-line with SciSearch in the 1970’s –CD-ROM in the 1980’s –Web interface (1997) Web of Science Content enhanced: –Social Sciences Citation Index (SSCI) –Arts & Humanities Citation Index (AHCI) The Citation Index –Primarily developed for purposes of information retrieval –Development of electronic media and powerful searching tools have increased its use and popularity for purposes of Research Evaluation
4
WEB OF SCIENCE JOURNAL SELECTION POLICY Why do we select journals?
5
WHY NOT INDEX ALL JOURNALS? 40% of the journals: 80% of the publications 92% of cited papers 4% of the journals: 30% of the publications 51% of cited papers
6
HOW TO DECIDE WHICH JOURNALS TO INDEX Approx. 2000 journals evaluated annually –10-12% accepted Thomson Reuters editors –Information professionals –Librarians –Experts in the literature of their subject area Web of Science Journals under evaluation Journal ‘quality’
7
THOMSON REUTERS JOURNAL SELECTION POLICY Publishing Standards –Peer review, Editorial conventions Editorial content –Addition to knowledge in specific subject field Diversity –International, regional influence of authors, editors, advisors Citation analysis –Editors and authors’ prior work
8
Region# Journals from Region in Web of Science Europe5,573 49% North America4,251 38% Asia-Pacific965 9% Latin America272 2% Middle East/Africa200 1% Language# Journals in Web of Science English9114 81% Other2147 19% GLOBAL RESEARCH REPRESENTATION WEB OF SCIENCE COVERAGE
9
Analyses based on authoritative, consistent data from the world’s leading provider of Research Evaluation solutions Thomson Reuters has developed a selection policy over the last 50 years designed to hand-pick the relevant journals containing the core content over the full range of scholarly disciplines This has created a large set of journals containing comparable papers and citations Thomson Reuters has always had one consistent editorial policy to index all journals cover-to-cover, index all authors and index all addresses. This unique consistency makes Web of Science the only suitable data source for citation analysis SUMMARY CONSISTENCY IS THE KEY TO VALIDITY
10
GOVERNMENTS AND INSTITUTIONS USING TR DATA FOR EVALUATION (INCL.) Germany: IFQ, Max Planck Society, DKFZ, MDCUS Netherlands: NWO & KNAW France: Min. de la Recherche, OST - Paris, CNRS United Kingdom: King’s College London; HEFCE European Union: EC’s DGXII(Research Directorate) US: NSF: biennial Science & Engineering Indicators report (since 1974) Canada: NSERC, FRSQ (Quebec), Alberta Research Council Australian Academy of Science, gov’t lab CSIRO Japan: Ministry of Education, Ministry of Economy, Trade & Industry People’s Republic of China: Chinese Academy of Science Times Higher Education: World University Rankings (from 2010) 10
11
EVALUATING COUNTRIES
12
SCIENTIFIC RESEARCH IMPACT IN CENTRAL EUROPE 12 Thomson Reuters InCites
13
OUTPUT AND PRODUCTIVITY BULGARIAN RESEARCH 1998 - 2008 13
14
COMPARATIVE IMPACT IN SELECTED FIELDS BETWEEN COUNTRIES 14 Source: Thomson Reuters InCites
15
BULGARIAN RESEARCH RELATIVE PRODUCTIVITY BY FIELD 15 22% Bulgarian papers are in Chemistry Source: Thomson Reuters InCites <1% Bulgarian papers are in Psychiatry
16
EVALUATING INSTITUTIONS
17
Source: Thomson Reuters North America University Science Indicators
18
CITATIONS PER PAPER MATHEMATICS 18 Source: Thomson Reuters InCites
19
COMPARISON OF TOP MATHEMATICS INSTITUTES AROUND THE WORLD 19 Source: Thomson Reuters InCites
20
WITH WHOM DOES OUR FACULTY COLLABORATE? 20 Source: Thomson Reuters InCites
21
WHICH COLLABORATIONS ARE THE MOST VALUABLE? 21 Collaborations with these institutions have produced highly cited papers within their subject fields Source: Thomson Reuters InCites
22
EVALUATING JOURNALS
23
CALCULATING 2009 IMPACT FACTOR - JOURNAL OF CONTAMINANT HYDROLOGY Citations in 2009 To items published in 2008 = 153 To items published in 2007 = 239 Sum = 392 Number of items Published in 2008 = 97 Published in 2007 = 98 Sum = 195 392 195 = 2,01
24
JOURNAL IMPACT FACTOR SELECTED CHEMISTRY JOURNALS 24 Thomson Reuters Journal Citation Reports
25
25 USING THE IMPACT FACTOR EVALUATING JOURNALS Appropriate use –To evaluate journals within a subject field Misuse –Comparison of journals from different fields –Evaluation of individual articles –Evaluation of institution or researcher
26
26 USING THE IMPACT FACTOR MISUSE: EVALUATING INDIVIDUAL PAPERS 30% of articles in Food Policy were not cited at all Journal Impact Factor = 2,01
27
BENCHMARK YOUR PAPERS AGAINST GLOBAL AVERAGES – IS THIS A HIGHLY CITED PAPER? 27 Hematology articles from this year have been cited 18,83 times This article is ranked in the 12,92nd percentile in its field by citations Articles published in ‘Blood’ from 2004 have been cited 34,30 times This paper has received 40/34,30=1,17 times the expected citations for this journal This paper has received 40/18,83=2,12 times the expected citations for this subject category
28
EVALUATING INDIVIDUALS
29
HOW CAN WE COMPARE RESEARCHERS? 29 Author A: 60 papersAuthor B: 117 papers Source: Thomson Reuters InCites
30
OBTAIN MULTIPLE MEASURES
31
RECOGNIZE THE SKEWED NATURE OF CITATION DATA Citation distribution is always skewed –Few highly cited papers –Majority cited little or not at all Distribution type –Always distorted –Human decision E.g. Criticality
32
SUMMARY (I): TREAT AS A SCIENTIFIC STUDY Ask whether the results are reasonable Follow scientific process for evaluating data Apply scientific skepticism
33
SUMMARY (II): HOW DO WE EVALUATE RESEARCH? Research grants –Number and value Prestigious awards –Nobel Prizes Patents –Demonstrating innovative research Faculty –Number of post-graduate researchers Citation analysis –Publication and citation counts –Normalised by benchmarks Peer Evaluation –Expensive, time consuming and subjective
34
THANK YOU Philip Purnell September 2010
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.