THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010
HOW DO WE EVALUATE RESEARCH? Research grants –Number and value Prestigious awards –Nobel Prizes Patents –Demonstrating innovative research Faculty –Number of post-graduate researchers Citation analysis –Publication and citation counts –Normalised by benchmarks Peer Evaluation –Expensive, time consuming and subjective
A BRIEF HISTORY OF THE CITATION INDEX Concept first developed by Dr Eugene Garfield –Science, 1955 The Science Citation Index (1963) –SCI print (1960’s) –On-line with SciSearch in the 1970’s –CD-ROM in the 1980’s –Web interface (1997) Web of Science Content enhanced: –Social Sciences Citation Index (SSCI) –Arts & Humanities Citation Index (AHCI) The Citation Index –Primarily developed for purposes of information retrieval –Development of electronic media and powerful searching tools have increased its use and popularity for purposes of Research Evaluation
WEB OF SCIENCE JOURNAL SELECTION POLICY Why do we select journals?
WHY NOT INDEX ALL JOURNALS? 40% of the journals: 80% of the publications 92% of cited papers 4% of the journals: 30% of the publications 51% of cited papers
HOW TO DECIDE WHICH JOURNALS TO INDEX Approx journals evaluated annually –10-12% accepted Thomson Reuters editors –Information professionals –Librarians –Experts in the literature of their subject area Web of Science Journals under evaluation Journal ‘quality’
THOMSON REUTERS JOURNAL SELECTION POLICY Publishing Standards –Peer review, Editorial conventions Editorial content –Addition to knowledge in specific subject field Diversity –International, regional influence of authors, editors, advisors Citation analysis –Editors and authors’ prior work
Region# Journals from Region in Web of Science Europe5,573 49% North America4,251 38% Asia-Pacific965 9% Latin America272 2% Middle East/Africa200 1% Language# Journals in Web of Science English % Other % GLOBAL RESEARCH REPRESENTATION WEB OF SCIENCE COVERAGE
Analyses based on authoritative, consistent data from the world’s leading provider of Research Evaluation solutions Thomson Reuters has developed a selection policy over the last 50 years designed to hand-pick the relevant journals containing the core content over the full range of scholarly disciplines This has created a large set of journals containing comparable papers and citations Thomson Reuters has always had one consistent editorial policy to index all journals cover-to-cover, index all authors and index all addresses. This unique consistency makes Web of Science the only suitable data source for citation analysis SUMMARY CONSISTENCY IS THE KEY TO VALIDITY
GOVERNMENTS AND INSTITUTIONS USING TR DATA FOR EVALUATION (INCL.) Germany: IFQ, Max Planck Society, DKFZ, MDCUS Netherlands: NWO & KNAW France: Min. de la Recherche, OST - Paris, CNRS United Kingdom: King’s College London; HEFCE European Union: EC’s DGXII(Research Directorate) US: NSF: biennial Science & Engineering Indicators report (since 1974) Canada: NSERC, FRSQ (Quebec), Alberta Research Council Australian Academy of Science, gov’t lab CSIRO Japan: Ministry of Education, Ministry of Economy, Trade & Industry People’s Republic of China: Chinese Academy of Science Times Higher Education: World University Rankings (from 2010) 10
EVALUATING COUNTRIES
SCIENTIFIC RESEARCH IMPACT IN CENTRAL EUROPE 12 Thomson Reuters InCites
OUTPUT AND PRODUCTIVITY BULGARIAN RESEARCH
COMPARATIVE IMPACT IN SELECTED FIELDS BETWEEN COUNTRIES 14 Source: Thomson Reuters InCites
BULGARIAN RESEARCH RELATIVE PRODUCTIVITY BY FIELD 15 22% Bulgarian papers are in Chemistry Source: Thomson Reuters InCites <1% Bulgarian papers are in Psychiatry
EVALUATING INSTITUTIONS
Source: Thomson Reuters North America University Science Indicators
CITATIONS PER PAPER MATHEMATICS 18 Source: Thomson Reuters InCites
COMPARISON OF TOP MATHEMATICS INSTITUTES AROUND THE WORLD 19 Source: Thomson Reuters InCites
WITH WHOM DOES OUR FACULTY COLLABORATE? 20 Source: Thomson Reuters InCites
WHICH COLLABORATIONS ARE THE MOST VALUABLE? 21 Collaborations with these institutions have produced highly cited papers within their subject fields Source: Thomson Reuters InCites
EVALUATING JOURNALS
CALCULATING 2009 IMPACT FACTOR - JOURNAL OF CONTAMINANT HYDROLOGY Citations in 2009 To items published in 2008 = 153 To items published in 2007 = 239 Sum = 392 Number of items Published in 2008 = 97 Published in 2007 = 98 Sum = = 2,01
JOURNAL IMPACT FACTOR SELECTED CHEMISTRY JOURNALS 24 Thomson Reuters Journal Citation Reports
25 USING THE IMPACT FACTOR EVALUATING JOURNALS Appropriate use –To evaluate journals within a subject field Misuse –Comparison of journals from different fields –Evaluation of individual articles –Evaluation of institution or researcher
26 USING THE IMPACT FACTOR MISUSE: EVALUATING INDIVIDUAL PAPERS 30% of articles in Food Policy were not cited at all Journal Impact Factor = 2,01
BENCHMARK YOUR PAPERS AGAINST GLOBAL AVERAGES – IS THIS A HIGHLY CITED PAPER? 27 Hematology articles from this year have been cited 18,83 times This article is ranked in the 12,92nd percentile in its field by citations Articles published in ‘Blood’ from 2004 have been cited 34,30 times This paper has received 40/34,30=1,17 times the expected citations for this journal This paper has received 40/18,83=2,12 times the expected citations for this subject category
EVALUATING INDIVIDUALS
HOW CAN WE COMPARE RESEARCHERS? 29 Author A: 60 papersAuthor B: 117 papers Source: Thomson Reuters InCites
OBTAIN MULTIPLE MEASURES
RECOGNIZE THE SKEWED NATURE OF CITATION DATA Citation distribution is always skewed –Few highly cited papers –Majority cited little or not at all Distribution type –Always distorted –Human decision E.g. Criticality
SUMMARY (I): TREAT AS A SCIENTIFIC STUDY Ask whether the results are reasonable Follow scientific process for evaluating data Apply scientific skepticism
SUMMARY (II): HOW DO WE EVALUATE RESEARCH? Research grants –Number and value Prestigious awards –Nobel Prizes Patents –Demonstrating innovative research Faculty –Number of post-graduate researchers Citation analysis –Publication and citation counts –Normalised by benchmarks Peer Evaluation –Expensive, time consuming and subjective
THANK YOU Philip Purnell September 2010