Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March.

Slides:



Advertisements
Similar presentations
How can I find the number of times a work has been cited by other authors?
Advertisements

Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
1 CHBE 594 Lecture 14 Data Bases For The Chemical Sciences.
ANALYSING RESEARCH – A GLOBAL PERSPECTIVE Krzysztof Szymanski – Country Manager Thomson Reuters October 2009.
Scopus. Agenda Scopus Introduction Online Demonstration Personal Profile Set-up Research Evaluation Tools -Author Identifier, Find Unmatched Authors,
Latest Developments in Research Performance Evaluation Yasushi Adachi Regional Manager, Asia Pacific Elsevier 研究產出 競爭力剖析 洞悉全球趨勢.
WISER: Bibliometrics I Who’s citing you? Angela Carritt & Juliet Ralph November 2011.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
Håkan Carlsson Gothenburg University Library Bibliometrics – A Tool in the Evaluation of Science.
1 Guide to exercise 10 Bibliometric searching on indicators for journals, papers, and institutions Tefko Saracevic.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
Bibliometrics: the black art of citation rankings Roger Mills OULS Head of Science Liaison and Specialist Services February 2010 These slides are available.
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Bibliometrics in Computer Science MyRI project team.
Journal Impact Factors and H index
BIBLIOMETRICS & IMPACT FACTORS University of Idaho Library Workshop Series October 15 th, 2009.
Scientometric study of CSIR-NAL Scholarly Literature.
Lunch Discussion Library Resources. Deadline is Approaching! Does W&L want to commit to another three years of Web of Science? Deadline is June 15.
Welcome to Scopus Training by : Arash Nikyar June 2014
The Latest in Information Technology for Research Universities.
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Bibliometrics and Impact Analyses at the National Institute of Standards and Technology Stacy Bruss and Susan Makar Research Librarians SLA Pharmaceutical.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Where Do I Publish My Research Paper? Some Quality Considerations I.R.N. Goudar Head, ICAST National Aerospace Laboratories Bangalore
Kathleen Padova INFO 861 January 20, Emerged in different disciplines, academically Continued to develop in different disciplines in practice Information.
Bibliometric research methods Faculty Brown Bag IUPUI Cassidy R. Sugimoto.
Citation Searching with Web of Knowledge Roger Mills Catherine Dockerty OULS Bio- and Environmental.
Journal Impact Factors and the Author h-index:
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
University of Antwerp Library TEW & HI UA library offers... books, journals, internet catalogue -UA catalogue, e-info catalogue databases -e.g.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Google Scholar as a cybermetric tool Alastair G Smith Victoria University of Wellington New Zealand
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
Citation Searching with Web of Knowledge Roger Mills Catherine Dockerty OULS Bio- and Environmental.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
EuroCRIS Platform Meeting - Vienna 2-3 October 1998 CRIS as a source for tracking science publication patterns Fulvio Naldi - Carlo Di Mento Italian National.
Database collection evaluation An application of evaluative methods S519.
HOW TO PUBLISH IN HIGH-IMPACT PUBLICATION. At the end of this session, participants will be able to choose the method of measurement on research performance.
Citation Searching To trace influence of publications Tracking authors Tracking titles.
CiteSearch: Multi-faceted Fusion Approach to Citation Analysis Kiduk Yang and Lokman Meho Web Information Discovery Integrated Tool Laboratory School of.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Making an impact ANU Library. Topics Research data management Open access Bibliometrics Researcher profiles Where to publish 2.
Citation Searching Isabel Holowaty Juliet Ralph
Bibliometrics: the black art of citation rankings Roger Mills Head of Science Liaison and Specialist Services, Bodleian Libraries June 2010 These slides.
Tools for building literature review and measuring research impact Jan. 27, 2016 Mei Ling Lo Math/Computer Science Librarian
Scopus Fueling Research, Driving Innovation. Scopus Introduction What is Scopus ? Why do you need Scopus? Why do our customers use Scopus?
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
THOMSON REUTERS INCITES Marta Plebani – Country Account Manager – Italy, Slovenia, Croatia 12 May 2011.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Where Should I Publish? Journal Ranking Tools
Bibliometrics toolkit: Thomson Reuters products
Assessment of the contribution of IIT’s:
Bibliometric Analysis of Water Research
Bibliometric Analysis of Process Safety and Environmental Protection
Citation Searching with Web of Knowledge
1Micheal T. Adenibuyan, 2Oluwatoyin A. Enikuomehin and 2Benjamin S
Presentation transcript:

Citation Counting, Citation Ranking, and h- Index of HCI Researchers: Scopus vs. WoS Lokman I. Meho and Yvonne Rogers Network and Complex Systems March 24, 2008

Why citation analysis? Study the evolution of scientific disciplines Examine and/or map the social, economic, political, and intellectual impact of scientific research Assist in certain decisions (promotion, tenure, hiring, grants, collaboration, etc.)

Research problem Until today, most citation-based research rely exclusively on data obtained from the Web of Science database Emergence of Scopus and Google Scholar has raised many questions regarding the use of Web of Science exclusively

Literature review The question of whether to use Scopus and/or Web of Science as part of a mapping or research assessment exercise might be domain-dependent and that more in-depth studies are needed to verify the strengths and limitations of each source Scopus covers 84% of all journal titles indexed in Web of Science; Web of Science covers 54% of all journal titles indexed in Scopus

Research questions How do the two databases compare in their coverage of HCI literature and the literature that cites it, and what are the reasons for the differences? What impact do the differences in coverage between the two databases have on the citation counting, citation ranking, and h-index scores of HCI researchers? Should one or both databases be used for determining the citation counting, citation ranking, and h-index scores of HCI researchers?

Significance/value of study Determine whether citation searching in HCI should be extended to both Scopus and Web of Science or limited to one of them. Will help people who use citation analysis for research evaluation and mapping exercises justify their choice of database

Databases Web of Science – Approximately 9,000 journals going back to 1955 – Books in series and an unknown number of conf. proceedings, including LNCS, LNAI, LNM Scopus – 14,000 journals going back to 1996 for citations – 500 conference proceedings – 600 trade publications

Methods Sample – 22 top HCI researchers from the Equator Interdisciplinary Research Collaboration, a project funded by UK’s Engineering and Physical Sciences Research Council (six years) Publications (n=1,440, mainly conf papers and journal articles) – 594 (41%) were covered by Scopus – 296 (21%) were covered by Web of Science – 647 (45%) were covered by both

Methods, cont’d Searching methods used to identify citations to the 1,440 items published/produced by the sample members: – Scopus: (1) exact match of each item in “References” field; (2) “More” tab; and (3) “Author” search results + “Cited by” – WoS: cited references search Citation information was parsed by author, publication type, year, source name, institution, country, and language Source names were manually standardized and missing institutional affiliation and country information (3%) was gleaned from the web

Methods, cont’d Data from both databases were cross- examined for accuracy h-index – Definition, strengths, and limitations – System-based counting method (takes into account only indexed works, n=647 works) – Manual-based counting method (takes into account all 1,440 works published/produced by sample)

11 Results: Distribution of unique and overlapping citations Scopus n=6,919 (93%) Web of Science n=4,011 (54%) 3,491 (47%) 520 (7%) 3,428 (46% ) WoS  Scopus = 7,439* *Excludes 255 citations from WoS, published before 1996

Results: Reasons for the significant differences Note: 76% of all citations found in conference proceedings were unique to a single database, in comparison to 34% in the case of citations in journals

Results: Quality of Scopus unique citing journals RankSources of citationsWoSScopusUnion Scopus IF (rank) JCR Impact Factor 1Presence: Teleoperators and Virtual Environments (10) International Journal of Human-Computer Studies (7) Interacting with Computers (17) Computer Supported Cooperative Work (4)NA 5TCyberpsychology & Behavior (13) TIEEE Pervasive Computing (2) TPersonal and Ubiquitous Computing* (12)NA 8Behaviour & Information Technology (19) J. of the Am. Soc. for Info. Sci. and Tech (6) Human-Computer Interaction (1) T ACM Transactions on Computer-Human Interaction (3)NA 19TNew Review of Hypermedia and Multimedia (22)NA This is a partial list of the top 20 citing journals

Results: Quality of Scopus’s citing conference proceedings (top 9 citing titles) RankSources of citations WoS citations Scopus citations Union citations IF* (rank) 1 ACM Conference on Human Factors in Computing Systems Not indexed (1) 2 ACM Conference on Computer Supported Cooperative Work Not indexed Ubicomp: Ubiquitous Computing, Proceedings (LNCS) IEEE International Conference on Pervasive Computing and Communications, PerCom Not indexed (3) 5 Proceedings of SPIE - The International Society for Optical Engineering Not indexed ACM Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI (LNCS) IEEE Virtual Reality Conference Not indexed (5) 8ACM Conference on Hypertext and Hypermedia** Not indexed (4) 9ACM Conference on Designing Interactive Systems, DIS Not indexed 45 - *Source: Scopus.

Differences in citation counting and ranking of individual researchers (top 12) NameWeb of ScienceScopusDifferenceUnion of Web of Science and Scopus CountRankingCountRankingCount (%)RankingCountRanking Rogers*75311, (63%)01,3191 Benford*57231, (106%)11,2442 Rodden*57721, (86%)1,1383 De Roure* (81%)18344 Gaver* (65%)7535 Friday* (86%)26776 Schmidt (84%)26547 Gellersen* (90%)26278 Cheverst (66%) Steed* (65%) Chalmers* (62%) Crabtree (140%) TOTAL4,0116,9192,908 (73%)7,439

Differences in mapping scholarly impact of individual researchers: an example ResearcherWeb of ScienceScopus% Mismatch Top Citing Authors BenfordPilar Herrero (10) Chris Greenhalgh (6) Ling Chen (5) Jin Zhang (5) Paul Luff (4) Minh Hong Tran (4) Pilar Herrero (13) Ling Chen (10) Andy Crabtree (10) Azzedine Boukerche (8) Carl Gutwin (7) 64% Top Citing Sources BenfordPresence: Teleoperators and Virtual Environments (57) UbiComp (21) International Journal of Human- Computer Studies (15) Interacting with Computers (14) Personal and Ubiquitous Computing (11) CHI Conference (58) Presence: Teleoperators and Virtual Environments (57) Int. Conf. on Collaborative Virtual Environments (32) Computer Supported Cooperative Work (31) IEEE Virtual Reality Conference (22) 80%

Differences in mapping scholarly impact of individual researchers, cont’d ResearcherWeb of ScienceScopus% Mismatch Top Citing Institutions* BenfordUniversity of Nottingham (33) University of Sussex (14) Lancaster University (11) Universidad Politécnica de Madrid (10) King's College London (8) University of Nottingham (80) University of Ottawa (23) University College London (21) Zhejiang University (19) Fraunhofer-Gesellschaft (16) Georgia Institute of Technology (16) Lancaster University (16) 67% Top Citing Countries BenfordUnited Kingdom (158) United States (127) Germany (30) Japan (28) Australia (25) United Kingdom (312) United States (234) China (69) Japan (65) Canada (52) 40% *Percentage of mismatch would have been higher had we removed citations from the home institution of the researcher

Differences in average h-index

Difference in h-index of individual researchers Web of ScienceScopusUnion of WoS and Scopus Difference System countManual countSystem countManual countSystem countManual count Benford* % Rodden* % Gaver* % De Roure* % Rogers* % Steed* % Gellersen* % Schmidt % Chalmers* % Cheverst % AVERAGE % This is a partial list of the top 10 researchers

Comparison of h-index between GS and WoS+Scopus ResearcherUnion of WoS and ScopusGoogle ScholarDifference ScoreRankScoreRank Benford*241381T58% Rodden*212381T81% Gaver* % De Roure*194274T42% Rogers*175274T59% Cheverst139T256T92% Gellersen*157T256T67% Steed*166256T56% Schmidt157T24960% Friday*139T231077% Chalmers*139T211162% Crabtree139T201254% Brown1014T181380% Fitzpatrick*1014T171470% Muller*917T1515T67% Stanton-Fraser T36% Weal1014T141740% Randell917T131844% Izadi % Schnädelbach % Barkhuus621T8 33% Price621T8 33% AVERAGE %

Conclusions and implications In HCI, conference proceedings constitute a major channel of written communication Most of these proceedings are published by ACM and IEEE and also by Springer in the form of LNCS and LNAI Scopus should be used instead of WoS for citation-based research and evaluation in HCI

Conclusions and implications, cont’d h-index should be manually calculated rather than relying on system-generated scores Researchers can no longer limit themselves to WoS just because they are familiar with it, have access to it, or because it is the more established data source A challenge is to systematically explore citation data sources to determine which one(s) are better for what research domains

Conclusions and implications, cont’d Principles of good bibliometrics research: – Analysis should be applied only by professional people with theoretical understanding and thorough technical knowledge of the databases, retrieval languages, and the abbreviations, concepts, and/or terminologies of the domain under investigation – Analysis should only be used in accordance with the established principles of “best practice” of professional bibliometrics – If utilized for research assessment purposes, citation-based information should only be used in conjunction with qualitative peer review-based information

March 24, 2008 Network and Complex Systems 24 Thank You Questions? Full paper available at: