Oslo University College, Norway

Slides:



Advertisements
Similar presentations
Library The Web of Science, Bibliometrics and Rankings 23 November 2011.
Advertisements

INFORMATION SOLUTIONS Citation Analysis Reports. Copyright 2005 Thomson Scientific 2 INFORMATION SOLUTIONS Provide highly customized datasets based on.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
Measuring Scholarly Communication on the Web Mike Thelwall Statistical Cybermetrics Research Group University of Wolverhampton, UK Bibliometric Analysis.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
Journal Status* Using the PageRank Algorithm to Rank Journals * J. Bollen, M. Rodriguez, H. Van de Sompel Scientometrics, Volume 69, n3, pp , 2006.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Journal Impact Factors and H index
Publication and impact in English
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
Are downloads and readership data a substitute for citations? The case of a scholarly journal? Christian Schlögl Institute of Information Science and Information.
How to Use Google Scholar An Educator’s Guide
The Web of Science database bibliometrics and alternative metrics
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
IL Step 1: Sources of Information Information Literacy 1.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Databases and Library Catalogs Global Index Medicus/Global Health Library PubMed Source Bibliographic Database: International Health and Disability.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Impact factorcillin®: hype or hope for treatment of academititis? Acknowledgement Seglen O Per (BMJ 1997; 134:497)
The Web of Science, Bibliometrics and Scholarly Communication 11 December 2013
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
ISC Journal Citation Reprots تقارير استنادية للمجلات Mohammad Reza – Ghane Assistant Prof. in Library and Information Science & Director of Research Department.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
How to use Bibliometrics in your Career The MyRI Project Team.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
The Web of Science, Bibliometrics and Scholarly Communication
STIMULATE 5 Ronald Rousseau Web page: users.telenet.be/ronald.rousseau.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Beyond the Repository: Research Systems, REF & New Opportunities William J Nixon Digital Library Development Manager.
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Writing for Psychology II: Finding the best references
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Where Should I Publish? Journal Ranking Tools
Peter Ingwersen Royal School of Library & Information Science, DK
QuicK Reference Cards for Research Impact Metrics.
Our Digital Showcase Scholars’ Mine Annual Report from July 2015 – June 2016 Providing global access to the digital, scholarly and cultural resources.
The Range of Webometrics: Forms of Digital Social Utility as Tools
How to Use Google Scholar An Educator’s Guide
Demonstrating Scholarly Impact: Metrics, Tools and Trends
CRIStin, reporting and rewarding research
Bibliometrics toolkit: Thomson Reuters products
journal metrics university of sulaimani college of science geology dep by Hawber Ata
Citation Analysis Your article Jill Otto InCites Other?
Altmetrics: Analysis of Library and Information Science (LIS) Research in the Social Media Ifeanyi J. Ezema (Ph.D) Paper Presented at the 1st International.
D. E. Koditschek 358 GRW ESE 290/291 Introduction to Electrical & Systems Engineering Research Methodology & Design
Bibliometric Analysis of Water Research
Advanced Scientometrics Workshop
An Efficient method to recommend research papers and highly influential authors. VIRAJITHA KARNATAPU.
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
Citation Searching with Web of Knowledge
Information Science in International Perspective
Introduction of KNS55 Platform
Peter Ingwersen, IVA, Denmark University College, Oslo, Norway 2011
Journal evaluation and selection journal
Scientific communication in the electronic age – Definitions
Bibliometrics: the black art of citation rankings
Comparing your papers to the rest of the world
Citation Searching with Web of Knowledge
H-indexes & Aging Peter Ingwersen Royal School of LIS, Denmark
Relevance in ISR Peter Ingwersen Department of Information Studies
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

Oslo University College, Norway Central R&D assessment indicators: Scientometric and Webometric Methods Peter Ingwersen Royal School of LIS - 2010 Denmark – pi@iva.dk http://www.iva.dk/pi Oslo University College, Norway

Agenda Scientific Communication: Scientometrics: Webometrics Classic & present models Scientometrics: Publication Analyses Publication Point evaluation (’Norwegian Model’) Citation Analyses Crown Indicators (research profile weighting) Hirsch Index (h-Index) Webometrics Web Impact Factors; Issue tracking - mining Concluding remarks Ingwersen 2011

Scientific communication 1 – Classic Model (prior to Web / open access) Research idea & activities TechnicalResearch report Peers Archive Conf. Papers (Peer reviewed) Journal Articles Library index Domain databases Citation database Time Un-published non-peer review informal com. Ingwersen 2011

Scientific communication 1 – Present Model (incl. Web / open access) Research idea & activities TechnicalResearch reports Working papers Peers Inst. Repositories Open access journals Conf. Papers (Peer reviewed) Journal Articles Time Un-published public Non Peer review Full text Domain database - Web of Science - Scopus Google (Scholar) Academic Web Search Engines Ingwersen 2011

Scientific communication 2 – What ’is’ scientific information? Blogs … Teaching material Working papers Research reports Student output Searchable on Open Web Conference Papers Posters, Abstracts (peer reviewed) Collaboratory round tables Open Access - Journals (peer reviewed) - Inst. Repositories (Duplicates/versions) Partly searchable on Open Web Confidence in information source? Qualified knowledge source (Domain dependent ) Restricted Access - Journal articles (peer reviewed) Authoritative source - Research Monographs Ingwersen 2011

Examples of Publication analysis Ranking most productive Countries in a field Journals in a field Institutions or universities; departments or goups (Exponential Bradford-like distributions) Counting scientific publications in Academic fields / disciplines Countries, regions, universities, departments Counting number of articles OVER TIME Time series Ingwersen 2011

Typical time series 1981-2005 Ingwersen 2011

Productivity Growth Ingwersen 2011

Publication Growth – all fields 1981-2006 1981-85 = index 1: China=14,114 p.; EU=736,616 p; USA=887,039 p. India = 65,250 (98,598) publ. Ingwersen 2011

Publication success ‘points’ As done in Norway: Articles from the journals considered among the 20 % best journals in a field: 3 points Articles from other (peer reviewed) journals: 1 point Conference papers (peer reviewed): .7 points Monographs (int.publisher): 8 points Monographs (other publishers): 5 points Fractional counts; points used for funding distribution Covers all research areas, incl. humanities, for all document types Ingwersen 2011

One cannot use the publication points for DIRECT comparison Between universities or countries Or applied to individual researchers Recent detailed article on the system: Schneider, J.W. (2009) An outline of the bibliometric indicator used for performance-based funding of research institutions in Norway. European Political Science, 8(3), p. 364-378. Ingwersen India 2010

However: Publication Point Indicators established! Elleby, A., & Ingwersen, P. Publication point indicators: A comparative case study of two publication point systems and citation impact in an interdisciplinary context. Journal of Informetrics, 4 (2010): 512-523. doi:10.1016/j.joi.2010.06.001 Ingwersen India 2010

Publication Point Indicators 2 Comparing the vectors of ideal cumulated PP (= expected success gain) with the actually obtained PP, for the same publication (types), providing a ratio that can be normalized: nPPI: The normalized Publication Point Index Comparisons between institutionas can be done at specific ranges of publication vector values through their nPPI. Ingwersen India 2010

Cumulated Publ. Point Indicator the DIIS example (n=70) Ingwersen Tutorial 2011

Citation Analyses Diachronic (forward in time) … or Synchronous (back in time – like ISI-JIF) Observing: how older research is received by current research (ISI+Scopus: always peer reviewed sources) Citation indicators: Time series (like for publications) Citation Impact (Crown Indicators) Citedness Ingwersen 2011

Absolute Citation Impact Ingwersen 2011

‘Crown indicators’ Normalized impact-indicators for one unit (center/university/country) in relation to research field globally: JCI : Journal Crown Indicator FCI : Field Crown Indicator – both provide an index number Ingwersen 2011

Journal Crown Indicator The ratio between: - the real number of citations received for all journal articles in a unit from a year, and - the diachronic citation impact of the same journals used by the unit, covering the same period (= the expected impact). ONE WOULD NEVER APPLY THE ISI-JIF!! Since it only signifies the AVERAGE (international) impact of an article made in a synchronous way 2011 2010 2009 2008 2007 Ingwersen 2011

Journal Impact Factor - ISI Synchroneous method: For 2010: Analysis done in Febr-April, 2011 for … 1) all citations given in 2010 to journal X for articles+notes+letters in journal X, 2) Published in previous two years: 2008-2009 2011 2010 2009 2008 2007 Ingwersen 2011

Field Crown Indicator - FCI Normalisation must be weighted in relation to the observed unit’s publication profile: Like a ’shadow’ unit (country) An example of this weighting for India: Ingwersen 2011

Research profile as weight for impact calculation (as ‘shadow country’) Ingwersen 2011

Research profile (China) as weight for impact calculation (as ‘shadow country’) Ingwersen 2011

A small European country with very different profile Ingwersen 2011

Example of research profile with FCI-index score Ingwersen 2011

Summary: Different indicators – one given period Σc/Σp / ΣC/ΣP – Globally normalized impact: For single fields it is OK to use! If averaged over all subject areas: quick’n dirty!: all areas have the same weight! – thus: Σc / Σ(C/Parea x parea ) = FCI: Standard Field Crown Indicator (FCI) for ’profile’ of subject areas for a local unit (country/university) – via applying it as global profile, like a kind of ’shadow unit’. Made as ratio of sums of citations over publications (weights) (If done as sum of rations divided by fields: all fields equal) Ingwersen 2011

Ageing of journals or articles Cited half-life - diachronic: Acumulate citations forward in time by year: 1990 91 92 93 94 95 96 97 98 99 00 01 02 - yrs 2 12 20 25 30 17 12 10 0 3 1 0 0 - Citations Acum: 2 14 34 59†89 106 118 128 128 131 132 1/2 life= 132/2 = 66 = ca. 4,2 years Ingwersen Tutorial 2011

Ageing of journals or articles – 2 Ingwersen Tutorial 2011

Hirsch Index (2005) A composite index of publications and citations for a unit (person, group, dept. …) in a novel way: H is the number of articles given a number of citations larger or equal to h. A person’s h-index of 13 implies that he/she among all his/her publications has 13, that at least each has obtained 13 citations. The index is dependent on research field and age of researcher. Can be normalized in many ways. Ingwersen Tutorial 2011

Criticism of Citation Analyses Formal influences not cited Biased citing Informal influences not cited Self-citing – may indeed improve external cits.! Different types of citations – Variations in citation rate related to type of publication, nationality, time period, and size and type of speciality – normalization? Technical limitations of citation indexes and domain databases Multiple authorship – fractional counting/article level Ingwersen Tutorial 2011

Reply: van Raan (1998) Different biases equalizes each other If researchers simply do most of their referencing ”in a reasonable way” and if a sufficient number of citations are counted, realiable patterns can be observed. It is very unlikely that all researchers demonstrate the same biases (e.g. all researchers consciously cite research, which does not pertain to their field) Ingwersen Tutorial 2011

Google Scholar Does not apply PageRank for ranking but citations Contains conference papers and journal articles (??) Workable for Computer Science and Engineering (and Inf. Sc.) Requires a lot of clean-up! Apply http://www.harzing.com/pop.htmfor (Publish or Perish) for better analysis on top of GS Google Scholar may provide the h-index for persons Ingwersen 2011

infor-/biblio-/sciento-/cyber-/webo-/metrics L. Björneborn & P. Ingwersen 2003 infor-/biblio-/sciento-/cyber-/webo-/metrics informetrics bibliometrics scientometrics cybermetrics webometrics Ingwersen 2011

Link terminology basic concepts L. Björneborn & P. Ingwersen 2003 B has an outlink to C; outlinking : ~ reference B has an inlink from A; inlinked : ~ citation B has a selflink; selflinking : ~ self-citation A has no inlinks; non-linked: ~ non-cited E and F are reciprocally linked A is transitively linked with H via B – D H is reachable from A by a directed link path A has a transversal link to G : short cut C and D are co-linked from B, i.e. have co-inlinks or shared inlinks: co-citation B and E are co-linking to D, i.e. have co-out-links or shared outlinks: bibliog.coupling A B E G C D F H co-links Ingwersen 2011

Ingwersen 2011

www.internetworldstats.com Ingwersen 2011

Search engine analyses See e.g. Judith Bar-Ilan’s excellent longitudinal analyses Mike Thelwall et al. in several case studies Scientific material on the Web: Lawrence & Giles (1999): approx. 6 % of Web sites contains scientific or educational contents Increasingly: the Web is a web of uncertainty Allen et al. (1999) – biology topics from 500 Web sites assessed for quality: 46 % of sites were ”informative” – but: 10-35 % inaccurate; 20-35 % misleading 48 % unreferenced Ingwersen 2011

The Web-Impact Factor Ingwersen, 1998 Intuitively (naively?) believed as similar to the Journal Impact Factor Demonstrate recognition by other web sites - or simply impact – not necessarily quality Central issue: are web sites similar to journals and web pages similar to articles? Are in-links similar to citations – or simply road signs? What is really calculated? DEFINE WHAT YOU ARE CALCULATING: site or page IF Ingwersen 2011

The only valid webometric tool: Site Explorer Yahoo Search … If one enters (old valid) commands like: Link:URL or Domain: topdomain (edu, dk) or Site:URL you are transferred to: http://siteexplorer.search.yahoo.com/new/ Or find it via this URL The same facilities are available in click-mode, as one starts with a given URL: Finding ‘all’ web pages in a site Finding ‘all’ inlinks to that site/those pages Also without selflinks! – this implies … Ingwersen 2010 Åbo

… to calculate Web Impact Factors But one should be prudent in interpretations. Note that external inlinks is the best indicator of recognition Take care of how many sub-domains (and pages) that are included in the click analysis. Results can be downloaded Ingwersen 2010 Åbo

Possible types of Web-IF: E-journal Web-IF Calculated by in-links Calculated as traditional JIF (citations) Scientific web site – IF (by link analyses) National – regional (some URL-problems) Institutions – single sites Other entities, e.g. domains Best nominator: no. of staff – or simply use external inlinks Ingwersen 2011

Web-links like citations? Kleinberg (1998) between citation weights and Google’s PageRank: Hubs ~ review article: have many outlinks (refs) to: Authority pages ~ influential (highly cited) documents: have many inlinks from Hubs! Typical: Web index pages = homepage with self-inlinks = Table of contents Ingwersen 2011

Reasons for outlinking … Out-links mainly for functional purposes Navigation – interest spaces… Pointing to authority in certain domains? (Latour: rhetoric reasons for references-links) Normative reasons for linking? (Merton) Do we have negative links? We do have non-linking (commercial sites) Ingwersen 2011

Some additional reasons for providing links In part analogous to providing references (recognition) And, among others, emphasising the own position and relationship (professional, collaboration, self-presentation etc.) sharing knowledge, experience, associations … acknowledging support, sponsorship, assistance providing information for various purposes (commercial, scientific, education, entertainment) drawing attention to questions of individual or common interest and to information provided by others (the navigational purpose) Ingwersen 2011

Other differences between references, citations & links The time issue: Aging of sources are different on the Web: Birth, Maturity & Obsolescence happens faster Decline & Death of sources occur too– but Mariages – Divorse – Re-mariage – Death & Resurrection … & alike liberal phenomena are found on the Web! (Wolfgang Glänzel) Ingwersen 2011

Issue tracking – Web mining Adequate sampling requires knowledge of the structure and properties of the population - the Web space to be sampled Issue tracking of known properties / issues may help Web mining the unknown is more difficult, due to the dynamic, distributed & diverse nature the variety of actors and minimum of standards the lack of quality control of contents Web archeology – study of the past Web Ingwersen 2011

Nielsen Blog Pulse Observes blogs worldwide by providing: Trend search – development over time of terms/concepts – user selection! Featured trends – predefined categories Coversation tracker – blog conversations BlogPulse profiles – blog profiles Look into: http://www.blogpulse.com/tools.html Ingwersen 2011

Home > Tools Trend Search Ingwersen 2011

Concluding remarks: Future With open access we can foresee a nightmare as concerns tracking qualified and authoritative scientific publications, aside from the citation indexes because of Lack of Bibliographic control (what is original – vs. parallel and spin-off versions & crab?) over many institutional repositories – and mixed on the web with all other document types incl. Blogs (web 2.0) – Google Scholar(?) … Google Books (?) Ingwersen 2011

Concluding remarks One may be somewhat cautious on Web-IF applications without careful sampling via robots due to its incomprehensiveness and what it actually signifies One might also try to investigate more the behavioural aspects of providing and receiving links to understand what the impact might mean and how/why links are made Understand the Web space structure better Design workable robots, downloading & local analyses Ingwersen 2011

References Allen, E.S., Burke, J.M., Welch, M.E., Rieseberg, L.H. (1999). How reliable is science information on the Web? Science, 402, 722. Björneborn, L., Ingwersen, P. (2004). Towards a basic framework for webometrics. Journal of American Society for Information Science and Technology, 55(14): 1216-1227. Brin, S., Page, L. (1998). The anatomy of a large scale hypertextual web search engine. Computer Networks and ISDN Systems, 30(1-7), 107-117. Elleby, A., Ingwersen, P. Publication Point Indicators: A Comparative Case Study of two Publication Point Systems and Citation Impact in an Interdisciplinary Context. Journal of Informetrics, 2010, 4, p. 512-523. Hirsch, J.E. (2005): An index to quantify an individual’s scientific research output. PNAS, 102: 16569-16572. Ingwersen 2011

References 2 Jepsen, E.T., Seiden, P., Ingwersen, P., Björneborn, L., Borlund, P. (2004). Characteristics of scientific Web publications: Preliminary data gathering and analysis. Journal of American Society for Information Science and Technology, 55(14): 1239-1249. Lawrence, S., Giles, C. L. (1999). Accessibility and distribution of information on the Web. Nature, 400, 107-110. Li, X.M., Thelwall, M., Musgrove, P.,, Wilkinson, D. (2003). The relationship between the WIFs or inlinks of Computer Science Departments in UK and their RAE ratings or research productivities in 2001. Scientometrics, 57(2), 239-255. Moed, H. (2005). Citation Analysis in Research Evaluation. Springer. Thelwall, M. & Harrier, G. Do the Web sites of higher rated scholars have significantly more online impact? JASIST, 55(2), 149-159. SearchEngineShowdown.com Van Raan, A. se: http://www.cwts.nl/TvR/ - for publications Ingwersen 2011