Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College.

Slides:



Advertisements
Similar presentations
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Advertisements

Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
1 Academic Impact : What it means and how to calculate it Prof. Neal Ashkanasy.
Research Assessment and UK publication patterns Jonathan Adams.
Scopus. Agenda Scopus Introduction Online Demonstration Personal Profile Set-up Research Evaluation Tools -Author Identifier, Find Unmatched Authors,
Research Quality Framework Presentation to APSR - ARROW The Adaptable Repository 3 May 2007 Dr Alexander Cooke Department of Education Science and Training.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
Excellence in Research for Australia
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Research Impact Alexandra Byrnes, Research Publication Officer Rio
My Research, its Potential, and its Contribution to SCIT Mike Thelwall.
Not all Journals are Created Equal! Using Impact Factors to Assess the Impact of a Journal.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Bibliometrics in Computer Science MyRI project team.
Journal Impact Factors and H index
Publication and impact in English
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
The Web of Science database bibliometrics and alternative metrics
The Role of Citations in Warwick’s Strategy and Improving Them Nicola Owen (Academic Registrar) Professor Mark Smith (PVC Research: Science and Medicine)
Welcome to Scopus Training by : Arash Nikyar June 2014
Making an impact ANU Library What is impact What (the heck) are bibliometrics Publish with impact – an overview Debate on impact How innovative are you.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Beyond the RAE: New methods to assess research quality July 2008.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
The Web of Science, Bibliometrics and Scholarly Communication 11 December 2013
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
University of Antwerp Library TEW & HI UA library offers... books, journals, internet catalogue -UA catalogue, e-info catalogue databases -e.g.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
April 9, 2003Santiago, Chile The ISI Database: Reflecting the Best of International and Regional Research Keith R. MacGregor Sr. Vice President The Americas,
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
INDEXATION CRITERIA Christian Kieling, MD Department of Psychiatry, Hospital de Clínicas de Porto Alegre Universidade Federal do Rio Grande do Sul, Brazil.
Journal Impact Factors: What Are They & How Can They Be Used? Pamela Sherwill, MLS, AHIP April 27, 2004.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Thomson Reuters ISI (Information Sciences Institute) Azam Raoofi, Head of Indexing & Education Departments, Kowsar Editorial Meeting, Sep 19 th 2013.
How to use Bibliometrics in your Career The MyRI Project Team.
Research Quality Framework Presentation to APSR - ARROW - Repository Market Day 4 May 2007 Sandra Fox Department of Education Science and Training.
Selection Strategies for Digital Institutional Repositories Kent Woynowski 30 September 2004.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
The Research Excellence Framework Impact: the need for evidence Professor Caroline Strange 22 June 2011.
The Web of Science, Bibliometrics and Scholarly Communication
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
RESEARCH – DOING AND ANALYSING Gavin Coney Thomson Reuters May 2009.
Who’s citing you? Citation tracking tools. Angela Carritt & Juliet Ralph
Citation Searching To trace influence of publications Tracking authors Tracking titles.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
ACADEMIC PROMOTIONS Promotions Criteria Please note, these slides only contain a summary of the promotions information – full details can be found.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
Merit JISC Collections Merit: presentation for UKCORR Hugh Look, Project Director.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
It’s not about searching…. It’s about finding.
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Bibliometrics toolkit: Thomson Reuters products
A Practical Guide to Evidencing Impact
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College London

Outline Bibliometrics and university libraries Bibliometrics and university libraries –Structure of disciplines –Collection management –Research evaluation Bibliometric techniques Bibliometric techniques Sample projects Sample projects

Structure of disciplines Domain analysis Domain analysis Patterns of scholarship Patterns of scholarship

Collection management Patterns of use Patterns of use Core journals in discipline Core journals in discipline International comparison International comparison

Research evaluation Approaches to research assessment: experiences from UK, NZ and Australia Approaches to research assessment: experiences from UK, NZ and Australia Peer assessment and metrics Peer assessment and metrics Bibliometrics Bibliometrics –Assessment tools –Challenges of bibliometric approach –How to increase the quality and impact of published outputs

What do governments want for their money? Economic outcomes Economic outcomes –increase wealth creation & prosperity –improve nation’s health, environment & quality of life Innovation Innovation R&D from private sector R&D from private sector Improved competitiveness Improved competitiveness Less “curiosity-driven” activity Less “curiosity-driven” activity

Research Assessment Exercise (UK) Started in mid-1980s to determine size of research funding from HEFCEs to universities. Early 1990s brought shift from quantity to quality Started in mid-1980s to determine size of research funding from HEFCEs to universities. Early 1990s brought shift from quantity to quality 60 subject panels examine, for quality, selected outputs of UK academics. Also, previous funding, evidence of esteem, PhD completions and res. environment are taken into consideration 60 subject panels examine, for quality, selected outputs of UK academics. Also, previous funding, evidence of esteem, PhD completions and res. environment are taken into consideration Grades for submission units; correlates with bibliometrics measures (Charles Oppenheim’s research) Grades for submission units; correlates with bibliometrics measures (Charles Oppenheim’s research) It will be scraped in New systems will be a hybrid of a “basket of metrics” and peer-review (???) It will be scraped in New systems will be a hybrid of a “basket of metrics” and peer-review (???) Changed behaviour of scholars Changed behaviour of scholars –Increased volume of publications (salami slicing?) –Increased quality of outputs (numbers of 5 and 5* departments increase)

Performance-Based Research Fund (PBRF) New Zealand Introduced in early 2003 Introduced in early panels grades individuals in 41 subject categories 12 panels grades individuals in 41 subject categories Portfolio has three components Portfolio has three components –Publications (4 best and the rest) –Peer esteem –Contribution to research environment 2003 round produced lower scores than expected and great variability between subjects (from 4.7 in philosophy to 0.3 in nursing). 40% of submitted staff were judged research inactive (grade R) 2003 round produced lower scores than expected and great variability between subjects (from 4.7 in philosophy to 0.3 in nursing). 40% of submitted staff were judged research inactive (grade R) Scores for institutions are calculated from these and used together with PhD completion and external funding metrics to calculate awards to institutions Scores for institutions are calculated from these and used together with PhD completion and external funding metrics to calculate awards to institutions 2006 round incorporates “breaks” for new researchers 2006 round incorporates “breaks” for new researchers

Research Quality Framework (RQF) - Australia To be introduced in 2007 (?) To be introduced in 2007 (?) Quality includes the intrinsic merit of original research and academic impact. This relates to recognition of the originality of research by peers and its impact on the development of the same or related discipline areas within community of peers. Quality includes the intrinsic merit of original research and academic impact. This relates to recognition of the originality of research by peers and its impact on the development of the same or related discipline areas within community of peers. Assessment: peer review and metrics Impact or use of the original research outside of the peer community that will typically not be reported in traditional peer review literature. Relates to the recognition by qualified end-users that quality research has been successfully applied to achieve social, cultural, economic and/or environmental outcomes. Impact or use of the original research outside of the peer community that will typically not be reported in traditional peer review literature. Relates to the recognition by qualified end-users that quality research has been successfully applied to achieve social, cultural, economic and/or environmental outcomes. Assessment: expert opinion and narratives or metrics?

RQF evidence portfolio Four “best” outputs (with rationale for selection) Four “best” outputs (with rationale for selection) List of all published outputs in the last six years List of all published outputs in the last six years Statement of “early impact” as assessed by qualified end-users (description of outcome; identification of beneficiaries; metrics illustrating benefits; linkage between claimant and beneficiary) Statement of “early impact” as assessed by qualified end-users (description of outcome; identification of beneficiaries; metrics illustrating benefits; linkage between claimant and beneficiary) Context statement for the research group Context statement for the research group

Evaluation: peer vs. metrics Evaluate/reward past performance Evaluate/reward past performance Measures short to medium term impacts Measures short to medium term impacts Established use for the evaluation of academic impact (within the group of peers) Established use for the evaluation of academic impact (within the group of peers) No established/accepted procedures for evaluation of impact outside academe No established/accepted procedures for evaluation of impact outside academe

Peer review “Traditional” assessment is an outputs-based peer-review “Traditional” assessment is an outputs-based peer-review –a panel of experts “reads” submissions or outputs –not always transparent (selection; evaluation criteria) –subjective (concept of quality is very difficult to objectivise) –relies on proxies (e.g. ranking/prestige of publishing journal; inst. affiliation of authors, etc.) –composition of panels will affect their judgements –lacks comparability between panels –“punishes” innovative or multidisciplinary research –“old boys” club disadvantaging young researchers –assesses what already has been assessed by journals’ peer-review processes –time and resource intensive

Metrics Range of unobtrusive and transparent measures can be: Range of unobtrusive and transparent measures can be: –Input-based Income Income Success rates Success rates –Output-based Long-term impact (patents, IP, etc.) Long-term impact (patents, IP, etc.) Bibliometric measures (volume and impact/quality) Bibliometric measures (volume and impact/quality)

Problems with metrics (1) Input-based Input-based –Retrospective –Creates “Matthew effect” –Disadvantages younger researchers (without previous track record) –Undervalues research done “on the string”

Problems with metrics (2) Output based Output based –Citations as a measure of intellectual influence? –Comparing like with like (benchmarking) –Inadequate tools –Social science and humanities (in search for new metrics) –Research on the periphery (local citation indices) –New models of communication (capturing it) –Impacts on wider community (“tangible” outcomes) –Lack of expertise in analysis and interpretation (education)

Indicators of quality as measured using published outputs Number of publications Number of publications Citation counts to these publications (adjusted for self-citations) Citation counts to these publications (adjusted for self-citations) -what “window” should be used? 4, 5, 10 years? Citations per publication Citations per publication Percentage of uncited papers Percentage of uncited papers Impact factors (of publishing journals) Impact factors (of publishing journals) Diffusion factor (of citing journals) – profile of users of research (who, where, when and what) Diffusion factor (of citing journals) – profile of users of research (who, where, when and what) “Impact factor” of a scholar - Hirsh index (h index) “Impact factor” of a scholar - Hirsh index (h index) –(numbers of papers with this number of citations). –Your h index =75 if you wrote at least 75 papers with 75 citations each. Note: These should not be seen as “absolute” numbers but always seen in the context of the discipline, research type, institution profile, seniority of a researcher, etc.

Compare like with like! Applied research attracts fewer citations than basic research. Applied research attracts fewer citations than basic research. Differences in citation behaviour between disciplines (e.g. papers in organisational behaviour attract 5 times as many citations as papers in accounting). Differences in citation behaviour between disciplines (e.g. papers in organisational behaviour attract 5 times as many citations as papers in accounting). Highest IF journal in immunology is Ann Rev Immun (IF 47.3) Mean for cat. 4.02; and in health care and services category is Milbank Q. (IF of 3.8). Mean for cat Highest IF journal in immunology is Ann Rev Immun (IF 47.3) Mean for cat. 4.02; and in health care and services category is Milbank Q. (IF of 3.8). Mean for cat Matthew effect. Matthew effect. Benchmarking must be done using comparable variables!

Tools available Publication and citation data Publication and citation data –Web of Science –SCOPUS –Google Scholar –Local/national databases Other bibliometric indicators Other bibliometric indicators –Journal Citation Reports (JCR) –Other indicators databases (national, essential, university, institutional) –ISIHighlyCited.com

WoS and Scopus: subject coverage (% of total records) WoS SCOPUS Google Scholar ? Jacso, 2005

Web of Science Covers around 9,000 journal titles and 200 book series divided between SCI, SSCI and A&HCI. Covers around 9,000 journal titles and 200 book series divided between SCI, SSCI and A&HCI. Electronic back files available to 1900 for SCI and mid- 50s for SSCI and mid-70s for A&HCI. Electronic back files available to 1900 for SCI and mid- 50s for SSCI and mid-70s for A&HCI. Very good coverage of sciences; patchy on “softer” sciences, social sciences and arts and humanities. Very good coverage of sciences; patchy on “softer” sciences, social sciences and arts and humanities. US and English-language biased. US and English-language biased. Full coverage of citations. Full coverage of citations. Name disambiguation tool. Name disambiguation tool. Limited downloading options. Limited downloading options.

Scopus Positioning itself as an alternative to ISI Positioning itself as an alternative to ISI More journals from smaller publishers and open access (13,000 journal titles; 600 open access journals; 750 conf proceedings; 600 trade publications, 2.5 mil “quality” web pages) More journals from smaller publishers and open access (13,000 journal titles; 600 open access journals; 750 conf proceedings; 600 trade publications, 2.5 mil “quality” web pages) Source data back to Source data back to Excellent for physical and biological sciences; poor for social sciences; does not cover humanities or arts. Excellent for physical and biological sciences; poor for social sciences; does not cover humanities or arts. Better international coverage (60% of titles are non-US) Better international coverage (60% of titles are non-US) Not much of a back file (e.g. citation data for the last decade only) Not much of a back file (e.g. citation data for the last decade only) Not “cover to cover” and not up to date Not “cover to cover” and not up to date Easy to use in searching for source publications; clumsy in searching cited publications. Easy to use in searching for source publications; clumsy in searching cited publications. Citation tracker works up to 1000 records only. Citation tracker works up to 1000 records only. Limited downloading options. Limited downloading options.

Google Scholar Coverage and scope? Coverage and scope? Inclusion criteria? Inclusion criteria? Very limited search options Very limited search options No separate cited author search No separate cited author search Free! Free!

Taiwan in the three sources,

Social sciences Communication in social sciences Communication in social sciences –Books –National journals –International journals –Non-scholarly outputs (culture-creating role) “International” tools inadequate “International” tools inadequate Creation of “national” and “regional” tools Creation of “national” and “regional” tools –Chinese Social Sciences Citation Index –Polish Sociology Citation Index –Serbian Social Sciences Index –Polish Citation Index for Humanities –European Humanities Citation index

Science on the periphery Poor coverage in international databases Poor coverage in international databases Intl. benchmarking difficult Intl. benchmarking difficult Impossible for local evaluations Impossible for local evaluations Development of national sources of data Development of national sources of data While volume is growing, citation rates lag behind While volume is growing, citation rates lag behind

Science on the periphery

Capturing new forms of communication Use of online journals (deep log analysis) Use of online journals (deep log analysis) Traffic on websites and downloads Traffic on websites and downloads In-links to websites In-links to websites

Impact of research: example of biomedicine Informing policies (citations on guidelines, govt. policy, development of medicines) Informing policies (citations on guidelines, govt. policy, development of medicines) Building capacity (training; development) Building capacity (training; development) Relationship between research and health outcomes and cost savings Relationship between research and health outcomes and cost savings Healthier workforce Healthier workforce

Citations on clinical guidelines 47 Clinical Guidelines and 64 Health Technology Appraisals prepared by the UK’s National Institute for Clinical Excellence (NICE) were analysed for citations 47 Clinical Guidelines and 64 Health Technology Appraisals prepared by the UK’s National Institute for Clinical Excellence (NICE) were analysed for citations –cited 25% of UK papers (2.5 more than expected) –Majority were clinical papers –Median lag time between publication of res. paper and its citation was 3.3 years

Relative commitment to research and burden of disease

How to increase “quality” of your publications? Publish in the right journals (prestige; importance to the discipline; impact factor vs. diffusion factor) Publish in the right journals (prestige; importance to the discipline; impact factor vs. diffusion factor) Publish in English Publish in English Write review articles Write review articles Engage in basic research Engage in basic research Become a journal editor (Lange, 1997) Become a journal editor (Lange, 1997) Acquire a co-author (preferably from US or UK) Acquire a co-author (preferably from US or UK) Get external funding (from different sources) Get external funding (from different sources) Make your outputs available in open access (own website, institutional and subject repositories) (Antelman, 2004; Harnard various) Make your outputs available in open access (own website, institutional and subject repositories) (Antelman, 2004; Harnard various) “Advertise” your publications on listservs and discussion groups “Advertise” your publications on listservs and discussion groups Make and maintain professional/social contacts with others in your research area (Rowlands, 2000) Make and maintain professional/social contacts with others in your research area (Rowlands, 2000)