Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College.

Similar presentations


Presentation on theme: "Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College."— Presentation transcript:

1 Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College London

2 Outline Bibliometrics and university libraries Bibliometrics and university libraries –Structure of disciplines –Collection management –Research evaluation Bibliometric techniques Bibliometric techniques Sample projects Sample projects

3 Structure of disciplines Domain analysis Domain analysis Patterns of scholarship Patterns of scholarship

4 Collection management Patterns of use Patterns of use Core journals in discipline Core journals in discipline International comparison International comparison

5 Research evaluation Approaches to research assessment: experiences from UK, NZ and Australia Approaches to research assessment: experiences from UK, NZ and Australia Peer assessment and metrics Peer assessment and metrics Bibliometrics Bibliometrics –Assessment tools –Challenges of bibliometric approach –How to increase the quality and impact of published outputs

6 What do governments want for their money? Economic outcomes Economic outcomes –increase wealth creation & prosperity –improve nation’s health, environment & quality of life Innovation Innovation R&D from private sector R&D from private sector Improved competitiveness Improved competitiveness Less “curiosity-driven” activity Less “curiosity-driven” activity

7 Research Assessment Exercise (UK) Started in mid-1980s to determine size of research funding from HEFCEs to universities. Early 1990s brought shift from quantity to quality Started in mid-1980s to determine size of research funding from HEFCEs to universities. Early 1990s brought shift from quantity to quality 60 subject panels examine, for quality, selected outputs of UK academics. Also, previous funding, evidence of esteem, PhD completions and res. environment are taken into consideration 60 subject panels examine, for quality, selected outputs of UK academics. Also, previous funding, evidence of esteem, PhD completions and res. environment are taken into consideration Grades for submission units; correlates with bibliometrics measures (Charles Oppenheim’s research) Grades for submission units; correlates with bibliometrics measures (Charles Oppenheim’s research) It will be scraped in 2008. New systems will be a hybrid of a “basket of metrics” and peer-review (???) It will be scraped in 2008. New systems will be a hybrid of a “basket of metrics” and peer-review (???) Changed behaviour of scholars Changed behaviour of scholars –Increased volume of publications (salami slicing?) –Increased quality of outputs (numbers of 5 and 5* departments increase)

8 Performance-Based Research Fund (PBRF) New Zealand Introduced in early 2003 Introduced in early 2003 12 panels grades individuals in 41 subject categories 12 panels grades individuals in 41 subject categories Portfolio has three components Portfolio has three components –Publications (4 best and the rest) –Peer esteem –Contribution to research environment 2003 round produced lower scores than expected and great variability between subjects (from 4.7 in philosophy to 0.3 in nursing). 40% of submitted staff were judged research inactive (grade R) 2003 round produced lower scores than expected and great variability between subjects (from 4.7 in philosophy to 0.3 in nursing). 40% of submitted staff were judged research inactive (grade R) Scores for institutions are calculated from these and used together with PhD completion and external funding metrics to calculate awards to institutions Scores for institutions are calculated from these and used together with PhD completion and external funding metrics to calculate awards to institutions 2006 round incorporates “breaks” for new researchers 2006 round incorporates “breaks” for new researchers

9 Research Quality Framework (RQF) - Australia To be introduced in 2007 (?) To be introduced in 2007 (?) Quality includes the intrinsic merit of original research and academic impact. This relates to recognition of the originality of research by peers and its impact on the development of the same or related discipline areas within community of peers. Quality includes the intrinsic merit of original research and academic impact. This relates to recognition of the originality of research by peers and its impact on the development of the same or related discipline areas within community of peers. Assessment: peer review and metrics Impact or use of the original research outside of the peer community that will typically not be reported in traditional peer review literature. Relates to the recognition by qualified end-users that quality research has been successfully applied to achieve social, cultural, economic and/or environmental outcomes. Impact or use of the original research outside of the peer community that will typically not be reported in traditional peer review literature. Relates to the recognition by qualified end-users that quality research has been successfully applied to achieve social, cultural, economic and/or environmental outcomes. Assessment: expert opinion and narratives or metrics?

10 RQF evidence portfolio Four “best” outputs (with rationale for selection) Four “best” outputs (with rationale for selection) List of all published outputs in the last six years List of all published outputs in the last six years Statement of “early impact” as assessed by qualified end-users (description of outcome; identification of beneficiaries; metrics illustrating benefits; linkage between claimant and beneficiary) Statement of “early impact” as assessed by qualified end-users (description of outcome; identification of beneficiaries; metrics illustrating benefits; linkage between claimant and beneficiary) Context statement for the research group Context statement for the research group

11 Evaluation: peer vs. metrics Evaluate/reward past performance Evaluate/reward past performance Measures short to medium term impacts Measures short to medium term impacts Established use for the evaluation of academic impact (within the group of peers) Established use for the evaluation of academic impact (within the group of peers) No established/accepted procedures for evaluation of impact outside academe No established/accepted procedures for evaluation of impact outside academe

12 Peer review “Traditional” assessment is an outputs-based peer-review “Traditional” assessment is an outputs-based peer-review –a panel of experts “reads” submissions or outputs –not always transparent (selection; evaluation criteria) –subjective (concept of quality is very difficult to objectivise) –relies on proxies (e.g. ranking/prestige of publishing journal; inst. affiliation of authors, etc.) –composition of panels will affect their judgements –lacks comparability between panels –“punishes” innovative or multidisciplinary research –“old boys” club disadvantaging young researchers –assesses what already has been assessed by journals’ peer-review processes –time and resource intensive

13 Metrics Range of unobtrusive and transparent measures can be: Range of unobtrusive and transparent measures can be: –Input-based Income Income Success rates Success rates –Output-based Long-term impact (patents, IP, etc.) Long-term impact (patents, IP, etc.) Bibliometric measures (volume and impact/quality) Bibliometric measures (volume and impact/quality)

14 Problems with metrics (1) Input-based Input-based –Retrospective –Creates “Matthew effect” –Disadvantages younger researchers (without previous track record) –Undervalues research done “on the string”

15 Problems with metrics (2) Output based Output based –Citations as a measure of intellectual influence? –Comparing like with like (benchmarking) –Inadequate tools –Social science and humanities (in search for new metrics) –Research on the periphery (local citation indices) –New models of communication (capturing it) –Impacts on wider community (“tangible” outcomes) –Lack of expertise in analysis and interpretation (education)

16 Indicators of quality as measured using published outputs Number of publications Number of publications Citation counts to these publications (adjusted for self-citations) Citation counts to these publications (adjusted for self-citations) -what “window” should be used? 4, 5, 10 years? Citations per publication Citations per publication Percentage of uncited papers Percentage of uncited papers Impact factors (of publishing journals) Impact factors (of publishing journals) Diffusion factor (of citing journals) – profile of users of research (who, where, when and what) Diffusion factor (of citing journals) – profile of users of research (who, where, when and what) “Impact factor” of a scholar - Hirsh index (h index) “Impact factor” of a scholar - Hirsh index (h index) –(numbers of papers with this number of citations). –Your h index =75 if you wrote at least 75 papers with 75 citations each. Note: These should not be seen as “absolute” numbers but always seen in the context of the discipline, research type, institution profile, seniority of a researcher, etc.

17 Compare like with like! Applied research attracts fewer citations than basic research. Applied research attracts fewer citations than basic research. Differences in citation behaviour between disciplines (e.g. papers in organisational behaviour attract 5 times as many citations as papers in accounting). Differences in citation behaviour between disciplines (e.g. papers in organisational behaviour attract 5 times as many citations as papers in accounting). Highest IF journal in immunology is Ann Rev Immun (IF 47.3) Mean for cat. 4.02; and in health care and services category is Milbank Q. (IF of 3.8). Mean for cat. 1.09. Highest IF journal in immunology is Ann Rev Immun (IF 47.3) Mean for cat. 4.02; and in health care and services category is Milbank Q. (IF of 3.8). Mean for cat. 1.09. Matthew effect. Matthew effect. Benchmarking must be done using comparable variables!

18 Tools available Publication and citation data Publication and citation data –Web of Science –SCOPUS –Google Scholar –Local/national databases Other bibliometric indicators Other bibliometric indicators –Journal Citation Reports (JCR) –Other indicators databases (national, essential, university, institutional) –ISIHighlyCited.com

19 WoS and Scopus: subject coverage (% of total records) WoS SCOPUS Google Scholar ? Jacso, 2005

20 Web of Science Covers around 9,000 journal titles and 200 book series divided between SCI, SSCI and A&HCI. Covers around 9,000 journal titles and 200 book series divided between SCI, SSCI and A&HCI. Electronic back files available to 1900 for SCI and mid- 50s for SSCI and mid-70s for A&HCI. Electronic back files available to 1900 for SCI and mid- 50s for SSCI and mid-70s for A&HCI. Very good coverage of sciences; patchy on “softer” sciences, social sciences and arts and humanities. Very good coverage of sciences; patchy on “softer” sciences, social sciences and arts and humanities. US and English-language biased. US and English-language biased. Full coverage of citations. Full coverage of citations. Name disambiguation tool. Name disambiguation tool. Limited downloading options. Limited downloading options.

21 Scopus Positioning itself as an alternative to ISI Positioning itself as an alternative to ISI More journals from smaller publishers and open access (13,000 journal titles; 600 open access journals; 750 conf proceedings; 600 trade publications, 2.5 mil “quality” web pages) More journals from smaller publishers and open access (13,000 journal titles; 600 open access journals; 750 conf proceedings; 600 trade publications, 2.5 mil “quality” web pages) Source data back to 1960. Source data back to 1960. Excellent for physical and biological sciences; poor for social sciences; does not cover humanities or arts. Excellent for physical and biological sciences; poor for social sciences; does not cover humanities or arts. Better international coverage (60% of titles are non-US) Better international coverage (60% of titles are non-US) Not much of a back file (e.g. citation data for the last decade only) Not much of a back file (e.g. citation data for the last decade only) Not “cover to cover” and not up to date Not “cover to cover” and not up to date Easy to use in searching for source publications; clumsy in searching cited publications. Easy to use in searching for source publications; clumsy in searching cited publications. Citation tracker works up to 1000 records only. Citation tracker works up to 1000 records only. Limited downloading options. Limited downloading options.

22 Google Scholar Coverage and scope? Coverage and scope? Inclusion criteria? Inclusion criteria? Very limited search options Very limited search options No separate cited author search No separate cited author search Free! Free!

23 Taiwan in the three sources,1996-2005

24 Social sciences Communication in social sciences Communication in social sciences –Books –National journals –International journals –Non-scholarly outputs (culture-creating role) “International” tools inadequate “International” tools inadequate Creation of “national” and “regional” tools Creation of “national” and “regional” tools –Chinese Social Sciences Citation Index –Polish Sociology Citation Index –Serbian Social Sciences Index –Polish Citation Index for Humanities –European Humanities Citation index

25 Science on the periphery Poor coverage in international databases Poor coverage in international databases Intl. benchmarking difficult Intl. benchmarking difficult Impossible for local evaluations Impossible for local evaluations Development of national sources of data Development of national sources of data While volume is growing, citation rates lag behind While volume is growing, citation rates lag behind

26 Science on the periphery

27 Capturing new forms of communication Use of online journals (deep log analysis) Use of online journals (deep log analysis) Traffic on websites and downloads Traffic on websites and downloads In-links to websites In-links to websites

28 Impact of research: example of biomedicine Informing policies (citations on guidelines, govt. policy, development of medicines) Informing policies (citations on guidelines, govt. policy, development of medicines) Building capacity (training; development) Building capacity (training; development) Relationship between research and health outcomes and cost savings Relationship between research and health outcomes and cost savings Healthier workforce Healthier workforce

29 Citations on clinical guidelines 47 Clinical Guidelines and 64 Health Technology Appraisals prepared by the UK’s National Institute for Clinical Excellence (NICE) were analysed for citations 47 Clinical Guidelines and 64 Health Technology Appraisals prepared by the UK’s National Institute for Clinical Excellence (NICE) were analysed for citations –cited 25% of UK papers (2.5 more than expected) –Majority were clinical papers –Median lag time between publication of res. paper and its citation was 3.3 years

30 Relative commitment to research and burden of disease

31 How to increase “quality” of your publications? Publish in the right journals (prestige; importance to the discipline; impact factor vs. diffusion factor) Publish in the right journals (prestige; importance to the discipline; impact factor vs. diffusion factor) Publish in English Publish in English Write review articles Write review articles Engage in basic research Engage in basic research Become a journal editor (Lange, 1997) Become a journal editor (Lange, 1997) Acquire a co-author (preferably from US or UK) Acquire a co-author (preferably from US or UK) Get external funding (from different sources) Get external funding (from different sources) Make your outputs available in open access (own website, institutional and subject repositories) (Antelman, 2004; Harnard various) Make your outputs available in open access (own website, institutional and subject repositories) (Antelman, 2004; Harnard various) “Advertise” your publications on listservs and discussion groups “Advertise” your publications on listservs and discussion groups Make and maintain professional/social contacts with others in your research area (Rowlands, 2000) Make and maintain professional/social contacts with others in your research area (Rowlands, 2000)


Download ppt "Bibliometrics in higher education: the impact on libraries Keith Webster University of Queensland, Australia Dr Berenika M. Webster CIBER, Univ. College."

Similar presentations


Ads by Google