A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.

Slides:



Advertisements
Similar presentations
CSE594 Fall 2009 Jennifer Wong Oct. 14, 2009
Advertisements

The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
What are the characteristics of academic journals
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
Publishing Opportunities Alan Fyall Deputy Dean Research & Enterprise School of Services Management.
Håkan Carlsson Gothenburg University Library Bibliometrics – A Tool in the Evaluation of Science.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
Bibliometrics: the black art of citation rankings Roger Mills OULS Head of Science Liaison and Specialist Services February 2010 These slides are available.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
School of Business and Management Accounting for Research Quality: Research Audits and the Journal Rankings Debate Michael Rowlinson Professor of Organization.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Journal Impact Factors and H index
Publication and impact in English
REF2014 – results and the way forward SSHP Meeting 12 March 2015.
JOURNAL CITATION REPORTS ® – “THE JCR” FEBRUARY 2009 ENHANCEMENTS GSS – Thomson Reuters, Scientific Business, A&G January 2009.
The Web of Science database bibliometrics and alternative metrics
Welcome to Scopus Training by : Arash Nikyar June 2014
Standards in science indicators Vincent Larivière EBSI, Université de Montréal OST, Université du Québec à Montréal Standards in science workshop SLIS-Indiana.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
RQF outcomes in sciences Gavin Moodie, Principal Policy Adviser Vice Chancellor’s office.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
Impact factorcillin®: hype or hope for treatment of academititis? Acknowledgement Seglen O Per (BMJ 1997; 134:497)
Journal Impact Factors and the Author h-index:
University of Antwerp Library TEW & HI UA library offers... books, journals, internet catalogue -UA catalogue, e-info catalogue databases -e.g.
Bibliometric methods of (research) assessment Themis Lazaridis Chemistry Department City College of New York/CUNY
Google Scholar as a cybermetric tool Alastair G Smith Victoria University of Wellington New Zealand
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
ISC Journal Citation Reprots تقارير استنادية للمجلات Mohammad Reza – Ghane Assistant Prof. in Library and Information Science & Director of Research Department.
Transparency in Searching and Choosing Peer Reviewers Doris DEKLEVA SMREKAR, M.Sc.Arch. Central Technological Library at the University of Ljubljana, Trg.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
Bibliometrics: Measuring the Quality of Research Professor John Mingers Kent Business School, November 2014
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
The ISI Web of Knowledge nce/training/wok/#tab3.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
Who’s citing you? Citation tracking tools. Angela Carritt & Juliet Ralph
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Karin Henning Bibliometric Services Gothenburg University Library Bibliometrics – an introduction to indicators and analyses.
Citation Searching To trace influence of publications Tracking authors Tracking titles.
CiteSearch: Multi-faceted Fusion Approach to Citation Analysis Kiduk Yang and Lokman Meho Web Information Discovery Integrated Tool Laboratory School of.
Bibliometrics: the black art of citation rankings Roger Mills Head of Science Liaison and Specialist Services, Bodleian Libraries June 2010 These slides.
Handbook for Health Care Research, Second Edition Chapter 6 © 2010 Jones and Bartlett Publishers, LLC CHAPTER 6 Reviewing the Literature.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
Communication Challenges for Nordic Universities NUAS Conference August, 2012 Prof. Andrejs Rauhvargers, EUA Senior Adviser Session A2b: Rankings.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Where Should I Publish? Journal Ranking Tools
Bibliometrics toolkit: Thomson Reuters products
Johannes Sorz, Bernard Wallner, Horst Seidler and Martin Fieder
CSE594 Fall 2009 Jennifer Wong Oct. 14, 2009
journal metrics university of sulaimani college of science geology dep by Hawber Ata
Citation Analysis Your article Jill Otto InCites Other?
Bibliometric Analysis of Water Research
Advanced Scientometrics Workshop
Bibliometric Analysis of Process Safety and Environmental Protection
Trainer and Product Specialist Elsevier-FarIdea Company
CSE594 Fall 2009 Jennifer Wong Oct. 14, 2009
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014

1. Overview Introduction The Leiden Methodology Criticisms of the Leiden Methodology The Empirical Study – Three UK Business Schools Overview of publications and Citations Differences between Fields Results of the Methodology Conclusions

1.2. Introduction There has been an ever-increasing desire to monitor and measure the quality of a department’s research Apart from peer review, the main method for doing this is bibliometrics, generally based on measuring the citations received by published papers (citations per paper – CPP) However, there are major differences between the citation patterns of different fields and this needs to be taken into account in any comparisons. This means that the data must be normalised with respect to field, and time The Leiden ranking methodology (LRM) is one of the most well developed methodologies for comparing departments. It has been used mainly in the sciences where Web of Science (WoS) data is better. This study is one of the first tests with social science departments.

3. 3. The Leiden Methodology (crown indicator) Leiden methodology (crown indicator) 1.Collect all papers from a department over a specific time window, e.g., 5 years 2.Use the Web of Science to find the citations for each paper (assuming it is included in WoS) 3.Calculate how many citations such a paper would expect to receive given its field and year of publication a.WoS has field lists of all the journals in a particular field. From this we can find the total citations to papers published in that field in that year, and divide by the number of papers giving the field citations per paper (FCS) b.We can calculate a similar figure just for the set of journals the department actually publishes in (JCS) 4.Total the actual number of citations for all papers and the expected number of citations for all papers. Dividing give the crown indicator – ie the average citations per paper relative to the field average (CPP/FCS)

1. 5.The value may be : > 1 – the department is doing better than the field = 1 – it is average < 1 – it is below average Very good departments may be 2 or 3 times the average 6.Can also calculate JCS/FCS which shows if the department is targeting better or worse journals than the field as a whole 7.This methodology has been criticised for its method of calculation – should you sum the citations before dividing, or do the division for each paper and then average The first method can cause biases eg in favour of publications in fields with high citation numbers and it is now accepted that the second method is correct

4. Criticisms of the LRM 1.The use of WoS for collecting citations WoS does not include books either as targets or for their citations Until recently it has not included conference papers It does not include many journals The coverage is differential across disciplines – science is good (> 90%), social science is mediocre (30%-70%), arts and humanities is poor (20%) 2.The use of field lists from WoS which are ad hoc and not transparent 3.Bibliometric methods should not be used by themselves but only in combination with peer review and judgement 4.Are citations a good measure of quality, or just of impact?

5. The Study – Three UK Business Schools 1.Three business schools were selected, primarily because of the availability of the data but they were reasonably representative A.New school but in a very prestigious university B.New school in a traditional university undergoing rapid expansion C.Older school in modern university aiming to become more research intensive TABLE 1 RESEARCH OUTPUTS FOR THREE SCHOOLS Years covered by outputs Staff in the 2008 UK RAE Authors involved in outputs Total outputs Total journal papers Total outputs in Web of Science School A School B School C

TABLE 3 A COMPARISON OF RESEARCH OUTPUTS ACROSS SCHOOLS School ASchool BSchool C ISI WoS subject areasNumber of publications Number of journalsNumber of publications Number of journalsNumber of publications Number of journals Agriculture* Business Business Finance Computer Science* Economics Engineering Environmental Sciences Environmental Studies Ethics Food Science Technology Geography Health Care Sciences & Services Management Mathematics Applied Operations Research & Management Science Pharmacology Pharmacy Planning Development Political Science Public Administration Social Sciences* Others ( <10 Publications/ Field) Journal papers not in WoS

TABLE 4 EXPECTED CITATIONS PER PAPER BY FIELD AND PERIOD BusinessEconomicsManagement % change78%28%60%

TABLE 5 LEIDEN INDICATOR FOR THREE UK SCHOOLS FOR SEVERAL TIME PERIODS PCCPPTotal papers of journals Total cites of journals JCSmCPP/ JCSm CPP/ FCSm MNCSJCSm/ FCSm A ,32415, B , C ,1558, A ,52818, B ,45410, C ,00810, A ,71224, B ,80912, C ,97013, A ,11032, B ,66114, C ,38916, A ,64241, B ,11112, C ,11420,

6. Main Results 1.Raw (un-normalised) CPP: A is always better and is rising. B and C alternate but are also rising especially in (before the RAE) 2. JCS/FCS (quality of journal set): generally around or below 1, showing the journal set is average for the field, but it rises for A (0.99 – 1.23) showing that A is improving the quality of journals. 3.CPP/FCS (the crown indicator): all schools above 1.0 so better than the field average but not hugely (2 or 3). A actually fell before rising. Could this be because they were targeting better journals and competing against a stronger field? 4.MNCS (the alternative calculation): only marginal differences

5.Do we gain anything by the normalization? A has improved the quality of its journal set A appeared to improve over all years, but in fact fell back so this was really a field effect Would allow us to compare schools with very different subject mixes, or departments from different subjects 6.Comparison with 2008 RAE results (GPA out of 4.0): A – 3.05 B – 2.45 C – 2.50 So good agreement

7. Conclusions 1.The Leiden methodology can be applied in the social science to business schools but it requires a huge amount of data collection/ processing. It would have to be automated. 2.The results did provide a limited amount of extra value over raw scores. 3.But there are major problems in using WoS Only 20% of the outputs of the schools could actually be evaluated The WoS field categories are poorly defined 4.At the moment the LRM is NOT suitable for assessing business schools research performance. Perhaps Google Scholar could be used although that is unreliable and has no field categories.