Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.

Similar presentations


Presentation on theme: "A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014."— Presentation transcript:

1 A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School j.mingers@kent.ac.uk March 2014

2 1. Overview Introduction The Leiden Methodology Criticisms of the Leiden Methodology The Empirical Study – Three UK Business Schools Overview of publications and Citations Differences between Fields Results of the Methodology Conclusions

3 1.2. Introduction There has been an ever-increasing desire to monitor and measure the quality of a department’s research Apart from peer review, the main method for doing this is bibliometrics, generally based on measuring the citations received by published papers (citations per paper – CPP) However, there are major differences between the citation patterns of different fields and this needs to be taken into account in any comparisons. This means that the data must be normalised with respect to field, and time The Leiden ranking methodology (LRM) is one of the most well developed methodologies for comparing departments. It has been used mainly in the sciences where Web of Science (WoS) data is better. This study is one of the first tests with social science departments.

4 3. 3. The Leiden Methodology (crown indicator) Leiden methodology (crown indicator) 1.Collect all papers from a department over a specific time window, e.g., 5 years 2.Use the Web of Science to find the citations for each paper (assuming it is included in WoS) 3.Calculate how many citations such a paper would expect to receive given its field and year of publication a.WoS has field lists of all the journals in a particular field. From this we can find the total citations to papers published in that field in that year, and divide by the number of papers giving the field citations per paper (FCS) b.We can calculate a similar figure just for the set of journals the department actually publishes in (JCS) 4.Total the actual number of citations for all papers and the expected number of citations for all papers. Dividing give the crown indicator – ie the average citations per paper relative to the field average (CPP/FCS)

5 1. 5.The value may be : > 1 – the department is doing better than the field = 1 – it is average < 1 – it is below average Very good departments may be 2 or 3 times the average 6.Can also calculate JCS/FCS which shows if the department is targeting better or worse journals than the field as a whole 7.This methodology has been criticised for its method of calculation – should you sum the citations before dividing, or do the division for each paper and then average The first method can cause biases eg in favour of publications in fields with high citation numbers and it is now accepted that the second method is correct

6 4. Criticisms of the LRM 1.The use of WoS for collecting citations WoS does not include books either as targets or for their citations Until recently it has not included conference papers It does not include many journals The coverage is differential across disciplines – science is good (> 90%), social science is mediocre (30%-70%), arts and humanities is poor (20%) 2.The use of field lists from WoS which are ad hoc and not transparent 3.Bibliometric methods should not be used by themselves but only in combination with peer review and judgement 4.Are citations a good measure of quality, or just of impact?

7 5. The Study – Three UK Business Schools 1.Three business schools were selected, primarily because of the availability of the data but they were reasonably representative A.New school but in a very prestigious university B.New school in a traditional university undergoing rapid expansion C.Older school in modern university aiming to become more research intensive TABLE 1 RESEARCH OUTPUTS FOR THREE SCHOOLS Years covered by outputs Staff in the 2008 UK RAE Authors involved in outputs Total outputs Total journal papers Total outputs in Web of Science School A1981-2008458161933705403 School B1984-2009396751455629309 School C1980-2008394611212548292

8 TABLE 3 A COMPARISON OF RESEARCH OUTPUTS ACROSS SCHOOLS School ASchool BSchool C ISI WoS subject areasNumber of publications Number of journalsNumber of publications Number of journalsNumber of publications Number of journals Agriculture*----3010 Business7634 183219 Business Finance14511422 Computer Science*1710332014 Economics603337215923 Engineering14101772010 Environmental Sciences 25922116 Environmental Studies --63127 Ethics102--11 Food Science Technology --22154 Geography1047544 Health Care Sciences & Services 32326-- Management1354679297824 Mathematics Applied127623013 Operations Research & Management Science 2010636014 Pharmacology Pharmacy 11102-- Planning Development 14710534 Political Science--15522 Public Administration4411594 Social Sciences*2171410 6 Others ( <10 Publications/ Field) 305438902444 Journal papers not in WoS 302 320 256

9 TABLE 4 EXPECTED CITATIONS PER PAPER BY FIELD AND PERIOD BusinessEconomicsManagement 2001-20040.730.831.05 2002-20050.840.881.16 2003-20060.990.921.29 2004-20071.060.951.38 2005-20081.301.061.70 % change78%28%60%

10 TABLE 5 LEIDEN INDICATOR FOR THREE UK SCHOOLS FOR SEVERAL TIME PERIODS PCCPPTotal papers of journals Total cites of journals JCSmCPP/ JCSm CPP/ FCSm MNCSJCSm/ FCSm 2001-2004 A1081931.7917,32415,6820.911.971.952.030.99 B60941.5711,35995320.841.871.691.700.91 C56550.9811,1558,9750.801.221.071.030.87 2002-2005 A1242421.9517,52818,9071.081.811.911.901.05 B50531.0610,45410,1540.971.091.031.070.95 C70510.7313,00810,9540.840.870.720.730.83 2003-2006 A1212351.9420,71224,5851.191.641.741.671.06 B61801.3111,80912,2241.041.271.151.200.91 C751011.3513,97013,6610.981.381.221.240.88 2004-2007 A1432992.0924,11032,5211.351.551.771.731.14 B651021.5712,66114,7241.161.351.311.340.97 C941501.6016,38916,7851.021.561.341.400.96 2005-2008 A1183462.9323,64241,5201.761.672.052.011.23 B601622.7010,11112,9541.282.111.871.970.89 C791902.4116,11420,9431.301.851.701.710.92

11 6. Main Results 1.Raw (un-normalised) CPP: A is always better and is rising. B and C alternate but are also rising especially in 2005-8 (before the RAE) 2. JCS/FCS (quality of journal set): generally around or below 1, showing the journal set is average for the field, but it rises for A (0.99 – 1.23) showing that A is improving the quality of journals. 3.CPP/FCS (the crown indicator): all schools above 1.0 so better than the field average but not hugely (2 or 3). A actually fell before rising. Could this be because they were targeting better journals and competing against a stronger field? 4.MNCS (the alternative calculation): only marginal differences

12 5.Do we gain anything by the normalization? A has improved the quality of its journal set A appeared to improve over all years, but in fact fell back so this was really a field effect Would allow us to compare schools with very different subject mixes, or departments from different subjects 6.Comparison with 2008 RAE results (GPA out of 4.0): A – 3.05 B – 2.45 C – 2.50 So good agreement

13 7. Conclusions 1.The Leiden methodology can be applied in the social science to business schools but it requires a huge amount of data collection/ processing. It would have to be automated. 2.The results did provide a limited amount of extra value over raw scores. 3.But there are major problems in using WoS Only 20% of the outputs of the schools could actually be evaluated The WoS field categories are poorly defined 4.At the moment the LRM is NOT suitable for assessing business schools research performance. Perhaps Google Scholar could be used although that is unreliable and has no field categories.


Download ppt "A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014."

Similar presentations


Ads by Google