THE REF AND BIBLIOMETRICS Presentation at Northampton University, 3/2/09 Charles Oppenheim Loughborough University

Slides:



Advertisements
Similar presentations
RAE 2008: Goldsmiths Outcomes. Sample Quality Profile.
Advertisements

GSOE Impact Workshop Impact and the REF 19 th May 2010 Lesley Dinsdale.
The Research Excellence Framework RIOJA meeting 7 July 2008 Graeme Rosenberg REF Pilot Manager.
Long live the REF!.  The RAE looks at three main areas: ◦ Outputs ◦ Environment ◦ Esteem  We are used evaluations of Environment and Esteem being “informed”
Welcome to the seminar course
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Bibliometrics Toolkit Google Scholar (GS) is one of three central tools (the others being ISI and Scopus) used to generate bibliometrics for researchers.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
What are the characteristics of academic journals
Project Monitoring Evaluation and Assessment
Research Assessment and UK publication patterns Jonathan Adams.
| 0 World-Class Scientific Journals – 2014: Improving quality and expanding presence in the world information resources Moscow, May 19 – 21, 2014 Karen.
Social Research Methods
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
Measuring Scholarly Communication on the Web Mike Thelwall Statistical Cybermetrics Research Group University of Wolverhampton, UK Bibliometric Analysis.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
1 Guide to exercise 10 Bibliometric searching on indicators for journals, papers, and institutions Tefko Saracevic.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
School of Computing and Mathematical Sciences
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
Bibliometrics: the black art of citation rankings Roger Mills OULS Head of Science Liaison and Specialist Services February 2010 These slides are available.
CS 597 Your Ph.D. at USC The goal of a Ph.D. What it takes to achieve a great Ph.D. Courses Advisor How to read papers? How to keep up-to-date with research?
CMP3265 – Professional Issues and Research Methods Research Proposals: n Aims and objectives, Method, Evaluation n Yesterday – aims and objectives: clear,
H E L S I N G I N K A U P P A K O R K E A K O U L U H E L S I N K I S C H O O L O F E C O N O M I C S Orientaatiopäivät 1 Writing Scientific.
Experimental Psychology PSY 433
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
My Research, its Potential, and its Contribution to SCIT Mike Thelwall.
Publishing strategies – A seminar about the scientific publishing landscape Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Bibliometrics in Computer Science MyRI project team.
The Impact of Business and Management Research The ABS Journal Quality Guide: Version 4 Aidan Kelly, Charles Harvey, Huw Morris, Michael Rowlinson.
RESEARCHING TIPS & STRATEGIES Summer 2008 Melanie Wilson Academic Success Center MSC 207.
Writing the PhD and Publications at the Same Time: Two for One TharenouMarch06 Professor Phyllis Tharenou Dean Research Division of Business.
Wojciech Fenrich Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) University of Warsaw Prague, KRE 12,
REF Information Session August Research Excellence Framework (REF)
BIBLIOMETRICS IN THE RAE HEA Workshop, 28/4/10 Charles Oppenheim Department of Information Science
BME1450: Biomaterials and Biomedical Research Michelle Baratta Engineering & Computer Science Library Maria Buda Dentistry Library.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
Beyond the RAE: New methods to assess research quality July 2008.
Department of Chemical Engineering Project IV Lecture 3: Literature Review.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Transparency in Searching and Choosing Peer Reviewers Doris DEKLEVA SMREKAR, M.Sc.Arch. Central Technological Library at the University of Ljubljana, Trg.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
How to use Bibliometrics in your Career The MyRI Project Team.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
WISER: Citation searching Web of Knowledge is a powerful way to access the ISI's multidisciplinary citation indexes. It allows you to discover what research.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
Intro to Critiquing Research Your tutorial task is for you to critique several articles so that you develop skills for your Assignment.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
The REF assessment framework (updated 23 May 2011)
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
Bibliometrics and Publishing Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE School of Education and Communication in Engineering.
1 Choosing a Computer Science Research Problem. 2 Choosing a Computer Science Research Problem One of the hardest problems with doing research in any.
CiteSearch: Multi-faceted Fusion Approach to Citation Analysis Kiduk Yang and Lokman Meho Web Information Discovery Integrated Tool Laboratory School of.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Bibliometrics: the black art of citation rankings Roger Mills Head of Science Liaison and Specialist Services, Bodleian Libraries June 2010 These slides.
12 things that you need to know about Open Access, the REF and the CRIS Rowena Rouse Scholarly Communications Manager June 2016.
Bibliometrics at the University of Glasgow Susan Ashworth.
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Where Should I Publish? Journal Ranking Tools
What Does Responsible Metrics Mean?
Experimental Psychology PSY 433
Research Update GERI May 2010.
Bibliometrics: the black art of citation rankings
Presentation transcript:

THE REF AND BIBLIOMETRICS Presentation at Northampton University, 3/2/09 Charles Oppenheim Loughborough University

MY CREDENTIALS Have undertaken research on the links between RAE results and bibliometrics since the mid 1990s Member of the Committee advising HEFCE on the use of bibliometrics in the REF, and the pilot use of it to compare to 2008 RAE

ONE IMPORTANT QUESTION Is the RAE a way of evaluating past performance, predicting future performance, or a way of working out how much QR money to dish out? The three are not identical, yet the RAE tries to be all three Evaluating past output (+ PhD completions, research income achieved, etc.) does the first; evaluating RA5, future research plans, does the second

THE REF Announced by Gordon Brown when he was Chancellor of the Exchequer (so it is clear that the motivation is cost- cutting) To be metrics based – details left to HEFCE et al to sort out HEFCE itself evidently surprised by the announcement

THE REF HEFCE commissioned expert advice on the use of bibliometrics and consulted the community on key elements of the REF Large number of responses Consultation outcomes published on HEFCE website Significant modification announced in April 2008: – combination of metrics-based indicators, including bibliometrics where appropriate, as well as input from expert panels for all subjects

THE PILOT Trial run of the bibliometrics approach using RAE2008 data Ongoing right now Main purpose of pilot is to assess two things: do the bibliometrics results correlate with actual RAE results? What are the administrative and technical burdens on HEIs in doing the pilot? Broad results will be published; participating HEIs will get detailed results, to be retained for a short time period and only for the purpose of feeding back to HEFCE any errors or issues

THE REF PILOT IN PRACTICE Collect ALL papers written by staff submitted to 2008 RAE by selected HEIs in the selected subject areas Assign the papers to somewhere between 100 and 250 subject categories (probably two runs, one with the smaller and one with the larger number of subject categories) Calculate: average no. of citations per article; again, but ignoring top and bottom 25% results; % uncited Calculate: world average number of citations per article in chosen subject area over chosen time period Calculate: % of articles from HEI that are above the world average N.B. Subject areas based on journal title and where it is assigned by Thomson-Reuters; ignore non-journal articles (for the Pilot only hard sciences and life sciences are being examined)

FURTHER CALCULATIONS Do the same, BUT: Ignore all review articles (identified by algorithm) Add in/exclude papers published in any previous employment not in this HEI Exclude papers by Category C staff (medicine) Restrict to 6 papers with the highest number of citations

FINALLY See which of the combinations provides the best correlation with actual RAE results HEFCE will digest the results and will then probably follow the best combination in running the real REF

HOW WILL THE REAL REF WORK? Department submits (probably) all papers published by (probably) all staff over a certain time period for review (time period will depend on subject area; shorter for fast- moving subjects) Issues regarding checking who is employed by the HEI, master list of publications – all of this will force HEIs to get their management information in order

NEXT STAGE HEFCE counts the numbers of citations to all the papers and totals them up using WoS and/or SCOPUS (for pilot, it’s just WoS) HEFCE assigns papers to subject area HEFCE does a world calculation of the average number of citations per paper per year for that subject area A profile, along the lines of RAE2008, will then be created of proportion of papers from Dep’t uncited, below world average, at world average, above world average; maybe by percentiles. Decisions yet to be made about excluding certain publications, e.g., in “popular” outlets, review papers (characterised by number of citations in that article) from these calculations Followed by a round of peer review (“light touch” for STM, heavier touch for arts/humanities) to amend profiles in light of particular circumstances of Department/subject area The profile still forms just one component of final REF assessment of UoA – PhDs, research income, etc., still get considered

WHY BIBLIOMETRICS? Civil servants clearly felt that this would provide a cheap and reliable method of evaluating research But, following up the One Important Question, it is backward looking only and does not evaluate future research strategy There are other issues as well, as we shall see!

CHEAP AND RELIABLE? I’m partly to blame for this In a series of articles published since 1997, I have demonstrated the statistically significant correlation between RAE results and citation counts – and have argued that citation counting could and should be used as a cheap and reliable substitute for expensive and subjective peer review It’s possible (I don’t know) that Treasury civil servants read my articles and were persuaded by them

IF THIS IS WHAT THE CIVIL SERVANTS DID….. …then they were being naïve I made it clear that to reliably undertake such studies, you needed subject experts to carry out the analyses manually Instead, the Treasury instructed HEFCE to go for a purely algorithmic approach

THE EVIDENCE All studies carried out so far have shown a statistically significant correlation between RAE scores and citation counts Subjects evaluated include: archaeology; business studies; genetics; library and information management; engineering; music; psychology So, the whole gamut of pure science, engineering, social sciences and humanities – but not medicine yet

THE CORRELATIONS ARE HARDLY SURPRISING Citation counts are a measure of impact And impact is closely related to quality Nonetheless, the two concepts are not synonymous We don’t really know what the RAE peer panels were evaluating; “international standard” research = international impact?

BUT IF THE CIVIL SERVANTS WERE NAÏVE, SO ARE CRITICS OF CITATION ANALYSIS A long familiar catalogue of criticisms, aptly called “fairy tales” by Ton van Raan, head of CWTS in Leiden, the organisation managing the REF Pilot: –ISI’s Web of Knowledge has poor coverage of the humanities, computer science, conferences, monographs….. –Poor coverage of non-English language sources –Co-authors only included post-2000 –People with the same surname and initials –Same person using different names, e.g., after marriage

MORE FAIRY TALES There are also the issues of… –Clerical errors by ISI –Citing for the wrong reasons, e.g., to impress referees, because material is conveniently to hand…. –Not all influences are cited –Mistakes in citing, e.g., title, author surname… by the author –Deliberately controversial or erroneous articles designed to attract negative citations –Self-citation –Mutual citation within a group (“citation clubs”) –Deliberately choosing high Impact Factor journals to improve citation counts –Journal editors forcing authors to cite references from their journal

TYPICAL OF THE NAÏVE/UNINFORMED COMMENTS Ron Johnston, former VC of Essex University, in THE, 8/5/08, p. 24 “ISI data cannot be readily downloaded to be normalised to produce reliable measures” “No evidence that citation scores and RAE scores are correlated” “Evaluation can be done only by peer review”

YES, IT IS TRUE THAT… WoS is not strong in its coverage of humanities journals Not strong on non-English sources The humanities, engineering, computer science are less dependent on journals than other subject areas But the correlations are still there!

WHAT ABOUT THE REST? Citing for the wrong reasons: rare and not statistically significant Mis-citing: a fairly constant problem in all subject areas – no impact overall Deliberately controversial articles: no increase in overall citations Self-citation: no statistically significant effect Mutual citation within a group: no evidence of this Choice of high Impact Factor journals: article quality counts, not IF

POSSIBLE ALTERNATIVE SOURCES SCOPUS – a serious contender; better coverage than WoS in engineering, conferences, etc., and more global in coverage Easier to analyse the data as well, for various technical reasons – less cleaning up needed. Main downside – currently untested; database does not go back that far Likely to be a global deal (as with Web of Knowledge) so that HEIs can access SCOPUS at reasonable cost Google Scholar – data is very dirty and there is duplication; data structure not suited for citation analysis; these points ruin its great potential for a wide range of subjects

A KEY POINT No matter how convincing the objective arguments might be, if people don’t “buy into” the concept, there will be problems Most academics simply don’t believe citation counts are an adequate substitute for peer review So the current approach to the REF, combining bibliometrics with peer review, makes a lot of sense

WHERE WE HAVE ENDED Civil servants were naïve to think simple citation counts would do the trick Many academics are naïve in believing that citation counts cannot work in their subject area The proposed new REF gives us the best of both worlds But what weighting for bibliometrics and peer review? Will a new Government scrap the REF altogether??

REF VERSUS RAE REF – all data is in the public domain, so anyone can replicate and check if they’ve been calculated correctly; numbers are “objective” RAE – decisions taken behind closed doors HEFCE knows use of bibliometrics is controversial, and is determined to involve stakeholders at all stages of the pilot and implementation of the REF

ANY QUESTIONS?