The future of the British RAE The REF (Research Excellence Framework) Jonathan Adams.

Slides:



Advertisements
Similar presentations
RAE 2008: Goldsmiths Outcomes. Sample Quality Profile.
Advertisements

1 The Research Environment Post 2008 Some Possibilities Professor Peter Gilroy.
June 2006 How good is our research? New approaches to research indicators.
Research Funding and Assessment: The Future Professor David Eastwood Vice-Chancellor and Principal.
Overview of the Research Assessment Exercise Iain Richardson School of Engineering and the Built Environment
Research Excellence Framework Jane Boggan Planning Division Research Staff Forum - January 2010.
Main Panel A: Subpanels and Chairs A1: Clinical Medicine - Christopher Day, Newcastle University A2: Public Health, Health services and Primary Care -
Oct 2006 Research Metrics What was proposed … … what might work Jonathan Adams.
Tüzin BAYCAN-LEVENT ERC Advanced Grant Evaluation.
The Research Excellence Framework RIOJA meeting 7 July 2008 Graeme Rosenberg REF Pilot Manager.
Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"
Long live the REF!.  The RAE looks at three main areas: ◦ Outputs ◦ Environment ◦ Esteem  We are used evaluations of Environment and Esteem being “informed”
Open access in the post-2014 REF: an overview. Introduction This slide pack covers the main points of the four UK HE funding bodies’ policy for open access.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
SciVerse Scopus: Content coverage and title selection Dr Wim J.N. Meester Senior Product Manager Moscow, 18 May 2010.
Research Assessment and UK publication patterns Jonathan Adams.
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
Publishing Opportunities Alan Fyall Deputy Dean Research & Enterprise School of Services Management.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Research at York Presentation to Council Alastair Fitter Pro-Vice-Chancellor, Research.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Managing and developing the collections at the Bodleian Libraries of the University of Oxford COSEELIS conference June 2012 Catríona Cannon, Associate.
Achieving and Demonstrating Research Impact John Scott.
REF2014 – results and the way forward SSHP Meeting 12 March 2015.
Orvill Adams, Orvill Adams & Associates B.V. Orvill Adams Orvill Adams & Associates B.V. Measuring the Products of Medical Education.
Orientation to Web of Science Dr.Tariq Ashraf University of Delhi South Campus
The Web of Science database bibliometrics and alternative metrics
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
The International Entrepreneurship Educators Conference Sharing experience across contexts: UK Higher Education Academy Subject Centres.
Beyond the RAE: New methods to assess research quality July 2008.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
The Web of Science, Bibliometrics and Scholarly Communication 11 December 2013
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Closing date 28 February  Assessment of your recent research track record  International peer review  Based on the quality of research outputs.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
The Web of Science, Bibliometrics and Scholarly Communication
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
RESEARCH – DOING AND ANALYSING Gavin Coney Thomson Reuters May 2009.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
Excellence in Research for Australia (ERA 2015) RMIT Presentation 18 January 2016 Ms Leanne Harvey Executive General Manager Australian Research Council.
Managing and developing the collections at the Bodleian Libraries of the University of Oxford WESLINE conference 2 – 3 September 2013 Catríona Cannon Associate.
Open Access & REF202*.  Green OA  Deposit of pre-print or post-print of accepted paper for publishing within a repository.  Gold OA  Published version.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Current R& KE Issues David Sweeney
Where Should I Publish? Journal Ranking Tools
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Bibliometrics toolkit: Thomson Reuters products
A Practical Guide to Evidencing Impact
What Does Responsible Metrics Mean?
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
Journal evaluation and selection journal
Subject Pilot Teaching Excellence and Student Outcomes Framework (TEF)
How good is our research? New approaches to research indicators
Presentation transcript:

The future of the British RAE The REF (Research Excellence Framework) Jonathan Adams

Research Assessment Exercise - timeline 1980s - policy on concentration and selectivity st Research Selectivity Exercise modified and formalised as the RAE Polytechnics access research funding, enter a streamlined RAE 1996 and further cycles, higher quality thresholds for funding 2008 – new Roberts profiling format

The shift to metrics Evolution –RAE = peer review of an evidence portfolio, including data on outputs, training and grants funding –RAE2008 profiling adds emphasis to the data Discontinuity –Treasurys 2007 announcement was disruptive, from many perspectives Compromise –HEFCE consultations shifted emphasis away from the gross simplification, and restored peer review

Research assessment must support the UKs enhanced international research status Is the assessment dividend beginning to plateau? Has the RAE delivered all it can?

If there is a shift to metrics, then disproportionate change should be avoided

Research performance - indicators, not metrics Inputs Research black box Outputs FundingNumbers..Publications research quality Time What we want to know What we have to use

How can we judge possible metrics? Relevant and appropriate –Are metrics correlated with other performance estimates? –Do metrics really distinguish excellence as we see it? –Are these the metrics the researchers would use? Cost effective –Data accessibility, coverage, cost and validation Transparent, equitable and stable –Is it clear what the metrics do? –Are all institutions, staff and subjects treated equitably? –How do people respond, and can they manipulate metrics? –Once an indicator is made a target for policy, it starts to lose the information content that initially qualified it to play such a role

Three proposed data components Research funding Research training Research output –The key quality measure All have multiple components PLUS Peer Review

HEFCE favours bibliometrics: impact ( ) is related to RAE2001 grade (data for UoA14 Biology)

Impact index is coherent across UK grade levels data for core science disciplines, grade at RAE96

HEFCE favours bibliometrics: impact ( ) is related to RAE2001 grade (data for UoA14 Biology) The residual variance is very great

What is the right impact score? Correct counts –25% of cites are to non-SCI outputs Proliferating versions –How do you collate? Collaboration vs fractional citations –Fractional citation counts would work against trends and policy Self citation – does it matter? –It is part of the sociology of research Normalisation strategies Clustering into subject groups

TOTAL INSTITUTIONAL OUTPUT Non-print UNPUBLISHED ORCLIENT PUBLISHEDREPORTS etc PUBLICATIONS

INSTITUTIONAL PUBLICATIONS Books and chapters Conference proceedings Journal articles Will be in WoS within 2-3 months

INSTITUTIONAL PUBLICATIONS Journals covered by THOMSON WoS and/or SCOPUS Articles in journals not covered by THOMSON WoS and/or SCOPUS or journal not covered at time of publication

INSTITUTIONAL PUBLICATIONS Journals covered by THOMSON WoS and/or SCOPUS 2001 Timeline 2007 CENSUS DATE CENSUS PERIOD

INSTITUTIONAL PUBLICATIONS Journals covered by THOMSON WoS and/or SCOPUS 2001 Timeline 2007 CENSUS PERIOD All papers with an institutional address published by all staff and students employed or in training during Papers with an institutional address published by staff who left or retired before the census date CENSUS DATE All papers with an institutional address published by all staff and students employed or in training during Journals covered by THOMSON WoS and/or SCOPUS

All papers with an institutional address published by all staff and students employed or in training during Papers without that institutional address published by staff recruited during INSTITUTIONAL PUBLICATIONS CENSUS DATE CENSUS PERIOD

Papers published during by staff present at census date Papers published during census period by staff while at the institution PAPERS BY ADDRESS PAPERS BY AUTHOR CENSUS DATE CENSUS PERIOD Leavers Recruits

Quality differentiation: do you assess total activity or selected papers? (data for UoA18 Chemistry)

The average does not describe the profile Two units in the same field differ markedly in average normalised citation impact (2.39 vs. 1.86) because of an exceptionally high outlier in one group, but the groups have similar profiles Average = 2.39 Average = 1.86

Distribution of data values - income MaximumMinimum

Distribution of data values - impact The variables for which we have data are skewed and therefore difficult to picture in a simple way

Simplifying the data picture Scale data relative to a benchmark, then categorise –Could do this for any data set All journal articles –Uncited articles (take out the zeroes) –Cited articles Cited less often than benchmark Cited more often than benchmark –Cited more often but less than twice as often –Cited more than twice as often »Cited less than four times as often »Cited more than four times as often

Categorising the impact data This grouping is the equivalent of a log 2 transformation. There is no place for zero values on a log scale.

UK ten-year profile 680,000 papers AVERAGE RBI = 1.24 MODE (cited) MEDIAN THRESHOLD OF EXCELLENCE? MODE

Profiles are informative and work well across institutions and subjects

HEIs – 10 year totals smoothed Absolute volume would add a further element for comparisons

HEIs – 10 year totals by volume

Normalisation strategy will affect the outcome ( Data for UoA13 Psychology )

Clinical Lab Sci... Accountancy Hosp. based... Com. based... Other stud. Pharmacy Biochemistry Biol. sciences Pre-clin. stud. Physiology Pharmacology Anatomy Veterinary sci. Clin. Dentistry Food sci... Agriculture Earth sci. Environ. sci. Geography Archeology Mineral/mining... Chemistry Metallurgy... Physics Chem. eng. Computer sci. Gen. Eng. Mechanical eng... Electrical eng... Civil eng. Pure maths. Applied maths. Statistical res... Nursing Sports related... Psychology Education Politics... Social policy... Sociology Social work Communication... Built environ. Town/country... Economics... Business... stud. Law Library and info... Anthropology Asian stud. Middle east... Theology... American stud. Iberian... European stud. French German, Dutch... English History Italian Russian... Linguistics Classics... Philosophy History of Art... Art and Design Drama, Dance... Music Celtic stud. Subject clustering needs to fit UK research Engineering Medical Physical Maths Bio-Med Environment Social Arts & hums This tree diagram illustrates similarity in the frequency with which journals were submitted to RAE1996

How should we map data to disciplines? i.e. what is Chemistry? Thomson

How well do metrics respond to variation? Subject differences –Can we accept differences in criteria and balance between clusters? –What about divergence within clusters? –How do metrics support the growth of interdisciplinarity? –How can emerging (marginal?) research groups be recognised? Differences in mode –Where is the balance between basic and applied research? Differences in people –Career breaks, career development

How well do metrics represent different HEIs? Output coverage by articles on Thomson Reuters databases

What will it cost? Data costs –Core data – how much, from whom? –Data cleaning and validation Pilot studies are elucidating this – and the task is big Requirements on institutions –Pilot studies will elucidate this System development System maintenance Will it cover institutional quality assurance?

Other issues Census period –What about synchrony and sequence? Weighting indicators –ERA will weight research training at 0 –Need to weight within types as well as between Interface between quantitative (indicators) and qualitative (peer review) –Role of panel members –Risk of mis-match

Do outputs hang together with income and training? We can tell you … You are the REF Check it out now RAE2008.com

How can we judge possible metrics? Relevant and appropriate - YES –Technical correctness of metrics is not a problem, but there is a lot of work to do in refining and comparing options Cost - MAYBE –Data accessibility is not a problem –But we have yet to scope full system requirements So is there a problem? –Are all subjects, HEIs, staff and modes treated equitably? –What will 50,000 intelligent people start to do? –Goodharts Law - for how long will the metrics track excellence? Researchers must decide, not metricians (RMM, 1997) –The devil is in the detail: get involved

REF pilot projects 20+ institutions (July 08) Collect and collate databases, reconciling authors to staff (Oct 08) Compare Thomson and Scopus coverage Collate and normalise citation counts (Dec 08) Run evaluations of alternative methodologies Disseminate outcomes and consult (Mar 09)

Over 8,000 people participated in recent PBRF rounds (50,000 in the RAE). Thomson recorded fewer than 5,000 articles per year recently (100,000 for the UK). That is less than one article per NZ researcher per year.

Implications for Aotearoa New Zealand Relative data coverage –Balance of regional journals International = trans-Atlantic –The relevance of citations Scale factors and relative load –Fixed costs Community size and anonymity Compatibility of stakeholder and researcher views on assessment outcomes

The future of the British RAE The REF (Research Excellence Framework) Jonathan Adams