Snowball Metrics – providing a robust methodology to inform research strategy – but do they help? John Green, Chair of Snowball Steering Group formerly.

Slides:



Advertisements
Similar presentations
Project Snowball – sharing data for cross-institutional benchmarking Lisa Colledge, Anna Clements, Mhamed el Aisati, Scott Rutherford euroCRIS 2012 with.
Advertisements

Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director
IREG Forum on University Rankings May 2013 Dr Lisa Colledge Snowball Metrics Program Director
SUPA KT Presentation to the SUPA International Advisory Board April 2012.
Support for decommissioned groups and VSF members The Ready for Change Tool John Griffiths Rocket Science UK Ltd
How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015.
How are people using altmetrics now? The Snowball Metrics perspective 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics Program.
Snowball Metrics An institutional perspective Anna Clements 5/20/2015 Snowball Metrics Project Partners University of Oxford, University College London,
Innovation Measurement
Snowball Metrics A standard for research benchmarking between institutions 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics.
CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
Is a Current Research Information System (CRIS) a critical corporate system for HEIs? A Case Study from the University of St Andrews Anna Clements, Assistant.
CRISs, CERIF, CASRAI and Snowball Metrics (Why) are these key to University Libraries? Anna Clements, Assistant Director (Digital Research), University.
Training Seminar The Professional Association of Research Managers and Administrators Improved Research Information and Decision Management for Strategic.
Integrating aspects of interoperability, policy and governance FRIS: a research information infrastructure in Flanders euroCRIS Strategic Members meeting.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
May MD Anderson Keio University Kiel University Gazi University Queen’s University Belfast Ural Federal University CAPES Brazil Nanyang Technological.
University of St Andrews euroCRIS Strategic Seminar, September 12 th -13 th, 2011, BrusselsAnna Clements, University of St Andrews Anna Clements Enterprise.
MARY-ALICE PATON Partner Network of Procurement Professionals Procuring Locally: Supplier Relationships, the Law and Tensions.
Standardising vocabularies and use cases in Research Information: the CASRAI Profiles euroCRIS Strategic Membership Meeting, Barcelona, Nov 2015.
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
THOMSON REUTERS PROFESSIONAL SERVICES. THOMSON REUTERS PATENT CONTENT 98% of world’s filed patents.
Duncan Ross Director, data and analytics Times Higher Education.
THE DATA 1. Selection Collection Indexing Organisation 2.
Academic Ranking of World Universities
1 QUICK REFERENCE CARDS FOR RESEARCH IMPACT METRICS.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Excellence in Research for Australia 2015 Update Dr Tim Cahill Director Research Evaluation 1.
NRF Open Access Statement
QuicK Reference Cards for Research Impact Metrics.
Dr. Jo Alice Blondin, President, Clark State Community College
Continuing Competence is coming
Alternative delivery models in public services
Jisc Open Access Dashboard
Key Performance Indicators in Measuring Institutional Performance Case Study Use of Board Level KPIs John Lauwerys Secretary & Registrar.
Bibliometrics toolkit: Thomson Reuters products
Christina Lohr Product Manager Research Metrics, Elsevier
New Zealand Disability Strategy
Phil Quirke RAE 2008 & REF 2014 panels
Ian McArdle Research Information Manager
Sourcing Event Tool Kit Multiline Sourcing, Market Baskets and Bundles
Investment Logic Mapping – An Evaluative Tool with Zing
Citation Analysis Your article Jill Otto InCites Other?
Chapter 5 Output: ERP Reports, Data Warehouses and Intranets
Governor Visits to School
CERIF in Action Anna Clements
Enhancing Research Capabilities
Let’s Talk – Interoperability between University CRIS/IR and
How to Improve the Visibility and Impact of Your Research
Optimize your research performance using SciVal
Advanced Scientometrics Workshop
Collective impact Emma Cooper, M&E Unit
Overview of working draft v. 29 January 2018
SciVal to support building a research strategy
Open access in REF – Planning Workshop
Writing and Using the Accountability Report in an Academic Setting
Periodic Review Departmental Review.
POVERTY IN THE EUROPEAN UNION: BUILDING A COMMON VISION FOR ACTION
WHAT IS RISS? The Rural innovation Support Service (RISS) is a bottom-up approach to rural innovation, addressing the needs of land managers RISS gets.
Comparing your papers to the rest of the world
Governor Visits to School
OPEN ACCESS POLICY Larshan Naicker Rhodes University Library
DUAL SUPPORT DUEL FOR SUPPORT
Towards a Work Programme for the Common Implementation Strategy for the Water Framework Directive (2000/60/EC) Water Directors Meeting 28 November.
KEY INITIATIVE Financial Data and Analytics
KEY INITIATIVE Financial Data and Analytics
Tracie Wills Senior Commissioning Officer
LAUNCHING THE 2019 REGIONAL COMPETITIVENESS INDEX RCI 2019
Presentation transcript:

Snowball Metrics – providing a robust methodology to inform research strategy – but do they help? John Green, Chair of Snowball Steering Group formerly Chief Operating Officer, Imperial College London now Life Fellow, Queens’ College Cambridge jtg11@cam.ac.uk Anna Clements, University of St Andrews, UK Board member, euroCRIS Deputy chair , CASRAI-UK  Chair, Pure UK Strategy Group Peter Darroch, Elsevier, Project Manager Snowball

Snowball Metrics address university-driven benchmarking Universities need standard metrics to benchmark themselves and know their position relative to peers, so they can strategically align resources to their strengths and weaknesses Snowball Metrics UK Project Partners So The Snowball Metrics project started about 4 years ago following a report led by the Chair of the Snowball Metrics Steering Committee, John Green, when he was Chief Operating Officer at Imperial College London. One of the things the report highlighted was the need for standard metric definitions in order to allow universities to benchmark themselves against their peers in order to allow them to more effectively align strategies to their strengths and weaknesses. The slide here shows the original 9 partners in the project. The University Partners account for around 50% of UK research output and also research spend and Elsevier as a partner, provided general project management, the capability to help test the feasibility of the metrics and to help communicate and distribute the program using our global networks.

Snowball Metrics approach Vision: Snowball Metrics enable benchmarking by driving quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier Bottom-up initiative: universities define and endorse metrics to generate a strategic dashboard. The community is their guardian Draw on all data: university, commercial and public Ensure that the metrics are system- and tool-agnostic Build on existing definitions and standards where possible and sensible The Snowball Metrics project aims to publish a series of free to use metrics which become the international standard that is endorsed by research-intensive universities so they can build and monitor their strategies. Importantly, it is a ‘bottom-up, or ‘sector led’ initiative in which universities agree a single method to calculate metrics about their own performance so they can effectively compare themselves against each other in an apples to apples way across the whole research workflow from Inputs, throughputs or processes, to outputs and outcomes. Snowball metrics are not proprietary, they do not depend on a particular data source or supplier and are based on a combination of all the data sources available to universities be they institutional, third party or commercial sources. They should also be system and tool agnostic and build on existing standards where possible and this is where the CASRAI project is obviously relevant.

Globalizing Snowball Metrics US University of Michigan University of Minnesota Northwestern University University of Illinois at Urbana-Champaign Arizona State University MD Anderson Cancer Center Kansas State University Australia / New Zealand University of Queensland University of Western Australia University of Auckland University of Wollongong University of Tasmania Massey University The University of Canberra Charles Darwin University Interest and support from: Japan RU11 metrics group Association of Pacific Rim Universities (APRU) European Commission for H2020 Fundação para a Ciência e a Tecnologia (FCT) in Portugal Started in Uk but desire is to globalise metrics so we have a US group which has been partially successful with a few universities still interested but not taken up as quickly as hope due to many factors. We do however have around 12 US specific versions of metrics where the subject areas have been mapped to the NSF HERD survey subject areas. In ANZ group has been slow and not too successful to this point. Also had interest from other areas listed. Now UK and US universities could benchmark themselves but key to this further success is also our work with standards agencies such as CASRAI and euroCRIS. One issue may be there is no way for univ to see each others metrics and share them.

The output of Snowball Metrics “Recipes” – agreed and tested metric methodologies – are the output of Snowball Metrics From Statement of Intent: Agreed and tested methodologies… are and will continue to be shared free-of-charge None of the project partners will at any stage apply any charges for the methodologies Any organization can use these methodologies for their own purposes, public service or commercial www.snowballmetrics.com/metrics Statement of Intent available at http://www.snowballmetrics.com/wp-content/uploads/Snowball-Metrics-Letter-of-Intent.pdf

The Snowball Metrics Landscape A mutually agreed and tested basket of metric ‘recipes’ Research Inputs Research Process Research Outputs and Outcomes Research Applications Volume Awards Volume Success Rate Income Volume Market Share Publications and citations Scholarly Output Citation Count Citations per Output h-index Field-Weighted Citation Impact Outputs in Top Percentiles Publications in Top Journal Percentiles Collaboration Collaboration Impact Collaboration Field-Weighted Citation Impact Collaboration Publication Share Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Impact Altmetrics Public Engagement Academic Recognition Enterprise Activities/Economic Development Academic-Industry Leverage Business Consultancy Activities Contract Research Volume Intellectual Property Volume Intellectual Property Income Sustainable Spin-Offs Spin-Off-Related Finances Postgraduate Education* Research Student Funding Research Student to Academic Staff Ratio Time to Approval of Doctoral degree Destination of Research Student Leavers * In feasibility testing, to be published in 3rd recipe book due 2016

Denominators

Towards global standards for benchmarking Snowball Metrics denominators should enable global benchmarking as far as possible. We do not know how this will look , but one possibility is: Common core where benchmarking against global peers can be conducted. Aim is to make this as big as possible Shared features where benchmarking between Countries 1 and 2 can be supported by shared or analagous data e.g. regional benchmarking National peculiarity can support benchmarking within Country 1, but not globally i.e. national benchmarking UK metrics Japan metrics Illustrative only, testing underway Australia / New Zealand metrics

Snowball metrics exchange Institution A+B add metric values to the SMP e.g. Scholarly output = 1,376 Institution A+B agree to share encrypted metric values and manage entitlements in the SMX FJ#)_!A#_+ FJ #)_!A#_+ Institution s A+B retrieve or update metric values in the SMC. e.g. Scholarly output = 1,376

Snowball metrics exchange Data sources inputs Snowball Metrics eXchange Data sources outputs database InCites InCites SciVal SciVal Symplectic Symplectic Pure Pure Converis Converis spreadsheet spreadsheet FJ#)_!A#_+ Scholarly output = 1,376 Scholarly output = 1,376

Snowball Metrics exchange Any institution who is using Snowball Metrics can become a member of the exchange Institutional members are responsible for generating Snowball Metrics according to recipes An institution could be the member of one or more ‘Benchmarking clubs’ Institutions choose what to share Exchange service encrypts all metrics and only entitled institutions can decrypt Data underlying metrics will never be exchanged CRIS system could act as provider and client, communicating directly with exchange APIs

Review of Institutional KPIs in 2014 Case study - St Andrews Review of Institutional KPIs in 2014 Made use of Snowball Metrics, including Outputs in Top Percentiles and Success Rates Advantages of using Snowball Tried and tested by Snowball partners “Benchmark-ready” Specific questions answered important that metrics have been agreed by expert group from the partner institutions and then tested with real data Want to be in a position to benchmark either 1:1 via exchange of data or as part of SMX We were asking the same questions here at St Andrews internally .. Good to have something to refer to … why reinvent the wheel?

Success Rates : Questions What do we count ? Record three-states : success / pending / rejection Uses most up to date price available pragmatic decision – we all have this in our systems; we don’t all keep requested price, for example When do we count it ? Record against year of application : rates could change retrospectively … this was a problem at first for our accountants! Agree 12 month write-off across all funders 12 month write-off for all funders ... need workable rule for benchmarking … agreed a sensible compromise. Price awarded versus requested … we hold both in our systems but several of the project partners only have data available for price actually awarded – and cannot link back to the originally requested price. As more institutions improve the links between their pre and post award systems this should change and Snowball notes that the recipe will need revisiting. When do we count the success or failure : our finance team reluctant at first to recipe that meant rates could change retrospectively. However the point here is that Snowball Metrics are indicators to help measure progess towards institutional objectives and benchmark with peers - they are not subject to accounting rules. Indeed a Success Rate is a percentage and for it to make sense the success or failure has to be recorded against the application. If we have a 12 month write-off period then the most we need to do is look back 12 months to get the final picture – another advantage of an agreed 12 month write off period.

Success Rates : example An example of how the Success Rates indicator can be represented

So do metrics help? It’s not all metrics, or no metrics – it’s not a black and white decision Metrics can provide data points on which to build using expert opinion (peer review) to delve deeper & deal with outliers Metrics aren’t a replacement for human judgment – they complement it Universities already widely adopt metrics and tools We value objective normalized universal information that enables meaningful comparisons After all academia is an evidence-based activity! Metrics aren’t the antithesis of peer review (Biblio)-metrics incorporate decisions made by peer review, e.g. whether to publish, what to cite But metrics aren’t just bibliometrics – there are many measures that can and should be used First define the question; then pick the metrics to answer them 9/19/2018 Snowball Metrics Project Partners University of Oxford, University College London, University of Cambridge, Imperial Colledge London, University of Bristol, University of Leeds, Queen’s University Belfast, University of St. Andrews, Elsevier *Ordered according to productivity 2011 (data source: Scopus)

Snowball Metrics http://www.snowballmetrics.com/ John Green Anna Clements Peter Darroch jtg11@cam.ac.uk Snowball Metrics http://www.snowballmetrics.com/