Snowball Metrics A standard for research benchmarking between institutions 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics.

Slides:



Advertisements
Similar presentations
Driving forward excellence in research: institutional strategies and approaches Professor Malcolm Grant UCL President and Provost HEPI conference Research.
Advertisements

Project Snowball – sharing data for cross-institutional benchmarking Lisa Colledge, Anna Clements, Mhamed el Aisati, Scott Rutherford euroCRIS 2012 with.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director
IREG Forum on University Rankings May 2013 Dr Lisa Colledge Snowball Metrics Program Director
Embedding Public Engagement Sophie Duncan and Paul Manners National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
SUPA KT Presentation to the SUPA International Advisory Board April 2012.
PERFORMANCE APPRAISAL IN THE SIERRA LEONE CIVIL SERVICE PRESENTER: DAVID WS BANYA DIRECTOR PERFORMANCE MANAGEMENT – HRMO.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Best practice partnership models
POSTER TEMPLATE BY: Introduction Current E-book Challenges E-book Project Plan E-book Project Vision For Further Information.
How are people using altmetrics now? The Snowball Metrics perspective 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics Program.
Snowball Metrics An institutional perspective Anna Clements 5/20/2015 Snowball Metrics Project Partners University of Oxford, University College London,
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
HERA WORKSHOP Jane Hart Oksana Bolgarova Cartwright Wanda Kirby.
W HAT S TARTS H ERE C HANGES THE W ORLD The University of Texas at Austin New Brand Strategy New University Brand Strategy Workshop February 28, 2011.
Customer Service Excellence Standard – adding value for your students Helen Loughran Libraries and Learning Innovation Leeds Metropolitan University
Universities and Governments: The Commercialization & Innovation Agenda Sitting Beside the Elephant –AUTM Metrics and Performance Anxiety AUCC and Federal.
Royal Society Industry Fellowships Scheme Dr Donna Lammie, Senior Manager, Grants.
CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge.
PATHFINDER Mission: Results for New Zealanders. Agenda for WG4 1.Introduction (Chair) - includes short website update (Greg) 2.WG / WS Process (Chair)
The Higher Education Innovation Fund Vinnova and British Embassy seminar 21 March 2006.
LEADING ONLINE: An autoethnography focused on leading an instructional focus on student learning in an online school DOCTOR OF EDUCATION WASHINGTON STATE.
Theme Managing self Competency Problem solving and decision making Use appropriate media to communicate.
Vision 2018 Surrey Board of Trade October 2nd, 2013.
International Congress and Convention Associationwww.iccaworld.com Strategic Plan – Mission Statement “ICCA is the global community for the meetings industry,
Focus on Learning: Student Outcomes Assessment and the Learning College.
Working Definition of Program Evaluation
AHIMA & PHDSC A Transformational Alliance. CONFIDENTIAL AHIMA Background  Professional association founded in 1928 as the Association of Record Librarians.
McGraw-Hill/Irwin Teaching Excellence Project funded by CELT Teaching Economics through Innovative Content and Effective Teaching Methods Necati Aydin,
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
CAUL Strategic Plan Review 2003 Objectives & Actions.
THOMSON REUTERS—GLOBAL INSTITUTIONAL PROFILES PROJECT DR. NAN MA SCIENCE AND SOLUTION CONSULTANT THOMSON REUTERS OCT 19 TH, 2010.
Warrington Conference – 15 th March 2011 “Going local – creating and managing local training networks” Enhancement of Learning Support.
Gaining and Maintaining Supported Researcher Status Knowing the Rules of the Game June 2007.
Commissioning Self Analysis and Planning Exercise activity sheets.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
May MD Anderson Keio University Kiel University Gazi University Queen’s University Belfast Ural Federal University CAPES Brazil Nanyang Technological.
Ann Campion Riley University of Missouri
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
1 Click to edit Master text styles Second level Third level Fourth level Fifth level Administrative Support for Large- Scale Funding Applications – Session.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Faculty Satisfaction Survey Results October 2009.
The School’s Research Environment Anne Mills Deputy Director and Provost Improving health worldwide
Transforming Patient Experience: The essential guide
Mission and Future of the College of Education. WHY IS A CLEAR ORGANIZATIONAL VISION NEEDED? Missions often are mixed. Resources are scarce.
Kathy Corbiere Service Delivery and Performance Commission
N NCHN Spring Conference April 18, 2005 Jill Zabel, Manager Wipfli Health Care Practice Network Formation, Evolution & Evaluation.
IAB Involvement in ERCs: Assessing and Strengthening the Role.
Standardising vocabularies and use cases in Research Information: the CASRAI Profiles euroCRIS Strategic Membership Meeting, Barcelona, Nov 2015.
Workshop For Reviewers Operating the Developmental Engagements Prof. Dr. Hala SalahProf. Dr. Hoda ELTalawy.
DRIVER Action plan for an International Repository Organisation Dale Peters OAI6 Breakout Session Joining up Repositories 18 June 2009.
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
Governance for a Board Monday March 14, Agenda  Introductions  Benefits and challenges of regional cooperation  What is governance  Governance.
EMPOWERMENT THROUGH EDUCATION Business Retention and Expansion Task Force Workshop Joe Lucente Assistant Professor and Extension Educator OSU Extension.
Top Tips Localism In Action Tip 1: Getting Started Use existing links to build a strong localism partnership across the CA area Be proactive,
Christina Lohr Product Manager Research Metrics, Elsevier
Auditing Sustainable Development Goals
Education Council Work Programme
CBP Strategic Communications Plan
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Snowball Metrics – providing a robust methodology to inform research strategy – but do they help? John Green, Chair of Snowball Steering Group formerly.
Is there another way besides accreditation?
Optimize your research performance using SciVal
BAI Gender Action Plan 27th April 2018 IFI - Spotlight Stephanie Comey.
Evaluation in the GEF and Training Module on Terminal Evaluations
SSHRC Institutional Grant University
Tracie Wills Senior Commissioning Officer
Peter Tinson, Executive Director, UCISA
Presentation transcript:

Snowball Metrics A standard for research benchmarking between institutions 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics Program Manager

Snowball Metrics recap

The origins of Snowball Metrics Competitive award won by Imperial College London and Elsevier to investigate the state of research management in the UK Clear trends were voiced: “Unless you have [data] you cannot make informed decisions; you would be acting based on opinions and hearsay.” “[There is little] thought leadership and knowledge development around best practice.” 3 Report available at management1.pdfhttp:// management1.pdf “The principle drivers for our systems are often external… but they shouldn’t be… a research strategy should… be developed… to respond to our strengths and the external environment, our systems should be defined to run our business.”

University recommendations 4 Universities and funders should work more collaboratively, and develop stronger relationships with suppliers “Universities should work together more to make their collective voice heard by external agencies.” “ The lack of a long-term vision makes it hard to… co-operate within a university let alone across the sector.” “Suppliers do not know what research offices do on a daily basis.” “How educated are we at asking suppliers the right questions?” “Someone needs to take ownership of the process: it is impossible to please all of the people all of the time so somebody needs to be strong enough to stand behind decisions and follow through.” “It would be great if the top five [universities] could collaborate.” 1 2

Snowball Metrics project partners 5 UK group US group University of Michigan University of Minnesota Northwestern University University of Illinois at Urbana-Champaign Arizona State University MD Anderson Cancer Center Kansas State University Australia/New Zealand group University of Queensland University of Western Australia University of Auckland University of Wollongong University of Tasmania Massey University University of Canberra Charles Darwin University

Snowball Metrics in a nutshell 6 Vision: Snowball Metrics enable benchmarking by driving quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier Bottom-up initiative: universities define and endorse metrics to generate a strategic dashboard. The community is their guardian Draw on all data: university, commercial and public Ensure that the metrics are system- and tool-agnostic Build on existing definitions and standards where possible and sensible

Main roles and responsibilities Everyone covers their own costs Universities – Agree the metrics to be endorsed as Snowball Metrics – Determine methodologies to generate the metrics in a commonly understood manner to enable benchmarking, regardless of systems Elsevier – Ensures that the methodologies are feasible – Distribute the outputs using global communications networks – Day-to-day project management of the global program Outside the remit of the Snowball Metrics program – Nature and quality of data sources used to generate Snowball Metrics – Provision of tools to enable generation and use Snowball Metrics 7

The output of Snowball Metrics 8 “Recipes” – free, agreed and tested metric methodologies – are the output of Snowball Metrics From Statement of Intent: Agreed and tested methodologies… are and will continue to be shared free-of-charge None of the project partners will at any stage apply any charges for the methodologies Any organization can use these methodologies for their own purposes, public service or commercial Statement of Intent available at Letter-of-Intent.pdfhttp:// Letter-of-Intent.pdf

Exchange of Snowball Metrics Metrics are exchanged, never data “I’ll show you mine if you show me yours” 9

News since euroCRIS conference New and enhanced recipes CERIFied recipes CASRAI-Snowball Metrics project Internationalisation of the recipes

Recipe book version 2 11 Recipes in first recipe book Recipes added in second recipe book Enormous flexibility in understanding performance is possible from this “basket” of standard metrics, especially with the “slicing and dicing” that can be applied

Enhanced recipes

13 Enhanced version 2 Add detail on output types included and excluded

New: economic development recipes

Economic development: a “basket of metrics” gives an accurate picture of performance 15 Contract Research Spin-Off-Related Finances IP Volume / Licenses Cumulative Active Patents

New: altmetrics recipe

Altmetric: Scholarly Activity flavour 17 Institution – total counts Normalised by FTE Normalised by outputs

Snowball Altmetrics: 4 flavours 18 Scholarly Activity: posts in scholarly tools Social Activity: social media posts Scholarly Commentary: comments in scholarly tools Mass Media: references from newspapers etc.

Why the interest in altmetrics? Measures of engagement with the wider scholarly community – Scholarly Activity and Scholarly Communication Measures of engagement outside the academic sphere – Social Activity and Mass Media “I firmly believe that altmetrics offer a unique chance to bring Stem and non- Stem together again. How better to build on the current process of the evaluation of research (and the recent introduction in the UK of impact assessment) than by gathering, publishing and analysing data on scholarly influence among the audience beyond the scholarly sphere?” Part of the Snowball Metrics landscape that aims to provide the most complete picture of research performance possible “I would argue that altmetrics alone in their current form cannot be used to judge the quality of research or its output... Nevertheless, pending on improving the underlying data sources, it is likely that altmetrics will play a crucial role in informing the research assessment and impact agenda…” 19 Quotes from “Expanding altmetrics to include policy documents will boost its reputation”, Juergen Wastl, Head of Research Information, University of Cambridge, available at network/blog/2014/jul/23/expanding-altmetrics-include-policy-documents-boost-reputation?commentpage=1http:// network/blog/2014/jul/23/expanding-altmetrics-include-policy-documents-boost-reputation?commentpage=1

CERIFied recipes

21

Description of CERIF xml specification 22

Artefacts of the specification 23

CASRAI profiles

CASRAI project “The project to prepare the Snowball Metrics Recipe Book in the CASRAI format (exchange files based on a common vocabulary) will transform the current recipes into CASRAI terms, objects and fields and will conduct a wider review circle on these terms and objects to gather comment and suggested improvements and additions to both the CASRAI standards and the Snowball recipes. The final outputs will be one or more exchange files and a proposed integration/harmonization with current dictionary terms plus any new terms derived from Snowball Metrics.” Project will also deliver “a streamlined process for the expression of new Snowball Metrics recipes / modified Snowball Metrics recipes as CASRAI standards” 25

First profile is currently being developed for review 26

Internationalisation of the recipes

Researcher definition version 1 28 Very UK-focused without a generic global version Feedback that it’s difficult for the global community to relate this to their national situations.

Researcher definition iteration by US working group Any faculty or staff member with PI privileges who reports >0% Research on annual federal Effort Reporting This definition includes: Researchers who engage in “traditional” lab work and publishing papers Researchers doing non-lab-based research: clinicians doing even a small amount of research are counted, but not the many clinicians who don’t do research People who don’t have certified-PI-enabled status but who are eligible for it, and have time allocated to research of any kind Librarians and professional research staff e.g. research associates who are performing research solely with internal or philanthropic funds Postdoctoral fellows Visiting faculty / researchers This definition excludes trainees (undergrads, graduate students and postdoctoral fellows) without PI privileges. 29

UK and US groups agree version 2 Published in recipe book version 2

National data structure and international benchmarking? 31 UK HESA cost centres US NSF HERD categories Discipline - desirable characteristics for benchmarking Researcher assignment to a discipline Up-to-date (used for national reporting) Commonly understood No strategic angle Relevant outside the organization ?

Mapping to a shared class is work in progress 32

THANK YOU FOR YOUR TIME AND ATTENTION!