IREG Forum on University Rankings 16-17 May 2013 Dr Lisa Colledge Snowball Metrics Program Director

Slides:



Advertisements
Similar presentations
Organisational Effectiveness Consulting Achievements Organization Effectiveness Consulting is a full-service organizational development consulting team.
Advertisements

Strategic Planning: A Wide-Angle Lens Kerby Meyers, IEB Chair IABC Asia Pacific Leadership Institute Melbourne, Australia 17 November 2012.
Introduction to VET Quality Assurance in the UK Mark Novels 6 th December 2011 Quality Assurance in Technical and Vocational Education and Skills Study.
OECD World Forum Statistics, Knowledge and Policy, Palermo, November
HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
1 Lessons learned – success factors for biodiversity projects Peter Tramberend Environment Agency Austria.
Pacific Regional Digital Strategy II Suella Hansen & Noelle Jones Presentation for APT 28 April 2010.
Innovate Now: Overview and Next Steps February 2007.
Transport for London Supplier Diversity Stonewall Presentation Clive Saunders Equality & Inclusion Delivery Manager Group Services.
Options appraisal, the business case & procurement
Project Snowball – sharing data for cross-institutional benchmarking Lisa Colledge, Anna Clements, Mhamed el Aisati, Scott Rutherford euroCRIS 2012 with.
1 European benchmarking with the CAF ROME 17-18th of November 2003.
Stage 2: The GOA Tool.
HERODOT International Conference: Torun, Poland 2-5 September Tuning and European Higher Education Geography Kevin Crawford & Karl Donert Liverpool.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director
Chapter 5 – Enterprise Analysis
1FANIKISHA Institutional Strengthening Project First Author: Henry Kilonzo Second Author: Dr. Daraus Bukenya Enabling Kenyan Civil Society Organizations.
Life Science Services and Solutions
© SchellingPoint 2012 Alignment Optimization An Introduction to Synchronize Your Destination.
Strategic Financial Management 9 February 2012
USE OF REGIONAL NETWORKS FOR POLICY INFLUENCE: THE HIS KNOWLEDGE HUB EXPERIENCE Audrey Aumua and Maxine Whittaker Health Information Systems Knowledge.
Internal Control–Integrated Framework
Engr Mian Khurram Mateen
HR – Are we marketing the brand ? Neil Scurlock Head of Learning & Development The Chartered Institute of Marketing.
CUPA-HR Strong – together!
Market Position Statements. About IPC We work for well run evidence based public care We are part of Oxford Brookes University We work with national and.
The Power of Collaboration Models of Collective Impact
Website: Bologna Secretariat Transparency Tools in The European Higher Education Area Viorel Proteasa 2010.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved Chapter The Future of Training and Development.
1. 2 Introduction Industry trends Engagement models Governance Innovation Case Study Summary & Wrap Up Agenda.
Becoming a Digital Nation Digital Index---Arts & Culture— E-Government initiatives.
How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015.
How are people using altmetrics now? The Snowball Metrics perspective 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics Program.
Snowball Metrics An institutional perspective Anna Clements 5/20/2015 Snowball Metrics Project Partners University of Oxford, University College London,
© Centre for Integral Excellence Sheffield Hallam University Education Community of Practice University of Bergen, January.
Learning and Development Shaping and managing the L&D function
Customer Loyalty Programs – Increasing Customer Loyalty throughout the customer base! Suhail Khan – Director of WW Customer Loyalty Program – FileNet Corporation.
WAGGGS Policy & Guidelines: Adult Training, Learning and Development
Snowball Metrics A standard for research benchmarking between institutions 1 Dr Lisa Colledge Director of Research Metrics, Elsevier and Snowball Metrics.
CERIFy Snowball Metrics Output from meeting in London on 20 June 2013 between Brigitte Joerg Anna Clements Lisa Colledge.
Theme Managing self Competency Problem solving and decision making Use appropriate media to communicate.
1 Consultative Meeting on “Promoting more effective partnership between INGOs and other CSOs” building on Oxfam’s “Future Roles of INGO in Cambodia”, 24.
Toolbox CRC programme managers – Dag Kavlie, RCN Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009.
Presented by Linda Martin
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
BPK Strategic Planning: Briefing for Denpasar Regional Office Leadership Team Craig Anderson Ahmed Fajarprana August 11-12, 2005.
Commissioning Self Analysis and Planning Exercise activity sheets.
1 The role of Government in fostering competitiveness and growth Ken Warwick Deputy Chief Economic Adviser UK Department of Trade and Industry.
Snowball Metrics Slides from : Anna Clements, University of St Andrews Lisa Colledge, Elsevier Stephen Conway, University of Oxford Keith Jeffery, euroCRIS.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha
Customer Loyalty Programs – Increasing Customer Loyalty throughout the customer base! Paul Knott– Customer Services Director EMEA Response Center Paul.
The Access Initiative and the Partnership for Principle 10 World Resources Institute Nathalie Eddy.
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
Setting the context: Full costing and the financial sustainability of universities Country Workshop: POLAND EUIMA – Full Costing Project University of.
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
Unit-5 TQM culture Presented by N.Vigneshwari.  Culture is “the sum total learned beliefs, values, and customs that serve to direct the consumer behavior.
INCITES TM INSTITUTIONAL PROFILES David Horky Country Manager – Central & Eastern Europe Informatio Scientifica / Informatio.
THOMSON REUTERS PROFESSIONAL SERVICES. THOMSON REUTERS PATENT CONTENT 98% of world’s filed patents.
The National Skills Academy for Rail (NSAR) Neil Robertson - CEO.
The Strategy Map Presentation Templates
Christina Lohr Product Manager Research Metrics, Elsevier
Snowball Metrics – providing a robust methodology to inform research strategy – but do they help? John Green, Chair of Snowball Steering Group formerly.
Is there another way besides accreditation?
Optimize your research performance using SciVal
The ERA.Net instrument Aims and benefits
Swedish Life Cycle Center is a center of excellence for the advance of applied life cycle thinking in industry and other parts of society.
STRATEGIC PLAN.
Government Finance Function
Presentation transcript:

iREG Forum on University Rankings May 2013 Dr Lisa Colledge Snowball Metrics Program Director Snowball Metrics 1

Snowball Metrics are… Endorsed by a group of distinguished UK universities to support their strategic decision making Tried and tested methodologies that are available free-of-charge to the higher education sector Absolutely clear, unambiguous definitions enable apples-to-apples comparisons so universities can benchmark themselves against their peers to judge the excellence of their performance Snowball Metrics are unique because: Universities drive this bottom up Academia – industry collaboration 2

Trends in Research Management 3 Growing recognition of the value of data/metrics to inform and monitor research strategies, to complement but not replace existing methods Unless you have [data] you cannot make informed decisions; you would be acting based on opinions and hearsay. Frustration over the lack of a manageable set of standard metrics for sensible measurements [There is little] thought leadership and knowledge development around best practice. Frequent similar data requests from external bodies looking at performance in a way that is not necessarily of most value to universities themselves The principle drivers for our systems are often external… but they shouldnt be… a research strategy should… be developed… to respond to our strengths and the external environment, our systems should be defined to run our business.

University-driven (bottom-up) benchmarking is very important 4 This report recommended that universities and funders should work more collaboratively, and develop stronger relationships with suppliers Universities need to benchmark themselves to know their position relative to their peers, so they can strategically align resources to their strengths and weaknesses Universities should work together more to make their collective voice heard by external agencies. The lack of a long-term vision makes it hard to… co-operate within a university let alone across the sector. Suppliers do not know what research offices do on a daily basis. How educated are we at asking suppliers the right questions?

Snowball Metrics addresses university-driven benchmarking Someone needs to take ownership of the process: it is impossible to please all of the people all of the time so somebody needs to be strong enough to stand behind decisions and follow through. 5 Snowball Metrics Project Partners It would be great if the top five [universities] could collaborate

The project partners… Agree a pragmatic approach from the point of view of the research office Endorse metrics to generate a dashboard that supports university strategy Draw on and combine university, proprietary and third party / public data Ensure that the metrics can be calculated, and in the same way by universities with different systems and data structures 6

Main roles and responsibilities Everyone is responsible for covering their own costs University project partners – Agree the metrics to be endorsed as Snowball Metrics – Determine methodologies to generate the metrics in a commonly understood manner to enable benchmarking, regardless of systems Elsevier – Ensures that the methodologies are feasible when applied to real data, prior to publication of the recipes to share with the sector – Distribute the recipes using our communications networks – Day-to-day project management of the global program Outside the remit of the Snowball Metrics program – Nature and quality of data sources used to generate Snowball Metrics – Provision of tools to enable the global sector to generate and use Snowball Metrics 7

Snowball Metrics are feasibile 8

Metrics can be size-normalised 9

Metrics can be sliced and diced 10

Recipe Book shares the methods with the sector free of charge 11 Input Metrics - Applications Volume - Awards Volume Process Metrics - Income Volume - Market Share Output Metrics - Scholarly Output - Citation Count - h-index - Field-Weighted Citation Impact - Publications in Top Percentiles - Collaboration University and discipline levels only First set of Snowball Metrics

Elsevier and Snowball Metrics 12 Declaration from the project partners Agreed and tested methodologies… are and will continue to be shared free-of- charge None of the project partners will at any stage apply any charges for the methodologies Any organisation can use these methodologies for their own purposes, public service or commercial (Extracts from Statement of intent, October 2012) Universities are also requesting the provision of calculated metrics Some organisations do not want to use the recipe book themselves, and are approaching Elsevier for help to implement and use Snowball Metrics Elsevier charges for our support in this, and can offer the metrics in a Custom Report, and in Pure (Current Research Information System) We plan to continue to build commercial tools to help any universities who want to adopt Snowball Metrics but prefer not to generate them in house

Global benchmarking 13 Research InputsResearch Processes Research Outputs and Outcomes Research Post-Graduate Education Enterprise Activities Research applications Research awards Research income Publications & citations Collaboration (co- authorship) Impact / Esteem Post-graduate research Post-graduate experience Industrial income and engagement Contract turnaround times Industry research income Patenting Licensing income Spin-out generation / income Completion rates People Organisations Themes / Schemes Researchers Role Institution Institutional unit External groupings Funder type Award type Subject area / keywords Denominators Slice and dice Normalise for size Numerators Denom. Vision: Snowball Metrics drive quality and efficiency across higher educations research and enterprise activities, regardless of system and supplier

Achieving the vision Continue to agree, test and share new metrics to illuminate research and enterprise activities Facilitate adoption by the sector by translating the metrics into standard data formats that can be easily understood by systems Ensure that Snowball Metrics support global benchmarking as far as possible 14 Vision: Snowball Metrics drive quality and efficiency across higher educations research and enterprise activities, regardless of system and supplier

Global vs national standards for benchmarking Snowball Metrics start life with a national perspective – currently UK The aim is to promote all aspects of Snowball Metrics as far as possible to a global standard 15 UK metrics Country 2 metrics Country 1metrics Illustrative only, testing underway Common core where benchmarking against global peers can be conducted. Aim is to make this as big as possible Shared features where benchmarking between Countries 1 and 2, but not UK, can be conducted e.g. regional benchmarking National peculiarity can support benchmarking within Country 1, but not globally i.e. national benchmarking

Possible end point of a metric 16 Version enabling global benchmarking Multiple versions enabling regional benchmarking Multiple versions enabling national benchmarking e.g. Discipline represented by a universal journal classification e.g. Discipline represented by subject mapping of a regional body e.g. Discipline represented by the UKs HESA Cost Centres Increasing circle of peers for benchmarking Illustrative only, testing underway

THANK YOU FOR YOUR ATTENTION! Contact Dr Lisa Colledge or