EU Expert Group Altmetrics

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

URBACT II Building Healthy Communities 1 st Steering Group Meeting Brussels, 9-10 June 2008 An overview.
Delivering effective enterprise education: the role of learning design and technology Professor Pauric McGowan University of Ulster Dr Richard Blundel.
Centre for Irish and European Security "Societal Security R&D” Perspectives, Conclusions and Recommendations from Workshop of July 1st 2010 Purpose of.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
“I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider.
Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
 HEFCEmetrics. “I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The.
The evaluation of research units at HCERES
Regulatory Transparency and Interaction with the Government Dr. Konstantin Petrov Head of Section, Policy and Regulation.
ITGS Standard Level Mr Gavin Johnson. ITGS The Diploma Programme information technology in a global society (ITGS) course is the study and evaluation.
HORIZON 2020 The EU Framework Programme for Research and Innovation Societal Challenge 6 Topics under DG CONNECT H3 responsibility European Commission,
Deep Impact [Factor] Altmetrics from a PhD perspective Jon Tennant PhD student, Imperial College London Seeking employment.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Session Chair: Peter Doorn Director, Data Archiving and Networked Services (DANS), The Netherlands.
2012 National Partnerships Schools’ Forum Margery Evans CEO, AITSL ~ Leadership for Learning ~
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Paul Griffiths and Roland Simon Wrap-up presentation What has the EMCDDA learned ?
María Amor Barros del Río Gender as content in research in Horizon 2020 GENDER AS CONTENT IN RESEARCH IN HORIZON 2020 CAPACITY BUILDING WORKSHOP FOR RESEARCHERS.
Strategic Approaches to Improving Ethical Behavior
ESF Member Organisation Forum Science in Society Relationships Inproving interaction with society – urge for strategy & action ESOF2012 session.
Sha.p.e.s. Conference – Sevilla 1st October 2013 INNODEC/Interreg IIIC IN-EUR/Interreg IVC Measuring INnovation among EURopean Subregions ADVANCED LOCAL.
LIVING LAB OF GLOBAL CHANGE RESEARCH
Principles of Good Governance
Research Indicators for Open Science
Metrics What they are and how to use them
Eric Peirano BRIDGE Support Team, Technofi
Demonstrating Scholarly Impact: Metrics, Tools and Trends
MGMT 452 Corporate Social Responsibility
Disciplinary structure and topical complexity in SSH—the IMPACT EV mission Sándor Soós, András Schubert, Zsófia Vida.
Professor Harry Scarbrough
Observations and Lessons from OECD/NEA Activities in Stakeholder Involvement Summary of Experience from the NEA Committee on Radiation Protection and.
Accounting (Foundation)
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Evaluating Better Care Together
Environmental Health Management (EN481)
EUROPEAN COMMISSION DG Employment and Social Affairs Jerome Vignon
DRAFT Standards for the Accreditation of e-Learning Programs
Open Science Dr. Dr.Phil. Rene VON SCHOMBERG
FDA Guidance for Industry and FDA Staff Summary of Public Notification of Emerging Postmarket Medical Device Signals (“Emerging Signals”) Effective: December.
Business environment in the EU Prepared by Dr. Endre Domonkos (PhD)
European Network on teacher Education Policies
Altmetrics 101 LITA Altmetrics & Digital Analytics Webinar
What Does Responsible Metrics Mean?
Metrics: a game of hide and seek
EU Reference Centres for Animal Welfare
OpenUP – Revisiting Peer Review, Dissemination & Impact Measurement
THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent
EOSC Governance Development Forum
Information Technology (IT)
Communication and Consultation with Interested Parties by the RB
Overview of working draft v. 29 January 2018
Evaluation for open science - towards a new evaluation protocol
Towards Excellence in Research: Achievements and Visions of
Extractive Industries and Water Governance in the Nile Basin, now and in 2030: Lessons from Upstream around Lake Victoria in Tanzania   Donald Kasongi,
SwafS Ethics and Research Integrity
An Introduction to STAGES
Workshop 1: PROJECT EVALUATION
THE INSPECTION SYSTEM AND THE SCHOOL EXTERNAL EVALUATION
ORIENTATION TO THE FINNISH CULTURE AND EDUCATIONAL SYSTEM
Raising the bar Meeting Europe’s future challenges
SwafS Ethics and Research Integrity
Applied Software Project Management
Bibliometric Services at the Masaryk University
6th Framework Programme on Research
Research on Climate Change on Water, including Natural Hazards Contribution to SSG discussions and science-policy interfacing Philippe QUEVAUVILLER European.
Achieving coexistence with large carnivores in the EU
Socio-Economic Impact of ESS
Presentation transcript:

EU Expert Group Altmetrics Next-generation altmetrics: responsible metrics and evaluation for open science EU Expert Group Altmetrics

EU expert group members James Wilsdon, University of Sheffield (chair); Judit Bar-Ilan, Bar-Ilan University; Robert Frodeman, University of Texas; Elizabeth Lex, Graz University of Technology; Isabella Peters, Leibniz Information Centre for Economics; Paul Wouters, Leiden University Team Leader-Open science policy coordination and development: Rene von Schomberg

Aims /1 assess role (alt)metrics in research evaluation consider how altmetrics can be developed for open science engage stakeholders consider implications metrics for: diversity and equality interdisciplinarity research cultures gaming :

Aims /2 examine implications of: emerging social networks research information systems citation profiles explore altmetrics for impacts, research actions, in Horizon 2020 and in next framework programme consider required data infrastructures

The Leiden Manifesto Quantitative evaluation should support expert assessment. Measure performance in accordance with the research mission. Protect excellence in locally relevant research Keep data collection and analytical processes open, transparent and simple. Allow for data verification Account for variation by field in publication and citation practices Data should be interpreted taking into account the difficulty of credit assignment in the case of multi-authored publications. Base assessment of individual researchers on qualitative judgment. False precision should be avoided (eg. the JIF).  Systemic effects of the assessment and the indicators should be taken into account and indicators should be updated regularly

Responsible metrics Robustness: basing metrics on the best possible data in terms of accuracy and scope; Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment; Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results; Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths; Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response. The Metric Tide

Measuring is changing What counts as excellence is shaped by how we measure and define “excellence” What counts as impact is shaped by how we measure and define “impact” Qualities and interactions are the foundation for “excellence” and “impact” so we should understand those more fundamental processes first We need different indicators at different levels in the scientific system to inform wise management that strikes the right balance between trust and control Context crucial for effective data standardization

Report outline Introduction Metrics: technical state of the arts Use of metrics in policy and practice Data infrastructures and open standards Cultures of counting, ethics and research Next generation metrics: the way forward

Traditional metrics Based on citation and publication counts are not sufficient Citations take time to accumulate IF often used as a proxy for citation count h-index Disciplinary differences in publication and citation culture Ignore societal impact DORA, Leiden Manifesto, Metric Tide

Altmetrics Intend to capture and measure additional aspects of scholarly information Altmetrics Manifesto New forms of communication reflect and transmit scholarly impact Expand our view of what impact looks like Altmetric platforms Increased visibility of researchers/publications Exposing research to the public Involving the public Discussion/commenting Readers vs. authors Altmetric events can be measured/counted

Altmetrics - challenges Coverage Transparency Validity Dynamics Disciplinary differences Gaming Acceptance Research community Decision makers :