Research Indicators for Open Science

Slides:



Advertisements
Similar presentations
Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
Advertisements

Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
Håkan Carlsson Gothenburg University Library Bibliometrics – A Tool in the Evaluation of Science.
Publishing strategies – A seminar about the scientific publishing landscape Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE.
The evaluation of research units at HCERES
Final evaluation of the Research Programme on Social Capital and Networks of Trust (SoCa) 2004 – 2007: What should the Academy of Finland learn.
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
XXX_DECRIPT_MON00/1 Quality and impact of Social Science and Operations Research by the Special Programme in Human Reproduction Department of Reproductive.
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Why is Swedish science not performing as well as expected? Staffan Karlsson, Royal Institute of Technology The 18th Nordic Workshop on Bibliometrics and.
Dr Ritva Dammert Director Brussels May 27, 2009 Evaluation of the Finnish Centres of Excellence Programmes
María Amor Barros del Río Gender as content in research in Horizon 2020 GENDER AS CONTENT IN RESEARCH IN HORIZON 2020 CAPACITY BUILDING WORKSHOP FOR RESEARCHERS.
ESF Member Organisation Forum Science in Society Relationships Inproving interaction with society – urge for strategy & action ESOF2012 session.
Assessment of Technology Options 1 Naomi Radke, seecon international GmbH.
Bibliometrics and Publishing Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE School of Education and Communication in Engineering.
It’s the data that makes a paper Joerg Heber Executive Editor Nature Communications.
Karin Henning Bibliometric Services Gothenburg University Library Bibliometrics – an introduction to indicators and analyses.
Open Science and Research – Services for Research Data Management © 2014 OKM ATT 2014–2017 initiative Licenced under.
Welcome to EGI Community Forum 2014 May 19 th, 2014 Anita Lehikoinen Permanent Secretary.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
 All the 16 Finnish universities enter publications to their own publication registers and submit the information to the Finnish Ministry of Education.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
PAA on Scientific Data and Information Roberta Balstad Chair, PAA Panel.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Beyond the Repository: Research Systems, REF & New Opportunities William J Nixon Digital Library Development Manager.
Research data storage service IDA © 2016 OKM ATT 2014–2017 initiative Licensed under Creative Commons BY 4.0Creative.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Towards integrating European research information
Research data storage service IDA
Chairman, Danish Council for Research Policy
Nordic Workshop on Data Citation Policies and Practices: How to Make it Happen? ATT Forum Workshop Pekka Orponen Finnish Committee on Research.
INTERNATIONALISATION OF SOUTH AFRICAN RESEARCH AND INNOVATION
Bibliometrics as a pathway to research strategies
Statistics to Meet Policy Needs: The Labour of Sisypho
Head of Publishing, University of Jyväskylä
EU Expert Group Altmetrics
European VIRTA pilot – current situation
European Publication Information Service - a Pilot Project
Publishing software and data
European VIRTA pilot – eurooppalaisen julkaisutietovirran pilotointi
European Publication Information Service - a Pilot Project
Research data finder Etsin
What Does Responsible Metrics Mean?
Introducing da|raSearchNet
Brussels, 15 September 2009.
SwafS Ethics and Research Integrity
Chapter 8 Developing an Effective Ethics Program
Evaluation Activities
Objectives, activities, and results of the database Lituanistika
PEMPAL Internal Control Working Group– 45th IACOP Meeting
Third International Seville Conference on Future-Oriented Technology Analysis (FTA): Impacts and implications for policy and decision-making 16th- 17th.
Goal-Driven Continuous Risk Management
Raising the bar Meeting Europe’s future challenges
Open Science The Finnish approach Henriikka Mustajoki
SwafS Ethics and Research Integrity
Third International Seville Conference on Future-Oriented Technology Analysis (FTA): Impacts and implications for policy and decision-making 16th- 17th.
Goal-Driven Software Measurement
Power to the People? Committing the Scholarly Community to the Development of Open Science in Finland ILIDE Conference, Jasná, Slovakia, Pekka.
Bibliometric Services at the Masaryk University
Research Data: Infrastructure, Re-use and Dark Knowledge
Main recommendations & conclusions (1)
Professor David Eastwood
Our vision Knowledge creates a sustainable world
Research Information =Descriptive infromation, metadata, on e.g. publications, research data, projects, researchers, research groups and organizations.
FAIR Across – Implementation of FAIR into research practice
Presentation transcript:

Research Indicators for Open Science Nordic Workshop on Data Citation Policies and Practices: How to Make it Happen? 23 November 2016, Helsinki Research Indicators for Open Science Anu Nuutinen Senior Science Adviser Academy of Finland anu.nuutinen(at)aka.fi © ACADEMY OF FINLAND 2016

Academy of Finland in a nutshell Four research councils Biosciences and Environment Culture and Society Natural Sciences and Engineering Health Key public funding agency for scientific research, major player in science policy in Finland * To support scientific research and research careers * To develop research environments Finnish Research Infrastructure Committee employees Strategic Research Council Funding in 2015 140 €405m* © ACADEMY OF FINLAND 2016 | FOR EXCELLENCE IN SCIENCE

Open science at the Academy of Finland Recommended archives and storage services FSD, FIN-CLARIN, CERN Zenodo, EUDAT, AVAA, Etsin, IDA Data management plans required Opening of data and methods required Depends on research ethics & law Openly available publications required Funding for publishing costs provided Academy is committed to Finnish Open Science and Research Roadmap 2014-2017* Results must be made public * To improve overall quality and impact of research and promote the good scientific practice © ACADEMY OF FINLAND 2016

Openness is a key principle in science Promotes the reproduction of scientific results. Openness, in practice, improves the overall quality and impact of research. Open science forms part of the principles of good scientific practice. The goal is to make research publications, data and material, metadata and methods widely available for further use. Academy is committed to considering the promotion of open science as one of the criteria in funding decisions. http://www.aka.fi/en/funding/responsible-research/open-science/ © ACADEMY OF FINLAND 2016

The proposed actions advancing open science must be described in the grant proposal Research plan Publication plan & description, justification, collection, use of material & ethical issues Data management plan Management, storage, access, opening, rights of the research data or brief account if there is no data Research organisations and research infrastructures are expected to support open access publishing and the delivery of open research data and methods © ACADEMY OF FINLAND 2016

Evaluation of grant proposals and funding decisions Evaluation of quality: Ethical aspects and open science (no rating) The international review panels are asked to give their assesment of the planned open science activities Are there any ethical issues involved and, if so, how are they taken into account? What is the intended level of open access to research results? Is the data management plan worked out in a sufficient way? Funding decisions The Academy pays attention to the scientific evaluation, and in addition to science policy objectives. The science policy objectives also include the promotion of open science. © ACADEMY OF FINLAND 2016

Scientific reporting after the funding period The follow up of suggested open science activities is planned to be implemented also in the Academy’s new scientific reporting At the moment bibliometric analyses are based on scientific publications; there is no comprehensive enough data available to analyse data citations Publication data can be mapped with Web-of-Science- based data BUT: The quality of reported publication data of pivotal importance Correct and complete bibliographic information! Allows a small-scale bibliometric analysis of the scientific impact of Academy funded research © ACADEMY OF FINLAND 2016

On the path towards analysing data citations What are we assessing? Amount and types of activity (Quantity, publication volume) Scientific impact (visibility, interest within the scientific community) Impact indicators alone do not provide a reliable overall picture of the level of research. Add a useful perspective to the analysis of scientific impact. Collaboration (eg. international collaboration, national collaboration, no collaboration) Quality The role of peer review © ACADEMY OF FINLAND 2016

Citation analyses: What needs to be taken into account? Disciplinary differences Various types of research data Disciplinary differences in citation practices Citation databases (citation indexing services) Overall coverage of data outputs Automated tracking of data outputs a necessary requirement Unique identifiers, DOI Credit must be assigned correctly to the researchers who produced the data output Data cleaning especially important in organisation level analyses © ACADEMY OF FINLAND 2016

Research indicators: Methodological issues Normalisation of the number of citations By research field, publication year Comparison to the international level in the research field Self citations often removed when anaysing scientific impact Counting method Whole counting of publications Fractionalisation between countries, organisations, research fields The fractional counting method leads to a more proper field normalisation of impact indicators Fairer comparisons between organisations active in different fields Importance of methodological research on research indicators © ACADEMY OF FINLAND 2016

Conclusions Openness improves the overall quality and impact of research Disciplinary differences need to be acknowledged when designing and applying reserach indicators Methodological research on research indicators is needed Responsible use of research indicators is very important Metrics has its limitations! They need to be acknowledged and taken into account when interpreting the indicators and drawing conclusions. © ACADEMY OF FINLAND 2016