Download presentation
Presentation is loading. Please wait.
1
Open Science Dr. Dr.Phil. Rene VON SCHOMBERG
Team Leader-Open science policy coordination and development European Commission DG Research & Innovation A.6-Data, Open Access and Foresight Open Science
2
Open Science: a new approach to the research process
Based on cooperative work and new ways of diffusing and sharing knowledge using digital technologies and new collaborative tools A systemic change to the way science is organised and research is carried out It affects virtually all components of doing science and research, from conceptual work to publishing, from empirical research to data-analysis. Shifting focus from "publishing as fast as possible" to "sharing knowledge as early as possible" 2014 Public consultation on ‘Science 2.0: Science in Transition’ Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, Open Science
3
Open Science – opening up the research process
Data- intensive Citizen science Open tabbooks/ workflow Open code Analysis Data-gathering Publication Open data Pre-print Open annotation Conceptualisation Open access Review Science blog Alternative reputation systems Collaborative bibliographies Open Science Source :
4
An emerging ecosystem of services and standards It's real!
Sci-starter.com Citizens science Data-intensive Open code Runmycode.org Open workflows Pre-print Analysis Publication Review Conceptualisation Data gathering Open data Open access Open annotation Impact Story Alternative Reputation systems Openannotation.org Scientific blogs Collaborative bibliographies
10
Rank : the lowest need (1) to the highest need (11)
11
Five lines of potential policy actions
Fostering and creating incentives for Open Science Removing barriers to Open Science Mainstreaming and further promoting Open Access policies Developing research infrastructures for Open Science Embedding Open Science in society as a socio-economic driver Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, Open Science
12
Open Science: key issues
The European Open Science Cloud Advancing Open Access and Data Policies Alternative systems to evaluate the quality and impact of research Text and Data Mining Towards better, more efficient and more Open Science Fostering Research Integrity Making science more inclusive: Citizen Science Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, Open Science
13
Governance of the European Open Science Cloud
Bottom-up governance Federation Legacy and sustainability Leverage of MS investment Trust Governance layer Scale of scientific activity (data-driven science) IPR and privacy protection Big data analytics Data fusion across disciplines High performance computing Data access and re-use Data manipulation and export Data discovery and catalogue Data and service layer High-speed connectivity Super-Computing Data storage Infrastructure layer Lead scientific users… …Long tail of science Applied-engineering Physics Life sciences Earth sciences Economics Social sciences Humanities Citizen science Other disciplines Open Science Source: DG Research and Innovation (2015)
14
Growth of Open Access Repository Mandates and Policies
Policies Adopted by Quarter Number of Policies Research organisation Sub-unit of research organisation Multiple research organisations Funder Funder and research organisation Open Science Source:
15
Open Science: From Open Access to Open Scholarly Communication
Discovery Analysis Writing Publication Outreach Assessment Elsevier Springer Nature Digital Science Google Wikimedia Public or private initiatives at every level of the research process offering specific services to researchers Layer of "commons" New initiatives allowing the scholarly process to be carried out differently Open Science Source:
16
Towards ‘better science’ – Good, efficient and Open Science
research governance changes declaring competing interests replication & reproducibility meaningful assessment effective quality checks credit where it is due no fraud, plagiarism GOOD connected tools & platforms no publ. size restrictions null result publishing speed of publication (web)standards, IDs semantic discovery Re-useability versioning open peer review open (lab)notes plain language open drafting open access CC-0/BY technical changes & standards economic & copyright changes EFFICIENT OPEN Open Science Source:
17
Open Science Policy Platform and European Open Science Agenda
May 2016 Competitiveness Council: "NOTES the establishment of the Open Science Policy Platform by the Commission, which aims at supporting the further development of the European Open Science policy and promoting the uptake by stakeholders of best practices, including issues such as adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics, guiding principles for optimal reuse of research data, development and use of standards, and other aspects of open science such as fostering research integrity and developing citizen science"; Commissioner Moedas will inform the Council biannually on advances of the Platform (which consist of 25 Key stakeholders-European Branch Organisations)
18
Optimal re-use of Research Data
Follow-Up by Stakeholders, EC and MS: 1. As of 2017, Open Data is the default option under H2020- Data Management Plans will be mandatory 2. Evaluation of MS advances on Open Data will be necessary 3. Evaluation of MS advances on Open Data will be necessary; Expert Group on FAIR data will advise DG RTD in course of 2016 Competitiveness Council: 1.Make research data produced by H2020 open by default 2.Encourage MS to Promote data stewardship and implement data management plans 3.Encourage MS and Commission to follow FAIR principles in research programmes and funding mechanisms
19
European Open Science Cloud
Competitiveness Council: "CALLS on the Commission, in cooperation with Member States and stakeholders, to explore appropriate governance and funding frameworks' European Commission-Follow up of the April 2016 Communication on a European Cloud Initiative: Commission will need to have a roadmap for funding of European Open Science Cloud by end of 2016 which requires consultation of Member States.
20
Open Science Policy Platform
European Open Science Agenda: OA publishing models FAIR open data Science Cloud Alternate metrics Rewards & careers Education & skills Citizen Science Research integrity … ERA & framework conditions for actors: European Charter for researchers Code of conduct for Research Integrity Charter for Access to Research Infra … European Commission Open Science Policy Platform DSM & framework conditions for data: Copyright - TDM Data Protection Free Flow of Data … Wide input from stakeholders: ad-hoc meetings and workshops e-platform with wider community reports and independent experts advice EG on open science cloud EG on altmetrics EG on alt. business models for OA publishing EG on FAIR open data opinions context
21
European Open Science Policy Agenda (1)
Foster Open Science: Creating incentives, e.g. Establish an Open Science Policy Platform Promote best practices Launching a European Open Science Monitor Promote a discussion on evaluation criteria of research, prepare for next framework programme All potential actions – under consideration Establish a Policy Platform at European Level and a self-regulation/ clearinghouse mechanism for addressing Open Science issues Co-develop of Open Science Policy (Promote Best Practices, Research Integrity, Citizen Science etc.)
22
European Open Science Agenda (2)
Remove barriers, e.g. European Copyright and Data Protection revisions: foresee appropriate exceptions for research activities - TDM Development of 'alternative' metrics Propose a European "code of conduct" Address low open data-skills amongst researchers and the underuse of professional support (librarians, repository managers etc.) Encourage the research on and collaborative development of 'alternative' metrics – and ultimately their wider use: A growing concern is that researchers should (also) be assessed based on more pertinent criteria than the traditional journal Impact Factor This is particularly relevant to support the career of young scientists Propose a European "code of conduct" setting out the general principles and requirements of how Open Science should affect the roles, responsibilities and entitlements of researchers and of their employers
23
Flash Report EU Expert Group Altmetrics First release September 2016
Next-generation altmetrics: responsible metrics and evaluation for open science Flash Report EU Expert Group Altmetrics First release September 2016
24
EU expert group members
James Wilsdon, University of Sheffield (chair); Judit Bar-Ilan, Bar-Ilan University; Robert Frodeman, University of North Texas; Elizabeth Lex, Graz University of Technology; Isabella Peters, Leibniz Information Centre for Economics; Paul Wouters, Leiden University
25
Aims /1 assess changing role of (alt)metrics in research evaluation
consider how altmetrics can be developed to support open science engage stakeholders consider implications of metrics for: diversity and equality interdisciplinarity research cultures gaming :
26
Aims /2 examine implications of:
emerging social networks research information systems citation profiles to develop a framework for responsible metrics for research qualities and impacts for evaluation of Horizon 2020 and for wider use in the next framework programme to consider required data infrastructures
27
Across the research community, the description, production and consumption of ‘metrics’ remains contested and open to misunderstandings.
28
Quantitative evaluation should support expert assessment.
Measure performance in accordance with the research mission. Protect excellence in locally relevant research Keep data collection and analytical processes open, transparent and simple. Allow for data verification Account for variation by field in publication and citation practices Data should be interpreted taking into account the difficulty of credit assignment in the case of multi-authored publications. Base assessment of individual researchers on qualitative judgment. False precision should be avoided Systemic effects of the assessment and the indicators should be taken into account and indicators should be updated regularly
30
Responsible metrics Robustness: basing metrics on the best possible data in terms of accuracy and scope; Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment; Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results; Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths; Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response.
31
Ambitions for Open Science
More comprehensive measurement of traditional scientific publications (eg Mendeley) Recognizing and capturing the diversity of scientific output including new forms (eg software and blogs) Opening up the whole scientific publication system (open access) and more interactive communication Opening up the very core of knowledge creation and its role in higher education and innovation (participatory science)
32
Measuring is changing What counts as excellence is shaped by how we measure and define “excellence” What counts as impact is shaped by how we measure and define “impact” Qualities and interactions are the foundation for “excellence” and “impact” so we should understand those more fundamental processes first We need different indicators at different levels in the scientific system to inform wise management that strikes the right balance between trust and control Context crucial for effective data standardization
33
Call for evidence /1 strong support for development and research of open metrics and altmetrics metrics should complement not replace human judgment of quality altmetrics currently not yet ready for routine use in assessment EU should help develop public sector based metrics diversity key criterion for metrics
34
Call for evidence /2 portfolios of metrics for societal interaction and impact urgently needed open standards for data and indicator infrastructure context should prevail over technical standards reflexive protection against gaming strategies strong support for Metrics Tide and Leiden Manifesto principles portfolios of indicators to support open science
35
Report outline Metrics: technical state of the art
Use of metrics in policy and practice Data infrastructures and open standards Cultures of counting, ethics and research Next generation metrics: the way forward More information & updates on the progress of the expert panel can be found here: m?pg=altmetrics_eg
36
To conclude with some problems…
Good Metrics for Science 'equals' good metrics for Open Science? -Impacts of research is becoming more important, but what is a good impact? -metrics can never directly measure 'impact' and 'excellence'(whatever the definition)- are metrics not more useful for what they are not created for? Final thesis: Responsible metrics resembles responsible research ( See Von Schomberg, A Vision of Responsible Research and Innovation)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.