Presentation is loading. Please wait.

Presentation is loading. Please wait.

OpenUP – Revisiting Peer Review, Dissemination & Impact Measurement

Similar presentations


Presentation on theme: "OpenUP – Revisiting Peer Review, Dissemination & Impact Measurement"— Presentation transcript:

1 OpenUP – Revisiting Peer Review, Dissemination & Impact Measurement
OPEN SCIENCE -- KNOWLEDGE FOR ALL Lisbon, 29 March 2016 Wolfram Horstmann University of Göttingen Göttingen State and University Library (With contributions by Birgit Schmidt) 1

2 usage Traditional disseminate-review-impact processes: How are these affected by Open Scholarship? citations (open) peer review JIF social networks SNIP SCJ ALM Altmetrics h-index blogs reporting Source Normalized Impact per Paper (SNIP) SCImago Journal Rank (SJR)

3 Impact assessment: Some History
Impact-factor ‚obsession‘ is still ongoing Impact-factor developed 1955 onwards Introduced 1972 for librarians to select journal subscriptions As much embraced as attacked for its simplicity Main Criticisms Defines impact on a journal and not on an article level A linear scale of research quality is suggestive Prone to ‚gaming‘ Numerous other metrics & service providers now available CWTS Journal Indicators, Scimago Journal Rank, h-index, altmetrics

4 Impact assessment: Some history
Interviewer „I often hear ... that any prestigious journal should have a JIF of ‚2‘ or higher...“ Eugene Garfield „It is simply an arbitrary number; I can‘t justify it scientifically.“

5

6 Reviewing assessment practice
DORA: San Francisco Declaration on Research Assessment American Society for Cell Biology, December 2012 Not to use journal-based metrics to judge the quality of a paper Leiden Manifesto April 2015, by a group of research assessment experts Best-practice recommendations for metrics-based research assessments Summarized in 10 principles, e.g. #1 Quantitative evaluation should support qualitative, expert assessment #4 Keep data collection and analytical processes open, transparent and simple #5 Allow those evaluated to verify data and analysis #6 Account for variation by field in publication and citation practices #10 Scrutinize indicators regularly and update them

7 The Metric Tide A growing tide due to growing pressures for audit and evaluation Demand for more strategic intelligence, competition within and between institutions Research community uses metrics but with too much emphasis on narrow indicators Peer review, despite its flaws and limitations, still has widespread support across disciplines Inappropriate indicators create perverse incentives Indicators only meet potential if underpinned by an open and interoperable infrastructure Responsible metrics = robust + transparent + supplementing qualitative assessment + accounting for variation by field + recognising and anticipating systematic and potential effects of indicators + updating indicators in response

8 OpenUP project facts Opening up new methods, indicators and tools for peer review, dissemination of research results, and impact measurement Project start: June 2016 (30 months) Partners Public Policy and Management Institute (PPMI) – Coordinator University of Göttingen University of Athens University of Amsterdam Know-Center, Graz Austrian Institute of Technology German Centre Higher Education Research & Science Studies (DZHW) Frontiers Media CNR

9 The review-disseminate-assess process
What‘s missing? A common understanding of new methods, implementation good practices as well as policy implications, with a view towards Responsible Researhc and Innovation (RRI) principles) What‘s to be done in OpenUP? Facilitate an open dialogue and define a framework, validate mechanisms through pilots, offer trainings and provide practical policy guidelines and recommendations

10 Implementation Peer review Validation through community-driven pilots
Innovative dissemi-nation Policy recommen-dations Image: Metrics & Indicators

11 Objectives Engage the community & raise awareness for quality Responsible Research & Innovation (RRI) review-disseminate-assess processes Explore, analyse and promote (open) peer review mechanisms Explore and promote innovative methods of dissemination & communication Define research metrics and indicators for different stakeholders Validate the OpenUP framework with community-driven pilots Produce cohesive policy recommendations to complement the European RRI activities

12 OpenAIRE Open Access Infrastructure for Research in Europe
Fostering and furthering Open Scholarship in Europe and beyond Human Network Digital Network 50 Partners: Data centres, universities, libraries, repositories from all EU countries and beyond

13 A European research information infrastructure
A mini EU-CRIS system Discovery Metadata Validation Cleaning De-duplicating Inferring Linking Organiz-ations Projects Authors Datasets Publications Data Providers Crowdsourcing Funding Info Monitoring CRIS systems Reporting Literature Repositories Full text APIs OA Journals Classification Clustering Analysis Evaluation Data Repositories Usage data Impact Zenodo Data Providers OpenAIRE Platform Services

14 New modes of research and scholarly communication
COAR aims to facilitate the vision by bringing together research repositories as part of a global infrastructure; to link across continents and around the world, enabling new forms of research and supporting new models of scholarly communication. Standards, policies, infrastructure, strategy, interoperability, support, awareness, integration… Research libraries have a central role to play. Inspired by Tim Berners Lee

15 …will enhance the provision, visibility and use of research outputs
Our vision A global knowledge infrastructure, built upon a network of open access digital repositories …will enhance the provision, visibility and use of research outputs

16 Factsheet • COAR e.V., a registered not-for-profit association of repository initiatives according to German law, Office seat: Göttingen, DE, Host: Göttingen State and University Library Founded in Ghent, Belgium, October 21, 2009 (28 members), evolved out of the European DRIVER Project (EC, FP7) One Annual Meeting of all members with General Assembly Members & Partners (April 2015): about 100 member institutions (out of over 35 countries in all continents); 8 partner organizations Elected Executive Board: Chairperson: Eloy Rodrigues, University of Minho, Portugal Vice Chairperson: Carmen-Gloria Labbé, Cooperación Latinoamericana de Redes Avanzadas (CLARA), Uruguay Treasurer: Márta Virágos, University and National Library of Debrecen, Hungary Oya Rieger, Cornell University, USA Wolfram Horstmann, Göttingen State and University Library, Germany Daisy Selematsela, National Research Foundation, South Africa

17 Summary & Conclusion Observations on research assessment Open Science
A rising tide of impact maesurement Critique of unidimensional indicators New and qualitative indicators on the rise Open Science ... requires new impact measures to assess new methods ... provides the opportunities to apply them Helping initiatives underway OpenUP explores pilots, practice and policies OpenAIRE provides information infrastructure COAR extends towards global reach

18 Outlook: Impact is multi-dimensional
Source: BBSRC / UK


Download ppt "OpenUP – Revisiting Peer Review, Dissemination & Impact Measurement"

Similar presentations


Ads by Google