How to make the most of your publications in the humanities?

Slides:



Advertisements
Similar presentations
VO Sandpit, November 2009 Data Citation, Principles and Practice Sarah DataCite Annual Conference, 2014.
Advertisements

OpenAIRE & OA in H2020 Open Access Infrastructure for Research In Europe Inge Van Nieuwerburgh Gwen Franck.
Interuniversity Center for Educational Research and Advanced Training Paolo Tosato, Juliana Raffaghelli European Distance and E-Learning Network Teachers’
VOA3R Virtual Open Access Agriculture & Aquaculture Repository: sharing scientific and scholarly research related to agriculture, food, and environment.
Librarians as a Resource for African Journals Partnership Project (AJPP) Journals Christine Wamunyima Kanyengo
Joint Declaration of Data Citation Principles Notes [1] CODATA 2013: sec 3.2.1; Uhlir (ed.) 2012, ch 14; Altman &
MJM22 Digital Practice and Pedagogy Week 9 Collaboration Tools.
Introduction to the Semantic Web and Linked Data
NIH BioCADDIE / Force11 Data Citation Pilot Kickoff Meeting Nine Zero Hotel, Boston MA, 3 February 2016 Introduction: Tim Clark, Maryann Martone and Joan.
Joint Declaration of Data Citation Principles (Overview) The Data Citation Synthesis Group Joint Declaration.
CMMI Certification - By Global Certification Consultancy.
RCUK Policy on Open Access: Terms and Compliance Repositories Support Project Event London, May 2013 Mari Williams BBSRC.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Stages of Research and Development
Critical Information Literacy
NRF Open Access Statement
Jeff Moon Data Librarian &
CESSDA SaW Training on Trust, Identifying Demand & Networking
Publishing DDI-Related Topics Advantages and Challenges of Creating Publications Joachim Wackerow EDDI16 - 8th Annual European DDI User Conference Cologne,
Demonstrating Scholarly Impact: Metrics, Tools and Trends
DSA and FAIR: a perfect couple
PLOS Facilitating Text & Data Mining The Role the Publisher Can Play
Usage scenarios, User Interface & tools
From the old to the new… Towards better resource discoverability
SHARE: A Public Good to Increase Scholarly Innovation
VOA3R Virtual Open Access Agriculture & Aquaculture Repository: A platform for sharing scientific and scholarly research related to agriculture, aquaculture.
Research software best practices: Transparency, credit, and citation
National planning for Open Research euroCRIS 2017, 30 May 2017
Open Access : Challenging the norm in Academia
Publishing software and data
W. Christopher Lenhardt
Institutional role in supporting open access, open science, open data
Working with your archive organization Broadening your user community
Publisher-Driven Preprints
Introducing the Publishing Data Services WG
Value proposition for the app
Rebecca Lawrence Managing Director, F February 2018
Introduction to Implementing an Institutional Repository
Open Science at the Royal Society Dr Stuart Taylor Publishing Director
What, why and best practices in open research
Sophia Lafferty-hess | research data manager
Results from surveys on peer review & peer feedback
Experiences of the Digital Repository of Ireland
Open Access to your Research Papers and Data
SFU Open Access Policy Endorsed by Senate January 9, 2017
EOSCpilot Skills Landscape & Framework
What Are Institutional Repositories?
OpenML Workshop Eindhoven TU/e,
EOSC services architecture
The JISC IE Metadata Schema Registry
ASAPbio/HHMI/Wellcome peer review meeting 8 Feb 2018
Open Peer Review: a way towards a more transparent scholarly publishing system Edit Görögh University of Göttingen Hungarian National Workshop on Open.
Your ORCID researcher identity: Enhancing the impact and visibility of your research web presence Research Week. 11th May 2017 Debbie Martindale: Librarian:
Helene Brinken Bootcamp – Day 1
Mission DataCite was founded in 2009 as an international organization which aims to: establish easier access to research data increase acceptance of research.
Common Solutions to Common Problems
Introduction to electronic resources management
Introduction to electronic resources management
AIP Publishing is… Supports the charitable, scientific & educational programs of the AIP Embodies AIP’s strong commitment to researchers worldwide Places.
OPEN ACCESS POLICY Larshan Naicker Rhodes University Library
Jisc Research Data Shared Service (RDSS)
The Next Generation of IRs – enabling closer cooperation & networking
Bird of Feather Session
MSDI training courses feedback MSDIWG10 March 2019 Busan
Environment and Development Policy Section
What is Open Peer Review?
A Research Data Catalogue supporting Blue Growth: the BlueBRIDGE case
Implications of openly licenced resources for librarians
Data + Research Elements What Publishers Can Do (and Are Doing) to Facilitate Data Integration and Attribution David Parsons – Lawrence, KS, 13th February.
FAIR Across – Implementation of FAIR into research practice
Presentation transcript:

How to make the most of your publications in the humanities? FOSTER-DARIAH workshop January 21, Berlin Görögh Edit

Agenda Open peer review in the context of open science Open peer review – alternativ peer review tools and services Peer review to data journals in Humanities

Significance of open science Answering to the current state of scholarly communication: Slow, redundant, wasteful Moved by commercial interest Chaotic state of copyright Crisis of science: Access, reproducibility, serial, evaluation Illusion of scientific freedom Melanie Imming, Jon Tennant. (2018). http://doi.org/10.5281/zenodo.128557 @melimming; quote from @chartgerink

Open Science Training Handbook. https://book.fosteropenscience.eu/

Success of an OA publishing platform Quality control and moderation Certification and reputation Motivation and engagement https://publons.com/static/Publons-Global-State-Of-Peer-Review-2018.pdf Tennant, J. et al. Thinking outside the black box of peer review

Peer review reevaluated Anonymous Closed/Opaque Selective (participation) How different the principle of peer review from its practice? How do the web technologies change our expectations of scholarly communication (publishing, peer review)? Can these technologies chage the critical state of peer review? Can the strong connection between peer review and journal publishing be broken? Objective: Mapping & promoting (Open) peer review mechanisms

Open peer review Open identities Open reports Open participation Open interaction Open pre-review manuscripts Open final-version commenting Open platforms Open peer review is an umbrella term for a number of overlapping ways that peer review models can be adapted in line with the aims of Open Science. Ross-Hellauer, 2017, “What is open peer review? A systematic review”, F1000Research. DOI: 10.12688/f1000research.11369.2

Open identities Authors and reviewers aware of each other’s identity Open reports Review reports published alongside relevant article Open participation Wider community able to contribute to review process Open interaction Direct discussion between author(s)/reviewers, and/or between reviewers Open pre-review manuscripts Manuscripts/pre-prints available online in advance of peer review Open final-version commenting Review or commenting on final “version of record” publications. Open platforms (“decoupled review”) Review is facilitated by a different organizational entity than the venue of publication

Combinations 122 definitions analyzed

Open identities Positives Increase quality of reports Foster transparency to avoid conflicts of interest More civil language (in review and response) Negatives Difficulty in taking and giving critical feedbacks (reviewers might blunt their opinions for fear of reprisals esp. from senior peers) Labor-intensive process

Open reports Negatives Positives Higher refusal rates amongst potential reviewers Time-consuming and more demanding process Fear of being exposed (esp. for early career researchers) Positives Feedback improves work and provide contextual information Giving better feedback - increase review quality Enable credit and reward for review work Help train young researchers in peer reviewing

Open participation Positives Negatives Expanding the pool of reviewers (including to those non-traditional research actors) Support cross-disciplinary dialogue Increase number of reviewers Being part of the debate Negatives Time issue: difficulties motivating commentators to take part and deliver useful critique Self-selecting reviewers tend to leave less “in-depth” responses Feedback from non-competent participants T. Ross-Hellauer / OPR How & Why / PEERE Training School, Split, May 2018 And E. Görögh/OPR workshop results /DARIAH 2018, Paris, May 2018

peer review and commenting   Pre-publication peer review and commenting Decoupled peer review Open Science Training Handbook. https://book.fosteropenscience.eu/ Post-publication peer review Collaborative peer review Interactive peer review

Alternativ peer review tools and services Publishers Publishing platforms Independent review services Repository based review platforms & tools Review/Annotation applications OpenUP. D3.1 Practices, evaluation and mapping: Methods, tools and user needs. http://openup-h2020.eu/wp-content/uploads/2017/01/OpenUP_D3.1_Peer-review-landscape-report-1.pdf

Publishing platforms Interactive peer review Collaborative peer review Post-publication peer review

Decoupled peer review

Preprint based publishing Should researchers publish their findings before peer review? By IVAN ORANSKY and ADAM MARCUS MAY 27, 2016

Annotation/commenting tools Any scientist can publish an assessment of the publications that she / he has read lately in less than one minute, by going to epistemio.com, searching the publication, and adding a rating. Ratings and reviews can be either anonymous or signed, according to authors’ choice.  Epistemio hosts freely these ratings and reviews and provides them under an open access licence. The Hypothesis Project is a new effort to implement an old idea: A conversation layer over the entire web that works everywhere, without needing implementation by any underlying site.

Redefining the roles Changing role of editors Growing responsibility of authors Proactive reviewer stance Involvement of peers Gatekeeping function as a content filter Typically closed system with a secretive and selective process Organised around journals Non-accountable editor-controlled “black box of peer review” Structurally limited (2-3 people) Collaborative, constructive peer review: quality control is achieved by consensus Self-organised, open and unrestricted communities Unrestricted content types and formats Elected ‘moderators’ accountable to communities Semi-automated matching of content to reviewers

Growing demands 1. Transparency Ross-Hellauer T, Deppe A, Schmidt B (2017) Survey on open peer review: Attitudes and experience amongst editors, authors and reviewers. PLoS ONE 12(12): e0189311. https://doi.org/10.1371/journal.pone.0189311 Stančiauskas, V. and Banelytė, V. (2017). OpenUP survey on researchers' current perceptions and practices in peer review, impact measurement and dissemination of research results. Accessed on May 3, 2017: https://doi.org/10.5281/zenodo.556157

Growing demands Incentives to review Crediting peer review Publons, Peerage of Science. Peer review in academic promotion. 3. Training young scholars

OPR in Humanities This visualisation shows journal policies on (1) reviewers signing their peer reviews and (2) peer reviews being published. The information is sourced from Publons. The size of the bubbles on the plot corresponds to the number of journals in each subject for which Publons contains journal information. (2017) https://publons.com/blog/who-is-using-open-peer-review/

Humanities data journal SHARING Using CC license to advance reuse CREDIT Getting credit for curating and publishing data HUMANITIES DATA JOURNAL PEER REVIEW Applying community based quality check REUSE Standardized metadata for enabling reuse of data OPEN Endorsing open science and FAIR principles to publish data

Defining data Data: something to be measured, collected, reported, and analyzed, Data in the humanities: a digital, selective, machine-actionable construction of the object of humanistic inquiry. 2 types of data in the humanities: 1. big data (relatively unstructured, messy and implicit, relatively large in volume, and varied in form), 2. smart data (semi-structured or structured, clean and explicit, as well as relatively small in volume and of limited heterogeneity) (Schoch, 2017) The contextualization of data is needed to understand research data management.

Data storage and sharing Schöpfel J. and Prost H. (2016). Research data management in social sciences and humanities: A survey at the University of Lille (France). Prost LIBREAS. Library Ideas, 29

Data sharing standards FAIR guiding principles for research data stewardship set of principles, focused on ensuring that research objects are reusable rendering data and services Findable, Accessible, Interoperable, to serve the reuse of research objects FAIR simply describes the qualities or behaviors required of data resources to achieve. Data Citation Principles cover purpose, function and attributes of citations recognize the dual necessity of creating citation practices that are both understandable by humans and machine-actionable Importance: data should be considered legitimate, citable products of research. Access: Data citations should facilitate access to the data themselves and to such associated metadata, documentation, code, and other materials, Persistence: Unique identifiers, and metadata describing the data, and its disposition, Interoperability and Flexibility: Data citation methods should be sufficiently flexible to accommodate the variant practices among communities but should not differ so much that they compromise interoperability of data citation practices across communities.

Data evaluation Technical and subject-area review includes assessment of: Data logic Consistency Formatting Non-proprietary (i.e., open sourced/accessible) Plausibility High quality Handling & reuse Units of measurement Quality of collection method Presence of any anomalies Criteria for evaluating data Do the description and data make sense? Do the authors adequately explain the data’s utility to the community? Are the protocol/references for generating data adequate? Data format: is it standard for the field? Potentially re-usable? Does the article follow the required data article template? Is the data well documented? Shaklee, P. and Cousijn, H. (2015). Can data be peer-reviewed? https://www.elsevier.com/connect/can-data-be-peer-reviewed   Enago academy. (2018). Should Data Sets Be Peer Reviewed? https://www.enago.com/academy/should-data-sets-be-peer-reviewed/

Elements The data journal framework should include the following attributes: assignment of persistent identifiers (PIDs) to datasets peer review of data metadata information and technical check links to related outputs (journal articles) facilitation of data citation standards compliance discoverability (indexing of the data)

Research data publication workflow RDA-WDS Publishing Data Workflows Working Group (WG) has developed a data publication process: Austin, C.C., Bloom, T., Dallmeier-Tiessen, S. et al. (2017). Int J Digit Libr 18: 77. https://doi.org/10.1007/s00799-016-0178-2

Benefits Visibility of research Acknowledgement of work (DOI) Linking data to published results Complying with H2020 data mandate Enhancing findability of data (metadata) Finding new collaborations and new research topics Adding to the researchers profile (ORCID, OpenID, VIAF)

Thank you. goeroegh@sub.uni-goettingen.de