Presentation is loading. Please wait.

Presentation is loading. Please wait.

Persistent Identifiers, Discoverability, and Open Science

Similar presentations


Presentation on theme: "Persistent Identifiers, Discoverability, and Open Science"— Presentation transcript:

1 Persistent Identifiers, Discoverability, and Open Science
Fiona Murphy (1), Kerstin Lehnert (2), Brooks Hanson (3), and David Mellor (4) Institute for Environmental Analytics, University of Reading, Reading, UK, Lamont-Doherty Earth Observatory, Columbia University, NY, USA American Geophysical Union, Washington DC, USA Center for Open Science, Charlottesville, VA, USA Transparency and Openness Promotion (TOP) Guidelines Eight Standards: Citation Standards Data Transparency Analytical Methods and Code Transparency Research Material Transparency Design and Analysis Transparency Preregistration of Studies Preregistration of Analysis Plans Replication Three Levels: Disclosure, Requirement, Verification Abstract Early in 2016, the American Geophysical Union announced it was incorporating ORCIDs into its submission workflows. This was accompanied by a strong statement supporting the use of other persistent identifiers – such as IGSNs, and the Crossref open registry ‘funding data’. This was partly in response to funders’ desire to track and manage their outputs. However the more compelling argument, and the reason why the AGU has also signed up to the Center for Open Science’s Transparency and Openness Promotion (TOP) Guidelines ( is that ultimately science and scientists will be the richer for these initiatives due to increased opportunities for interoperability, reproducibility and accreditation. The AGU has appealed to the wider community to engage with these initiatives, recognizing that – unlike the introduction of Digital Object Identifiers (DOIs) for articles by Crossref – full, enriched use of persistent identifiers throughout the scientific process requires buy-in from a range of scholarly communications stakeholders. At the same time, across the general research landscape, initiatives such as Project CRediT (contributor roles taxonomy), Publons (reviewer acknowledgements) and the forthcoming Crossref DOI Event Tracker are contributing to our understanding and accreditation of contributions and impact. More specifically for earth science and scientists, the cross-functional Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) was formed in October 2014 and is working to ‘provide an organizational framework for Earth and space science publishers and data facilities to jointly implement and promote common policies and procedures for the publication and citation of data across Earth Science journals’. Clearly, the judicious integration of standards, registries and persistent identifiers such as ORCIDs and International Geo Sample Numbers (IGSNs) to the research and research output processes is key to the success of this venture. However these also give rise to a number of logistical, technological and cultural challenges. This poster seeks to identify and progress our understanding of these. The authors are keen to build knowledge from the gathering of case studies (successful or otherwise) and hear from potential collaborators in order to develop a robust structure that will empower both earth science and earth scientists and enable more nuanced, trustworthy, interoperable research in the near future. Shall we have a list of signatories to either or both of COPDESS and TOPs? American Astronomical Society American Geophysical Union American Meteorological Society Biological and Chemical Oceanography Data Management Office, Woods Hole Oceanographic Institution (BCO-DMO) Center for Open Science CLIVAR and Carbon Hydrographic Data Office (CCHDO) Community Inventory of EarthCube Resources for Geosciences Interoperability (CINERGI) Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Continental Scientific Drilling Coordination Office (CSDCO) COOPEUS Copernicus Publications Council of Data Facilities Dryad Elsevier European Geosciences Union Geochemical Society Geological Data Center of Scripps Institution of Oceanography Geological Society of America Geological Society of London GFZ German Research Centre for Geosciences ICSU World Data System Incorporated Research Institutions for Seismology (IRIS) Interdisciplinary Earth Data Alliance (IEDA) International Continental Drilling Program (ICDP) John Wiley and Sons LacCore: National Lacustrine Core Facility Magnetics Information Consortium (MagIC) Mineralogical Society of America Neotoma Paleoecology Database National Snow and Ice Data Center Nature Publishing Group Nordicana D OpenTopography Paleobiology Database Paleonotological Society Proceedings of the National Academy of Sciences Rolling Deck to Repository (R2R) Program Science Springer UNAVCO Case Studies Case Study 1 [I believe Kerstin described a good example of journal editors removing data citations because they were unaware of the ability to include them and the need to do so. ] Case Study 2 (Fiona: the Research Resource Initiative is a positive story, although it doesn’t deal with earth sciences (it’s neuroscience)?) To Kerstin – are there any data rescue stories that could be added here? Problem Statement The traditional research paper is the only widely accreditable research output. Because of this, researchers face few incentives to make other parts of their research accessible. The data used in publications can often be hard to access. Data collected but never used in a publication are even harder to use. Other parts of the research lifecycle face similar obstacles: if they’re not included in the final publication, they will be lost. Solutions Funders need to recognize a wide range of research outputs (including data collection, management planning and enrichment, reviews, software, etc) as contributing to the knowledge cannon. The goals of reproducibility and interoperability should be woven into research funding calls. Producers of these outputs should be rewarded accordingly. Altmetrics to form part of the assessment framework in a transparent form Researchers to be encouraged to submit a range of outputs for assessment Editors need to know how data can be cited Publishers need to inform authors and editors about data publication, draw up and enact practical policies, and work with partners to ensure robust links between publications and related outputs housed by other platforms. All stakeholders need to share best practices for citing data, and other materials. COPDESS Suggested Author Instructions and Best Practices for Journals The Coalition on Publishing Data in the Earth and Space Sciences (COPDESS) develops and recommends best practices for journal author instructions around data and identifiers as a resource to the community. These best practices are consistent with and based on the COPDESS Statement of Commitment and have been developed with guidance from participants in COPDESS. Data Policy Statement Data Citation Sample Citation and Identification Crossref Funder Registry ORCIDs Data Policy Statement: XXX journal has endorsed the Statement of Commitment of the Coalition on Publishing Data in the Earth and Space Sciences (COPDESS).  All data and software necessary to understand, evaluate, replicate, and build upon the reported research must be made available and accessible at the time of publication as far as possible. Data should, to the greatest extent possible, be stored in appropriate domain repositories that are widely recognized and used by the community, follow leading practices for data curation, and can provide additional data services. Best Practices: Include a prominent statement about data availability and link to COPDESS. Encourage editors, reviewers, and staff to evaluate data and affirm availability. Statement above is one minimal version of data availability requirements.  Other examples are provided here: Software availability should be included as part of data availability. Logistical Challenges This isn’t simply a technical challenge, although it has technical aspects. For reproducibility and interoperability to be meaningful and well-understood, appropriate standards need to be agreed. Amongst these, Persistent Identifiers (PIDs) could well form a key part. However, in order to do so they need to be understood by the respective research communities. In addition, workflows need to be optimized (and as far as possible automated) for accurate use and, if there are implications for workloads, consideration given to resourcing and training requirements. The Center for Open Science Technology to enable change Training to enact change Incentives to embrace change Adoption and Implementation: Next Steps Work with funders to incentivize the sharing, publication and re-use of data We need to work with relevant societies, publishers and with individual journal editors to ensure these policies are widely disseminated and, where necessary, that community responses can refine or adapt them to become more suitable for specific disciplines. Training, researchers, peer review.


Download ppt "Persistent Identifiers, Discoverability, and Open Science"

Similar presentations


Ads by Google