Presentation is loading. Please wait.

Presentation is loading. Please wait.

RDA-WDS Publishing Data IG Data Bibliometrics Working Group.

Similar presentations


Presentation on theme: "RDA-WDS Publishing Data IG Data Bibliometrics Working Group."— Presentation transcript:

1 RDA-WDS Publishing Data IG Data Bibliometrics Working Group

2 2  Essentially, we need to understand the impact and value of the data being shared and distributed, much like we gather and capture the value of traditional article or journal publishing.  Assessment is something that every domain, every researcher, every funder, and every administration is interested in.  The lack of a framework for assessment is a barrier to greater data sharing. There is no answer to the question: “What is in it for me to share my data?” Why do we need Data Bibliometrics?

3 3  The overall objective of this working group is to conceptualize data metrics and corresponding services that are suitable for overcoming existing barriers.  These new metrics and services are thus likely to:  Initiate a cultural change among scientists,  Encouraging more and better data citations,  Augmenting the overall availability and quality of research data,  Increasing data discoverability and reuse,  Facilitating the reproducibility of research results. The Goal of the Data Bibliometrics WG

4 4  Chair(s): Sarah Callaghan, Todd Carpenter, John Kratz, Kerstin Lehnert  Working group members: 63 from more than 20 countries.  Conducted a survey, other data gathering, description of the landscape, and identification of key areas for focus About the Working Group

5 5 Summary of Survey of current status/opinions on data bibliometrics What do you currently use to evaluate the impact of data? What is currently missing and/or needs to be created for bibliometrics for data to become widely used? (n-92) 1) Standards 2) Data Citation 3) Consistent use of PIDs/DOIs 4) Culture change/“A belief that they are valid”

6 6 Types of metrics MetricProsCons Data citation1.Most advanced method for data 2.Fits in with existing metrics 3.Operational systems already exist 4.Researchers “understand” what citation counts mean 1.Treats datasets as special cases of articles—while data is far more dynamic 2.Lag in getting counts due to publication delays 3.Citation counts overloaded with connotations of “quality” when they don’t actually measure that Repository download statistics 1.Quick response 2.May have better understanding of usage (depending on registration/reporting requirements) 3.Each repository can collect its own statistics 4.Technology is already in use and is mature 1.Not centralized, so if data is in more than one repository, counts need to be amalgamated 2.Nonstandard measures of counting across repositories 3.Extra information on intended use is not captured if data is completely open 4.Can’t be used to determine usage for large datasets where server-side processing is the norm Social media1.Quick response 2.Captures interest by nonacademic communities 1.It is uncertain how social media mentions map to usage— tweets can indicate useful case studies, but a large number of tweets doesn’t correlate with usefulness to the community 2.Not valued by researchers at present Reference manager bookmarks 1.Correlate with citations 2.Quicker to determine than citation counts 1.No standards for importing data citations into reference managers 2.Data repositories don’t make it easy (e.g., using a one-click button) to import citations

7 7 Who these metrics impact? Actor (who) Data producers Research funders Repository managers Data users Requirements (what) Want to know who else is using the data they’ve produced and how they’re using it Want a transparent, unbiased way of evaluating how important or useful a dataset is to the community Want to know which datasets they hold are of the most value to the community Want to encourage data producers to deposit data Want to discover quickly and easily which are the “best” datasets for addressing their research problems

8 8 Comparison of related metrics usage

9 9  National Information Standards Organization (NISO)  Draft recommendations for new assessment forms  Definitions and use cases document  Assessment for non-traditional outputs, such as research data, software  Data Collection Code of Practice  Group C Draft released on Feb 25, 2016,  Group A Draft due by March 4, 2016  Group B draft due by March 17, 2016  Seeking Public comment and reaction to these drafts.  One of the core needs identified by Bibliometric WG survey was the needs for standards in this space. Endorsements/Adopters

10 10  California Digital Library Making Data Count (NSF funded)  Partners: California Digital Library, PLOS, DataONE Project page: mdc.lagotto.io Data-level metrics prototype: dlm.datacite.org Software: github.com/lagotto/lagottomdc.lagotto.iodlm.datacite.orggithub.com/lagotto/lagotto  JISC Giving Researchers Credit for their Data  the early stages of its inception and Phase 1 steps, GRCD relied on evidence produced by the WGs to prove the need for the service  Successful in raising funds for further Phase.  This will enable them to go live in reality, to extend their engagement efforts, and investigate potential to develop the app into a service in its own right Endorsements/Adopters

11 11  CASRAI Dataset Level Metrics Group  Initiating efforts to deliver outputs that support database- and vendor- neutral interoperability of information about research data between repositories, publishers, academic administrators, funding agencies and researchers  Coordinating with NISO Altmetrics B Working group  re3data.org schema Includes properties, at the data repository level, about citationReference, metrics, and citationGuidelineURL Endorsements/Adopters

12 12  Publishers, data centers, researchers and administrators all need assessment metrics  Most of our outputs are informational resources, but there are some adoptions and implementations.  Contact those specific projects to get engaged How You Can Endorse

13 13  Evolution on the topic of assessment is continuing to evolve and will continue to develop over time.  You can always join and participate in the Data Publishing IG, where the ongoing discussion will continue  For more information:  Kerstin Lehnert - lehnert@ldeo.columbia.edulehnert@ldeo.columbia.edu  John Kratz - johnkratzcdl@gmail.comjohnkratzcdl@gmail.com  Todd Carpenter – tcarpenter@niso.orgtcarpenter@niso.org  Sarah Callaghan - sarah.callaghan@stfc.ac.uksarah.callaghan@stfc.ac.uk Next Steps and Contact Information


Download ppt "RDA-WDS Publishing Data IG Data Bibliometrics Working Group."

Similar presentations


Ads by Google