RDA-WDS Publishing Data IG Data Bibliometrics Working Group.

Slides:



Advertisements
Similar presentations
NATIONAL AERONAUTICS AND SPACE ADMINISTRATION 1 NASA Earth Science Data Systems (ESDS) Software Reuse Working Group CEOS WIGSS-22 Annapolis, MD September.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Portfolio Management, according to Office of Management and Budget (OMB) Circular A-16 Supplemental Guidance, is the coordination of Federal geospatial.
State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
Sharing research data: expectations of research funders Nature Publishing Group meeting 14 November 2014 Dave Carr Wellcome Trust
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
IRUS-UK: Improving understanding of the value and impact of institutional repositories Ross MacIntyre, Mimas Service Manager Munin Conference, November.
DARE: building a networked academic repository in the Netherlands ICOLC October 25 Ronald Dekker Delft University of Technology Library.
Connect communicate collaborate View on eResearch 2020 study Draft report on “The Role of e-Infrastructures in the Creation of Global Virtual Research.
Supporting education and research E-learning tools, standards and systems Sarah Porter Head of Development, JISC.
Graffiti Reporting A partnership of Local and State Government; My Local Services App enhancements.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Institutional Perspective on Credit Systems for Research Data MacKenzie Smith Research Director, MIT Libraries.
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
RDA Wheat Data Interoperability Working Group Outcomes RDA Outputs P5 9 th March 2015, San Diego.
Other responses: Librarian 3 Data Scientist/data manager/data analyst 7 Student/assistant 2 Writer/Editor/publications support 3 Programme Manager 1 Computer.
1Hydra Connect 2: Working Group Framework Empowering the Community through a Framework for Interest Groups and Working Groups Robin Ruggaber University.
DATA FOUNDATION TERMINOLOGY WG 4 th Plenary Update THE PLUM GOALS This model together with the derived terminology can be used Across communities and stakeholders.
Update on the VERSIONS Project for SHERPA-LEAP SHERPA Liaison Meeting UCL, 29 March 2006.
المحاضرة الثالثة. Software Requirements Topics covered Functional and non-functional requirements User requirements System requirements Interface specification.
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
Group 1 Case Study Presentation Proposal for Open Access (OA) Library Leadership Institute 2014.
Providing Access to Your Data: Tracking Data Usage Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
RDA Data Foundation and Terminology (DFT) IG: Introduction Prepared for RDA 6 th Plenary Paris, Sept. 25, 2015 Gary Berg-Cross, Raphael Ritz Co-Chairs.
CAUL Strategic Plan Review 2003 Objectives & Actions.
Paris Project Meeting January 2012 Item – Statistics Objective 5 B. Proia With financial support from Criminal Justice Programme 2008 European Commission.
NIEM Domain Awareness June 2011 Establishing a Domain within NIEM.
1 CONCERT 2004 Power to the Librarian Delivering Transparency in the Serials Market Doug McMillan Managing Director Bowker UK Ltd.
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
1 Monitoring & Evaluation Plan Bea iSTEP Monitoring – Why? Are we on the right track? Do we need corrective action? What are the problems? Are.
VERSIONS Project Workshop London School of Economics and Political Science 10 May 2006.
Domain Modeling In FREMA David Millard Yvonne Howard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University of Southampton, UK.
 Copyright 2005 Digital Enterprise Research Institute. All rights reserved. Semantic Web services Interoperability for Geospatial decision.
Where are the rewards? University of Melbourne 28 January
"How much?": Aggregating usage data from Repositories in the UK Jo Lambert, Ross Macintyre, Paul Needham, Jo Alcock OR2015.
1 ALLmetrics. Not Altmetrics. Know What is Going on with Your Research Jan Luprich Regional Sales Manager, EBSCO
Where are the rewards? Building a culture of data citation workshop Edith Cowan University, Perth March
Software Sustainability Institute Software Attribution can we improve the reusability and sustainability of scientific software?
Presentation to Legal and Policy Issues Cluster JISC DRP Programme Meeting 28 March 2006.
Freelib: A Self-sustainable Digital Library for Education Community Ashraf Amrou, Kurt Maly, Mohammad Zubair Computer Science Dept., Old Dominion University.
NMI End-to-End Diagnostic Advisory Group BoF Fall 2003 Internet2 Member Meeting.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
Deepcarbon.net Xiaogang (Marshall) Ma, Yu Chen, Han Wang, John Erickson, Patrick West, Peter Fox Tetherless World Constellation Rensselaer Polytechnic.
1. 2 Rewards are real … but few (yet) 3 The citation benefit intensified over time... ...with publications from 2004 and 2005 cited 30 per cent more.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
The future of Statistical Production CSPA. 50 task team members 7 task teams CSPA 2015 project.
Seeking SC Feedback on Draft Technology Strategy and Roadmap for EarthCube Draft of 3 November 2015 The Technology and Architecture Committee (TAC) Chairs:
RESEARCH DATA ALLIANCE BIBLIOMETRICS FOR DATA SURVEY RESULTS.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Arlington, VA March 31, 2004 Presentation for the Advisory Committee for Business & Operations Effective Practices Research Overview For Merit Review This.
Documentation, Data, and Monitoring Section 9 Page 38 1.
An adoption phase for RDA WGs?. Background WGs end after 18 months WGs (and some IGs) produce outputs, but adoption of these outputs often only takes.
UKRISS Landscape Study 28 June 2012 Simon Waddington Centre for e-Research King’s College London 1.
The Data Sharing Working Group 24 th meeting of the GEO Executive Committee Geneva, Switzerland March 2012 Report of the Data Sharing Working Group.
Implementation recommendations 1st COPRAS review Presentation at 2nd COPRAS annual review, 15 March 2006, CEN/CENELEC meeting centre, Brussels Bart Brusse.
Open Science (publishing) as-a-Service Paolo Manghi (OpenAIRE infrastructure) Institute of Information Science and Technologies Italian Research Council.
Altmetrics: Where are we and where are we going?.
Measuring Your Research Impact Citation and Altmetrics Tools University Libraries Search Savvy Seminar Series April 9 & 10, 2014 Prof. Amanda Izenstark.
Open Ag Data : Landscape Analysis ●Who is involved in collecting data on agricultural investments, and from whom? ●How is data publicly shared? Which.
Global Water Information Interest Group meeting RDA 7 th Plenary, 1 st March 2016, Tokyo Global Water Information Interest Group Welcome to the inaugural.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Beyond the Repository: Research Systems, REF & New Opportunities William J Nixon Digital Library Development Manager.
PIRUS PIRUS -Publisher and Institutional Repository Usage Statistics
Measuring Scholarly and Public Impact: Let’s Talk Metrics
Summit 2017 Breakout Group 2: Data Management (DM)
Altmetric Indicators vs. Bibliometric Indicators
Measuring Your Research Impact
Interoperable Repository Statistics
Presentation of the project and its activities
Presentation transcript:

RDA-WDS Publishing Data IG Data Bibliometrics Working Group

2  Essentially, we need to understand the impact and value of the data being shared and distributed, much like we gather and capture the value of traditional article or journal publishing.  Assessment is something that every domain, every researcher, every funder, and every administration is interested in.  The lack of a framework for assessment is a barrier to greater data sharing. There is no answer to the question: “What is in it for me to share my data?” Why do we need Data Bibliometrics?

3  The overall objective of this working group is to conceptualize data metrics and corresponding services that are suitable for overcoming existing barriers.  These new metrics and services are thus likely to:  Initiate a cultural change among scientists,  Encouraging more and better data citations,  Augmenting the overall availability and quality of research data,  Increasing data discoverability and reuse,  Facilitating the reproducibility of research results. The Goal of the Data Bibliometrics WG

4  Chair(s): Sarah Callaghan, Todd Carpenter, John Kratz, Kerstin Lehnert  Working group members: 63 from more than 20 countries.  Conducted a survey, other data gathering, description of the landscape, and identification of key areas for focus About the Working Group

5 Summary of Survey of current status/opinions on data bibliometrics What do you currently use to evaluate the impact of data? What is currently missing and/or needs to be created for bibliometrics for data to become widely used? (n-92) 1) Standards 2) Data Citation 3) Consistent use of PIDs/DOIs 4) Culture change/“A belief that they are valid”

6 Types of metrics MetricProsCons Data citation1.Most advanced method for data 2.Fits in with existing metrics 3.Operational systems already exist 4.Researchers “understand” what citation counts mean 1.Treats datasets as special cases of articles—while data is far more dynamic 2.Lag in getting counts due to publication delays 3.Citation counts overloaded with connotations of “quality” when they don’t actually measure that Repository download statistics 1.Quick response 2.May have better understanding of usage (depending on registration/reporting requirements) 3.Each repository can collect its own statistics 4.Technology is already in use and is mature 1.Not centralized, so if data is in more than one repository, counts need to be amalgamated 2.Nonstandard measures of counting across repositories 3.Extra information on intended use is not captured if data is completely open 4.Can’t be used to determine usage for large datasets where server-side processing is the norm Social media1.Quick response 2.Captures interest by nonacademic communities 1.It is uncertain how social media mentions map to usage— tweets can indicate useful case studies, but a large number of tweets doesn’t correlate with usefulness to the community 2.Not valued by researchers at present Reference manager bookmarks 1.Correlate with citations 2.Quicker to determine than citation counts 1.No standards for importing data citations into reference managers 2.Data repositories don’t make it easy (e.g., using a one-click button) to import citations

7 Who these metrics impact? Actor (who) Data producers Research funders Repository managers Data users Requirements (what) Want to know who else is using the data they’ve produced and how they’re using it Want a transparent, unbiased way of evaluating how important or useful a dataset is to the community Want to know which datasets they hold are of the most value to the community Want to encourage data producers to deposit data Want to discover quickly and easily which are the “best” datasets for addressing their research problems

8 Comparison of related metrics usage

9  National Information Standards Organization (NISO)  Draft recommendations for new assessment forms  Definitions and use cases document  Assessment for non-traditional outputs, such as research data, software  Data Collection Code of Practice  Group C Draft released on Feb 25, 2016,  Group A Draft due by March 4, 2016  Group B draft due by March 17, 2016  Seeking Public comment and reaction to these drafts.  One of the core needs identified by Bibliometric WG survey was the needs for standards in this space. Endorsements/Adopters

10  California Digital Library Making Data Count (NSF funded)  Partners: California Digital Library, PLOS, DataONE Project page: mdc.lagotto.io Data-level metrics prototype: dlm.datacite.org Software: github.com/lagotto/lagottomdc.lagotto.iodlm.datacite.orggithub.com/lagotto/lagotto  JISC Giving Researchers Credit for their Data  the early stages of its inception and Phase 1 steps, GRCD relied on evidence produced by the WGs to prove the need for the service  Successful in raising funds for further Phase.  This will enable them to go live in reality, to extend their engagement efforts, and investigate potential to develop the app into a service in its own right Endorsements/Adopters

11  CASRAI Dataset Level Metrics Group  Initiating efforts to deliver outputs that support database- and vendor- neutral interoperability of information about research data between repositories, publishers, academic administrators, funding agencies and researchers  Coordinating with NISO Altmetrics B Working group  re3data.org schema Includes properties, at the data repository level, about citationReference, metrics, and citationGuidelineURL Endorsements/Adopters

12  Publishers, data centers, researchers and administrators all need assessment metrics  Most of our outputs are informational resources, but there are some adoptions and implementations.  Contact those specific projects to get engaged How You Can Endorse

13  Evolution on the topic of assessment is continuing to evolve and will continue to develop over time.  You can always join and participate in the Data Publishing IG, where the ongoing discussion will continue  For more information:  Kerstin Lehnert -  John Kratz -  Todd Carpenter –  Sarah Callaghan - Next Steps and Contact Information