Barriers to Data Sharing: New Evidence from a U.S. Survey Amy Pienta George Alter Jared Lyle University of Michigan IASSIST 2010 June 2, 2010 3:45-5:15pm.

Slides:



Advertisements
Similar presentations
Dr. John E. Niederhuber Director, National Cancer Institute Board of Scientific Advisors June 22, 2009 NCI Director’s Update.
Advertisements

Roberta Spalter-Roth, Ph.D Director of Research American Sociological Association Enhancing Diversity in Science: Working Together to Develop Common Data,
Overview Background information on AREA program Applying for an AREA grant Funding for the AREA program Updates to the program Strategies for successfully.
OPEN ACCESS PUBLICATION ISSUES FOR NSF OPP Advisory Committee May 30, /24/111 |
NIH Public Access Policy What it means to OHSU Researchers Presented by: Andrew Hamilton Date: 10/22/2009.
Assessing Institutional Effectiveness: The Mission Engagement Index (MEI) as a Measure of Progress on Mission Goals Ellen Boylan, Ph.D. ● Marywood University.
Survey of Earned Doctorates National Science Foundation Division of Science Resources Statistics Mark Fiegener, Ph.D. Presentation to Clemson University.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
Investing in Human Capital Underrepresented Racial Minorities’ Intentions to Attend Graduate School in STEM Fields Kevin Eagan Christopher Newman University.
The LEADS Database at ICPSR: Identifying Important Social Science Studies for Archiving Presentation prepared for 2006 Annual Meeting of the IASSIST Friday,
NSF Data Management Plan Requirements Alex Kanous
Pey-Yan Liou and Frances Lawrenz Quantitative Methods in Education of the Department of Educational Psychology, University of Minnesota Abstract This research.
NIH Public Access Policy What it means to OHSU Researchers Presented by: Andrew Hamilton Date: 3/18/2007.
Institutional Perspective on Credit Systems for Research Data MacKenzie Smith Research Director, MIT Libraries.
1 NIH Public Access Policy Policy on Enhancing Public Access to Archived Publications Resulting From NIH-Funded Research (Public Access Policy)
NANCY ABBOTT CANR GRANTS DEVELOPMENT OFFICER [GDO] CANR Sponsored Project Proposal Process.
Research Data Management Philip Tarrant Global Institute of Sustainability.
Evaluating NSF Programs
1 Sally J. Rockey, PhD Deputy Director for Extramural Research National Institutes of Health NIH Regional Seminar on Program Funding And Grants Administration.
Gender, Educational and Ethnic Differences in Active Life Expectancy among Older Singaporeans Angelique Chan, Duke-NUS Rahul Malhotra, Duke-NUS David Matchar,
1 E-911 Services Board Meeting General Business Meeting May 08,
Complying with the NIH Public Access Policy: Depositing manuscripts in PubMed Central Julie Speer, Lori Critz, Michelle Powell Office of Organizational.
Grant Writing Workshop for Research on Adult Education Elizabeth R. Albro National Center for Education Research.
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
1 Sex/Gender and Minority Inclusion in NIH Clinical Research What Investigators Need to Know! Presenter: Miriam F. Kelty, PhD, National Institute on Aging,
Responsible Conduct of Research (RCR) What is RCR? New Requirements for RCR Who Does it Affect? When? Data Management What is the Institutional Plan? What.
Frances Lawrenz and The Noyce evaluation team University of Minnesota 1 Acknowledgement: This project was funded by National Science Foundation (Grant#REC )
Overview of Data Infrastructure (with a Bit Extra about Social Science) Myron Gutmann Assistant Director, National Science Foundation Directorate for the.
Academic Research Enhancement Award (AREA) Program Erica Brown, PhD Director, NIH AREA Program National Institutes of Health 1.
A 40 Year Perspective Dr. Frank Scioli NSF-Retired.
ResearchFish Lisa George, Project Monitoring Officer Silvia Haycox, Research Funding Manager Project Support Office The University of Opportunity.
Luci Roberts, Director of Planning and Evaluation Katrina Pearson, Assistant Director, Division of Statistical Analysis and Reporting Sally Amero, NIH.
The Data Management Plan Amanda Rinehart, Data Librarian Illinois State University The Data Management Plan by Amanda K Rinehart is licensed under a Creative.
Collection Management Strategies in a Digital Environment Cecily Johns CMI Project Director August 2001.
Publishing Trends: Open the University of Florida Presentation to IDS 3931: Discovering Research and Communicating Science October 21, 2010.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
1 QEM/BIO Workshop October 21, 2005 Award Administration.
Workshop for all NSF-funded PIs regarding new NSF policies and requirements. America COMPETES Act contains a number of new requirements for all those funded.
Getting Started With Your ATE Evaluation ATE PI Conference October 24, 2012 This material is based upon work supported by the National Science Foundation.
1 Rock Talk: NIH Support of Biomedical Research NIH Regional Seminar on Program Funding And Grants Administration Sally J. Rockey, Ph.D. NIH Deputy Director.
PACSCL Consortial Survey Initiative Group Training Session February 12, 2008 at The Historical Society of Pennsylvania.
Faculty Early Career Development Program (CAREER) Joanne Tornow BIO/MCB Chair, CAREER Coordinating Committee.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Martin Halbert University of North Texas DLF Forum, Baltimore, MA Tuesday, November 1, 2011.
Presented by the College of Arts & Sciences with the Office of Contracts and Grants University of San Francisco April 2012.
NSF policies and requirements for Implementation of the America COMPETES Act. America COMPETES Act contains a number of new requirements for all those.
FEA CONSORTIUM MEETING: FEA ROLE IN COLLEGE-LEVEL PLANNING October 31, 2011 ADAPP ‐ ADVANCE Office of the Provost Michigan State University 524 South Kedzie.
Limited Submissions NCURA Region III Spring Meeting.
Accreditation Overview Winter 2016 Mallory Newell, Accreditation Liaison Office.
Working with Data at its Source: Partnering with Researchers to Share Their Data for Archiving and Discovery Ron Nakao – Stanford University Libraries.
THE INSTITUTIONAL REVIEW BOARD. WHAT IS AN IRB? An IRB is committee set up by an institution to review, approve, and regulate research conducted under.
University Archives Collections Building Research Connections for Teaching and Learning.
Nevada Mathematics and Science (MSP) Program Grants Technical Assistance Meeting November 2014.
Phase II Financial Review Guidance
The Pathway to Independence: Early
Ruth Geraghty Data Curator Children’s Research Network
Finding funding Fall 2015.
Using Administrative Data for Federal Program Evaluation
Research with human participants at Carnegie Mellon University
Athena Swan at Liverpool Hope
“SPARC” Support Program for Advancing Research and Collaboration
Investing in Innovation (i3) Fund
Snapshot of the Clinical Trials Enterprise as revealed by ClinicalTrials.gov Content downloaded: September 2012.
Ruth Geraghty Data Curator Children’s Research Network
Information Session January 18, :00-1:45 pm
FY18 Water Use Data and Research Program Q & A Session
Information Session October 13, 2016
Updated NIH Public Access Policy
Presentation transcript:

Barriers to Data Sharing: New Evidence from a U.S. Survey Amy Pienta George Alter Jared Lyle University of Michigan IASSIST 2010 June 2, :45-5:15pm Auditorium

Acknowledgements Supported by the Office of Research on Research Integrity (National Library of Medicine) National Science Foundation And also, NICHD, Library of Congress

Research Questions Does data sharing lead to greater research productivity? What types of factors (PIs, Grants, Institutions) are associated with data sharing & research productivity? Net of structural characteristics of grants awards, does data sharing lead to greater research productivity? Is there one metric of research productivity that shows the value of data sharing?

NSF Data Sharing Policy National Science Foundation Important Notice 106 (April 17, 1989) states: "[NSF] expects investigators to share with other researchers, at no more than incremental cost and within a reasonable time, the primary data, samples, physical collections, and other supporting materials created or gathered in the course of the research. It also encourages awardees to share software and inventions or otherwise act to make such items or products derived from them widely useful and usable."

NIH Data Sharing Policy The NIH expects and supports the timely release and sharing of final research data from NIH-supported studies for use by other researchers. Starting with the October 1, 2003 receipt date, investigators submitting an NIH application seeking $500,000 or more in direct costs in any single year are expected to include a plan for data sharing or state why data sharing is not possible.

Research Productivity & Data Traditionally conceptualized as one’s publications and citation of that original work Data sharing as a measure of research productivity  Data life cycle - Moving from the original purpose of the data to uses beyond that which the data were intended to be used for. Primary and secondary publications

Prior Results - LEADS Data # Records Reviewed # Social Science Data Recent NSF (1976+) 17,1942,537 Historic NSF (Pre-1976) 96,4034,019 NIH (1972+)172,1966, ,79312,937

Sources of Information NSF NIH

Sample Inclusion Criteria-LEADS Social science and/or behavioral science Original or primary data collection proposed, including assembling a database from existing (archival) sources

PI Survey Conducted in 2009, web survey Awards beginning ,217 responses (24.9% response rate) 86.6% PIs report having collected social science data

Publication Measures Self-reported Total publications (.92 correlation with staff measure) Secondary Publications  Number of publications without any research team member Primary Publications  number of publications that include PI as one of the authors

Data Sharing Status Formal data sharing through a data archive or repository (11.5%) Informal data sharing (website, upon request) (44.6%) Not shared (43.9%)

PI Characteristics Gender (48.1% female) Race (86.8% white) Age (mean 43.4) Rank (Senior 54.3%; Junior 25.7%; Non-Fac 20%) Discipline (Psy/hlth 62.5%, core soc science 25.5%, other 12%) Number of Federal Grants 6.2

Institutional Characteristics Carnegie classification  research university (78.7%)  non research (6.5%)  PRO (12.4%)  Other (2.5%) U.S. Region  NE (36%)  Midwest (23.7%)  South (21.6%)  West (18.7%)

Grant Characteristics NSF (27.3%) versus NIH 3.1 years duration

Bivariate Results Who shares data (in an archive) Women Senior faculty Core social science (esp. more than psy/health) Northeast (esp. more than south) Research Universities and PROs NSF (22.4% compared to 7.4% of NIH)

Median Publication Counts

Analytic Models Negative binomial regression models of publication counts  Offset by time between initial award and survey year

Results-Total Publication Count 3 times the # primary publications when data are archived 2 3/4 times the publications when data are shared informally Not explained by PI, Institutional, or Grant Award Characteristics. Older PIs (at time of award) greater research productivity (RP). Health/Psychology -> less RP (vs. core social science) Non-research university produced data -> less PR (vs. research univ). Longer awards -> More RP

Results-Secondary Pub. Count 12 times the primary publications when data are archived 10 3/4 times the publications when data are shared informally. Not explained by PI, Institutional, or Grant Award Characteristics (reduced by 1/2). Older PIs (at time of award) greater research productivity (RP). Health/Psychology & Others -> less RP (vs. core social science) PRO -> more RP (vs. research univ).

Results - Primary Publications 2 times the primary publications when data are archived 2 times the publications when data are shared informally. Not explained by PI, Institutional, or Grant Award Characteristics. Older PIs (at time of award) greater research productivity (RP). Health/Psychology -> more RP (vs. core social science) Non-Res & PRO -> less RP (vs. research univ) NIH -> more RP than NSF Longer awards more RP

Conclusions Data sharing is relatively infrequent across the social and behavioral sciences. When data are shared, formally or informally, research productivity is higher. This is true even after PI, Award, and Institutional characteristics are accounted for. When data are archived, the return on the investment is the highest. Secondary publications is the most sensitive to data sharing. But, all outcome measures show advantages to data sharing.

Limitations Causality - unclear if data sharing actually leads to more primary publications… Size of award is not controlled for

Future work Evaluation of ARRA funding at NSF and NIH  Natural experiment - ARRA award period compared to prior years  Publication Outcomes  Other Non-publication Outcomes