Shifting the research culture toward openness and reproducibility

Slides:



Advertisements
Similar presentations
Good data practices Jelte M. Wicherts 1. 2 Source: Wicherts, J. M. (2011). Psychology must learn a lesson from fraud case. Nature, 480, 7.
Advertisements

Improving Integrity, Transparency, and Reproducibility Through Connection of the Scholarly Workflow Andrew Sallans Partnerships Lead Center for Open Science.
Open Science Framework: Supporting the research worflow Brian Nosek University of Virginia.
Engaging Faculty in Prevention Efforts Ellen Bass Associate Professor.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
Data Access and Research Transparency in Political Science Journals John Ishiyama Professor of Political Science & Editor in Chief American Political Science.
DA-RT What publishing houses can and can’t do Patrick McCartan Publishing Director, Social Science and Humanities Journals Cambridge University Press.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Breakout Groups Goal Format Demo Pitch. Overview Monday – 3-6p Breakouts Tuesday – 9-12p Pitches (10 min, 10 discussion) – 2-6p breakouts Wednesday –
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Open Science Framework Jeffrey Center for Open Science | University of Virginia.
Beyond the Repository: Research Systems, REF & New Opportunities William J Nixon Digital Library Development Manager.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Stages of Research and Development
Improving Openness and Reproducibility of Scientific Research
David Preregistration David
Evaluating the effectiveness of open science practices
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Increasing openness, reproducibility, and prediction in social science research My general substantive interest in the gap between values and practices.
What is Open Science and How do I do it?
Transparency increases credibility and relevance of research
Improving Openness and Reproducibility of Scientific Research
Improving Openness and Reproducibility of Scientific Research
Chapter 1 The Science of Biology.
Center for Open Science: Practical Steps for Increasing Openness
Lorne Campbell University of Western Ontario
Creating an Academic Presence
Three points 1. Scientists’ Conflict of Interest 2
Practical Steps for Increasing Openness and Reproducibility
Improving Openness and Reproducibility of Scientific Research
Practical Steps for Increasing Openness and Reproducibility
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Psi Chi’s Network for International Collaborative Exchange (NICE)
Open Science Framework
An Open Science Framework for Managing and Sharing Research Workflows
Achieving Open Science
The culture of scientific research in the UK
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Reinventing Scholarly Communication by Separating Publication From Evaluation Brian Nosek University of Virginia -- Center for Open Science
Shifting incentives from getting it published to getting it right
Policy and publishing developments for sharing data and code
Chartered College of Teaching

Improving Openness and Reproducibility of Scientific Research
Modularity and Interoperability
Methods Hour: Preregistration
Study Pre-Registration
Disrupting Scholarly Communication
Open Science at the Royal Society Dr Stuart Taylor Publishing Director
Staff Feedback Forum 3pm-5pm, 22 March 2017
What, why and best practices in open research
School of Psychology, Cardiff University
We put students first..
Extending “Scholarship” to Including Teaching in a Digital World
Judy MIELKE, PhD. Taylor & Francis
Badges to Acknowledge Open Practices
ASPB ASPB is a professional society devoted to the advancement of plant biology through supporting the plant science community The Plant Cell, Plant Physiology,
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

Shifting the research culture toward openness and reproducibility Brian Nosek University of Virginia -- Center for Open Science http://briannosek.com/ -- http://cos.io/

Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012

Norms Counternorms Communality Universalism Disinterestedness Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Communality – open sharing with colleagues; Secrecy Universalism – research evaluated only on its merit; Particularism – research evaluated by reputation/past productivity Disinterestedness – scientists motivated by knowledge and discovery, not by personal gain; self-interestedness – treat science as a competition with other scientists Organized skepticism – consider all new evidence, theory, data, even if it contradicts one’s prior work/point-of-view; organized dogmatism – invest career in promoting one’s own most important findings, theories, innovations Quality – seek quality contributions; Quantity – seek high volume Anderson, Martinson, & DeVries, 2007

Anderson, Martinson, & DeVries, 2007

What is good for me, the scientist versus What is good for science Psychological COI What is good for me, the scientist versus What is good for science

Signals: Making Behaviors Visible Promotes Adoption Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014) Kidwell et al., 2016, PLOS Biology

40% 30% % Articles reporting that data was available 20% 10% 0%

Options with Badges Adopt! Collaborate on Randomized Trial 100 journals Randomly assign to offer badges or not Assess after a year

http://cos.io/top

TOP Guidelines Data citation Design transparency Research materials transparency Data transparency Analytic methods (code) transparency Preregistration of studies Preregistration of analysis plans Replication

Data sharing 1 2 3 Article states whether data are available, and, if so, where to access them Data must be posted to a trusted repository. Exceptions must be identified at article submission. Data must be posted to a trusted repository, and reported analyses will be reproduced independently prior to publication.

Some TOP Signatory Organizations AAAS/Science American Academy of Neurology American Geophysical Union American Heart Association American Meterological Society American Society for Cell Biology Association for Psychological Science Association for Research in Personality Association of Research Libraries Behavioral Science and Policy Association BioMed Central Committee on Publication Ethics Electrochemical Society Frontiers MDPI PeerJ Pensoft Publishers Public Library of Science The Royal Society Springer Nature Society for Personality and Social Psychology Society for a Science of Clinical Psychology Ubiquity Press Wiley

Options with TOP Become a signatory! Collaborate on education/adoption campaign with Editors Randomized trial

Context of Justification Confirmation Data independent Hypothesis testing Context of Discovery Exploration Data contingent Hypothesis generating p-values interpretable PREREGISTRATION p-values NOT interpretable Presenting exploratory as confirmatory increases publishability of results at the cost of credibility of results

Preregistration Challenge http://cos.io/prereg/

Options with Prereg Challenge Make more Elsevier journals eligible Co-marketing to promote engagement with Challenge at eligible journals

Registered Reports http://cos.io/rr Design Collect & Analyze Report Publish PEER REVIEW Review of intro and methods prior to data collection; published regardless of outcome Beauty vs. accuracy of reporting Publishing negative results Conducting replications Peer review focuses on quality of methods http://cos.io/rr, Committee Chair: Chris Chambers

Options with Registered Reports Adopt Special issues as pilots Randomized trial (grant application pending)

The Kindergartener’s Guide to Improving Research

1. Show your work 2. Share

http://osf.io

OpenSesame

OpenSesame

Will be added soon: And interest from 50+ research universities and organizations

What can you do? OSF: http://osf.io/ Prereg Challenge: http://cos.io/prereg Training: https://cos.io/our-services/training-services/ Adopt TOP Guidelines: http://cos.io/top/ Hackathon July 11-12. Community meeting: July 12-13.  Themes:  1) Pairing automatic enhancement with expert curation and the creation of tools to support these efforts 2) Pedagogy to develop expert curation of local data and technical skills to get SHARE data into local services (and then give back to SHARE) 3) Accessing (meta)data across the research workflow See below: I hope you'll consider attending SHARE's 2016 Community Meeting during the week of July 11, 2016 in Charlottesville, VA at the Center for Open Science (COS). If you're unfamiliar with SHARE or the free, open dataset we are creating, you can read more at share-research.org.    The meeting will include a hackathon and a working meeting and you are welcome to register for one or both of these components. We welcome a diversity of skills, skill levels, backgrounds, and interests at the hackathon. This diversity is not only welcome, but will result in a better, more impactful event. Please do consider attending both events, whether or not you are an (experienced) programmer. There is no registration fee for this meeting, but we are asking participants to cover their own travel and hotel costs in Charlottesville. We have a room block at the Omni Charlottesville with a rate of $149/night if booked by May 27. You can make your reservation by calling the Omni directly and mentioning the Center for Open Science. We do have a limited budget for travel support if needed. http://www.omnihotels.com/hotels/charlottesville Please fill out the registration form by Friday, April 15: https://arl.formstack.com/forms/share_community_meeting Thank you and please let me know if you have questions or concerns. These slides are shared at: https://osf.io/es6xz/ [take a picture] Email: Support@cos.io or nosek@virginia.edu