Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist

Slides:



Advertisements
Similar presentations
Improving Integrity, Transparency, and Reproducibility Through Connection of the Scholarly Workflow Andrew Sallans Partnerships Lead Center for Open Science.
Advertisements

Open Science Framework: Supporting the research worflow Brian Nosek University of Virginia.
Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Software Cluster Improve Collaboration and Community Engagement Work with diverse communities that contribute to the sustainability of scientific software.
N By: Md Rezaul Huda Reza n
Making Connections: SHARE and the Open Science Framework Jeffrey Open Repositories 2015.
Choosing Between Data Sharing Repositories for Engineering Linking Open Data cloud diagram, by Richard Cyganiak and Anja Jentzsch.
Nature Reviews/2012. Next-Generation Sequencing (NGS): Data Generation NGS will generate more broadly applicable data for various novel functional assays.
RESEARCH DATA ALLIANCE BIBLIOMETRICS FOR DATA SURVEY RESULTS.
Professor Phillipa Hay Centre for Health Research, School of Medicine.
The SharePoint Shepherd’s Course for End Users Based on the book by Robert L. Bogue Copyright 2011 AvailTek LLC All Rights Reserved.
Open Science Framework Jeffrey Spies University of Virginia.
Outline Announcements: –HW II due today! –HW III on web CVS.
Publication Ethics Webinar: Jan 2016 (Ethical) framework for author-driven publishing Dr Michaela Torkar Editorial Director, F1000Research
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
April Center for Open Fostering openness, integrity, and reproducibility of scientific research.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Breakout Groups Goal Format Demo Pitch. Overview Monday – 3-6p Breakouts Tuesday – 9-12p Pitches (10 min, 10 discussion) – 2-6p breakouts Wednesday –
BERKELEY INITIATIVE FOR TRANSPARENCY IN THE SOCIAL SCIENCES Garret Christensen, Research Fellow BITSS and Berkeley Institute for Data Science.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
The Reproducible Research Advantage Why + how to make your research more reproducible Presentation for the Center for Open Science June 17, 2015 April.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Workshop to Support Data Science Workflows
David Preregistration David
Evaluating the effectiveness of open science practices
Incorporating W3C’s DQV and PROV in CISER’s Data Quality Review and
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
What is Open Science and How do I do it?
Workshop to Support Data Science Workflows
What is Open Science and How do I do it?
Jon Grahe Presented at MPA, April 21st, 2017
Using the Open Science Framework: Basic Introduction
Crisis Schmeisis: Introduction to the Open Science Framework for Scholarship and Teaching Jon Grahe In all my affiliations, there are projects that utilize.
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
The Reproducible Research Advantage
Lorne Campbell University of Western Ontario
Practical Steps for Increasing Openness and Reproducibility
Practical Steps for Increasing Openness and Reproducibility
Translating Open Science into Daily Practice
The OSF is Free and Useful for Research and Teaching
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Psi Chi’s Network for International Collaborative Exchange (NICE)
Achieving Open Science
Jon Grahe, OSF Ambassador
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Preregistration on the Open Science Framework
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Lorne Campbell University of Western Ontario
The Reproducible Research Advantage
Incentives for a more #openscience

Session 4 Open Workflow: OSF and Pre-registration exercise
Study Pre-Registration
Disrupting Scholarly Communication
Pre-Workshop Prep Download example files:
EOSC services architecture
School of Psychology, Cardiff University
Bird of Feather Session
Dataverse for citing and sharing research data
Outline Announcements: Version control with CVS HW II due today!
Open Science & Reproducibility
Dr. Fred Oswald President, SIOP (Div. 14) Rice University
Presentation transcript:

Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist

Today’s webinar Create a study plan Set-up a reproducible project Preregistration 1. Plan for reproducibility before you start Version control Documentation Connect your research services 2. Keep track of things Reporting 3. Contain bias 4. Archive + share your materials 2

What is the problem? 3

What is reproducibility? Scientific method Observation Question Hypothesis Prediction Testing Analysis Replication 4 Computational reproducibility Empirical reproducibility Conceptual reproducibility

What are the barriers?

Why practice reproducibility? The idealist Shoulders of giants! Validates scientific knowledge Allows others to build on your findings Improved transparency Increased transfer of knowledge Increased utility of your data + methods The pragmatist Increased efficiency Reduces false leads based on irreproducible findings Data sharing citation advantage (Piwowar 2013) “It takes some effort to organize your research to be reproducible… the principal beneficiary is generally the author herself. ”- Schwab & Claerbout 6

Today’s webinar Create a study plan Set-up a reproducible project Preregistration 1. Plan for reproducibility before you start Version control Documentation Connect your research services 2. Keep track of things Reporting 3. Contain bias 4. Archive + share your materials 7

1. Plan for reproducibility before you start Create a study plan Create a study plan before you gather your data Begin documentation early Shows evolution of study How? Research questions + hypotheses Study design –Type of design –Sampling –Power and sample size –Randomization? Variables measured –Meaningful effect size Variables constructed –Data processing Data management Analyses Sharing

1. Plan for reproducibility before you start Set-up a reproducible project Set-up a centralized location for project management Organization is especially important for collaboration Easily find the most recent file version Eases transition between lab members Allows for back-up and version control How?

free, open source

1. Set-up a reproducible project

Persistent Citable Identifiers

1. Put data, materials, and code on the OSF

1. Giving contributors access

1. Creating a wiki

1. Adding organizational structure - components

How can you make your research reproducible? Create a study plan Set-up a reproducible project Preregistration 1. Plan for reproducibility before you start Version control Documentation Connect your research services 2. Keep track of things Reporting 3. Contain bias 4. Archive + share your materials 18

1. Plan for reproducibility before you start Preregister your study plan Preregister your study plan before you look at your data Distinguishes a priori design decisions from post hoc Counters selective reporting and outcome reporting bias Preregistration of all study plans helps counter publication bias Preregister your analysis plan Preregister your analysis plan before you look at your data Defines your confirmatory analyses Decreases researcher degrees of freedom

Publication bias Fanelli D (2010) “ Positive ” Results Increase Down the Hierarchy of the Sciences. PLoS ONE 5(4): e10068.

Researcher degrees of freedom Simmons, Nelson, & Simonsohn (2012)

1. How to preregister

Pregistration

How can you make your research reproducible? Create a study plan Set-up a reproducible project Preregistration 1. Plan for reproducibility before you start Version control Documentation Connect your research services 2. Keep track of things Reporting 3. Contain bias 4. Archive + share your materials 24

2. Keep track of things Version control Track your changes Everything created manually should use version control Tracks changes to files, code, metadata Allows you to revert to old versions Make incremental changes: commit early, commit often Git / GitHub / BitBucket Version control for data Metadata should be version controlled

2. Version control

2. Keep track of things Documentation Document everything done by hand Document your software environment (eg, dependencies, libraries, sessionInfo () in R) Everything done by hand or not automated from data and code should be precisely documented: – README files Make raw data read only – You won’t edit it by accident – Forces you to document or code data processing Document in code comments

2. Keep track of things Connect your research services

Connects Services Researchers Use

3. Contain bias Reporting Report transparently + completely Transparently means: – Readers can use the findings – Replication is possible – Users are not misled – Findings can be pooled in meta- analyses Completely means: – All results are reported, no matter their direction or statistical significance How? Use reporting guidelines Avoid HARKing: Hypothesizing After the Results are Known Report all deviations from your study plan Report which decisions were made after looking at the data

4. Archive + share your materials Open Science Framework Share your materials Where doesn’t matter. That you share matters. Get credit for your code, your data, your methods Increase the impact of your research

Merges Public- Private Workflows

File downloads Incentives for Openness

File downloads

How can you make your research reproducible? Create a study plan – Begin documentation at study inception Set-up a reproducible project – Centralize and organize your project management Registration – Preregister your study + analysis plan 1. Plan for reproducibility before you start Version control – Track your changes Documentation – Document everything done by hand Connect your research services – Track all your materials in one place 2. Keep track of things Reporting – Report transparently + completely 3. Contain bias Where doesn’t matter. That you share matters. 4. Archive + share your materials 38

OSF for Meetings free poster + presentation service

OSF for Institutions integration with local services

Technology to enable change Training to enact change Incentives to embrace change

Stats + methods training free stats + methods training

Technology to enable change Training to enact change Incentives to embrace change

1. Data citation 2. Design transparency 3. Research materials transparency 4. Data transparency 5. Analytic methods (code) transparency 6. Preregistration of studies 7. Preregistration of analysis plans 8. Replication Transparency and Openness Promotion (TOP) Guidelines guidance on open policies

Badges: making behaviors visible promotes adoption

Case Study: Psychological Science

The Preregistration Challenge $1000 per prereg

$1000 per prereg

Registered Reports

(just to name a few) Registered Reports list of journals with RRs

Technology to enable change Training to enact change Incentives to embrace change

Reproducible Research Practices OSF Feedback for how we could support you more