David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.

Slides:



Advertisements
Similar presentations
Code of Conduct for the Collection, Analysis and Sharing of Health Related Research Data in Developing Countries Elizabeth Pisani Consultant, Wellcome.
Advertisements

Good data practices Jelte M. Wicherts 1. 2 Source: Wicherts, J. M. (2011). Psychology must learn a lesson from fraud case. Nature, 480, 7.
Improving Integrity, Transparency, and Reproducibility Through Connection of the Scholarly Workflow Andrew Sallans Partnerships Lead Center for Open Science.
Open Science Framework: Supporting the research worflow Brian Nosek University of Virginia.
Principles for the Ethical Analysis of Clinical And Translational Research Jonathan Gelfond, MD PhDElizabeth Heitman, PhD Craig Klugman, PhDVanderbilt.
Chapter 1: Why Do Research?
Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Making Connections: SHARE and the Open Science Framework Jeffrey Open Repositories 2015.
1 Judy Hewitt, PhD On Detail to Office of Extramural Research National Institutes of Health May 18, 2015 Center for Scientific Review Advisory Council.
DA-RT What publishing houses can and can’t do Patrick McCartan Publishing Director, Social Science and Humanities Journals Cambridge University Press.
Professor Phillipa Hay Centre for Health Research, School of Medicine.
Open Science Framework Jeffrey Spies University of Virginia.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
April Center for Open Fostering openness, integrity, and reproducibility of scientific research.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Scientific Utopia: I. Improving Scientific Communication Brian Nosek University of Virginia Center for Open Science.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Breakout Groups Goal Format Demo Pitch. Overview Monday – 3-6p Breakouts Tuesday – 9-12p Pitches (10 min, 10 discussion) – 2-6p breakouts Wednesday –
BERKELEY INITIATIVE FOR TRANSPARENCY IN THE SOCIAL SCIENCES Garret Christensen, Research Fellow BITSS and Berkeley Institute for Data Science.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Open Science Framework Jeffrey Center for Open Science | University of Virginia.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
BERKELEY INITIATIVE FOR TRANSPARENCY IN THE SOCIAL SCIENCES Garret Christensen, Research Fellow BITSS and Berkeley Institute for Data Science.
Improving Openness and Reproducibility of Scientific Research
David Preregistration David
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Safer science: making psychological science more Transparent & replicable Simine Vazire UC Davis.
Increasing openness, reproducibility, and prediction in social science research My general substantive interest in the gap between values and practices.
What is Open Science and How do I do it?
Transparency increases credibility and relevance of research
Scientific Methodology
Improving Openness and Reproducibility of Scientific Research
Improving Openness and Reproducibility of Scientific Research
Improving Openness and Reproducibility of Scientific Research
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
Lorne Campbell University of Western Ontario
Three points 1. Scientists’ Conflict of Interest 2
Improving Openness and Reproducibility of Scientific Research
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Open Science Framework
An Open Science Framework for Managing and Sharing Research Workflows
Preregistration Challenge
Open Science Framework
Achieving Open Science
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Preregistration on the Open Science Framework
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Lorne Campbell University of Western Ontario
The Reproducible Research Advantage
Reinventing Scholarly Communication by Separating Publication From Evaluation Brian Nosek University of Virginia -- Center for Open Science
Shifting incentives from getting it published to getting it right
Incentives for a more #openscience

Improving Openness and Reproducibility of Scientific Research
Session 4 Open Workflow: OSF and Pre-registration exercise
Modularity and Interoperability
Study Pre-Registration
Disrupting Scholarly Communication
What, why and best practices in open research
Challenges for Journals: Encouraging Sound Science
The Scientific Method Notes
Badges to Acknowledge Open Practices
Open Science & Reproducibility
Presentation transcript:

David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research

Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942

Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942

Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942

Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942

Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942

Anderson, Martinson, & DeVries, 2007

Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012

Problems Flexibility in analysis Selective reporting Ignoring nulls Lack of replication Sterling, 1959; Cohen, 1962; Lykken, 1968; Tukey, 1969; Greenwald, 1975; Meehl, 1978; Rosenthal, 1979

Figure by FiveThirtyEight.comSilberzahn et al., 2015

A Garden of Forking Paths Jorge Luis Borges; Gelman and Loken Exclude outliers? Control for year? Median or mean? “Does X affect Y?”

Franco, Malhotra, & Simonovits, 2015, SPPS

Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Franco, Malhotra, & Simonovits, 2015, SPPS

Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Unreported Tests (147) Median p-value =.35 Median effect size (d) =.13 % p <.05 = 23% Franco, Malhotra, & Simonovits, 2015, SPPS

Estimating Reproducibility Increasing Depth Increasing Breadth

Reproducibility in other fields Leslie McIntosh Cynthia Hudson-Vitale Michael Frank Emilio Bruna Christian Collberg Todd Proebsting Ecology Developmental Psychology Health Sciences Computer Science

Solution? Appeal to intentions, values, and goals. “Hey You! Behave by your values! Be objective!”

Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012

Rewards What is published? What is not? Novel results Positive results Clean results Replications Negative results Mixed evidence

Evidence to encourage change Technology to enable change Training to enact change Incentives to embrace change

InfrastructureMetascience Community

Infrastructure

Technology to enable change

Collaboration Documentation Archiving

Version Control

Merges Public- Private Workflows

File downloads Incentives for Openness

File downloads Forks

Persistent Citable Identifiers

Registration

Connecting the workflow is critical to enabling change

Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report

Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report OpenSesame

Community

Training to enact change

Free training on how to make research more reproducible

Incentives to embrace change

Agnostic to discipline Low barrier to entry Modular Transparency & Openness Promotion Guidelines

Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Transparency & Openness Promotion Guidelines

Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines

Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines

Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines

Signatories 539 Journal signatories 59 Organizational signatories Learn more at

Signals: Making Behaviors Visible Promotes Adoption

% Articles reporting that data was available 40% 30% 20% 10% 0%

100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available

100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available

100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data

100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data

The $1,000,000 Preregistration Challenge

Exploratory research: Finds unexpected trends Pushes knowledge into new areas Results in a testable hypothesis

Confirmatory research: Puts a hypothesis to the test Does not allow data to influence the hypothesis Results are held to the highest standard of rigor

Research questions Data collection methods Variables Statistical tests Outliers

Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time Kaplan and Irvin, 2015 Positive result rate dropped from 57% to 8% after preregistration became required for clinical trials.

Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW

Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW

Who Publishes Registered Reports? (just to name a few) See the full list and compare features: osf.io/8mpji

Find this presentation