David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
Anderson, Martinson, & DeVries, 2007
Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012
Problems Flexibility in analysis Selective reporting Ignoring nulls Lack of replication Sterling, 1959; Cohen, 1962; Lykken, 1968; Tukey, 1969; Greenwald, 1975; Meehl, 1978; Rosenthal, 1979
Figure by FiveThirtyEight.comSilberzahn et al., 2015
A Garden of Forking Paths Jorge Luis Borges; Gelman and Loken Exclude outliers? Control for year? Median or mean? “Does X affect Y?”
Franco, Malhotra, & Simonovits, 2015, SPPS
Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Franco, Malhotra, & Simonovits, 2015, SPPS
Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Unreported Tests (147) Median p-value =.35 Median effect size (d) =.13 % p <.05 = 23% Franco, Malhotra, & Simonovits, 2015, SPPS
Estimating Reproducibility Increasing Depth Increasing Breadth
Reproducibility in other fields Leslie McIntosh Cynthia Hudson-Vitale Michael Frank Emilio Bruna Christian Collberg Todd Proebsting Ecology Developmental Psychology Health Sciences Computer Science
Solution? Appeal to intentions, values, and goals. “Hey You! Behave by your values! Be objective!”
Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012
Rewards What is published? What is not? Novel results Positive results Clean results Replications Negative results Mixed evidence
Evidence to encourage change Technology to enable change Training to enact change Incentives to embrace change
InfrastructureMetascience Community
Infrastructure
Technology to enable change
Collaboration Documentation Archiving
Version Control
Merges Public- Private Workflows
File downloads Incentives for Openness
File downloads Forks
Persistent Citable Identifiers
Registration
Connecting the workflow is critical to enabling change
Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report
Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report OpenSesame
Community
Training to enact change
Free training on how to make research more reproducible
Incentives to embrace change
Agnostic to discipline Low barrier to entry Modular Transparency & Openness Promotion Guidelines
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Transparency & Openness Promotion Guidelines
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
Signatories 539 Journal signatories 59 Organizational signatories Learn more at
Signals: Making Behaviors Visible Promotes Adoption
% Articles reporting that data was available 40% 30% 20% 10% 0%
100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available
100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available
100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data
100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data
The $1,000,000 Preregistration Challenge
Exploratory research: Finds unexpected trends Pushes knowledge into new areas Results in a testable hypothesis
Confirmatory research: Puts a hypothesis to the test Does not allow data to influence the hypothesis Results are held to the highest standard of rigor
Research questions Data collection methods Variables Statistical tests Outliers
Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time Kaplan and Irvin, 2015 Positive result rate dropped from 57% to 8% after preregistration became required for clinical trials.
Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW
Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW
Who Publishes Registered Reports? (just to name a few) See the full list and compare features: osf.io/8mpji
Find this presentation