Download presentation
Presentation is loading. Please wait.
Published byVernon Wilkinson Modified over 8 years ago
1
David Mellor, PhD Project Manager at COS david@cos.iodavid@cos.io @EvoMellor Improving Openness and Reproducibility of Scientific Research
2
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
3
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
4
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
5
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
6
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Merton, 1942
7
Anderson, Martinson, & DeVries, 2007
10
Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012
11
Problems Flexibility in analysis Selective reporting Ignoring nulls Lack of replication Sterling, 1959; Cohen, 1962; Lykken, 1968; Tukey, 1969; Greenwald, 1975; Meehl, 1978; Rosenthal, 1979
12
Figure by FiveThirtyEight.comSilberzahn et al., 2015
13
A Garden of Forking Paths Jorge Luis Borges; Gelman and Loken Exclude outliers? Control for year? Median or mean? “Does X affect Y?”
14
Franco, Malhotra, & Simonovits, 2015, SPPS
15
Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Franco, Malhotra, & Simonovits, 2015, SPPS
16
Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Unreported Tests (147) Median p-value =.35 Median effect size (d) =.13 % p <.05 = 23% Franco, Malhotra, & Simonovits, 2015, SPPS
17
Estimating Reproducibility Increasing Depth Increasing Breadth
21
Reproducibility in other fields Leslie McIntosh Cynthia Hudson-Vitale Michael Frank Emilio Bruna Christian Collberg Todd Proebsting Ecology Developmental Psychology Health Sciences Computer Science
22
Solution? Appeal to intentions, values, and goals. “Hey You! Behave by your values! Be objective!”
23
Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012
24
Rewards What is published? What is not? Novel results Positive results Clean results Replications Negative results Mixed evidence
27
Evidence to encourage change Technology to enable change Training to enact change Incentives to embrace change
28
InfrastructureMetascience Community
29
Infrastructure
30
Technology to enable change
33
Collaboration Documentation Archiving
35
Version Control
37
Merges Public- Private Workflows
39
File downloads Incentives for Openness
40
File downloads Forks
41
Persistent Citable Identifiers
43
Registration
45
Connecting the workflow is critical to enabling change
46
Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report
47
Publish report Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze Data Interpret findings Write report OpenSesame
48
Community
50
Training to enact change
51
Free training on how to make research more reproducible http://cos.io/stats_consulting
53
Incentives to embrace change
54
Agnostic to discipline Low barrier to entry Modular Transparency & Openness Promotion Guidelines
55
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Transparency & Openness Promotion Guidelines
56
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
57
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
58
Eight Standards 1.Data Citation 2.Design transparency 3.Research materials 4.Data 5.Analytical methods 6.Preregistered studies 7.Preregistered analysis plans 8.Registered Reports Three Tiers Disclose Require Verify Transparency & Openness Promotion Guidelines
59
Signatories 539 Journal signatories 59 Organizational signatories Learn more at http://cos.io/top
60
Signals: Making Behaviors Visible Promotes Adoption
62
% Articles reporting that data was available 40% 30% 20% 10% 0%
63
100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available
64
100% 75% 50% 25% 0% Reportedly Available Accessible Correct Data Usable Data Complete Data % of Articles reporting that data was available
65
100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data
66
100% 75% 50% 25% 0% % of Articles reporting that data was available Reportedly Available Accessible Correct Data Usable Data Complete Data
67
The $1,000,000 Preregistration Challenge
68
Exploratory research: Finds unexpected trends Pushes knowledge into new areas Results in a testable hypothesis
69
Confirmatory research: Puts a hypothesis to the test Does not allow data to influence the hypothesis Results are held to the highest standard of rigor
70
https://cos.io/prereg
72
Research questions Data collection methods Variables Statistical tests Outliers
73
Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time Kaplan and Irvin, 2015 Positive result rate dropped from 57% to 8% after preregistration became required for clinical trials.
74
Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW
75
Registered Reports Design Collect & Analyze ReportPublish PEER REVIEW
76
Who Publishes Registered Reports? (just to name a few) See the full list and compare features: osf.io/8mpji
77
http://osf.io/8mpji
78
Find this presentation at https://osf.io/ david@cos.iodavid@cos.io @EvoMellor @OSFramework
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.