Brian Nosek University of Virginia -- Center for Open Science Improving Openness and Reproducibility of Scientific Research
Norms Communality Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Counternorms Secrecy Closed Particularlism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity
Anderson, Martinson, & DeVries, 2007
Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012
Problems Flexibility in analysis Selective reporting Ignoring nulls Lack of replication Sterling, 1959; Cohen, 1962; Lykken, 1968; Tukey, 1969; Greenwald, 1975; Meehl, 1978; Rosenthal, 1979
Silberzahn et al., 2015 Figure credit: fivethirtyeight.com
Reported Tests (122) Median p-value =.02 Median effect size (d) =.29 % p <.05 = 63% Unreported Tests (147) Median p-value =.35 Median effect size (d) =.13 % p <.05 = 23% Franco, Malhotra, & Simonovits, 2015, SPPS
Positive Result Rate dropped from 57% to 8% after preregistration required.
97%37% xx Open Science Collaboration, 2015, Science
Barriers 1.Perceived norms (Anderson, Martinson, & DeVries, 2007) 2.Motivated reasoning (Kunda, 1990) 3.Minimal accountability (Lerner & Tetlock, 1999) 4.I am busy (Me & You, 2016)
Technology to enable change Training to enact change Incentives to embrace change
Signals: Making Behaviors Visible Promotes Adoption
% Articles reporting that data was available 40% 30% 20% 10% 0%
% of Articles reporting that data was available 100% 75% 50% 25% 0% Reportedly available Available Correct Data Usable Data Complete Data
Two Modes of Research Context of Discovery Exploration Data contingent Hypothesis generating Context of Justification Confirmation Data independent Hypothesis testing
Preregistration Purposes 1.Discoverability: Study exists 2.Interpretability: Distinguish exploratory and confirmatory approaches Mistaking exploratory as confirmatory increases publishability of results decreases credibility of results
Positive Result Rate dropped from 57% to 8% after preregistration required.
Are you okay with someone you love receiving a treatment based on clinical trials that were not preregistered?
Preregistration Challenge
Registered Reports Design Collect & Analyze Report Publish PEER REVIEW PEER REVIEW
Registered Reports AIMS Neuroscience Attention, Perception, & Psychophysics Cognition and Emotion Comprehensive Results in Social Psychology Cortex Drug and Alcohol Dependence eLife Euro Journal of Neuroscience Experimental Psychology Journal of Accounting Research Journal of Business and Psychology Journal of Personnel Psychology Journal of Media Psychology Nutrition and Food Science Journal Perspectives on Psych. Science Royal Society Open Science Social Psychology Stress and Health Work, Aging, and Retirement
Infrastructure Strategy Solve problems researchers have now, Enable easy transition to openness
My Problems I lose stuff I forget stuff
Search and discover Develop idea Design study Acquire materials Collect data Store data Analyze data Interpret findings Write report Publish report OSF Open Science Framework A scholarly commons connecting the entire research lifecycle
OpenSesame
ecosystem FUNDERSSOCIETIES UNIVERSITIES PUBLISHING
What can you do? Researchers – Try out OSF, – Prereg Challenge, Editors: Badges, Registered Reports, TOP Departments: OSF-Reproducibility Workshops Library: Connect IR to OSF, or use OSF as IR All: Conferences – OSF4Meetings or
Brian Nosek