Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science
New Challenges? Low power Overabundance of positive results Questionable research practices Ignoring null results Lack of replication Problems with NHST
Long-standing Challenges Sterling (1959) Cohen (1962) Lykken (1968) Tukey (1969) Greenwald (1975) Meehl (1978) Rosenthal (1979)
Solutions (in brief) Disclosure, openness Distinguish confirmatory v. exploratory Replication Accumulate evidence Narrow use of NHST
Problems ✔ Solutions ✔ Implementation X
Central issue Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012, PPS
Implementation Challenges Perceived norms (Anderson, Martinson, & DeVries, 2007) Temporal construal (Trope & Liberman, 2003) Motivated reasoning (Kunda, 1990) Minimal accountability (Lerner & Tetlock, 1999) I am busy (Me & You, 2013)
(Brief) History of COS January: Incorporated as non- profit March: $5.25 million grant April: 2 nd grant May: 6 staff + 4 interns hired Today: Occupy offices in Charlottesville, Virginia
Mission Improve Openness, Integrity, and Reproducibility of Scientific Research
Implementation Strategy 1.Support existing workflow with useful services 2.Enable good practices – Openness – Preregistration – Replication 3.Nudge incentives with both top-down and bottom-up interventions – Disclosure requirements – Submission options – Badges – Altmetrics – Grants
1. Support existing workflow with useful services
Open Science Framework Demo Jeff Spies
Version control
Grant applications IRB submission Creating materials Personal file-system Collaboration Scripting Data collection Analysis tools Figure creation Data storage Manuscript preparation Review/publishing system Article and material discovery
2. Enable Better Practices Openness Registration Replication
Open Science Framework Demo Jeff Spies
Registration
3. Nudge incentives, top-down and bottom-up Disclosure requirements Submission options Badges Altmetrics Grants
Psychological Science Disclosure Requirements (Jan 2014) 1.We reported the total number of observations that were excluded (if any) and the reasons for doing so. 2.We reported all independent variables or manipulations, whether successful or failed. 3.We reported all dependent variables or measures. 4.We reported how we determined our sample size.
New Journal Submission Options Preregistered review Review of intro and methods, published regardless of outcome Focused on replications (so far) Already: Perspectives on Psychological Science, Social Psychology, Cortex, Frontiers in Cognition
Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014)
Altmetrics to incentivize openness Open Science Framework
POTENTIAL INTERVENTION POINTS Government policy and regulation Funders Scientific Societies Universities/Libraries Journals Community Individual Researchers
Replications worth > 0 No need to replicate everything Retain innovation, add verification Culture change: Journals make some space
Social Psychology, Frontiers in Cognition, Perspectives on Psychological Science, Cortex Design – Peer-review prior to data collection – Simultaneous replications Features – Incentives for doing replications – Avoid researchers’ degrees of freedom – Minimize quality drift
Do ResearchWrite ReportPublish ReportAdd to Knowledge GATE OPEN
Scientists Tool
Open Science Framework (OSF) For collaboration, documentation, archiving, sharing, registration Respects and integrates workflow Replaces ad hoc archiving with shared solution Merges private and public workflows Connects infrastructure Integrates top-down and bottom-up solutions Incentivizes openness
Incentives for openness Bottom-up – slow, but comprehensive – Researcher’s workflow part of the public record – Methods of reputation building with openness Top-down – fast, but narrow – Funders: require disclosure of grantees – Journals: badges for openness, registration, disclosure
ACADEMIC STEROIDS HELP MANUFACTURE BEAUTY 1.Selectively report successes, dismiss failures as pilots 2.Multiple manipulations and measures, report subset 3.Multiple analysis strategies, report subset 4.Data peeking 5.No direct replication
Registration Constrain researchers degrees-of-freedom Clarify the distinction between confirmatory and exploratory research Ironically, will increase the valuation of exploratory research
#4: Registration Existing solutions are: – Discipline specific – Appended to the workflow – Constrained to a point in time – Content restrictive OSF registration is none of those. Instead: – Templating – Integrated in the workflow – Any time, any frequency – Registers everything
Cohen (1994) “David Bakan said back in 1966 that his claim that ‘a great deal of mischief has been associated’ with the test of significance ‘is hardly original’ that it is ‘what everybody knows’ …. If it was hardly original in 1966, it can hardly be original now.”
Open Science Framework (OSF) For collaboration, documentation, archiving, sharing, registration Respects and integrates workflow Incentivizes openness Replaces ad hoc archiving with shared solution Merges private and public workflows Integrates top-down and bottom-up solutions Breaks disciplinary silos Connects infrastructure
#4: Registration Existing solutions are: – Discipline specific – Appended to the workflow – Constrained to a point in time – Content restrictive OSF registration is none of those. Instead: – Templating – Integrated in the workflow – Any time, any frequency – Registers everything
Peer Review Submit for publication Draft article Analyze data Conduct a study Article published New research inspired Design a study Grant application system IRB submission system Creation of materials Personal file-system Collaborator sharing Data collection system Analysis tools Figure creation Data storage Manuscript preparation Review/publishing system Article and material discovery