Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.

Slides:



Advertisements
Similar presentations
Medical Education Research Collaboration: A Challenge, but not an Impossibility William J. Cairney, PhD Colorado Springs Osteopathic Foundation.
Advertisements

Improving Integrity, Transparency, and Reproducibility Through Connection of the Scholarly Workflow Andrew Sallans Partnerships Lead Center for Open Science.
Open Science Framework: Supporting the research worflow Brian Nosek University of Virginia.
Evaluating Discipline-based Goals and Educational Outcomes in Developmental Psychology Anne L. Law Department of Psychology Rider University.
The Open Science Framework: a Free, Open Source Solution for Connecting Global Research Infrastructure Andrew Sallans Center for Open Science 15 April.
JRC's Open Access (OA) Policy G. P. Tartaglia, A. Annoni, G. Merlo, F
Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Advertising your data: Agency/Institution requirements for publishing metadata Nancy Hoebelheinrich Knowledge Motifs LLC Version 1.0 [Reviewed August 2012]
Making Connections: SHARE and the Open Science Framework Jeffrey Open Repositories 2015.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
Unit 1 Lesson 2 Scientific Investigations
Data Management and Accessibility S.M. Kaye PPPL Research Seminar 12/16/2013.
1 Judy Hewitt, PhD On Detail to Office of Extramural Research National Institutes of Health May 18, 2015 Center for Scientific Review Advisory Council.
DA-RT What publishing houses can and can’t do Patrick McCartan Publishing Director, Social Science and Humanities Journals Cambridge University Press.
The REF assessment framework (updated 23 May 2011)
Scientific investigations.  Question/problem –What do you want to know  Hypothesis- logical prediction for the question or problem  Variables- Factors.
DOE Data Management Plan Requirements
Open Science Framework Jeffrey Spies University of Virginia.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
April Center for Open Fostering openness, integrity, and reproducibility of scientific research.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Scientific Utopia: I. Improving Scientific Communication Brian Nosek University of Virginia Center for Open Science.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Breakout Groups Goal Format Demo Pitch. Overview Monday – 3-6p Breakouts Tuesday – 9-12p Pitches (10 min, 10 discussion) – 2-6p breakouts Wednesday –
BERKELEY INITIATIVE FOR TRANSPARENCY IN THE SOCIAL SCIENCES Garret Christensen, Research Fellow BITSS and Berkeley Institute for Data Science.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Open Science Framework Jeffrey Center for Open Science | University of Virginia.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Our Digital Showcase Scholars’ Mine Annual Report from July 2015 – June 2016 Providing global access to the digital, scholarly and cultural resources.
Improving Openness and Reproducibility of Scientific Research
David Preregistration David
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Increasing openness, reproducibility, and prediction in social science research My general substantive interest in the gap between values and practices.
Scholarly Workflow: Federal Prototype and Preprints
Improving Openness and Reproducibility of Scientific Research
Improving Openness and Reproducibility of Scientific Research
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
Three points 1. Scientists’ Conflict of Interest 2
Practical Steps for Increasing Openness and Reproducibility
Improving Openness and Reproducibility of Scientific Research
Practical Steps for Increasing Openness and Reproducibility
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Psi Chi’s Network for International Collaborative Exchange (NICE)
Open Science Framework
An Open Science Framework for Managing and Sharing Research Workflows
Open Science Framework
Achieving Open Science
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Preregistration on the Open Science Framework
A Framework for Managing and Sharing Research Workflow
Reinventing Scholarly Communication by Separating Publication From Evaluation Brian Nosek University of Virginia -- Center for Open Science
Shifting incentives from getting it published to getting it right
Improving Openness and Reproducibility of Scientific Research
Session 4 Open Workflow: OSF and Pre-registration exercise
Modularity and Interoperability
Methods Hour: Preregistration
Study Pre-Registration
Disrupting Scholarly Communication
What, why and best practices in open research
Unit 1 Lesson 2 Scientific Investigations
This presentation will include:
Unit 1 Lesson 2 Scientific Investigations
School of Psychology, Cardiff University
Badges to Acknowledge Open Practices
Presentation transcript:

Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science

New Challenges? Low power Overabundance of positive results Questionable research practices Ignoring null results Lack of replication Problems with NHST

Long-standing Challenges Sterling (1959) Cohen (1962) Lykken (1968) Tukey (1969) Greenwald (1975) Meehl (1978) Rosenthal (1979)

Solutions (in brief) Disclosure, openness Distinguish confirmatory v. exploratory Replication Accumulate evidence Narrow use of NHST

Problems ✔ Solutions ✔ Implementation X

Central issue Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012, PPS

Implementation Challenges Perceived norms (Anderson, Martinson, & DeVries, 2007) Temporal construal (Trope & Liberman, 2003) Motivated reasoning (Kunda, 1990) Minimal accountability (Lerner & Tetlock, 1999) I am busy (Me & You, 2013)

(Brief) History of COS January: Incorporated as non- profit March: $5.25 million grant April: 2 nd grant May: 6 staff + 4 interns hired Today: Occupy offices in Charlottesville, Virginia

Mission Improve Openness, Integrity, and Reproducibility of Scientific Research

Implementation Strategy 1.Support existing workflow with useful services 2.Enable good practices – Openness – Preregistration – Replication 3.Nudge incentives with both top-down and bottom-up interventions – Disclosure requirements – Submission options – Badges – Altmetrics – Grants

1. Support existing workflow with useful services

Open Science Framework Demo Jeff Spies

Version control

Grant applications IRB submission Creating materials Personal file-system Collaboration Scripting Data collection Analysis tools Figure creation Data storage Manuscript preparation Review/publishing system Article and material discovery

2. Enable Better Practices Openness Registration Replication

Open Science Framework Demo Jeff Spies

Registration

3. Nudge incentives, top-down and bottom-up Disclosure requirements Submission options Badges Altmetrics Grants

Psychological Science Disclosure Requirements (Jan 2014) 1.We reported the total number of observations that were excluded (if any) and the reasons for doing so. 2.We reported all independent variables or manipulations, whether successful or failed. 3.We reported all dependent variables or measures. 4.We reported how we determined our sample size.

New Journal Submission Options Preregistered review Review of intro and methods, published regardless of outcome Focused on replications (so far) Already: Perspectives on Psychological Science, Social Psychology, Cortex, Frontiers in Cognition

Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014)

Altmetrics to incentivize openness Open Science Framework

POTENTIAL INTERVENTION POINTS Government policy and regulation Funders Scientific Societies Universities/Libraries Journals Community Individual Researchers

Replications worth > 0 No need to replicate everything Retain innovation, add verification Culture change: Journals make some space

Social Psychology, Frontiers in Cognition, Perspectives on Psychological Science, Cortex Design – Peer-review prior to data collection – Simultaneous replications Features – Incentives for doing replications – Avoid researchers’ degrees of freedom – Minimize quality drift

Do ResearchWrite ReportPublish ReportAdd to Knowledge GATE OPEN

Scientists Tool

Open Science Framework (OSF) For collaboration, documentation, archiving, sharing, registration Respects and integrates workflow Replaces ad hoc archiving with shared solution Merges private and public workflows Connects infrastructure Integrates top-down and bottom-up solutions Incentivizes openness

Incentives for openness Bottom-up – slow, but comprehensive – Researcher’s workflow part of the public record – Methods of reputation building with openness Top-down – fast, but narrow – Funders: require disclosure of grantees – Journals: badges for openness, registration, disclosure

ACADEMIC STEROIDS HELP MANUFACTURE BEAUTY 1.Selectively report successes, dismiss failures as pilots 2.Multiple manipulations and measures, report subset 3.Multiple analysis strategies, report subset 4.Data peeking 5.No direct replication

Registration Constrain researchers degrees-of-freedom Clarify the distinction between confirmatory and exploratory research Ironically, will increase the valuation of exploratory research

#4: Registration Existing solutions are: – Discipline specific – Appended to the workflow – Constrained to a point in time – Content restrictive OSF registration is none of those. Instead: – Templating – Integrated in the workflow – Any time, any frequency – Registers everything

Cohen (1994) “David Bakan said back in 1966 that his claim that ‘a great deal of mischief has been associated’ with the test of significance ‘is hardly original’ that it is ‘what everybody knows’ …. If it was hardly original in 1966, it can hardly be original now.”

Open Science Framework (OSF) For collaboration, documentation, archiving, sharing, registration Respects and integrates workflow Incentivizes openness Replaces ad hoc archiving with shared solution Merges private and public workflows Integrates top-down and bottom-up solutions Breaks disciplinary silos Connects infrastructure

#4: Registration Existing solutions are: – Discipline specific – Appended to the workflow – Constrained to a point in time – Content restrictive OSF registration is none of those. Instead: – Templating – Integrated in the workflow – Any time, any frequency – Registers everything

Peer Review Submit for publication Draft article Analyze data Conduct a study Article published New research inspired Design a study Grant application system IRB submission system Creation of materials Personal file-system Collaborator sharing Data collection system Analysis tools Figure creation Data storage Manuscript preparation Review/publishing system Article and material discovery