Practical Steps for Increasing Openness and Reproducibility

Slides:



Advertisements
Similar presentations
Chapter One: The Science of Psychology
Advertisements

Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Chapter One: The Science of Psychology. Ways to Acquire Knowledge Tenacity Tenacity Refers to the continued presentation of a particular bit of information.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
The Psychologist as Detective, 4e by Smith/Davis © 2007 Pearson Education Chapter One: The Science of Psychology.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
April Center for Open Fostering openness, integrity, and reproducibility of scientific research.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Smith/Davis (c) 2005 Prentice Hall Chapter One The Science of Psychology PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
The Reproducible Research Advantage Why + how to make your research more reproducible Presentation for the Center for Open Science June 17, 2015 April.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Open Science Framework Jeffrey Center for Open Science | University of Virginia.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
How to write a publishable qualitative article
David Preregistration David
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Increasing openness, reproducibility, and prediction in social science research My general substantive interest in the gap between values and practices.
What is Open Science and How do I do it?
Critically Appraising a Medical Journal Article
Scholarly Workflow: Federal Prototype and Preprints
Study pre-registration
Pre-Workshop Prep Download example files:
Lesson by Ryan Benson, M.A.
The Science of Social Psychology
Fostering openness, integrity, and reproducibility of scientific research April Center for Open
Principles of Quantitative Research
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
Jarek Nabrzyski Director, Center for Research Computing
Experimental Psychology
Lorne Campbell University of Western Ontario
Three points 1. Scientists’ Conflict of Interest 2
Practical Steps for Increasing Openness and Reproducibility
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Psi Chi’s Network for International Collaborative Exchange (NICE)
Preregistration Challenge
Open Science Framework
Clinical Studies Continuum
Achieving Open Science
Anna E. van ‘t Veer1, Roger Giner-Sorolla2, Christiane J. A
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Preregistration on the Open Science Framework
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Section 2: Science as a Process
The Reproducible Research Advantage
Reinventing Scholarly Communication by Separating Publication From Evaluation Brian Nosek University of Virginia -- Center for Open Science
Methods Reproducibility & Results Reproducibility
Incentives for a more #openscience
Session 4 Open Workflow: OSF and Pre-registration exercise
Modularity and Interoperability
Study Pre-Registration
Disrupting Scholarly Communication
Module 02 Research Strategies.
Pre-Workshop Prep Download example files:
What, why and best practices in open research
School of Psychology, Cardiff University
Psych 231: Research Methods in Psychology
Psych 231: Research Methods in Psychology
Chapter 4 Summary.
Chapter 1 The Science of Biology
MANUSCRIPT WRITING TIPS, TRICKS, & INFORMATION Madison Hedrick, MA
Open Science & Reproducibility
Presentation transcript:

Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science

Infrastructure Metascience Community

Unfortunately, it has become apparently over the last few years that perhaps the answer to that question is not all that much. Now, there have been some very prominent cases in the past few years of outright fraud, where people have completely fabricated their data, but I’m not talking about those case. What I’m talking is the general sense that many scientific findings in a wide variety of fields don’t replicate, and that the published literature has a very high rate of false-positives in it. So if a large proportion of our published results aren’t replicable, and are potential false positives, Are we actually accumulating knowledge about real phenomena? I would suggest that hte answer to this question is ‘no’, or at least we haven’t accumulated as much knowledge as we would like to believe.

What is reproducibility? Computation Reproducibility: If we took your data and code/analysis scripts and reran it, we can reproduce the numbers/graphs in your paper Methods Reproducibility: We have enough information to rerun the experiment or survey the way it was originally conducted Results Reproducibility/Replicability: We use your exact methods and analyses, but collect new data, and we get the same statistical conclusion

There’s more to it than sharing of discrete objects There’s more to it than sharing of discrete objects. Think about using this as an opportunity to increase transparency by capturing the entire workflow, and to do so while connecting tools and services that make up the parts of the workflow, not requiring people to change all of their practices at once, and providing immediate efficiencies and value to the researcher AS they comply with requirements. Easy, right? Obviously not.

Why should you care? Increases efficiency of your own work Can reduce false leads Data sharing citation advantage “It takes some effort to organize your research to be reproducible…the principal beneficiary is generally the author herself” – Schwab & Claerbout

Current Barriers Statistical Transparency Ignoring null results Researcher degrees of freedom Transparency Poor documentation Lack of openness

Steps Create a structured workspace

Research Management Planning What are you going to store? Where and how are you going to store it? Who will have access to it? When will they have access to it?

Research Management Planning What are you going to store? Where and how are you going to store it? Who will have access to it? When will they have access to it? PLAN AHEAD Checklists and common structures are your best friend!

Ledgerwood Lab Experimental Archive https://ucdavis.app.box.com/s/f8hn7rqtwwf6aa6hjtkthdbiuehup312

Research Management Planning Could do this internally on a lab server, personal computer, but: Makes eventual sharing more work Unclear how stable/accessible that will be in the long run Cross lab/institution collaborations harder

Open Science Framework http://osf.io free, open source

Steps Create a structured workspace Create a research plan Pre-registration

Pre-registration Before conducting a study registering: The what of the study: General information about what you are investigating and how Research question Population and sample size General design Variables you’ll be collecting, or dataset you’ll be using

Pre-registration Study pre-registration decreases file-drawer effects Helps with discovery of unpublished, usually null findings

Figure 1. Positive Results by Discipline. Fanelli D (2010) “Positive” Results Increase Down the Hierarchy of the Sciences. PLoS ONE 5(4): e10068. doi:10.1371/journal.pone.0010068 http://127.0.0.1:8081/plosone/article?id=info:doi/10.1371/journal.pone.0010068

Steps Create a structured workspace Create a research plan Pre-registration

Steps Create a structured workspace Create a research plan Pre-registration Pre-analysis plan for confirmatory research

Pre-analysis plan Like a pre-registration Detail the analyses planned for confirmatory hypothesis testing Decrease researcher degrees of freedom

Researcher Degrees of Freedom All data processing and analytical choices made after seeing and interacting with your data Should I collect more data? Which observations should I exclude? Which conditions should I compare? What should be my main DV? Should I look for an interaction effect?

False positive inflation this is also assuming your initial false positive rate was .05, which may not be true given that we often work with somewhat unreliable measures, low powered studies, and don’t often treat sitmuli as random factors which can all also increase false positive rates Now you may be saying to yourself, well,people don’t really do this do they? Simmons, Nelson, & Simonsohn (2012)

Solution: Pre-registered analyses Before data is collected, specify Sample size Data processing and cleaning procedures Exclusion criterion Statistical Analyses Registered in a read-only format so it can’t be changed Decreases RDF, so p-values are more face valid Registration holds you accountable to your self and to others; similar to model used in clinical trials

Exploratory vs. Confirmatory Analyses Interested in exploring possible patterns/relationships in data to develop hypotheses Confirmatory Have a specific hypothesis you want to test Pre-registration of analyses clarifies which are exploratory and which are confirmatory

Steps Create a structured workspace Create a research plan Pre-registration Pre-analysis plan for confirmatory research Archive materials from study

Steps Create a structured workspace Create a research plan Pre-registration Pre-analysis plan for confirmatory research Archive materials from study Analyze and document analyses

Steps Create a structured workspace Create a research plan Pre-registration Pre-analysis plan for confirmatory research Archive materials from study Analyze and document analyses Share study data, code, materials

What to share Sharing is a continuum Data underlying just results reported in a paper Data underlying publication + information about other variables collected Data underlying publication + embargo on full dataset All data collected for that study

Why you might want to share Journal/Funder mandates Increase impact of work Recognition of good research practices

Signals: Making Behaviors Visible Promotes Adoption Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014)

Also have an initiave to get journals ot adopt review of pre-registered reports of publications, so not even publication decisions are data dependent, but are made instead on the solidness of theory, study design, and analyses

The $1,000,000 Preregistration Challenge Endorse TOP Guidelines Badges for Open Practices Registered Reports Another incentive for researchers to try out preregistration.

https://cos.io/prereg

Registered Reports Design Collect & Analyze Report Publish PEER REVIEW Review of intro and methods prior to data collection; published regardless of outcome Beauty vs. accuracy of reporting Publishing negative results Conducting replications Peer review focuses on quality of methods

Registered Reports Design Collect & Analyze Report Publish PEER REVIEW Review of intro and methods prior to data collection; published regardless of outcome Beauty vs. accuracy of reporting Publishing negative results Conducting replications Peer review focuses on quality of methods

Registered Reports 43 journals so far Adv in Mthds & Practices in Psyc Sci AIMS Neuroscience Attention, Percept., & Psychophys Cognition and Emotion Cognitive Research Comp. Results in Social Psychology Cortex Drug and Alcohol Dependence eLife Euro Journal of Neuroscience Experimental Psychology Human Movement Science Infancy Int’l Journal of Psychophysiology Journal of Accounting Research Journal of Business and Psychology Journal of Cogn. Enhancement Journal of Euro. Psych. Students Journal of Expt’l Political Science Journal of Personnel Psychology Journal of Media Psychology Leadership Quarterly Management & Org Review Nature Human Behaviour Nicotine and Tobacco Research NFS Journal Nutrition and Food Science Journal Perspectives on Psych. Science Royal Society Open Science Social Psychology Stress and Health Work, Aging, and Retirement Review of intro and methods prior to data collection; published regardless of outcome Beauty vs. accuracy of reporting Publishing negative results Conducting replications Peer review focuses on quality of methods https://cos.io/rr/, Committee Chair: Chris Chambers

Steps Create a structured workspace Create a research plan Pre-registration Pre-analysis plan for confirmatory research Archive materials from study Analyze and document analyses Share study data, code, materials

Feeling overwhelmed? Start small Increase step by step Don’t have to do all pieces at once

How to make this more efficient? Have conversations with collaborators early What is our data management plan? What/when will we share? Be consistent across studies If an entire lab has the same structure, then it’s easier to find things Document from the beginning

Where to get help: Reproducible Research Practices? The OSF? stats-consulting@cos.io The OSF? support@cos.io Have feedback for how we could support you more? contact@cos.io feedback@cos.io

Registered Reports Committee COS helps coordinate a committee of editors and others interested in the format - committee led by Chris Chambers (University of Cardiff) The RR’s committee maintains best practices for RR’s and tracks use of RR’s by journals- Variations include an open submission format for any preregistered research, RR’s focused on replication with open calls for participating labs, and special issues focused on RR’s Committee also provides resources to the community: guidelines + templates for editors; comparisons of features by journals for authors + researchers. Goal: easy adoption.

Early Adopters AIMS Neuroscience Attention, Perception & Psychophysics Comparative Political Studies Comprehensive Results in Social Psychology Cortex Drug and Alcohol Dependence eLife Experimental Psychology Frontiers in Cognition Journal of Business and Psychology Nutrition and Food Science Journal Perspectives on Psychological Science Social Psychology Working, Aging, and Retirement

Signals: Making Behaviors Visible Promotes Adoption Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014)

Also have an initiave to get journals ot adopt review of pre-registered reports of publications, so not even publication decisions are data dependent, but are made instead on the solidness of theory, study design, and analyses

Data Availability in Psychological Science 10x increase from 2013 to 2015

Connecting silos to enable increased discovery

http://osf.io/share

https://api.osf.io/v2/docs/ Localized connections are now possible with the new OSF API. API Docs https://api.osf.io/v2/docs/

http://osf.io/ free, open source Share data, share materials, show the research process – confirmatory result make it clear, exploratory discovery make it clear; demonstrate the ingenuity, perspiration, and learning across false starts, errant procedures, and early hints – doesn’t have to be written in painstaking detail in the final report, just make it available. http://osf.io/ free, open source

Put data, materials, and code on the OSF

Manage access and permissions

Automate versioning With hashes

Share your work

OSF Extend beyond core features by connecting to other tools and services in the workflow. This allows for more incremental change while making significant gains in automation, efficiency, and reproducibility. Immediate value from this reinforces more changes.. Now

OSF Authoring New storage – some that can run under your desk Analytic workflow Lab notebook Data management plan Institutional platforms like vivo Publishing platforms like OJS and Ubiquity Press OpenSesame Soon 29 grants to develop open tools and services: https://cos.io/pr/2015-09-24/

Get a persistent identifier https://osf.io/tvyxz/

See the impact File downloads

http://osf.io/share

https://api.osf.io/v2/docs/ Localized connections are now possible with the new OSF API. API Docs https://api.osf.io/v2/docs/

Awaiting Instruction…