Download presentation
Presentation is loading. Please wait.
1
Achieving Open Science
Brian Nosek University of Virginia -- Center for Open Science Shifting the scholarly culture toward open access, open data, and open workflow is partly an incentives problem, partly an infrastructure problem, and partly a coordination problem. The Center for Open Science (COS; is a non-profit technology and culture change organization working on all three. The central components of COS’s strategy are [1] fostering a commercial environment that monetizes on service delivery, not controlling access to content, [2] providing free, open, public goods infrastructure that scholarly communities brand and operate based on their local norms, and [3] coordinating across disciplinary and stakeholder silos to align scholarly practices with scholarly values.
2
Changing scientific culture is a coordination problem
ecosystem UNIVERSITIES PUBLISHING FUNDERS SOCIETIES
3
Technology to enable open Training to enact open Incentives to embrace open
Improving scientific ecosystem
4
Open Science Framework
OpenSesame
5
Munafo et al., 2017, Nature Human Behavior
6
Example: Preregistration Challenge http://cos.io/prereg
Training Example: Preregistration Challenge Scholarly Paper Guided Workflow Available Support Help guides, consulting
7
Incentives ecosystem UNIVERSITIES PUBLISHING FUNDERS SOCIETIES
8
Incentives for individual success are focused on getting it published, not getting it right
Nosek, Spies, & Motyl, 2012
9
Norms Counternorms Communality Universalism Disinterestedness
Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Secrecy Closed Particularism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Communality – open sharing with colleagues; Secrecy Universalism – research evaluated only on its merit; Particularism – research evaluated by reputation/past productivity Disinterestedness – scientists motivated by knowledge and discovery, not by personal gain; self-interestedness – treat science as a competition with other scientists Organized skepticism – consider all new evidence, theory, data, even if it contradicts one’s prior work/point-of-view; organized dogmatism – invest career in promoting one’s own most important findings, theories, innovations Quality – seek quality contributions; Quantity – seek high volume
10
University Incentives Hiring, promotion, tenure
11
Journals and Funders Incentives
Badges Standards Innovation in Publishing
12
Signals: Making Behaviors Visible Promotes Adoption
Badges Open Data Open Materials Preregistration Psychological Science (Jan 2014) 18 adopting journals Kidwell et al., 2016
13
40% 30% % Articles reporting data available in repository 20% 10% 0%
15
TOP Guidelines Data citation Design transparency
Research materials transparency Data transparency Analytic methods (code) transparency Preregistration of studies Preregistration of analysis plans Replication
16
Some TOP Signatory Organizations
AAAS/Science American Academy of Neurology American Geophysical Union American Heart Association American Meterological Society American Society for Cell Biology Association for Psychological Science Association for Research in Personality Association of Research Libraries Behavioral Science and Policy Association BioMed Central Committee on Publication Ethics Electrochemical Society Frontiers MDPI PeerJ Pensoft Publishers Public Library of Science The Royal Society Springer Nature Society for Personality and Social Psychology Society for a Science of Clinical Psychology Ubiquity Press Wiley
17
Registered Reports https://cos.io/rr/
Design Collect & Analyze Report Publish PEER REVIEW Review of intro and methods prior to data collection; published regardless of outcome Beauty vs. accuracy of reporting Publishing negative results Conducting replications Peer review focuses on quality of methods Committee Chair: Chris Chambers; 49 adopting journals as of Feb 2017
18
Society Incentives Norms Counternorms Communality Universalism
Open sharing Universalism Evaluate research on own merit Disinterestedness Motivated by knowledge and discovery Organized skepticism Consider all new evidence, even against one’s prior work Quality Secrecy Closed Particularism Evaluate research by reputation Self-interestedness Treat science as a competition Organized dogmatism Invest career promoting one’s own theories, findings Quantity Communality – open sharing with colleagues; Secrecy Universalism – research evaluated only on its merit; Particularism – research evaluated by reputation/past productivity Disinterestedness – scientists motivated by knowledge and discovery, not by personal gain; self-interestedness – treat science as a competition with other scientists Organized skepticism – consider all new evidence, theory, data, even if it contradicts one’s prior work/point-of-view; organized dogmatism – invest career in promoting one’s own most important findings, theories, innovations Quality – seek quality contributions; Quantity – seek high volume
19
Anderson, Martinson, & DeVries, 2007
20
The Kindergartener’s Guide to Improving Research
21
1. Show your work 2. Share
22
Technology to enable open Training to enact open Incentives to embrace open Metascience to evaluate open Improving scientific ecosystem
23
COS Strategic Plan https://osf.io/x2w9h These slides
My general substantive interest in the gap between values and practices. The work that I am discussing today is a practical application of this interest to the gap between scientific values and practices. In particular, how can I best advance knowledge and my career at the same time? Challenges I face when working to advance scientific knowledge and my career at the same time. And, how my scientific practices can be adapted to meet my scientific values. Take a picture
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.