David Mellor osf.io/qthsf @EvoMellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor osf.io/qthsf @EvoMellor.

Slides:



Advertisements
Similar presentations
Linking Repositories Scoping Study Key Perspectives Ltd University of Hull SHERPA University of Southampton.
Advertisements

Improving Integrity, Transparency, and Reproducibility Through Connection of the Scholarly Workflow Andrew Sallans Partnerships Lead Center for Open Science.
Open Science Framework: Supporting the research worflow Brian Nosek University of Virginia.
Selecting Preservation Strategies for Web Archives Stephan Strodl, Andreas Rauber Department of Software.
Sara Bowman Center for Open Science Open Science Framework: Facilitating Transparency and Reproducibility.
Data-PASS Shared Catalog Micah Altman & Jonathan Crabtree 1 Micah Altman Harvard University Archival Director, Henry A. Murray Research Archive Associate.
Presented by DOI Create: TERN as a use-case Siddeswara Guru
“Filling the digital preservation gap” an update from the Jisc Research Data Spring project at York and Hull Jenny Mitcham Digital Archivist Borthwick.
Making Connections: SHARE and the Open Science Framework Jeffrey Open Repositories 2015.
DA-RT What publishing houses can and can’t do Patrick McCartan Publishing Director, Social Science and Humanities Journals Cambridge University Press.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Open Science Framework Jeffrey Spies University of Virginia.
Data Foundation IG DF Organizing Chairs: Gary Berg-Cross & Peter Wittenburg.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Scientific Utopia: Improving Openness and Reproducibility Brian Nosek University of Virginia Center for Open Science.
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Webinar on increasing openness and reproducibility April Clyburne-Sherin Reproducible Research Evangelist
Practical Steps for Increasing Openness and Reproducibility Courtney Soderberg Statistical and Methodological Consultant Center for Open Science.
Developing a Dark Archive for OJS Journals Yu-Hung Lin, Metadata Librarian for Continuing Resources, Scholarship and Data Rutgers University 1 10/7/2015.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
David Mellor, PhD Project Manager at Improving Openness and Reproducibility of Scientific Research.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
International Planetary Data Alliance Registry Project Update September 16, 2011.
Brian Nosek University of Virginia -- Center for Open Science -- Improving Openness.
Open Science Framework Jeffrey Center for Open Science | University of Virginia.
Sara Bowman Center for Open Science | Promoting, Supporting, and Incentivizing Openness in Scientific Research.
Dr.V.Jaiganesh Professor
David Preregistration David
Dataverse Integration with Open Science Framework (OSF)
Scholarly Workflow: Federal Prototype and Preprints
RDA US Science workshop Arlington VA, Aug 2014 Cees de Laat with many slides from Ed Seidel/Rob Pennington.
Improving Openness and Reproducibility of Scientific Research
Improving Openness and Reproducibility of Scientific Research
Shifting the research culture toward openness and reproducibility
Center for Open Science: Practical Steps for Increasing Openness
Jarek Nabrzyski Director, Center for Research Computing
Lorne Campbell University of Western Ontario
Three points 1. Scientists’ Conflict of Interest 2
Improving Openness and Reproducibility of Scientific Research
Connection of the scholarly work flow with the open science framework
SHARE: A Public Good to Increase Scholarly Innovation
David Mellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor
Open Science Framework
An Open Science Framework for Managing and Sharing Research Workflows
Preregistration Challenge
Exercise: understanding authenticity evidence
Open Science Framework
Achieving Open Science
Data Sharing Now and in the Future
Transparency increases the credibility and relevance of research
Preregistration on the Open Science Framework
Scaling the Open Science Framework: National Data Service Dashboard, Cloud Storage Add-ons, and Sharing Science Data on the Decentralized Web Natalie K.
A Framework for Managing and Sharing Research Workflow
Reinventing Scholarly Communication by Separating Publication From Evaluation Brian Nosek University of Virginia -- Center for Open Science
Shifting incentives from getting it published to getting it right
Publishing software and data
SowiDataNet - A User-Driven Repository for Data Sharing and Centralizing Research Data from the Social and Economic Sciences in Germany Monika Linne, 30.
Incentives for a more #openscience
Integral Employee Engagement
Improving Openness and Reproducibility of Scientific Research
Session 4 Open Workflow: OSF and Pre-registration exercise
Modularity and Interoperability
Study Pre-Registration
Disrupting Scholarly Communication
The Framework for Teaching
Sophia Lafferty-hess | research data manager
Mission SHARE is a higher education initiative whose mission is to maximize research impact by making research widely accessible, discoverable, and reusable.
School of Psychology, Cardiff University
Repository Platforms for Research Data Interest Group: Requirements, Gaps, Capabilities, and Progress Robert R. Downs1, 1 NASA.
Badges to Acknowledge Open Practices
Presentation transcript:

David Mellor osf.io/qthsf @EvoMellor Building infrastructure to connect, preserve, speed up, and improve scholarship David Mellor osf.io/qthsf @EvoMellor Center for Open Science cos.io @OSFramework

Evidence to encourage change Incentives to embrace change Mission: To increase the openness, integrity, and reproducibility of scholarship. Strategy: Evidence to encourage change Incentives to embrace change Infrastructure to enable change Improving scientific ecosystem

Infrastructure Metascience Community

The combination of a strong bias toward statistically significant findings and flexibility in data analysis results in irreproducible research

The combination of a strong bias toward statistically significant findings and flexibility in data analysis results in irreproducible research Fanelli D (2010) “Positive” Results Increase Down the Hierarchy of the Sciences. PLoS ONE 5(4): e10068. doi:10.1371/journal.pone.0010068 http://127.0.0.1:8081/plosone/article?id=info:doi/10.1371/journal.pone.0010068

The Garden of Forking Paths The combination of a strong bias toward statistically significant findings and flexibility in data analysis results in irreproducible research The Garden of Forking Paths Control for time? Exclude outliers? Median or mean? “Does X affect Y?” Gelman and Loken, 2013

The combination of a strong bias toward statistically significant findings and flexibility in data analysis results in irreproducible research p-values Original Studies Replications 97% “significant” 37% “significant”

Incentives for individual success are focused on getting it published, not getting it right Nosek, Spies, & Motyl, 2012

Barriers Perceived norms (Anderson, Martinson, & DeVries, 2007) Motivated reasoning (Kunda, 1990) Minimal accountability (Lerner & Tetlock, 1999) I am busy (Me & You, 2016) We can understand the nature of the challenge with existing psychological theory. For example: 1. The goals and rewards of publishing are immediate and concrete; the rewards of getting it right are distal and abstract (Trope & Liberman) 2. I have beliefs, ideologies, and achievement motivations that influence how I interpret and report my research (motivated reasoning; Kunda, 1990). And, even if I am trying to resist this motivated reasoning. I may simply be unable to detect it in myself, even when I can see those biases in others. 3. And, what biases might influence me. Well, pick your favorite. My favorite in this context is the hindsight bias. 4. What’s more is we face these potential biases in a context of minimal accountability. What you know of my laboratory work is only what you get in the published report. … 5. Finally, even if I am prepared to accept that I have these biases and am motivated to address them so that I can get it right. I am busy. So are you. If I introduce a whole bunch of new things that I must now do to check and correct for my biases, I will kill my productivity and that of my collaborators. So, the incentives lead me to think that my best course of action is to just to the best I can and hope that I’m doing it okay.

Incentives to embrace change Improving scientific ecosystem

cos.io/prereg

Preregistration increases credibility by specifying in advance how data will be analyzed, thus preventing biased reasoning from affecting data analysis. cos.io/prereg

Peer review before results are known to reduce bias and increase rigor Registered Reports Peer review before results are known to reduce bias and increase rigor cos.io/rr

Technology to enable change Improving scientific ecosystem

http://osf.io/ free, open source Share data, share materials, show the research process – confirmatory result make it clear, exploratory discovery make it clear; demonstrate the ingenuity, perspiration, and learning across false starts, errant procedures, and early hints – doesn’t have to be written in painstaking detail in the final report, just make it available. http://osf.io/ free, open source

Collaboration Documentation Archiving Content management and collaboration system Free service Connect, curate, search all aspects of the research project We don’t want to repeat ourselves Dataverse S3 Figshare Dropbox Service vs Application interface the service build new applicatoins this is right in line with SHARE-NS - openning, unlcoking this data allows for innovation From day one, we’ve been very excited about the SHARE partnerhship: community, expertise bringing to the table as well as shared interests mission Technical perspective, this project fit very much in line with what are building and more importantly how we are building it

Content management and collaboration system Free service Connect, curate, search all aspects of the research project We don’t want to repeat ourselves Dataverse S3 Figshare Dropbox Service vs Application interface the service build new applicatoins this is right in line with SHARE-NS - openning, unlcoking this data allows for innovation From day one, we’ve been very excited about the SHARE partnerhship: community, expertise bringing to the table as well as shared interests mission Technical perspective, this project fit very much in line with what are building and more importantly how we are building it

Put data, materials, and code on the OSF quite simply, it is a file repository that allows any sort of file to be stored and many types to be rendered in the browser without any special software. This is very important for increasing accessibility to research.

Manage access and permissions

Automate versioning With hashes

Connects Services Researchers Use

OpenSesame

OpenSesame

Registration

Template Forms

Share your work

Persistent Citable Identifiers

SHARE (http://share.osf.io/) Gather Notify Providers Consumers So this is what SHARE is doing - taking metadata about research from different kinds of digital repositories, normalizing it, and providing a feed, an API that can be queried, and a database that can be searched. This was Phase I - the data processing pipeline, workflow for building harvesters, development of the schema. Phase II is cleaning, enhancing, linking.

The OSF is one vision of an application framework

Toolkit Ecosystem Data OSF

Community Services Interfaces Toolkit Ecosystem Data OSF

Content Experts Schol Comm Experts Technical Experts Three layers (let experts be experts, reduce redundancy and cost, accelerate innovation) Top = content and interfaces -> researchers care Middle = services -> schol comm innovators care Bottom = tool-kit -> developers care Toolkit Ecosystem Data Technical Experts

http://osf.io/preprints/

osf.io/registries

Content Experts Schol Comm Experts Technical Experts Three layers (let experts be experts, reduce redundancy and cost, accelerate innovation) Top = content and interfaces -> researchers care Middle = services -> schol comm innovators care Bottom = tool-kit -> developers care Toolkit Ecosystem Data Technical Experts

Connecting, preserving, speeding up, and improving scholarship These slides are shared at: https://osf.io/ [Take a picture!] Thank you! David@cos.io @EvoMellor