OSG Rob Gardner • University of Chicago

Slides:



Advertisements
Similar presentations
Jaime Frey Computer Sciences Department University of Wisconsin-Madison OGF 19 Condor Software Forum Routing.
Advertisements

Efi.uchicago.edu ci.uchicago.edu Connecting Campus Infrastructures with HTCondor Services Lincoln Bryant Computation and Enrico Fermi Institutes University.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago July 17, 2013.
Parallel Scripting on Beagle with Swift Ketan Maheshwari Postdoctoral Appointee (Argonne National.
EHarmony in Cloud Subtitle Brian Ko. eHarmony Online subscription-based matchmaking service Available in United States, Canada, Australia and United Kingdom.
Overview of Wisconsin Campus Grid Dan Bradley Center for High-Throughput Computing.
Building Campus HTC Sharing Infrastructures Derek Weitzel University of Nebraska – Lincoln (Open Science Grid Hat)
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
International Workshop APAN 24, Current State of Grid Computing Researches and Applications in Vietnam Nguyen Thanh Thuy 1, Nguyen Kim Khanh 1,
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Swift: A Scientist’s Gateway to Campus Clusters, Grids and Supercomputers Swift project: Presenter contact:
Intermediate HTCondor: Workflows Monday pm Greg Thain Center For High Throughput Computing University of Wisconsin-Madison.
Communicating with Users about HTCondor and High Throughput Computing Lauren Michael, Research Computing Facilitator HTCondor Week 2015.
Center For Research Computing (CRC), University of Notre Dame, Indiana Application of ND CRC to be a member of the OSG Council Jarek Nabrzyski CRC Director.
Ways to Connect to OSG Tuesday afternoon, 3:00 pm Lauren Michael Research Computing Facilitator University of Wisconsin-Madison.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Introduction and Overview Questions answered in this lecture: What is an operating system? How have operating systems evolved? Why study operating systems?
OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Globus online Reliable, high-performance file transfer… made easy. XSEDE ECSS Symposium, Dec.12, 2011 Presenter: Steve Tuecke, Deputy Director Computation.
BOSCO Architecture Derek Weitzel University of Nebraska – Lincoln.
| nectar.org.au NECTAR TRAINING Module 3 Common use cases.
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Discovery Environment Overview.
Center for Research Computing at Notre Dame Jarek Nabrzyski, Director
1 Condor Team 2011 Established 1985.
Intermediate Condor: Workflows Rob Quick Open Science Grid Indiana University.
Tevfik Kosar Computer Sciences Department University of Wisconsin-Madison Managing and Scheduling Data.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
September 2, 2015 ACI-REF Mission: User Sensitivity 101 Tom Doak Le-Shin Wu Carrie Ganote National Center for Genome Analysis Support.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
| nectar.org.au NECTAR TRAINING Module 3 Common use cases.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
A worldwide e-Infrastructure and Virtual Research Community for NMR and structural biology Alexandre M.J.J. Bonvin Project coordinator Bijvoet Center for.
HTCondor-CE. 2 The Open Science Grid OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories.
TOWARDS AN ARCHITECTURE FOR NATIONAL DATA SERVICES Ian Foster Director, Computation Institute Argonne National Laboratory & The University of
INTRODUCTION TO HIGH PERFORMANCE COMPUTING AND TERMINOLOGY.
Ways to Connect to OSG Tuesday, Wrap-Up Lauren Michael, CHTC.
Advanced Network Administration Computer Clusters.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
Parrot and ATLAS Connect
Accessing the VI-SEEM infrastructure
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CLOUD COMPUTING
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Miron Livny John P. Morgridge Professor of Computer Science
Recap: introduction to e-science
OSG Connect and Connect Client
Study course: “Computing clusters, grids and clouds” Andrey Y. Shevel
XSEDE’s Campus Bridging Project
Campus Bridging at XSEDE
Brian Lin OSG Software Team University of Wisconsin - Madison
Mats Rynge USC Information Sciences Institute
GLOW A Campus Grid within OSG
Introduction to Research Facilitation
ACI-Ref Virtual Residency 2019
Welcome to (HT)Condor Week #19 (year 34 of our project)
Welcome to HTCondor Week #17 (year 32 of our project)
Thursday AM, Lecture 1 Lauren Michael
Thursday AM, Lecture 2 Brian Lin OSG
Presentation transcript:

OSG Rob Gardner • University of Chicago Advanced Cyberinfrastructure Research & Education Facilitators Virtual Residency Oklahoma University, June 4, 2015 Rob Gardner • University of Chicago OSG User Support and Campus Grids area coordinator rwg@uchicago.edu

Open Science Grid: HTC supercomputer aggregate of individual clusters > 100k cores

Open Science Grid: HTC supercomputer 2014 stats 67% size of XD, 35% BlueWaters 2.5 Million CPU hours/day 800M hours/year 125M/y provided opportunistic >1 petabyte of data/day 50+ research groups thousands of users XD service provider for XSEDE Rudi Eigenmann Program Director Division of Advanced Cyberinfrastructure (ACI) NSF CISE CASC Meeting, April 1, 2015

Science that has benefited Biological & medical sciences Nuclear theory Chemistry Molecular dynamics Materials Genetics Drug studies

OSG virtual organization computation by site 125 million hours opportunistic for individual projects

Is HTC for you? Does your science workflow involve computations that can be split into many independent jobs? Can individual jobs be run on a single processor (as opposed to a single large-scale MPI job simultaneously utilizing many processors)? Can your applications be interrupted and restarted at a later time or on another processor or computing site? Is your application software “portable”? How does computing fit into your science workflow? Are results needed immediately after submission? Do your computations require access to or produce huge amounts of data?

OSG Connect Service Login host Job scheduler Software Storage OSG as a campus cluster Login host Job scheduler Software Storage

Submit jobs to OSG with HTCondor Simple HTCondor submission Complexity hidden from the user No grid certificates required Uses HTCondor ClassAd and glidein technology DAGMan and other workflow tools

Software & tools on the OSG Distributed software file system Special module command identical software on all clusters $ switchmodules oasis $ module avail $ module load R $ module load namd $ module purge $ switchmodules local Common tools & libs Curate on demand continuously HTC apps in XSEDE campus bridging yum repo

Storage service: “Stash” Provide a quasi-transient storage service for job input/output data POSIX access provided to the login host Globus Online Server for managed transfers from campus data services Personalized http service endpoint Can handle data sets up to 10 TB/user Connected to 100 Gbps SciDMZ (I2, ESnet)

Easy, reliable upload of data to Stash for processing on OSG, and downloading output data back to campus.

Quotable: Searching for Symmetry Orbits & Rainbow Cliques in Group Theory From: Anton Betten <betten@math.colostate.edu> Date: Wed, Sep 3, 2014 at 10:47 AM Subject: OSG runs Hi Suchandra, just a quick note that everything is going extremely well with my computations on the OSG. I start 10K jobs one day and the next day they are done. Fantastic! I am probably more than 50% done by now, and I am trying hard to complete before my next conference in Germany (in less than two weeks). If I can report that the classification is finished and confirm the published result, that would be great. Thanks, Anton ________________________ Anton Betten, Ph.D. Associate Professor Department of Mathematics Colorado State University Fort Collins, CO 80523-1874, U.S.A.

“My recommendation for other scientists looking into using the OSG is don’t be intimidated, find someone to work with, and don’t hesitate to reach out… The competence and concern of the staff, and their ability to communicate with non-computer scientists are a huge help. It seems like OSG really has a service mentality.” Patrick Reeves, The National Center for Genetic Resources Preservation (NCGRP), U.S. Department of Agriculture

http://osgconnect.net/

Communities or campuses, as a service CI Connect - services for campus grids (Duke, UChicago) CMS Connect (users from 50 schools) - led by Notre Dame

Summary and Conclusions For researchers: OSG Connect provides a quick “on-ramp” to resources in the Open Science Grid For campus providers: OSG Connect offers a service to connect campus HPC centers to the OSG in lightweight fashion Hosted “Compute Element” available for larger sites For communities & campus grids: CI Connect to share resources and collaborate easily

opensciencegrid @rwg