Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April 30 2008.

Slides:



Advertisements
Similar presentations
PRAGMA Application (GridFMO) on OSG/FermiGrid Neha Sharma (on behalf of FermiGrid group) Fermilab Work supported by the U.S. Department of Energy under.
Advertisements

Open Science Grid Discovering and understanding the site environment Or, yet another site test kit.
ANTHONY TIRADANI AND THE GLIDEINWMS TEAM glideinWMS in the Cloud.
Building Campus HTC Sharing Infrastructures Derek Weitzel University of Nebraska – Lincoln (Open Science Grid Hat)
Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
Condor and GridShell How to Execute 1 Million Jobs on the Teragrid Jeffrey P. Gardner - PSC Edward Walker - TACC Miron Livney - U. Wisconsin Todd Tannenbaum.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
Workload Management Massimo Sgaravatto INFN Padova.
SCD FIFE Workshop - GlideinWMS Overview GlideinWMS Overview FIFE Workshop (June 04, 2013) - Parag Mhashilkar Why GlideinWMS? GlideinWMS Architecture Summary.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
glideinWMS: Quick Facts  glideinWMS is an open-source Fermilab Computing Sector product driven by CMS  Heavy reliance on HTCondor from UW Madison and.
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Site Provide one or more of the following capabilities: – access to local computational resources using a batch queue – interactive access to local.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Evolution of OSG to support virtualization and multi-core applications (Perspective of a Condor Guy) Dan Bradley University of Wisconsin Workshop on.
Campus Grids Report OSG Area Coordinator’s Meeting Dec 15, 2010 Dan Fraser (Derek Weitzel, Brian Bockelman)
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
GRAM5 - A sustainable, scalable, reliable GRAM service Stuart Martin - UC/ANL.
G RID M IDDLEWARE AND S ECURITY Suchandra Thapa Computation Institute University of Chicago.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
Grid Workload Management Massimo Sgaravatto INFN Padova.
Evolution of the Open Science Grid Authentication Model Kevin Hill Fermilab OSG Security Team.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Production Coordination Staff Retreat July 21, 2010 Dan Fraser – Production Coordinator.
OSG Production Report OSG Area Coordinator’s Meeting Aug 12, 2010 Dan Fraser.
Review of Condor,SGE,LSF,PBS
May 12, 2005Batch Workshop HEPiX Karlsruhe 1 Preparing for the Grid— Changes in Batch Systems at Fermilab HEPiX Batch System Workshop.
Michael Fenn CPSC 620, Fall 09.  Grid computing is the process of allowing loosely-coupled virtual organizations to share resources over a wide area.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
Pilot Factory using Schedd Glidein Barnett Chiu BNL
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
Mar 27, gLExec Accounting Solutions in OSG Gabriele Garzoglio gLExec Accounting Solutions in OSG Mar 27, 2008 Middleware Security Group Meeting Igor.
OSG Area Report Production – Operations – Campus Grids Jan 11, 2011 Dan Fraser.
OSG Site Admin Workshop - Mar 2008Using gLExec to improve security1 OSG Site Administrators Workshop Using gLExec to improve security of Grid jobs by Alain.
Eileen Berman. Condor in the Fermilab Grid FacilitiesApril 30, 2008  Fermi National Accelerator Laboratory is a high energy physics laboratory outside.
DIRAC Pilot Jobs A. Casajus, R. Graciani, A. Tsaregorodtsev for the LHCb DIRAC team Pilot Framework and the DIRAC WMS DIRAC Workload Management System.
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
An Introduction to Campus Grids 19-Apr-2010 Keith Chadwick & Steve Timm.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Job Management Claudio Grandi.
Campus Grid Technology Derek Weitzel University of Nebraska – Lincoln Holland Computing Center (HCC) Home of the 2012 OSG AHM!
Open Science Grid Consortium Storage on Open Science Grid Placing, Using and Retrieving Data on OSG Resources Abhishek Singh Rana OSG Users Meeting July.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Job submission overview Marco Mambelli – August OSG Summer Workshop TTU - Lubbock, TX THE UNIVERSITY OF CHICAGO.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
Why you should care about glexec OSG Site Administrator’s Meeting Written by Igor Sfiligoi Presented by Alain Roy Hint: It’s about security.
2 CMS 6 PB raw/run Phobos 50 TB/run E917 5 TB/run.
3 Compute Elements are manageable By hand 2 ? We need middleware – specifically a Workload Management System (and more specifically, “glideinWMS”) 3.
Condor Week 2007Glidein Factories - by I. Sfiligoi1 Condor Week 2007 Glidein Factories (and in particular, the glideinWMS) by Igor Sfiligoi.
Introduction to the Grid and the glideinWMS architecture Tuesday morning, 11:15am Igor Sfiligoi Leader of the OSG Glidein Factory Operations University.
UCS D OSG Summer School 2011 Intro to DHTC OSG Summer School An introduction to Distributed High-Throughput Computing with emphasis on Grid computing.
OSG Consortium Meeting - March 6th 2007Evaluation of WMS for OSG - by I. Sfiligoi1 OSG Consortium Meeting Evaluation of Workload Management Systems for.
UCS D OSG Summer School 2011 Overlay systems OSG Summer School An introduction to Overlay systems Also known as Pilot systems by Igor Sfiligoi University.
Honolulu - Oct 31st, 2007 Using Glideins to Maximize Scientific Output 1 IEEE NSS 2007 Making Science in the Grid World - Using Glideins to Maximize Scientific.
Dynamic Deployment of VO Specific Condor Scheduler using GT4
Operating a glideinWMS frontend by Igor Sfiligoi (UCSD)
Workload Management System
Building Grids with Condor
Presentation transcript:

Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 1 What is OSG? Links OSG home page.OSG home page VORS resource map and information.VORS VDT (Virtual Data Toolkit) home page.VDT Current use of OSG.Current use "Virtual Organizations" (VOs): trust point for authorization; role-based personalities. Works with multiple underlying batch systems (Condor, PBS family, LSF, SGE). Collection of mostly US-based scientific / academic sites sharing computing and storage resources via common software stack. Job submission and management based around Globus / CondorG.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 2 OSG facts and figures 83 registered computing resources. 30 registered VOs. Usage breakdown for 2008/04/19 – 2008/04/25:

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 3 Survey of Condor use on OSG Out of the box:  CondorG for inter-site job transfer via Globus/GRAM: GT2 submissions via CondorG still (by far) the most common method of grid job submission on OSG.  Task scheduling for site health monitoring.  One of several batch systems supported on OSG.  "ManagedFork" job management.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 4 Survey of Condor use on OSG External projects  Glidein / WMS: "pilot" job submission and management.  FermiGrid: job forwarding, "campus grid" management.  OSGMM / ReSS: job forwarding and attribute-based matchmaking across multiple OSG sites.  "condorview:" enhanced job monitoring and control – not the web-based statistics client of the same name.  Complex workflows (eg LIGO: Pegasus/DAGMAN).  Gratia: accounting system leverages features of condor where available: condor_history, PER_JOB_HISTORY_DIR, DN.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 5 More detail: Glidein/WMS Workload Management System (Igor Sfiligoi, FNAL) uses Condor Glideins -- startd submitted as a grid job ("pilot") makes remote batch nodes look like local ones.Workload Management SystemIgor Sfiligoi Two main components:  One or more glidein factories: manage available grid sites and submit pilot jobs.  One or more VO frontends: receive payload submissions from users for distribution to sites. Pilots receive user payloads as distributed by VO frontends.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 6 More detail: Glidein/WMS

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 7 More detail: Glidein/WMS Uses GCB for firewall / NAT management. Intra-VO priority management. Works with glExec: application running on worker nodes which handles authorization and UID mapping for payloads – per user accountability to the site.glExec Unaffected by grid site batch manager choice. V1.0 released Dec.'07; v1.1 Jan'08.v1.1 Jan'08 In use by: CDF; Minos (FNAL); being commissioned for CMS.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 8 More detail: "condorview" Michael Thomas, Caltech.Michael Thomas Graphical tool for browsing and managing a condor queue. Hooks to vacate and kill jobs. Hooks to ssh into job directory on worker node and print out process tree. Uses condor_q, condor_config_val, and condor_fetchlog.

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 9 More detail: condorview

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 10 More detail: condorview

April 30, 2008 Condor Week Chris Green OSG User Group / FNAL 11 Concluding statements Condor essential to the OSG. Condor use underpins connectivity of sites within the OSG. Close ties: Miron is OSG PI; VDT team at Wisconsin; new Condor features often a result of OSG needs. Widely used on OSG; many novel uses of and applications building on Condor features. More details in later talks!