LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.

Slides:



Advertisements
Similar presentations
11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
Advertisements

The ADAMANT Project: Linking Scientific Workflows and Networks “Adaptive Data-Aware Multi-Domain Application Network Topologies” Ilia Baldine, Charles.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Swift: A Scientist’s Gateway to Campus Clusters, Grids and Supercomputers Swift project: Presenter contact:
Design considerations for the Indigo Data Analysis Centre. Anand Sengupta, University of Delhi Thanks to -Maria Alessandra Papa (AEI) -Stuart Anderson.
Managing Workflows with the Pegasus Workflow Management System
Astrophysics on the OSG (LIGO, SDSS, DES) Astrophysics on the OSG (LIGO, SDSS, DES) Kent Blackburn LIGO Laboratory California Institute of Technology Open.
CONDOR DAGMan and Pegasus Selim Kalayci Florida International University 07/28/2009 Note: Slides are compiled from various TeraGrid Documentations.
LIGO Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee LIGO-G Z SuperComputing 2003 grid-distributed wide-area.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
1 TclGlobus ITR2003 Meeting Argonne National Laboratory May 10 th, 2004 Kent Blackburn LIGO-G E.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
The Grid is a complex, distributed and heterogeneous execution environment. Running applications requires the knowledge of many grid services: users need.
10/20/05 LIGO Scientific Collaboration 1 LIGO Data Grid: Making it Go Scott Koranda University of Wisconsin-Milwaukee.
LIGO Data Analysis (G M) Computing Breakout Session 1 1.LIGO Data Analysis Systems Data Management and Analysis 2.LIGO/LSC Analysis Software:
LIGO- G E LIGO Grid Applications Development Within GriPhyN Albert Lazzarini LIGO Laboratory, Caltech GriPhyN All Hands.
Patrick R Brady University of Wisconsin-Milwaukee
G Z LIGO Scientific Collaboration Grid Patrick Brady University of Wisconsin-Milwaukee LIGO Scientific Collaboration.
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
OSG Area Coordinators’ Meeting LIGO Applications (NEW) Kent Blackburn Caltech / LIGO October 29 th,
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Combining the strengths of UMIST and The Victoria University of Manchester Utility Driven Adaptive Workflow Execution Kevin Lee School of Computer Science,
 The workflow description modified to output a VDS DAX.  The workflow description toolkit developed allows any concrete workflow description to be migrated.
LIGO Applications Kent Blackburn (Robert Engel, Britta Daudert)
Pegasus-a framework for planning for execution in grids Ewa Deelman USC Information Sciences Institute.
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech.
© 2006 Open Grid Forum Enabling Pervasive Grids The OGF GIN Effort Erwin Laure GIN-CG co-chair, EGEE Technical Director
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
LIGO Z LIGO Scientific Collaboration -- UWM 1 LSC Data Analysis Alan G. Wiseman (LSC Software Coordinator) LIGO Scientific Collaboration.
Part Four: The LSC DataGrid Part Four: LSC DataGrid A: Data Replication B: What is the LSC DataGrid? C: The LSCDataFind tool.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Design considerations for the Indigo Data Analysis Centre. Anand Sengupta, University of Delhi Many thanks to -Maria Alessandra Papa (AEI) -Stuart Anderson.
Intermediate Condor: Workflows Rob Quick Open Science Grid Indiana University.
The LIGO Scientific Collaboration Data Grid Client/Server Environment Junwei Cao MIT LIGO Laboratory For the LIGO Scientific Collaboration.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
CEDPS Data Services Ann Chervenak USC Information Sciences Institute.
LIGO-G D LSC Meeting Nov An Early Glimpse at the S3 Run Stan Whitcomb (LIGO–Caltech) LIGO Scientific Collaboration Meeting LIGO Hanford.
Experiment Management from a Pegasus Perspective Jens-S. Vöckler Ewa Deelman
G Z LIGO's Physics at the Information Frontier Grant and OSG: Update Warren Anderson for Patrick Brady (PIF PI) OSG Executive Board Meeting Caltech.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
LIGO-G Z LSC ASIS Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Update on November 2003 grid-distributed.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
LIGO-G E PAC9 Meeting LIGO Laboratory at Caltech 1 The LIGO I Science Run Data & Computing Group Operations Plan 9 th Meeting of the.
LIGO-G Z1 Using Condor for Large Scale Data Analysis within the LIGO Scientific Collaboration Duncan Brown California Institute of Technology.
LSC Z000.0 LSC at LHO LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Committee Report Alan Wiseman University.
LIGO: The Laser Interferometer Gravitational-wave Observatory Sarah Caudill Louisiana State University Physics Graduate Student LIGO Hanford LIGO Livingston.
AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Managing LIGO Workflows on OSG with Pegasus Karan Vahi USC Information Sciences Institute
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
VO Experiences with Open Science Grid Storage OSG Storage Forum | Wednesday September 22, 2010 (10:30am)
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Building an European Research Community through Interoperable Workflow and Data Gabor Terstyanszky University of Westminster.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Pegasus WMS Extends DAGMan to the grid world
Gregory Mendell LIGO Hanford Observatory
Overview of E10 / S3 Hardware Injections
Summary of LSC Data Analysis Activities: Alan Wiseman
Mats Rynge USC Information Sciences Institute
High Throughput Computing for Astronomers
Presentation transcript:

LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

LSC Data Grid The LIGO Scientific Collaboration’s Data Grid: The LIGO Scientific Collaboration’s Data Grid: Nine Clusters (CIT, MIT, LHO, LLO, UWM, PSU, AEI, ISI, Birmingham)Nine Clusters (CIT, MIT, LHO, LLO, UWM, PSU, AEI, ISI, Birmingham) Close to 2000 CPUs within LSCClose to 2000 CPUs within LSC Condor is primary toolCondor is primary tool LDAS used for data reduction, database, and other analysesLDAS used for data reduction, database, and other analyses Learn more at … more at …

Issues with LSC Data Grid Many LIGO analysis groups take a local approach to using LSC Data Grid Many LIGO analysis groups take a local approach to using LSC Data Grid Concrete “DAG” workflows have been work horse targeting specific sites Concrete “DAG” workflows have been work horse targeting specific sites Culture developed out of particulars of analysis methods and nature of compute resources, e.g., …Culture developed out of particulars of analysis methods and nature of compute resources, e.g., … Where is the data I need most likely to be? Where is the data I need most likely to be? Where are the compute nodes the fastest? Where are the compute nodes the fastest? Scientific results from inspiral & pulsar analyses limited by available compute resources Scientific results from inspiral & pulsar analyses limited by available compute resources

Supercomputing 2004 Deployed an “inspiral” analysis across the LSC Data Grid Deployed an “inspiral” analysis across the LSC Data Grid Used Pegasus to “plan execution” across this distributed gridUsed Pegasus to “plan execution” across this distributed grid First use of abstract “DAX” in LIGO analysis First use of abstract “DAX” in LIGO analysis Included use of LSU clusterIncluded use of LSU cluster Considered very successful by LSCConsidered very successful by LSC Encountered transfer client timeouts due to large number of connections to any single gridftp server – solution currently under development by Pegasus team Encountered transfer client timeouts due to large number of connections to any single gridftp server – solution currently under development by Pegasus team

Going Beyond SC04 SC04 demonstrated that non-localized usage of LSC Data Grid by LSC analysis groups possible! SC04 demonstrated that non-localized usage of LSC Data Grid by LSC analysis groups possible! Pegasus will soon efficiently support LIGO dataset challenges through bundled transfer support on single connection Pegasus will soon efficiently support LIGO dataset challenges through bundled transfer support on single connection In January, a workshop on Pegasus is planned for LSC to bootstrap other analysis groups on using “DAX” workflows on a distributed grid. In January, a workshop on Pegasus is planned for LSC to bootstrap other analysis groups on using “DAX” workflows on a distributed grid.

Migration to Grid3 January workshop will also include a tutorial on using Grid3 January workshop will also include a tutorial on using Grid3 Goal is to carryout inspiral analysis utilizing Grid3 when possible Goal is to carryout inspiral analysis utilizing Grid3 when possible Hope to deploy stochastic analysis across the LSC Data Grid and onto Grid3 as well Hope to deploy stochastic analysis across the LSC Data Grid and onto Grid3 as well LIGO plans to build up in-house technical expertise for running on Grid3 LIGO plans to build up in-house technical expertise for running on Grid3

On to OSG Based on experiences running on Grid3 in late winter 2005, plan to migrate inspiral and if available stochastic analysis onto OSG0 once up and available in the spring Based on experiences running on Grid3 in late winter 2005, plan to migrate inspiral and if available stochastic analysis onto OSG0 once up and available in the spring