Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:

Slides:



Advertisements
Similar presentations
CHEPREO Tier-3 Center Achievements. FIU Tier-3 Center Tier-3 Centers in the CMS computing model –Primarily employed in support of local CMS physics community.
Advertisements

May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Astrophysics on the OSG (LIGO, SDSS, DES) Astrophysics on the OSG (LIGO, SDSS, DES) Kent Blackburn LIGO Laboratory California Institute of Technology Open.
Assessment of Core Services provided to USLHC by OSG.
F Run II Experiments and the Grid Amber Boehnlein Fermilab September 16, 2005.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OSG Public Storage and iRODS
LIGO Data Analysis (G M) Computing Breakout Session 1 1.LIGO Data Analysis Systems Data Management and Analysis 2.LIGO/LSC Analysis Software:
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
LIGO- G E LIGO Grid Applications Development Within GriPhyN Albert Lazzarini LIGO Laboratory, Caltech GriPhyN All Hands.
Patrick R Brady University of Wisconsin-Milwaukee
LSC General Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 UWM Data Analysis Facility LSC General Meeting, August.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
CERN, BalticGrid Project Rolandas Naujikas [rolandas nawyeekas] CERN IT/GD Vilnius University Lithuania.
OSG Area Coordinators’ Meeting LIGO Applications (NEW) Kent Blackburn Caltech / LIGO October 29 th,
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
 The workflow description modified to output a VDS DAX.  The workflow description toolkit developed allows any concrete workflow description to be migrated.
LIGO Applications Kent Blackburn (Robert Engel, Britta Daudert)
Wenjing Wu Andrej Filipčič David Cameron Eric Lancon Claire Adam Bourdarios & others.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
Main title ERANET - HEP Group info (if required) Your name ….
Main title HEP in Greece Group info (if required) Your name ….
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
OSG site utilization by VOs Ilya Narsky, Caltech.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Grid Operations Lessons Learned Rob Quick Open Science Grid Operations Center - Indiana University.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US CMS Centers & Grids – Taiwan GDB Meeting1 Introduction l US CMS is positioning itself to be able to learn, prototype and develop while providing.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
G Z LIGO's Physics at the Information Frontier Grant and OSG: Update Warren Anderson for Patrick Brady (PIF PI) OSG Executive Board Meeting Caltech.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
ATLAS Midwest Tier2 University of Chicago Indiana University Rob Gardner Computation and Enrico Fermi Institutes University of Chicago WLCG Collaboration.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
LIGO-G Z1 Using Condor for Large Scale Data Analysis within the LIGO Scientific Collaboration Duncan Brown California Institute of Technology.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
VO Experiences with Open Science Grid Storage OSG Storage Forum | Wednesday September 22, 2010 (10:30am)
] Open Science Grid Ben Clifford University of Chicago
WLCG IPv6 deployment strategy
Open Science Grid Progress and Status
Readiness of ATLAS Computing - A personal view
Connecting the European Grid Infrastructure to Research Communities
Open Science Grid at Condor Week
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC: G E

OSG Organization Albert Lazzarini currently sits on Executive Board Kent Blackburn currently sits on OSG Council Kent Blackburn is LIGO liaison to OSG and participates in many OSG activity groups PSU, UWM, Caltech also providing effort to support OSG site administration and application development

OSG Production Grid Evolved out of GRID3 More than 50 sites on OSG Production Grid –UWM & PSU from LIGO ~10% of these LDG compute cycles made available Over CPUs Used to support VO production runs for simulations and data analysis To date, only one release of software stack for use on production sites.

OSG Integration Testbed Roughly 30 sites currently in the OSG –Caltech from LSC Almost 5000 CPUs (some shared with production sites) Used to as test-bed to validate new OSG software stacks and develop new VO applications.

LIGO’s ITB Cluster Located on 6 th floor of Millikan Library Eight dual CPU nodes 1.6 TB disk space Used to develop LIGO applications for the OSG Production Grid Used to test new releases of the OSG software stack (& propose future changes supporting LIGO’s OSG application development)

LIGO Applications on OSG Targeted binary inspiral searches Effort began in preparation for July OSG Consortium meeting in Milwaukee of this year October 1 st succeeded in running binary inspiral application on LIGO’s OSG integration testbed cluster using a test DAG of ~800 dag-nodes. Two weeks later, succeeded in running on LIGO’s OSG production sites Have now run binary inspiral application at three CMS OSG production sites with mixed results: –1.25 to 19 hours to transfer 38 GBs of LIGO data –0.73 to 3.98 hours to compute –Requires customization of OSG software stack beyond current production “cache” Running jobs to fill queues for Super Computing 2005: Plan to run binary inspiral DAGs with 16,000 dag-nodes on OSG in near future! –Need to overcome data delivery and storage issues on non-LIGO sites.

Virtual Organization PSU hosts LIGO’s OSG VOMS server –Until this July, LIGO was under iVDGL VO’s VOMS Approximately 100 LIGO certificates registered Two productions sites: –UW Milwaukee –PSU One integration testbed site: –Caltech –Some interest in having a second site Currently only 3 non-LIGO OSG production sites recognize LIGO’s VO for running on their sites –Pitfall is that new VOMS info being tied to OSG releases

OSG Challenges Ease of porting LIGO data analysis applications from LDG to OSG: goal is to be transparent Efficiency of utilization of other OSG VO sites –LDG benefits tremendously from co-locating data and computing –Additional computer cycles from OSG coupled to inefficiencies of other VOs at utilizing these cycles. More frequent releases of software stack to better support VO and application needs Down the road issues: –Funding –Reorganization –Storage Management –Interoperability with TeraGrid and EGEE

OSG Summary LIGO and the LSC have contributed significantly to the OSG process, providing personnel at all levels from oversight to administrators, developers and applications. –LIGO and the LSC plan to continue to contribute –The LIGO Data Grid is critical to LIGO data analysis, migration and evolution of both the LDG and OSG will determine the degree of polymorphism OSG has only released one production software stack –LIGO’s VOMS server didn’t exist at the time of this release so most OSG sites are not LIGO aware due to software releases being so strongly coupled to the releases. –LIGO has worked with individual production sites to get the LIGO VO supported during the release interim so that LIGO applications can be developed and tested on the OSG LIGO has now developed an application based on the binary inspiral search code for running on the OSG –Requires patches beyond officials OSG software stack –Data transmission times significantly mask CPU benefits for LIGO application