LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech.

Slides:



Advertisements
Similar presentations
11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
Advertisements

Calibration of the gravitational wave signal in the LIGO detectors Gabriela Gonzalez (LSU), Mike Landry (LIGO-LHO), Patrick Sutton (PSU) with the calibration.
Anatomy of an ldasJob LSC Meeting March 2001 Antony C. Searle ACIGA / ANU LIGO-G Z.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GRID DATA MANAGEMENT PILOT (GDMP) Asad Samar (Caltech) ACAT 2000, Fermilab October , 2000.
LIGO Reduced Data Sets E7 standard reduced data set RDS generation for future runs decimation examples LIGO-G Z Isabel Leonor (w/ Robert Schofield)
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Detector Characterization in GEO 600 by Alicia M. Sintes-Olives on behalf of the GEO-DC team Universitat de les Illes Balears
1 TclGlobus ITR2003 Meeting Argonne National Laboratory May 10 th, 2004 Kent Blackburn LIGO-G E.
G D LSC ITR2003, ANL, LIGO Scientific Collaboration ITR 2003 Advisory Committee Meeting K. Blackburn 1, P. Brady 2, S. Finn 3, E.
LSC Segment Database Duncan Brown Caltech LIGO-G Z.
LIGO-G Z Detector characterization for LIGO burst searches Shourov K. Chatterji for the LIGO Scientific Collaboration 10 th Gravitational Wave.
LIGO-G9900XX-00-M LIGO and GriPhyN Carl Kesselman Ewa Deelman Sachin Chouksey Roy Williams Peter Shawhan Albert Lazzarini Kent Blackburn CaltechUSC/ISI.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Patrick R Brady University of Wisconsin-Milwaukee
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
LIGO NSF Rev LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Upper Limits Groups: Status and Plans Alan Wiseman LSC Software.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting October 10-11, 2002.
Data Characterization in Gravitational Waves Soma Mukherjee Max Planck Institut fuer Gravitationsphysik Golm, Germany. Talk at University of Texas, Brownsville.
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
Project 2003 Presentation Ben Howard 15 th July 2003.
LIGO Z LIGO Scientific Collaboration -- UWM 1 LSC Data Analysis Alan G. Wiseman (LSC Software Coordinator) LIGO Scientific Collaboration.
Multidimensional classification of burst triggers from the fifth science run of LIGO Soma Mukherjee CGWA, UTB GWDAW11, Potsdam, 12/18/06 LIGO-G
CLASS Information Management Presented at NOAATECH Conference 2006 Presented by Pat Schafer (CLASS-WV Development Lead)
Some Grid Science California Institute of Technology Roy Williams Paul Messina Grids and Virtual Observatory Grids and and LIGO.
LIGO-G E Generic LabelLIGO Laboratory at Caltech 1 Distributed Computing for LIGO Data Analysis The Aspen Winter Conference on Gravitational Waves,
LIGO-G D What can we Expect for the “Upper Limit” Run Stan Whitcomb LSC Meeting 13 August 2001 LIGO Hanford Observatory A Personal Reading of.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
LIGO-G Z LIGO at the start of continuous observation Prospects and Challenges Albert Lazzarini LIGO Scientific Collaboration Presentation at NSF.
LIGO-G Z 5 June 2001L.S.Finn/LIGO Scientific Collaboration1 LDAS Camp.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.
Module 9 Planning and Implementing Monitoring and Maintenance.
29 November 2001 LIGO-G Z PAC 11, LIGO Hanford Observatory 1 Proposal to NSF from Loyola University New Orleans RUI: Search for a stochastic background.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
LIGO-G Z Instrum. Lines and Blind CW Search K. Riles - University of Michigan 1 Status Report on Instrumental Line Catalog and Blind.
LIGO- G D E7: An Instrument-builder’s Perspective Stan Whitcomb LIGO Caltech LSC Meeting Upper Limits Plenary Session 22 May 2002 LIGO Livingston.
Gregory Mendell, LIGO Hanford Observatory LIGO-G WLIGO-G WLIGO-G W LIGO S5 Reduced Data Set Generation March 2007.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
GEO Online Detector Characterization System R. Balasubramanian Cardiff University LSC March 2003 LIGO-G Z.
May 29, 2006 GWADW, Elba, May 27 - June 21 LIGO-G0200XX-00-M Data Quality Monitoring at LIGO John Zweizig LIGO / Caltech.
LIGO-G E PAC9 Meeting LIGO Laboratory at Caltech 1 The LIGO I Science Run Data & Computing Group Operations Plan 9 th Meeting of the.
LIGO-G Z1 Using Condor for Large Scale Data Analysis within the LIGO Scientific Collaboration Duncan Brown California Institute of Technology.
LSC Z000.0 LSC at LHO LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Committee Report Alan Wiseman University.
G Z The LIGO gravitational wave detector consists of two observatories »LIGO Hanford Observatory – 2 interferometers (4 km long arms and 2 km.
March 2004 At A Glance The AutoFDS provides a web- based interface to acquire, generate, and distribute products, using the GMSEC Reference Architecture.
Online School Management System Supervisor Name: Ashraful Islam Juwel Lecturer of Asian University of Bangladesh Submitted By: Bikash Chandra SutrodhorID.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
LSC Z000.0 LSC at LLO LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software Coordinator Report Alan Wiseman University.
Report on the FCT MDC Stuart Anderson, Kent Blackburn, Philip Charlton, Jeffrey Edlund, Rick Jenet, Albert Lazzarini, Tom Prince, Massimo Tinto, Linqing.
Gravitational Wave Data Analysis  GW detectors  Signal processing: preparation  Noise spectral density  Matched filtering  Probability and statistics.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Data Acquisition, Diagnostics & Controls (DAQ)
Project Center Use Cases Revision 2
Project Center Use Cases
ASIS Status Report Bruce Allen LSC Meeting, LHO, August 16, 2001
Segments Basic Uses: slides minutes
Project Center Use Cases
Virgo Status Detector Status Computing Data Analysis status and Plans
LIGO Scientific Collaboration meeting
Project Center Use Cases Revision 3
Project Center Use Cases Revision 3
Presentation transcript:

LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech LIGO-G E

Three remaining modules under active development dataConditionAPI: First module to be tested in an MDC. Used extensively in later MDCs and Engineering Runs. Infrastructure nearly complete, only support for state information and cleanup of metadata remain. Still plenty of signal processing functionality to be implemented. diskCacheAPI: Coupled to selection of archival data storage technology(ies) to be used. Basic functions needed in support of MDCs and Engineering; Temporarily implemented in the frameAPI! Anticipate removing from frameAPI later this year. First “beta” release of LDAS after this API is implemented. dataIngestionAPI: Basic functions needed with the beginning of the Engineering Runs. Scripts external to LDAS developed to carry out these functions. Script to recording data to tapes at sites can be controlled and monitored using LDAS’s controlMonitorAPI. Current release of LDAS: LDAS (alpha)

Code Development Trends (more LSC support & testing!) MDC1 41% decrease in new code rate

Problem Tracking 93%

LDAS systems up and running around the world! LIGO Laboratory: Caltech –Development system –Test system –Main system soon! Hanford Livingston MIT LDAS staff work closely with others setting up LDAS systems. LSC Institutions: University of Wisconsin Milwaukee † Penn State University † University of Texas Brownsville † Australia National University † University of Florida † Sites with LDAS databases (IBM’s DB2)

System web interface provides window onto all sites

Participation with the LSC Weekly dataConditioning Group (CIT, ANU, PSU, UTB, UF, MIT) Weekly (Message Passing Interface) MPI Group (CIT, UWM, UTB, LHO) Participate in LSC Software Coordination Meetings (AL, KB, SA) LDAS staff active in LSC Upper Limits Search Groups (AL, KB, SA, PS, PC, LW, MB, PE) Participate in Upper Limits Chairs Meetings (AL, KB, SA, PS) Participate in Detector Characterization Group (AL, PS, KB) Also active in Non-LSC specific areas (gravitational wave network analysis, GriPhyn, and LIGO’s own CDS/GDS) Joint LSC/LDAS tutorial on developing analysis codes for the Diagnostic Monitoring Tool, wrapperAPI and dataConditionAPI last June at Caltech.

Participation in Engineering Runs E1: (ldas-0_0_10) supported ingestion of triggers from Hanford Diagnostic Monitor Tool (DMT) using LDAS Commands. E2: (ldas-0_0_12) supported ingestion of triggers(~7x10 5 ) from LHO DMT and archiving of frames to tape. E3: (ldas-0_0_15) supported ingestion of triggers from LHO and LLO DMTs and archiving of frames to tape … E3 release allowed frame channel data to be analyzed using signal processing tools to dataConditionAPI and stored in database Power spectral densities & Cross spectral densities E4: (ldas-0_0_16) supported ingestion of triggers from DMTs and archiving of frames to tape E5: (ldas-0_0_19) supported ingestion of triggers from DMTs and archiving of frames to tape. E6: (ldas-0_0_22) supported ingestion of triggers from DMTs and archiving of frames to tape… LDAS/LSC driver script polled database for locked segments, then submitted dataPipeline jobs to conduct an unmodeled burst search: 95 segments of 180 second long each analyzed to produce burst events in database!

Cross Spectral Densities generated online with LDAS in E3 run Coherence between single arm length signal & difference between end & vertex station seismic channels

Participation in MDCs (mock data challenges) dataConditionAPI MDC1: August 2000, release (ldas-0_0_10) –Tested managerAPI & dataConditionAPI. Message Passing Interface MDC2: December 2000, release (ldas-0_0_12) –Tested managerAPI, mpiAPI, wrapperAPI, & (LSC’s: LAL and LALwrapper). Database MDC3: January 2001, release (ldas-0_0_12 - ldas-0_0_15) –Tested managerAPI, metaDataAPI, lightWeightAPI, & database table design. Inspiral MDC4: May 2001, release (ldas-0_0_17) –Tested managerAPI, frameAPI, dataConditionAPI, mpiAPI, wrapperAPI, eventMonitorAPI, metaDataAPI & (LSC’s: inspiral dynamically loaded shared object (dso) search codes). –Data products inserted into database and into ilwd data objects. Joint burst/stochastic MDC5: September 2001, release (ldas-0_0_20) –Tested managerAPI, frameAPI, dataConditionAPI, mpiAPI, wrapperAPI, eventMonitorAPI, metaDataAPI, lightWeightAPI & (LSC’s burst and stochastic dso search codes). –Data products inserted into database, frame files and ligo_lw(xml) data objects. Periodic MDC6: November 2001, release (ldas-0_0_22) –Tested managerAPI, frameAPI, dataConditionAPI, mpiAPI, wrapperAPI, eventMonitorAPI, metaDataAPI, lightWeightAPI & (LSC’s periodic source dso search codes). –Data products inserted into database, frame files and ligo_lw(xml) data objects.

MDC influence on problem reports MDC1MDC2MDC3 MDC5 ER1 ER2ER3ER4ER5ER6 MDC6 MDC4

LIGO and GriPhyN LDAS working closely with GriPhyN collaboration on LIGO related components. Current prototype combines several LDAS systems using GLOBUS components to carry forward user requests. First prototype demonstrated at SC 2001 Conference. –For each request for data, determined if it already exists and if so where, if not how to get it using LDAS. –Data moved between LDAS system at UWM and CIT to carry out request if computation needed. Next phase of prototype will integrate components from continuous wave (periodic source) searches. –Data products will involve the common “Short Fourier Transform” SFT frames produced by search codes running on the LDAS Beowulf clusters. Security to LDAS systems and data assured by Globus proxy credentials and GridFTP over secure socket connections.

Open Issues GriPhyN activities starting to ramp up across collaboration(ocean). –Caltech, UWM, PSU, UTB, GEO (UK, Germany). –VIRGO, Others. Different potentials for how GriPhyN evolves from demos to real infrastructure. –Extensions of LDAS. Home (a.k.a. Home, a possibility for periodic searches). –GEO model (Java based TRIANA). –Grand Challenge model (CACTUS). No coherent body in the LSC to provide guidance on how to proceed. –Want to prevent redundant/conflicting efforts given that we have so few people to work on infrastructure.

Open Issues (continued) In LIGO the Tier capacity ratio is close to [1:1:1], unlike the HEP ratio of close to [1:1/N Tier2 :1/N Tier3 ] model. –How Tier 2 centers evolve actually impacts LIGO Lab significantly! –Tier 2 centers are hardware heavy and people light. –Unforeseen (in ) was the propagation of LAB LDAS systems to collaboration sites, this has pluses and minuses: (+) user experience/expertise defusing quickly. (+) feedback; faster problem identification and resolution. (-) LAB support function grown to greater than anticipated. (-) becoming progressively more difficult to update LDAS/LAL releases due to inertia and lack of people. Tier 2 centers either will not or cannot adopt a pure LDAS hardware model. –heterogeneous environment complicates maintenance of LDAS/LAL releases.

Open Issues (unavoidable tension) 20 terabytes of archival data at Caltech already! –collaboration starting to implement analysis capabilities that will demand data transmission/moving. –pressures certain LDAS functionality (scope) be added against original timeline. How data are reduced as they filter down the Tier Levels still undefined by collaboration. –worst case => everything everywhere. Individuality vs. Conformality of Tier 2 design / functionality. –Identify subset of each Tier 2 that mirror LIGO configuration of LDAS. fraction / partition can be changed according to load. Bandwidth limits the model of all data everywhere. –data reduction needs to waterfall through the Tier Hierarchy. logarithmic data reduction required for optimization.